Facebook’s dark design: it’s not just the algorithms

In the midst of our current Facebook debate, have we ignored a central question? The public scrutiny focused almost entirely on the company and its practices. Congressional testimony from a whistleblower earlier this fall – and the the wall street journals continuous exposuresstar— revealed the extent to which its employees knew, through their own research, of the damage caused by their product. And yet the product itself was strangely absent from much of the discussion.

Algorithms were discussed, including how Facebook determines which posts users see, and how rankings promote sensational content, fuel extremism and help spread misinformation. And user interface experts have long noted the myriad of small and subtle ways in which sites like Facebook nudge the user into more frequent and impetuous interactions.

These things are important and their perverse effects are well known, if not always recognised. But the essence of a software product (like the Facebook application) is not in the buttons and colors that appear on the screen, nor in the algorithms that prioritize one piece of data over another. Instead, it resides in the Notions of an app – the behavioral building blocks we interact with – that shape how we use and understand it, and determine the impacts of our actions.

The concepts of ‘news feed’, ‘likes’, ‘friends’, ‘tagging’, etc., are at the heart of Facebook, and their careful examination reveals how Facebook’s design often serves unrelated interests. of users, but of Facebook itself. . In other words, these concepts are driving Facebook’s broader societal impacts, and the damage they cause is not accidental, but intentional.

The purpose of the news is, according to Facebook, connect people to the stories that matter most to them. If so, you should be able to filter and sort posts just like you would for items in an online store. And yet Facebooks newsfeed doesn’t just have the most basic controls, but isn’t even stable: refreshing your browser window will show you a new selection of posts, not only changing their order, but even deleting the best posts you might have wanted to read.

WeWe’re so familiar with this concept that we don’t notice how strange it is. The concept of the News Feed has conditioned us to accept what seems like an almost random selection of posts, opening the void into which Facebook can insert the algorithms that override our own choices.

Imagine how many books Amazon would sell if connected us to the books that matter most” by showing us ever-changing title lists. Now, you might argue that these practical concerns aren’t what Facebookdesigners have in mind. It might surprise you, then, to read in their own manifesto of seven guiding principles that entitled “useful”, which begins: Our product is more utilitarian than entertaining, intended for repeated daily use, delivering effective value.

Sometimes the problem is not an individual concept but the way several concepts are superimposed. We all know the concept of positive votein which usersapprovals or disapprovals of articles (like comments on a newspaper article) are aggregated to rank them by popularity. Wehave also seen (in Slack, for example) the concept of reaction, where readers can respond to a post with a smiley or a heart. Facebooks to like the concept ingeniously merges these two; react to a message with a heart the implicit vote. What not all users realize is that an angry reaction also counts as an upvote, and according to a recent report, any emotional reaction matters more than just a like. Designing to separate these concepts would allow users to make independent decisions: express anger, for example, without contributing a messagethe promotion. This, however, would not serve the interests of Facebook.

Problems can also arise in the way concepts are synchronized with each other. Some degree of automation, in which actions in one concept can trigger actions in another, is often desirable; if you refuse a invitation in your calendar, for example, you expect the Event to delete. But such links do not always correspond to what the user wants. When you label someone in a photo on Facebook, their name is attached to the image. Additionally, however, the visibility of the photo changes: now all the friends of the tagged person can see it. In effect, this means that someone can share your photo not only with their friends but also with yours. You can disable this behavior, but unfortunately itis the default and many users do notI’m not even aware of it. Worse, anyone can tag you, even if they’re not a friend, and thatnot clear what control you have in this case: Facebooks help page ominously warns that tags from people you are not friends with may appear in your timeline review.

In all of these cases, Facebook’s design is complex and carefully considered. The problem isn’t some glaring design flaw that subverts the concept’s purpose. Rather, the real goal may not be what we users had in mind: it might be Facebook’s, not ours.

Today’s consumers are more design conscious than ever before. We expect our devices and products to be easy to use, with features that match our needs. As software products become increasingly ubiquitous in all aspects of our lives, we must balance an appreciation of the benefits they bring with a cool assessment of the risks they pose. Such an evaluation should start with the basic concepts of a product, and by asking a simple question: what needs are they intended to meet?


daniel jackson is a professor of computer science at MIT and author of The Essence of Software: Why Concepts Matter for Great Design.

Sharon D. Cole