Facebook could make its algorithms really work for you

Separately, draft legislation would require any major platform using an “opaque” algorithm – the inputs of which are not obvious, because the algorithm relies on user-specific data that the user has never expressly provided – to offer a more transparent based on information provided by users. Meanwhile, Facebook’s leaked research said users should have more options to customize their algorithmically curated experience.

How do we decide which of these options is best?

We don’t have to. These approaches could be combined into a single system that responds to a previous to promise from Facebook’s president of global affairs, Nick Clegg, to give “people a better understanding and control over how its algorithms rank content.”

A facebook blog post of January 2021 describes how its newsfeed algorithm works. Machine learning is used to predict the likelihood that each user will share, comment, like, click “haha” for, etc., any post the user might see. A weighted sum aggregates these probabilities, which helps determine a single number called the “value” of the post to the user.

The weights in this sum play a central role. Facebook initially gave all emoji reactions five times the weight of a like, but it ultimately reduced the weight of love and sadness emojis to twice that of like. The value of an angry emoji has been reduced to zero – to reduce the amount of toxic material encountered by users. Facebook estimates how close you are to each of your friends and incorporates this is so that messages from closer friends have a higher value. The subject of the post is also weighted – this is how Facebook is able to reduce distributing political content when it wants.

This is the story of the algorithm exitbut what is his contribution? How does Facebook make these decisions about you? We know it contains a lot of your personal data – your demographics, group memberships, engagement history and keyword searches, etc. The algorithm uses all the information it can access to predict your engagement probabilities.

But what if we had more of a say in the information that feeds this process? Facebook should group algorithm inputs into a manageable number of easy-to-understand collections and provide users with on/off toggle switches for each. Turn off the gender switch to enjoy a gender-neutral newsfeed. Disable “Engagement History” if you don’t want posts based on your past actions on the platform. Disable “personal data” if you want rankings to be based solely on the content of posts and how other users have interacted with them, rather than anything specific to you.

Different users prefer different levels of data privacy, and these toggle switches would allow users to choose what types of personal data they are comfortable sharing with the News Feed algorithm.

Facebook should also reveal the different forms of engagement its algorithms predict and allow users to adjust weightings. Do you like getting into political arguments with strangers online? Then raise the dials for angry reactions and “haha”, long comments and political content while lowering the dial for your proximity to the poster. You don’t like to engage in such arguments? Then adjust the knobs in the opposite direction.

These dials would allow each of us to specify the types of content and interactions we enjoy, rather than having Facebook engineers and executives decide for us.

Selecting a character

A fundamental problem widely documented in Facebook’s whistleblower leaks is that there is a discrepancy between what many of us want to see on social media and what the News Feed algorithm thinks we want based on it. of our actions online. In short, our clicks and comments are often the result of impulses, but they are used to determine the content we see online – and the result is a proliferation of extremism, hate and misinformation. The proposed dials would not eliminate this problem, but they would help reduce it because they would allow users to tell the algorithm what to prioritize.

Many users might find it overwhelming to have so many switches and dials to fine-tune. So, to make things easier, Facebook might provide a handful of preset configurations: news addict, casual dress, maximum privacy, and so on. We already have preset configurations for our TVs and home audio systems: movie mode, sports mode, concert mode, etc. — so why not provide options like these for our social media systems as well? Facebook might even let you import configurations from third-party organizations. For example, a digital rights organization such as the Electronic Frontier Foundation could publish a recommended list of switch and dial settings and users could simply click a button to import them into their news feed settings.

Facebook resisted algorithm transparency in part to make it harder for the system to play: knowledge of the algorithm would help people craft content that rises to the top. But if each user’s News Feed uses different entries and weights, then it would be harder to outsmart the rankings because what goes to the top for some users would fall to the bottom for others. And that level of transparency wouldn’t reveal proprietary information that Facebook deserves to keep secret: we would know what the algorithm predicts but not how it does it.

Perhaps Congress could require all major Internet platforms that rely on data-driven algorithms to classify content to provide toggle switches controlling which inputs are used. Machine learning is often described as a black box process. It’s time to let everyone choose what kind of data — and especially how much personal data — to put in the box.

Noah Giansiracusa is Assistant Professor of Mathematics and Data Science at Bentley University and author of “How algorithms create and prevent fake news.”

Sharon D. Cole