How Companies and Social Media Use Algorithms to Make ‘Scary’ Predictions Jennifer Golbeck

WORCESTER — Technology and social media companies like Facebook can use artificial intelligence to track personal information and make predictions, even for people who don’t have a social media account, Jennifer Golbeck told Worcester Wednesday evening.

Golbeck, director of the Social Intelligence Lab at the University of Maryland, College Park, spoke at the Worcester Economic Club dinner meeting on Wednesday. A computer scientist, Golbeck has studied developments in artificial intelligence, or AI, looked at how social media and other tech companies have used AI, and is working with her team to create new algorithms for… ‘IA.

In an interview ahead of her speech, Golbeck said she wanted to dig deeper into the amount of personal data companies collect.

“We all kind of know and hear that line about how all of this data is collected about us and all of these algorithms affect us,” Golbeck said. “A lot of what I’m going to talk about is painting a picture of the vastness of this space, which really freaks people out, and then talking about some of the ways we can protect ourselves.”

Golbeck said people are often lectured about not sharing too much personal information online, but she said companies such as Facebook have ways of creating “ghost” profiles of people who choose not to use them.

“There’s really no way to personally stay out of the system,” Golbeck said. “You end up on it and are watched if you participate in the digital society.”

Friends and colleagues of people without a Facebook profile will have their contact details saved when installing the app on their phone and the algorithms can design profiles of people in contact lists without a Facebook account based on the characteristics and jobs of the people. people who have their number, says Golbeck.

Citizens in the United States need to fundamentally rethink their approach to data ownership, Golbeck said, and people should push elected officials to grant more choice in collection.

While Golbeck believes AI can be used for benevolent purposes, she focused her talk on how companies can use algorithms to collect data about people without their consent.

Golbeck told a story from a New York Times article on Target using a teenage girl’s shopping history to determine she was pregnant and sending her coupons before her own father knew the girl was pregnant in 2012.

While the computer scientists who design the algorithms aren’t able to definitively explain why certain purchases or activities on social media can be predictive, Golbeck said the algorithms are able to determine statistical anomalies that can be predictive.

Use of your data

Facebook activity was also analyzed to determine a person’s IQ or other traits.

The spread of online activities such as Facebook likes, the spread of viral videos and fake news are similar to diseases, Golbeck said. Researchers can use disease spread models from the Centers for Disease Control and Prevention to track the spread of information on Facebook. Researchers also know that people tend to be friends with those who have similar characteristics to them, Golbeck said.

These two factors allow researchers to see how a liked Facebook page can spread to people with similar traits such as a high IQ.

Algorithms have advanced to the point where they can predict whether a person’s relationship will last and whether an attempt to get sober from alcohol will stick, Golbeck said.

However, the algorithms can reproduce racial biases that exist in systems, Golbeck said. She used an example of how an algorithm used in Florida to predict recidivism and play a role in granting parole was found to be biased against black people.

“Artificial intelligence has this veneer of objectivity. It’s math. It’s data. It can’t be racist. It can’t be biased,” Golbeck said. “But that’s because artificial intelligence just takes data from what we humans do and learns to replicate it.”

Home apps like Amazon’s Alexa or Nest thermostats also collect data that businesses can use for later profit-making, Golbeck said.

While search engines like Google will collect data, search engine DuckDuckGo has emphasized user privacy and does not send information to content farms, Golbeck said.

Not all uses of algorithms have to be scary, Golbeck said, using an example of how Netflix comes up with recommendations on what users should watch next.

Healthcare spaces have used data collected about patients throughout their treatment for purposes that can have a profound impact on care, Golbeck said.

For people in commercial spaces, Golbeck said AI can be used for good. If companies are transparent in their use of algorithms and provide customers with an opt-out option, that could solve a lot of problems, Golbeck said.

“This technology is not going away and as companies start using it, as new companies come up that start using it, I think I understand what the challenges are and be prepared to take an ethical approach to It’s something where I think there’s a lot of opportunity and I think a competitive advantage,” Golbeck said.

Sharon D. Cole