Planet Money: The Planet Money Indicator: NPR






And I’m Adrian Ma.

WONG: John Verdi was watching a classic movie the other night — a movie I’m obsessed with too — the 1991 action movie “Point Break.”

Was that, like, the 100th time you watched “Point Break,” the 50th time?

JOHN VERDI: It’s definitely in the tens.

MA: Oh, my God. ‘Point Break’ – I just saw this movie a few weeks ago for the first time.

WONG: Congratulations. For those of you who haven’t seen it, it stars Keanu Reeves as an FBI agent who hides with a gang of surfers who also rob banks, led by Patrick Swayze.

MA: Totally plausible plot.

WONG: (Laughs) So many great scenes. John watched “Point Break” on HBO Max. And afterward, the service recommended other crime thrillers from the early ’90s — a movie called “Ricochet,” starring Denzel Washington, and another called “Rush,” about undercover cops with drug dealers. .

MA: And that’s an experience most of us have, isn’t it? I mean, we shop online or we stream music, and these apps or websites, they say, hey, do you like that? Maybe you will like this thing too.

WONG: These are examples of algorithms at work, and John is well aware of this Internet plumbing, as he is an expert in digital privacy.

VERDI: Algorithms are computer programs that make decisions based on user input. So we all use algorithms every day, whether we know it or not.

MA: Algorithms are like the secret sauce of a lot of online apps and services – this powerful mix of computer code and data generated by you and me – you know, ordinary people doing stuff online. And those secret sauces – they’ve shaped our online experiences into something more personal, more focused. But increasingly, they are attracting the attention of government regulators.

WONG: Scrutiny is one thing, but what should a company be accused of doing to get the feds to say throw their secret sauce down the drain?


WONG: Earlier in March, the Federal Trade Commission announced that it had taken action against a company called Kurbo.


UNIDENTIFIED PERSON: With Kurbo, making smart food choices is as easy as understanding traffic lights.

WONG: So its parent company is WW International, formerly known as Weight Watchers, and Kurbo is a weight loss app for kids and families. The application collects personal information such as weight, physical activity, what you eat.


UNIDENTIFIED PERSON: After a meal or snack, open Kurbo’s tracking feature. Enter the food name, color and number of servings.

WONG: And then it provides personalized feedback based on that data. John Verdi is senior vice president of policy at the Future of Privacy Forum. It’s a think tank in Washington, DC And it says algorithms were an important part of Kurbo’s service.

VERDI: Kurbo, in this case, used algorithms to better make recommendations about health, fitness, and weight loss. These algorithms were not only based on the knowledge and expertise of individuals in the company. They were also based on an analysis of user data.

MA: In other words, the Kurbo app wasn’t just providing generic advice like eating more fruits and vegetables. It offered personalized advice powered by algorithms built with data on clients’ diets and activity.

WONG: And what got Kurbo in trouble with the FTC is that there is a law that specifies how children’s data should be handled. It’s called the Children’s Online Privacy Protection Act, and it states that services that collect and use the personal information of anyone under the age of 13 must get their parents’ permission. .

MA: Now the FTC is alleging that Kurbo marketed its app to children as young as 8 and didn’t get their parents’ permission. On top of that, the FTC claims that the company actually encouraged kids under 13 to, you know, fake their age when they signed up. The FTC says failure to obtain this permission means that this data on users under 13 was in fact collected illegally.

WONG: I contacted Kurbo for comment, and they said they disputed that their historical practices violated the law.

MA: Ultimately, Kurbo and the FTC reached a settlement that included a $1.5 million fine, which the company says is not an admission of wrongdoing. But that’s not the part of the settlement that caught people’s attention. That’s what else the government asked Kurbo to do.

WONG: The company is required to delete all data illegally collected from children under the age of 13 and to destroy all algorithms built with that data – pour that secret sauce down the drain.

MA: This regulation is the third time in less than three years that federal authorities have asked companies to destroy their algorithms because they say they are using ill-gotten data. The first time, in 2019, involved a company called Cambridge Analytica. Do you remember this controversy?

WONG: Yeah. This centered on targeted political advertising on Facebook. And then in 2021, the FTC came to terms with the makers of a photo-sharing app that the government says improperly used customer photos to develop facial recognition algorithms.

MA: John says the Kurbo settlement confirms that the FTC is serious about using algorithmic destruction as a deterrent.

VERDI: This type of FTC action can be really scary for businesses. There’s a good line in a James Bond movie that, you know, once is a fluke, twice a coincidence, and three times an enemy action.

MA: And those three examples – these are just investigations that resulted in high-profile settlements. John says the FTC is watching tech companies very closely.

VERDI: Let’s be clear: the vast majority of FTC investigations are not public. So we see colonies like the Kurbo Colony, and those are really just the tip of the iceberg. There is a huge iceberg under the investigation water. The companies are serious enough not to be on the wrong side of the FTC.

WONG: So the FTC has signaled that means business. But ensuring that companies are actually tracking their data deletion and algorithms – that gets tricky.

KENESA AHMAD: Deletion is just a lot more complicated than people think.

WONG: Kenesa Ahmad is a partner at Aleada Consulting, a San Francisco firm that advises clients on privacy, security, and data.

AHMAD: It’s not just a matter of, oh, we automatically know which data fields should be deleted. No.

WONG: That’s not a ctrl-F, delete? (To laugh).

AHMAD: Unfortunately, no. Unfortunately no.

WONG: Kenesa says data can also be copied, sliced ​​and moved around in different parts of the business – like the marketing team uses one set of data here and the customer service team uses a slightly different set of data. the low.

MA: In the case of a company like Kurbo, their treasure trove of data will contain information that the FTC says was illegally collected mixed with information that was collected the right way. But it’s all potentially mixed up and used in the company’s algorithm.

AHMAD: How do you separate data of legitimate origin from data of illegitimate origin? It’s really hard to track this line of data, and many companies don’t have the full resources to automatically track this.

WONG: So it could get quite complicated behind the scenes at Kurbo, which is why Kenesa’s firm advises clients to be proactive about data best practices. Determine exactly how data is used in a business and consider what data they collect and retain.

MA: Because if the FTC continues to use suppression as a sanction, it can really disrupt the way tech companies operate. Destroying the algorithms could mean erasing what makes money for the company or makes the experience of using an app better than its competitors.

WONG: As for Kurbo users, their experience may also change as a result of the FTC settlement, as these algorithms may have found their way into many parts of the app. According to John Verdi of the Forum on the Future of Privacy, depending on the extent of the use of these algorithms, getting rid of them could disrupt the operation of the application for a whole host of customers, not just those whose data was deemed to be unlawfully collected.

VERDI: When you look at a situation such as Kurbo, many legitimate users of this service – people 13 and older – may find themselves in a situation where a service they enjoy and rely on and, in certain circumstances , paid stops working as the algorithms themselves become illegal.

WONG: If more cases of algorithmic destruction emerge, it could lead to an Internet that’s a little less personal, a little less anticipatory of our every need and want — at least until tech companies come up with a new recipe for sauce. improved secrecy.

MA: Or they catch the next big wave, like Patrick Swayze’s character in “Point Break.”

WONG: Vaya con Dios, Adrian. Vaya with Dios.

MA: Vaya con Dios.


KEANU REEVES: (As Johnny Utah) Vaya con Dios.

MA: I would not have understood (ph) this reference two weeks ago.


MA: This episode was produced by Nicky Ouellet with the help of Isaac Rodrigues. It has been verified by Corey Bridges. Our main producer is Viet Le. Our editor is Kate Concannon. And THE INDICATOR is an NPR production.


Copyright © 2022 NRP. All rights reserved. Visit the Terms of Use and Permissions pages of our website at for more information.

NPR transcripts are created in peak time by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative recording of NPR’s programming is the audio recording.

Sharon D. Cole