The breathless pace of development means data protection regulators need to be prepared for another scandal like Cambridge Analytica, says Wojciech Wiewiórowski, the EU’s data watchdog.
Wiewiórowski is the European data protection supervisor, and he is a powerful figure. His role is to hold the EU accountable for its own data protection practices, monitor the cutting edge of technology, and help coordinate enforcement around the union. I spoke with him about The lessons we should learn from the past decade in tech, and what Americans need to understand about the EU’s data protection philosophy. Here’s what he had to say.
What tech companies should learn: That products should have privacy features designed into them from the beginning. However, “it’s not easy to convince the companies that they should take on privacy-by-design models when they have to deliver very fast,” he says. Cambridge Analytica remains the best lesson in what can happen if companies cut corners when it comes to data protection, says Wiewiórowski. The company, which became one of Facebook’s biggest publicity scandals, had scraped the personal data of tens of millions of Americans from their Facebook accounts in an attempt to influence how they voted. It’s only a matter of time until we see another scandal, he adds.
What Americans need to understand about the EU’s data protection philosophy: “The European approach is connected with the purpose for which you use the data. So when you change the purpose for which the data is used, and especially if you do it against the information that you provide people with, you are in breach of law ,” he says. Take Cambridge Analytica. The biggest legal breach was not that the company collected data, but that it claimed to be collecting data for scientific purposes and quizzes, and then used it for another purpose—mainly to create political profiles of people . This is a point made by data protection authorities in Italy, which have temporarily banned ChatGPT there. Authorities claim that OpenAI collected the data it wanted to use illegally, and did not tell people how it intended to use it.
Does regulation stifle innovation? This is a common claim among technologists. Wiewiórowski says the real question we should be asking is: Are we really sure that we want to give companies unlimited access to our personal data? “I don’t think that the regulations … are really stopping innovation . They are trying to make it more civilized,” he says. The GDPR, after all, protects not only personal data but also trade and the free flow of data over borders.
Big Tech’s hell on Earth? Europe is not the only one playing hardball with tech. As I reported last weekthe White House is mulling rules for AI accountability, and the Federal Trade Commission has even gone as far as demanding that companies delete their algorithms and any data that may have been collected and used illegally, as happened to Weight Watchers in 2022. Wiewiórowski says he is happy to see President Biden call on tech companies to take more responsibility for their products’ safety and finds it encouraging that US policy thinking is converging with European efforts to prevent AI risks and put companies on the hook for harm“One of the big players on the tech market once said, ‘The definition of hell is European legislation with American enforcement,'” he says.
Read more on ChatGPT
The inside story of how ChatGPT was built from the people who made it