TikTok hit with $500M fine in Europe for past failings safeguarding youth privacy


TikTok faces a hefty fine in Europe from a regulator for past shortcomings in protecting the privacy of younger users, a development that experts say could have implications for Canadian regulators and young people using the popular platform here.

News broke on Friday that Ireland’s Data Protection Commission (DPC) — the lead privacy regulator for Big Tech companies whose European headquarters are largely in Dublin — was handing TikTok a fine of 345 million euros (roughly $500 million Cdn), following a two-year investigation into its compliance with privacy rules in the latter half of 2020.

The investigation found that the sign-up process for teen users resulted in settings that made their accounts public by default, allowing anyone to view and comment on their videos. Those default settings also posed a risk to children under 13 who gained access to the platform even though they’re not allowed.

It was also found that a “family pairing” feature designed for parents to manage settings wasn’t strict enough, allowing adults to turn on direct messaging for users aged 16 and 17 without their consent. And it nudged teen users into more “privacy intrusive” options when signing up and posting videos, the commission said.

In a detailed response posted online, TikTok said it had addressed “most” of the issues at play in the Irish probe — and had done so prior to the start of the investigation.

That included making all accounts for teens under 16 private by default and disabling direct messaging for 13- to 15-year-olds, before the DPC probe began. TikTok also says that forthcoming accounts set up by 16- and 17-year-old users will be pre-selected to private account settings — a change rolling out on a global basis, starting later this month.

The platform took issue with aspects of the watchdog’s decision — “particularly the level of the fine imposed” — and indicated that it’s evaluating “next steps” as a result.

WATCH | TikTok, data and heightened scrutiny: 

TikTok tries to convince users it’s keeping their data safe

Featured VideoTikTok is on a charm offensive, trying to prove it’s not a threat. CBC’s Ellen Mauro gets access to its new Transparency and Accountability Centre in L.A. and asks the company’s head of public policy what it’s doing to protect user data.

Karen Louise Smith, an associate professor in the department of communication, popular culture and film at Brock University in St. Catharines, Ont., said that privacy rulings from like-minded regulators tend to get followed closely in other jurisdictions.

She said she expects regulators in Canada — where a group of privacy watchdogs have been investigating TikTok for months — to pay close attention to the findings by the commission in Ireland.

“They typically kind of work in co-ordination,” said Smith, who researches openness, privacy and participation in digital society. “Any kind of ruling like that, that comes out anywhere in the world, has potentially positive impacts for Canadian youth.”

The Canadian probe continues, a spokesperson for the Office of the Privacy Commissioner of Canada confirmed Friday.

Warning about ‘dark patterns’

The DPC in Ireland has been criticized for not moving fast enough in its investigations into Big Tech companies since European Union privacy laws took effect in 2018. For TikTok, German and Italian regulators disagreed with parts of a draft decision issued a year ago, delaying it further.

To avoid new bottlenecks, the Brussels headquarters of the 27-nation bloc has been given the job of enforcing new regulations to foster digital competition and clean up social media content — rules aimed at maintaining its position as a global leader in tech regulation.

LISTEN | Western tensions with TikTok: 

Front Burner19:33TikTok’s power and the push to ban it

Featured VideoTikTok is facing tough questions from many western democracies about the personal data it gathers and who has access to it. The app’s parent company is based in China and now US politicians want to make sure the country’s government can’t get access to Americans’ personal information. They aren’t liking the answers they’re getting. For transcripts of this series, please visit: https://www.cbc.ca/radio/frontburner/transcripts

In response to initial German objections, Europe’s top panel of data regulators said TikTok nudged teen users with pop-up notices that failed to lay out their choices in a neutral and objective way. The DPC described them as a form of “dark patterns” — a term describing design practices aimed at steering people toward making choices they otherwise wouldn’t.

Anu Talus, chair of the European Data Protection Board, said that “social media companies have a responsibility to avoid presenting choices to users, especially children, in an unfair manner — particularly if that presentation can nudge people into making decisions that violate their privacy interests.”

This particular finding caught the attention of Brock University’s Smith, who believes it may benefit Canadians, as TikTok and other tech companies, platform operators and app providers see the negative consequences of employing dark patterns.

“This hopefully is an incentive to stay away from those dark patterns,” she said.

WATCH | Dark patterns, web design and regulation in Canada: 

How dark patterns in web design are regulated under Canadian law

Featured VideoA U.S. regulator sued Amazon last month for allegedly duping customers into buying Prime memberships using a web design trick called ‘dark patterns.’ Here’s what Canada is doing about the practice.

The Irish watchdog, meanwhile, also examined TikTok’s measures to verify whether users are at least 13 years of age, but it found that no rules were broken.

Smith said that “privacy issues associated with age online are definitely a thorny issue,” but social media companies nonetheless must remain compliant with relevant laws and regulations.

But she said companies like TikTok have a special responsibility to safeguard the data of younger users, who may have “less capacity to make decisions about their digital footprints that may follow them for the rest of their lives.”

A need to be safe but also confident

Jenna Shapka, a professor and head of the department of educational and counselling psychology, and special education, at the University of British Columbia in Vancouver, said she doesn’t see a great deal of difference in how TikTok functions versus other social platforms.

“There isn’t anything special about TikTok that I see that’s different, that parents need to be more worried about TikTok than Snapchat or Instagram or anything like that,” she said, when discussing some of the steps the platform said it had taken to tighten protections for its teenage users.

Shapka said she also thinks children growing up today need to learn how these platforms work and how to thrive on them, as opposed to how to avoid any contact with them.

Some of the parent-involved tools and controls that TikTok and other platforms have developed may be of some benefit for younger users who are starting out in the digital world and learning their way, but less so for older teenagers, she said.

“I don’t think parents can use them to control or monitor [an older] teenager’s online activities,” Shapka said, and doing so could erode the relationship between parents and children.



Source link