TikTok hit with U.S. lawsuit alleging illegal collection of children’s data – National


The U.S. Justice Department sued TikTok on Friday, accusing the company of violating children’s online privacy law and running afoul of a settlement it had reached with another federal agency.

The complaint, filed together with the Federal Trade Commission in a California federal court, comes as the U.S. and the prominent social media company are embroiled in yet another legal battle that will determine if – or how – TikTok will continue to operate in the country.

The latest lawsuit focuses on allegations that TikTok, a trend-setting platform popular among young users, and its China-based parent company ByteDance violated a federal law that requires kid-oriented apps and websites to get parental consent before collecting personal information of children under 13. It also says the companies failed to honor requests from parents who wanted their children’s accounts deleted, and chose not to delete accounts even when the firms knew they belonged to kids under 13.

Story continues below advertisement

“This action is necessary to prevent the defendants, who are repeat offenders and operate on a massive scale, from collecting and using young children’s private information without any parental consent or control,” Brian M. Boynton, head of the Justice Department’s Civil Division, said in a statement.

TikTok said it disagreed with the allegations, “many of which relate to past events and practices that are factually inaccurate or have been addressed.”

“We offer age-appropriate experiences with stringent safeguards, proactively remove suspected underage users and have voluntarily launched features such as default screentime limits, Family Pairing, and additional privacy protections for minors,” the company said in a statement.


Click to play video: 'TikTok and ByteDance sue U.S. government over app ban law'


TikTok and ByteDance sue U.S. government over app ban law


The U.S. decided to file the lawsuit following an investigation by the FTC that looked into whether the companies were complying with a previous settlement involving TikTok’s predecessor, Musical.ly.

Story continues below advertisement

In 2019, the federal government sued Musical.ly, alleging it violated the Children’s Online Privacy Protection Act, or COPPA, by failing to notify parents about its collection and use of personal information for kids under 13.


The email you need for the day’s
top news stories from Canada and around the world.

Get the day's top news, political, economic, and current affairs headlines, delivered to your inbox once a day.

Get daily National news

Get the day’s top news, political, economic, and current affairs headlines, delivered to your inbox once a day.

By providing your email address, you have read and agree to Global News’ Terms and Conditions and Privacy Policy.

That same year, Musical.ly — acquired by ByteDance in 2017 and merged with TikTok — agreed to pay $5.7 million to resolve those allegations. The two companies were also subject to a court order requiring them to comply with COPPA, which the government says hasn’t happened.

In the complaint, the Justice Department and the FTC allege TikTok has knowingly allowed children to create accounts and retained their personal information without notifying their parents. This practice extends to accounts created in “Kids Mode,” a version of TikTok for children under 13. The feature allows users to view videos but bars them from uploading content.

The two agencies allege the information collected included activities on the app and other identifiers used to build user profiles.

They also accuse TikTok of sharing the data with other companies – such as Meta’s Facebook and an analytics company called AppsFlyer – to persuade “Kids Mode” users to be on the platform more, a practice TikTok called “re-targeting less active users.”


Click to play video: 'TikTok’s algorithm aggressively pushes harmful content to teens, study finds'


TikTok’s algorithm aggressively pushes harmful content to teens, study finds


The complaint says TikTok also allowed children to create accounts without having to provide their age, or obtain parental approval, by using credentials from third-party services. It classified these as “age unknown” accounts, which the agencies say have grown into millions.

Story continues below advertisement

After parents discovered some of their children’s accounts and asked for them to be deleted, federal officials said TikTok asked them to go through a convoluted process to deactivate them and frequently did not honor their requests.

Overall, the government said TikTok employed deficient policies that were unable to prevent children’s accounts from proliferating on its app and suggested the company was not taking the issue seriously. In at least some periods since 2019, the complaint said TikTok’s human moderators spent an average of five to seven seconds reviewing accounts flagged as potentially belonging to a child. It also said TikTok and ByteDance have technology they can use to identify and remove children’s accounts, but do not use them for that reason.

The alleged violations have resulted in millions of children under 13 using the regular TikTok app, allowing them to interact with adults and access adult content, the complaint said.

In March, a person with the matter had told the AP the FTC’s investigation was also looking into whether TikTok violated a portion of federal law that prohibits “unfair and deceptive” business practices by denying that individuals in China had access to U.S. user data.


Click to play video: 'Some ByteDance employees still have access to U.S. user data: TikTok CEO'


Some ByteDance employees still have access to U.S. user data: TikTok CEO


Those allegations were not included in the complaint, which is asking the court to fine the companies and enter a preliminary injunction to prevent future violations.

Story continues below advertisement

Other social media companies have also come under fire for how they’ve handled children’s data.

In 2019, Google and YouTube agreed to pay a $170 million fine to settle allegations that the popular video site had illegally collected personal information on children without their parents’ consent.

And last fall, dozens of U.S. states sued Meta Platforms Inc., which owns Facebook and Instagram, for harming young people and contributing to the youth mental health crisis by knowingly and deliberately designing features on Instagram and Facebook that addict children to its platforms.

A lawsuit filed by 33 states claims that Meta routinely collects data on children under 13 without their parents’ consent, in violation of COPPA. Nine attorneys general are also filing lawsuits in their respective states, bringing the total number of states taking action to 41 plus Washington, D.C.

&copy 2024 The Canadian Press





Source link