U.S. regulators say Facebook misled parents and failed to protect the privacy of children using its Messenger Kids app, including misrepresenting the access  to private user data that it provided to app developers.

As a result, The U.S. Federal Trade Commission on Wednesday proposed sweeping changes to a 2020 privacy order with Facebook — now called Meta — that would prohibit it from profiting from data it collects on users under 18. This would include data collected through its virtual-reality products. The FTC said the company has failed to fully comply with the 2020 order.

Meta would also be subject to other limitations, including with its use of face-recognition technology, and would be required to provide additional privacy protections for its users.

“Facebook has repeatedly violated its privacy promises,” said Samuel Levine, director of the FTC’s Bureau of Consumer Protection. “The company’s recklessness has put young users at risk, and Facebook needs to answer for its failures.”

‘Political stunt’

Meta called the announcement a “political stunt.”

“Despite three years of continual engagement with the FTC around our agreement, they provided no opportunity to discuss this new, totally unprecedented theory. Let’s be clear about what the FTC is trying to do: usurp the authority of Congress to set industry-wide standards and instead single out one American company while allowing Chinese companies, like TikTok, to operate without constraint on American soil,” Meta said in a prepared statement.

“We have spent vast resources building and implementing an industry-leading privacy program under the terms of our FTC agreement. We will vigorously fight this action and expect to prevail.”

Child-development experts concerned

Facebook launched Messenger Kids in 2017, pitching it as a way for children to chat with family members and friends approved by their parents. The app doesn’t give kids separate Facebook or Messenger accounts. Rather, it works as an extension of a parent’s account, and parents get controls, such as the ability to decide with whom their kids can chat.

At the time, Facebook said Messenger Kids wouldn’t show ads or collect data for marketing, though it would collect some data it said was necessary to run the service.

But child-development experts raised immediate concerns.

In early 2018, a group of 100 experts, advocates and parenting organizations contested Facebook’s claims that the app was filling a need kids had for a messaging service. The group included nonprofits, psychiatrists, pediatricians, educators and the children’s music singer Raffi Cavoukian.

“Messenger Kids is not responding to a need — it is creating one,” the letter said. “It appeals primarily to children who otherwise would not have their own social media accounts.”

Another passage criticized Facebook for “targeting younger children with a new product.”

WATCH | Facebook pauses plans for Instagram for kids: 

Plans for kids’ version of Instagram on hold, Facebook says

Facebook has paused its plans for a version of Instagram designed for kids, saying it wants to hear more parents’ concerns about children’s safety, privacy and mental health.

‘Gaps’ and ‘weaknesses’ in privacy controls

Facebook, in response to the letter, said at the time that the app “helps parents and children to chat in a safer way,” and emphasized that parents are “always in control” of their kids’ activity.

The FTC now says this has not been the case. The 2020 privacy order, which required Facebook to pay a $5 billion US  fine, required an independent assessor to evaluate the company’s privacy practices. The FTC said the assessor “identified several gaps and weaknesses in Facebook’s privacy program.”

The FTC also said Facebook, from late 2017 until 2019, “misrepresented that parents could control whom their children communicated with through its Messenger Kids product.”

“Despite the company’s promises that children using Messenger Kids would only be able to communicate with contacts approved by their parents, children in certain circumstances were able to communicate with unapproved contacts in group text chats and group video calls,” the FTC said.

As part of the proposed changes to the FTC’s 2020 order, Meta would also be required to pause launching new products and services without “written confirmation from the assessor that its privacy program is in full compliance” with the order.



Source link