Meta is making users who opted out of AI training opt out again, watchdog says

Privacy watchdog Noyb sent a cease-and-desist letter to Meta Wednesday, threatening to pursue a potentially billion-dollar class action to block Meta’s AI training, which starts soon in the European Union.
In the letter, Noyb noted that Meta only recently notified EU users on its platforms that they had until May 27 to opt their public posts out of Meta’s AI training data sets. According to Noyb, Meta is also requiring users who already opted out of AI training in 2024 to opt out again or forever lose their opportunity to keep their data out of Meta’s models, as training data likely cannot be easily deleted. That’s a seeming violation of the General Data Protection Regulation (GDPR), Noyb alleged.
“Meta informed data subjects that, despite that fact that an objection to AI training under Article 21(2) GDPR was accepted in 2024, their personal data will be processed unless they object again—against its former promises, which further undermines any legitimate trust in Meta’s organizational ability to properly execute the necessary steps when data subjects exercise their rights,” Noyb’s letter said.