X, the social media giant, faces growing pressure after NOYB, an Austrian advocacy group, as they have filed a complaint. The group claims X used personal data to train its AI systems without getting user permission. This complaint sent to data protection authorities in nine European countries adds to X’s legal and ethical problems about how it handlesdata.
Earlier, the Irish Data Protection Commission (DPC) tried to restrict X’s use of EU user data for AI. X responded by agreeing to stop AI training with this data for now giving users a chance to say no. But NOYB thinks this isn’t enough.They say it only addresses how to lessen the problem, not whether the data use was legal in the first place.
Max Schrems, who leads NOYB, worries that the DPC doesn’t enforce rules well enough, pointing to times when they didn’t do enough in the past. “We want to make sure Twitter [X] follows EU law. At the very least, this means asking users for permission in this case,” Schrems said.
All the questions about EU data handling and X’s strategy to separate EU from non-EU data remain unanswered. This situation might create an important example for GDPR enforcement regarding AI training methods and the way AI models create embeddings. Sakshi Grover, who leads senior research at IDC Asia Pacific, pointed out the clash between tech advances and data privacy rules. The GDPR and EU AI Act focus on getting user permission and being open, which means companies need tough data management rules.
For X, this close look could affect how it works in building its “social graph” and making money from user data through AI-powered ads and paid services. Neil Shah, who’s in charge of research and is a partner at Counterpoint Research, said that while X can use public info, the hard part is handling and using user data for AI training without breaking GDPR rules. As laws about AI and data privacy keep changing, X’s case shows companies need to be more open and follow the rules accordingly.