Texas Attorney General Ken Paxton filed a lawsuit against Meta over Facebook’s facial recognition practices, his office announced on Monday. The news was first reported by Wall Street Journal, which notes that the lawsuit seeks civil penalties in the hundreds of billions of dollars. The suit alleges that the company’s discontinued use of facial recognition technology violated the state’s privacy protections regarding biometric data.
A press release announcing the lawsuit alleges that Facebook is storing millions of biometric identifiers contained in user-uploaded photos and videos. Attorney General Paxton says Facebook exploited users’ personal information “to grow its empire and reap historic windfall profits.”
“Facebook will no longer take advantage of people and their children with the intention of making a profit at the expense of someone’s safety and well-being,” Paxton said in a statement. “This is yet another example of Big Tech’s deceptive business practices and it must stop. I will continue to fight for the privacy and safety of Texans.”
Meta did not immediately respond to Ploonge’s request for comment.
The suit alleges that Facebook misled the public by concealing the nature of its practices and that Texans who used the app were ignorant of the fact that Facebook was capturing biometric information from photos and videos. It also alleges, without providing further context, that users were unaware that Facebook was disclosing users’ personal information to other entities who exploited it further.
“Facebook has often failed to destroy collected biometric identifiers within a reasonable time, exposing Texans to increasing risks to their well-being, safety and security,” the statement said. process reads. “Facebook has consciously captured biometric information for its own commercial benefit, to train and improve its facial recognition technology, and thus create a powerful artificial intelligence apparatus that reaches all corners of the world and arrests even those who intentionally avoided using their faces. Facebook services.”
In November 2021, Meta announced was turning off its facial recognition system on Facebook and would no longer automatically identify users who opted in to photos and videos. He also said he would delete more than a billion individual facial recognition models as part of this shutdown. But Texas officials asked Meta to preserve that data for their investigation, likely delaying the system’s full shutdown.
This isn’t the first time Meta has faced legal action over its facial recognition practices. Last March, Facebook was ordered to pay $650 million for conflicting with an Illinois law designed to protect state residents from invasive privacy practices. That law, the Biometric Information Privacy Act (BIPA), is a powerful state measure that has tripped tech companies in recent years. The lawsuit against Facebook was first filed in 2015, alleging that Facebook’s practice of tagging people in photos using facial recognition without their consent violated state law.
After the ruling, 1.6 million Illinois residents received at least $345 according to the settlement’s final decision in California federal court. The final figure was $100 million higher than the $550 million proposed by Facebook in 2020, which a judge found inappropriate. the Facebook turned off facial recognition auto tagging features in 2019making it opt-in and addressing some of the privacy criticism echoed by the Illinois class action lawsuit.
A $650 million settlement would have been enough to significantly impact any normal business, but Facebook ignored it as it did with the FTC’s record $5 billion fine in 2019 following its investigation into the media giant’s privacy issues. Social.
Texas’ new lawsuit shows that widespread privacy laws can have a significant impact not only on Meta’s operations, but also on the practices of all major tech companies. In recent years, a string of lawsuits has accused Microsoft, Google and Amazon of breaking laws when users’ faces were used to train their facial recognition systems without explicit consent.