Meta Is Getting Swarmed By “Addictive Algorithm” Lawsuits
June 21, 2022
Psychologically damaging content on Meta, formerly Facebook, might be protected by Section 230 in the Communications Decency Act, but the secret sauce that spoon-feeds it to vulnerable adolescents is not. So goes the argument, as explained in the online publication Protocol (“Meta was sued for its algorithm. Section 230 might not be a shield.”), which also Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University, who says the distinction in that argument is illusory. “Ultimately, it’s the content that’s the problem,” he argues. Then you’re back to the fact it’s really a Section 230 lawsuit, he says.
In any case, no less eight lawsuits over Meta’s algorithm have been filed in eight different state courts since early June. The allegation is that Meta, formerly Facebook, made a decision to “aggressively addict adolescents in the name of corporate profits,” says attorney Andy Birchfield, a principal at Beasley Allen, the law firm that drafted the suits, quoted in an article from Bloomberg.
Despite his skepticism about the algorithm-content distinction, Goldman acknowledges there is a chance a judge somewhere will buy it, and he thinks that’s why these lawsuits were filed in eight different venues. “Basically,” he says. “it’s like a lottery. You only really need to win one in order to open up a very, very big door for future litigation.”
Earlier this year, in January, two other lawsuits making similar claims had already been filed against Meta by the Social Media Victims Law Center, founded by Seattle plaintiff attorney Matthew Bergman. Co-defendant in those lawsuits was Snap Inc., parent company of Snapchat. Regarding one of those cases, Bergman told the Oregon Mail Tribune the client is the mother of a teenager who struggled with eating disorders and other problems, following her addiction to the platforms. She needed two psychiatric hospitalizations and was “solicited by adult men with salacious photographs, groomed to provide salacious photographs and shamed because of her body type.”
Bergman founded the Social Media Victims Law Center last year about the same time that former Facebook engineer and whistleblower Frances Haugen released “The Facebook Files,” which included documents showing the company was aware that its platform was associated with mental health issues in vulnerable teens. But Bergman’s major plaintiff law experience is with asbestos lawsuits, more than two decades of it, notes a Washington Post article, and he tells the Post he thinks Facebook makes the asbestos companies look like choirboys because they are addicting children, knowing their “frontal cortexes are undeveloped,” with the intention of maximizing profits.
A Meta spokesman tells the Post that, working in partnership with some non-profits, it provides resources to users who post about eating disorders, body-image issues, or self-harm, and that during a single quarter of last year it removed 96 percent of content related to self-harm. Meta also maintains it has strengthened parental controls, and it uses AI to keep young children from joining its platforms.
Critical intelligence for general counsel
Stay on top of the latest news, solutions and best practices by reading Daily Updates from Today's General Counsel.
Daily Updates
Sign up for our free daily newsletter for the latest news and business legal developments.