A jury in the United States has ruled that Meta, the technology company behind Facebook and Instagram, failed to protect children on its platforms and misled the public about safety risks.
The case focused not only on harmful content but also on how the platforms are designed. Prosecutors argued that recommendation systems and safety tools exposed young users to harassment, exploitation and other harmful interactions.
The jury concluded that Meta violated consumer protection laws and ordered the company to pay €345 million in penalties. Evidence included internal documents and investigations showing how quickly accounts posing as minors encountered harmful behaviour.
Legal experts say the ruling could set a precedent. Courts and regulators may increasingly hold technology platforms responsible for how their systems affect users, particularly children.