Social Media Platforms Are Responsible For Users’ Posts, Brazil Rules

Brazil’s Supreme Court has ruled that digital platforms are responsible for the content of their users, as reported by The Rest of World. The ruling will go into effect within weeks and orders tech giants such as Google, X, and Meta to track and remove any content with hate speech, racism, and incitement to violence.
Companies must clearly indicate that they’ve taken swift action to remove any harmful content. If they fail to do so, they will be held liable, as stated by the justices.
Brazil’s relationship with big tech
Brazil has consistently faced issues with major tech firms. In 2024, Brazil’s top court ordered the immediate suspension of X—formerly Twitter.
The court’s crackdown on alleged election disinformation led to demands that X block certain accounts. In defiance, Elon Musk, who owns X, shut down the company’s office in Brazil earlier this month, refusing to comply with the court orders. De Moraes then gave X 24 hours to name a legal representative in the country or risk having its service suspended. X did not comply, and de Moraes followed through with his threat, banning X.
In 2023, the country experienced violent protests, which were organized mainly online by supporters of former President Jair Bolsonaro. Authorities continued to attempt to end what they believed was harmful behaviour being spread through social media.
Brazil’s ruling
In the past, social media networks were only liable for user-generated content if it was not removed after a court order; now, they can be held responsible for even a notification. If a platform is informed of illegal content and fails to act accordingly, it will be subject to a fine.
The court is also introducing the concept of systemic failure, which will hold providers accountable if they fail to implement preventive measures or remove illegal content. Platforms will now be expected to detail self-activating policies, ensure clarity in their procedure, and adopt standardized practices.
The ruling will be broad and will primarily affect content related to political criticism, reports of corruption, and sensitive discussions involving human rights.
Image: Matheus Câmara da Silva