The Brazilian Supreme Federal Court (STF) has concluded one of the most significant rulings of the digital era in Brazil: the analysis of the constitutionality of Article 19 of the Brazilian Internet Civil Rights Framework (Law 12.965/2014).
To date, the civil liability of digital platforms has been governed by Articles 19 and 21 of the Internet Civil Rights Framework (ICRF). Article 19 conditioned a platform’s liability on the existence of a prior court order mandating the removal of third-party content—an issue under discussion in Extraordinary Appeals 1.037.396 (Theme 987) and 1.057.258 (Theme 533), both with general repercussion.
It’s important to recall that the ICRF was drafted in a context of strong social mobilization and public participation. Article 19 aimed to safeguard freedom of expression and prevent private censorship by prohibiting digital platforms from being compelled to remove content without a court decision—except in cases provided for in Article 21, such as copyright violations or non-consensual sharing of intimate images.
Inspired by international models—like Section 230 of the U.S. Communications Decency Act—and European debates on intermediary liability, the article sought to balance freedom of expression with the defense of fundamental rights.
However, the digital landscape has changed. With the exponential growth of social media and the spread of illicit content, hate speech, and disinformation, the adequacy of the ICRF’s self-regulation model has been questioned. The requirement of a prior court order has often proven ineffective in protecting fundamental rights.
Freedom of Expression vs. Digital Responsibility
The debate involved a legitimate tension between two constitutional values: freedom of expression and the protection of fundamental rights. The STF’s decision represents a new paradigm of “responsibility with freedom,” requiring greater diligence and transparency from platforms—without opening the door to arbitrary censorship.
STF RULING OF JUNE 26
A majority of justices had already expressed the view that Article 19 is unconstitutional in certain situations—such as incitement to violence, crimes against honor, or the dissemination of serious disinformation. Justices Dias Toffoli and Luiz Fux, rapporteurs of the cases, proposed a reinterpretation of the provision based on Article 21 of the ICRF, which allows content removal upon simple notification in specific cases.
On June 26, 2025, the STF concluded the judgment of the aforementioned appeals, ruling that requiring a court order as the sole condition for liability is inadequate when dealing with content that seriously violates fundamental rights. The Court declared the partial unconstitutionality of Article 19.
Thus, it was established that in specific situations—such as incitement to violence, hate speech, serious disinformation, child pornography, terrorism, and crimes against honor—digital platforms may be held civilly liable even in the absence of a prior court order.
The ruling introduced a model of liability based on systemic failure, meaning the absence of effective moderation mechanisms and responses to user notifications.
WHAT CHANGES IN PRACTICE?
The STF’s decision will bring substantial changes to how platforms operate in Brazil. Key practical effects include:
- Duty of Care – Liability Without Court Order: For serious criminal content, such as:
- Anti-democratic acts
- Terrorism
- Discrimination based on race, color, ethnicity, religion, national origin, sexuality, or gender identity
- Child pornography
- Human trafficking
- Hate speech and incitement to violence
- Violence against women
- Incitement to suicide or self-harm
These types of content must be removed without the need for a court order, under penalty of civil liability. Digital platforms will be held accountable not for isolated posts, but for systemic failures—when they fail to implement effective content moderation, prevention, and response policies, thereby allowing the recurring and massive spread of illicit content.
- Extrajudicial Notification: Now accepted in other cases involving legally or constitutionally recognized rights violations (except crimes against honor—libel, slander, defamation—which still require a court order). A simple notification will suffice to trigger the platform’s duty to act. Failure to act may result in liability.
- Presumption of Liability: Applies to illicit content displayed in paid ads or boosted posts, given the platforms’ direct economic interest. Providers may be exempt if they prove timely action to remove the content.
- Private Communication Services: Email, videoconferencing, and private messaging (e.g., WhatsApp) remain protected by confidentiality and can only be held liable with a court order.
- Legal Representation in Brazil: All platforms operating in the country are required to have a legal representative and headquarters in Brazil to respond to legal proceedings and pay any applicable fines. The STF has determined that these companies must provide accessible contact information on their websites. The legal representative is also required to provide information about the platform’s operations, content moderation procedures, and complaint management policies. Additionally, the representative must produce “transparency, monitoring, and systemic risk management reports.” Information regarding advertising and paid content promotion must also be disclosed upon request.
- Service Distinction: The ruling preserves the requirement of a court order for private or confidential services, maintaining secrecy as the rule and liability as the exception.
- Marketplace Regime: Online marketplaces are governed by the Consumer Protection Code, meaning they are subject to strict liability. This applies when their participation in the consumer chain is evident—such as when they broker sales, manage payments, or provide customer support. In such cases, proof of fault is not required - it is sufficient to demonstrate the damage and its causal connection to the platform’s activities.
Three Liability Regimes for Platforms
To avoid a one-size-fits-all solution, the STF created three distinct liability regimes:
Liability Regime |
Requirement for Liability |
Examples of Content |
No court order + systemic failure |
Must show systemic failure (widespread omission) |
Incitement to violence, hate speech, terrorism, child abuse, violence against women |
Valid extrajudicial notice |
Notification from the affected party is sufficient |
Intellectual property violations, misuse of image, privacy breaches |
Prior court order required |
Liability only after a specific judicial decision |
Crimes against honor (libel, slander, defamation), confidential content (emails, private messages) |
Conclusion
The STF’s decision does not fully revoke the legal provision but interprets it in light of the Constitution, with binding effects on the judiciary. However, the Court urged the National Congress to legislate more precisely on the matter, considering ongoing challenges such as algorithm regulation, content moderation transparency, and user protection.