Big Tech & Censorship

The Role of Platforms in Shaping Public Discourse

By America's Overwatch Editorial BoardUpdated January 20, 202613 min read

Key Takeaways

  • A handful of technology companies control the infrastructure of modern public discourse.
  • Content moderation decisions affect what billions of people can see and say online.
  • Concerns exist about bias, inconsistency, and lack of accountability in moderation practices.
  • Policy responses involve difficult trade-offs between free speech and platform rights.

A handful of technology companies—Google, Facebook (Meta), Twitter (X), Apple, Amazon—control much of the infrastructure of modern communication. Their decisions about what content to allow, promote, or suppress shape public discourse for billions of people.

This concentration of power raises fundamental questions about free speech in the digital age. Private companies have rights to moderate their platforms, but when those platforms function as public squares, what obligations should they have? The answers are not simple.

Platform Power

Scale: Major platforms have billions of users. Facebook reaches over 3 billion people. Google processes over 8 billion searches daily. YouTube has over 2 billion logged-in users monthly. No previous institutions have had this reach.

Network Effects: Platforms become more valuable as more people use them, creating natural monopolies. Alternatives struggle to achieve critical mass. Users can't easily leave without losing connections.

Gatekeeping: Platforms decide what content to show, recommend, demote, or remove. Their algorithms determine what billions of people see. A tweak in ranking can make content viral or invisible.

Infrastructure: Beyond social media, tech companies provide essential infrastructure: app stores, cloud computing, payment processing, email. Denial of these services can destroy businesses.

Content Moderation

Platforms have always moderated content to some degree. The question is how:

Community Standards: Platforms establish rules prohibiting certain content—illegal material, spam, harassment, violence, hate speech. Enforcement involves both automated systems and human reviewers.

Fact-Checking: Some platforms label or demote content they deem false or misleading. Third-party fact-checkers or internal teams make these determinations.

Algorithmic Curation: Algorithms determine what content to recommend. This invisible editorial function shapes what people see without explicit removal.

Account Actions: Platforms can suspend or permanently ban users. High-profile bans—including of a sitting president—demonstrate this power.

Key Concerns

Political Bias: Critics argue that moderation disproportionately affects conservative viewpoints. Studies show platform employees lean heavily left. Whether this affects moderation is contested.

Inconsistency: Rules are applied inconsistently. Similar content is treated differently based on who posts it. This creates perceptions of unfairness.

Lack of Transparency: Moderation decisions are often opaque. Users don't know why content was removed or accounts suspended. Appeals processes are inadequate.

Collusion: Coordinated action among platforms—simultaneous bans across multiple services—raises concerns about cartel-like behavior.

Government Pressure: Revelations have shown government officials pressuring platforms to suppress certain content. This may constitute unconstitutional state action.

Misinformation Determinations: Who decides what is "misinformation"? Expert consensus can be wrong. Contested claims get labeled as false. Legitimate debate is suppressed.

Policy Options

Section 230 Reform: Section 230 provides platforms immunity from liability for user content. Some propose modifying this to condition immunity on neutral moderation or to remove immunity for certain actions.

Antitrust: Breaking up large platforms or preventing acquisitions could increase competition and reduce concentrated power.

Common Carrier Rules: Treating platforms as common carriers would require them to serve all users without discrimination, similar to telephone companies.

Transparency Requirements: Mandating disclosure of moderation policies, decisions, and appeals could increase accountability without dictating content policies.

Market Solutions: Alternative platforms (Rumble, Substack, etc.) provide competition. Users can migrate if dissatisfied. But network effects make this difficult.

Do Nothing: Some argue that platforms are private companies entitled to moderate as they choose, and government intervention threatens their First Amendment rights.

The Bottom Line

The concentration of communications power in a few private companies is unprecedented. These companies make decisions with profound implications for public discourse, yet they are accountable to no one but their shareholders.

The tension between platform rights and free speech values is real. Private companies have rights, including to moderate content. But when those companies control the public square, their decisions affect everyone's ability to speak and hear.

At America's Overwatch, we believe open discourse is essential to democracy. We support transparency in moderation, consistent application of rules, and meaningful competition among platforms. Citizens should understand how platforms shape what they see and demand accountability from these powerful institutions.

Last updated: January 20, 2026← Back to Media & Information
Browse Glossary by Letter