Digital Media Law

Chapter 10:
The Internet

Chapter 10 explores the legal and regulatory framework for the Internet — the dominant medium for communication, commerce, and content distribution in the modern era. Unlike broadcasting, the internet developed largely without direct government content control, fostering unprecedented freedom and innovation. This chapter explains how courts have applied First Amendment principles online, the limited role of the FCC, and the impact of laws like Section 230 of the Communications Act (enacted in the Communications Decency Act). It also covers emerging challenges, including platform moderation, cybersecurity, and global jurisdiction issues.

Key Concepts in this Chapter

First Amendment & the Internet

  • Courts have held that online speech receives full First Amendment protection (Reno v. ACLU).

  • The government generally cannot impose prior restraints or broad content restrictions online.

Section 230 of the Communications Decency Act

  • Shields online platforms from liability for most user-generated content.

  • Allows platforms to remove or moderate content without being treated as the publisher.

  • Critically important for the operation of social media, forums, and review sites.

  • Under increasing political scrutiny; proposals exist to amend or repeal it.

FCC and Net Neutrality

  • Net neutrality rules prevent internet service providers (ISPs) from blocking, throttling, or prioritizing certain content.

  • Rules have shifted with political administrations; future status remains uncertain.

Jurisdiction & Global Reach

  • Online activity crosses borders, raising questions about which country’s laws apply.

  • U.S. companies may face foreign regulations (e.g., EU’s GDPR privacy law).

Platform Content Moderation

  • Private platforms (Twitter/X, Facebook, YouTube) can set and enforce their own content policies.

  • Bans and suspensions (e.g., high-profile political figures) raise debates about free expression vs. private control.

Cybersecurity & Data Privacy

  • Growing legal frameworks address consumer data protection, breach notification, and cybersecurity standards.

  • Patchwork of state laws (e.g., California’s CCPA) and sector-specific federal laws (HIPAA, COPPA).

Online Harms & Regulation

  • Challenges include harassment, misinformation, deepfakes, CSAM, and extremist content.

  • Regulation is difficult because of First Amendment protections and global jurisdiction issues.

Test Your Knowledge

1. FOSTA/SESTA (2018) amended §230 to create a carve‑out primarily concerning:

 
 
 
 

2. Which provision protects good‑faith moderation/removal of content the service considers objectionable, even if it is constitutionally protected?

 
 
 
 

3. Which of the following is NOT an express statutory exception to Section 230?

 
 
 
 

4. In Barnes v. Yahoo!, the Ninth Circuit held that §230 did not bar a claim for:

 
 
 
 

5. What’s the best description of ‘neutral tools’ in §230 jurisprudence?

 
 
 
 

6. Under §230(e)(3), state laws that are inconsistent with §230 are:

 
 
 
 

7. A platform loses §230 protection when it becomes an information content provider by ______ unlawful content.

 
 
 
 

8. Twitter v. Taamneh (2023) addressed platform liability by:

 
 
 
 

9. In Gonzales v. Google, the Supreme Court:

 
 
 
 

10. Which of the following is generally considered a protected ‘editorial function’ under §230?

 
 
 
 

11. Which statement best reflects the difference between §230(c)(1) and §230(c)(2)?

 
 
 
 

12. Under §230, who qualifies as ‘another information content provider’?

 
 
 
 

13. Which type of claim is MOST likely to avoid §230 because it targets the platform’s own promises or conduct rather than publisher decisions?

 
 
 
 

14. Which theory targets a platform’s product design (e.g., failure to implement safety features) rather than its publishing decisions—and thus sometimes sidesteps §230?

 
 
 
 

15. In FTC v. Accusearch, the court denied §230 immunity where the site operator:

 
 
 
 

16. Which case is most often cited for the ‘material contribution’ test that can defeat §230 immunity?

 
 
 
 

17. Section 230(c)(1) provides that a provider or user of an interactive computer service shall not be treated as the ______ of information provided by another information content provider.

 
 
 
 

18. Which is MOST accurate about algorithmic recommendations after 2023 Supreme Court decisions?

 
 
 
 

19. A platform’s monetization of user content (ads, revenue share) generally:

 
 
 
 

20. In Doe v. Internet Brands, a failure‑to‑warn claim survived §230 because the duty alleged:

 
 
 
 

Question 1 of 20

Ideas for Future Study

  • Section 230 Debate: Should it be narrowed, expanded, or repealed?

  • Platform Power: Should dominant social media platforms be regulated like public utilities or common carriers?

  • Global Conflicts: How should U.S. companies handle conflicting laws between countries?

  • Online Privacy: Should the U.S. adopt a national privacy law like the EU’s GDPR?

  • Algorithmic Accountability: How should the law address the influence of recommendation algorithms?

Parting Thought

The internet has enabled unprecedented speech, creativity, and access to information — but also amplified harm and disinformation. As lawmakers consider new rules, should the focus be on preserving openness or preventing misuse? Where would you draw the line?