On Tuesday (February 21), the Supreme Court will be hearing oral arguments in a case that could alter the protections internet companies have had since the rise of social media over the last two decades.
The case, Gonzalez v. Google, centers around YouTube — a subsidiary of Google — providing a platform and promoting terrorist content through their algorithm that incited violence leading to the death of U.S. citizen Nohemi Gonzalez at the 2015 Paris terrorist attack.
The case targets Section 230 of the 1996 Communications Decency Act, a controversial provision shielding internet providers from liability for content posted on their platforms by third parties.
Those in the tech industry argue that the protections are necessary, but on both sides of the aisle, the rule has been criticized for different reasons.
Congress is currently at a stalemate regarding how to proceed when moderating content, shifting all focus to how SCOTUS will respond to the first Section 230 case.
The primary change to emerge from the case could affect algorithms.
The case focuses on holding Google’s algorithm responsible rather than its position as the host of the content, arguing that Section 230 does not protect the algorithm from being held responsible.
Google argues that Congress envisioned the protections outlined in Section 230 apply to the algorithm as it is essential to operating the modern internet.
It also argues that holding the algorithm responsible could have far-reaching negative ramifications for smaller tech companies — like Yelp and Reddit.