MIT Technology Review: “February, all eyes will be on the biggest players in tech—Meta, Google, Twitter, YouTube. A legal provision tucked into the Communications Decency Act, Section 230 has provided the foundation for Big Tech’s explosive growth, protecting social platforms from lawsuits over harmful user-generated content while giving them leeway to remove posts at their discretion (though they are still required to take down illegal content, such as child pornography, if they become aware of its existence). The case might have a range of outcomes; if Section 230 is repealed or reinterpreted, these companies may be forced to transform their approach to moderating content and to overhaul their platform architectures in the process. But another big issue is at stake that has received much less attention: depending on the outcome of the case, individual users of sites may suddenly be liable for run-of-the-mill content moderation. Many sites rely on users for community moderation to edit, shape, remove, and promote other users’ content online—think Reddit’s upvote, or changes to a Wikipedia page. What might happen if those users were forced to take on legal risk every time they made a content decision? In short, the court could change Section 230 in ways that won’t just impact big platforms; smaller sites like Reddit and Wikipedia that rely on community moderation will be hit too, warns Emma Llansó, director of the Center for Democracy and Technology’s Free Expression Project. “It would be an enormous loss to online speech communities if suddenly it got really risky for mods themselves to do their work,” she says…”
Sorry, comments are closed for this post.