How the Supreme Court ruling on Section 230 could end Reddit as we know it

[ad_1]

But another big issue is at stake that has received much less attention: depending on the outcome of the case, individual users of sites may suddenly be capable for run-of-the-mill content moderation. Many sites rely on users for community moderation to edit, shape, remove, and promote other users’ content online—think Reddit’s upvote, or changes to a Wikipedia page. What might happen if those users were forced to take on legal risk every time they made a content decision?

In short, the court could change Section 230 in ways that won’t just impact big platforms; smaller sites like Reddit and Wikipedia that rely on community moderation will be hit too, warns Emma Llansó, director of the Center for Democracy and Technology’s Free Expression Project. “It would be an enormous loss to online speech communities if suddenly it got really risky for mods themselves to do their work,” she says.

In an amicus brief filed in January, lawyers for Reddit argued that its signature upvote/downvote feature is at risk in Gonzalez v. Googlethe case that will reexamine the application of Section 230. Users “directly determine what content gets promoted or becomes less visible by using Reddit’s innovative ‘upvote’ and ‘downvote’ features,” the brief reads. “All of those activities are protected by Section 230, which Congress crafted to immunize Internet ‘users,’ not just platforms.”

At the heart of Gonzalez is the question of whether the “recommendation” of content is different from the display of content; this is widely understood to have broad implications for recommendation algorithms that power platforms like Facebook, YouTube, and TikTok. But it could also have an impact on users ‘ rights to like and promote content in forums where they act as community moderators and effectively boost some content over other content.

Reddit is questioning where user preferences fit, either directly or indirectly, into the interpretation of “recommendation.” “The danger is that you and I, when we use the internet, we do a lot of things that are short of actually creating the content ,” says Ben Lee, Reddit’s general counsel. “We’re seeing other people’s content, and then we’re interacting with it. At what point are we ourselves, because of what we did, recommending that content?”

Reddit currently has 50 million active daily users, according to its amicus brief, and the site sorts its content according to whether users upvote or downvote posts and comments in a discussion thread. be interested in, much of its content recommendation system relies on these community-powered votes. As a result, a change to community moderation would likely drastically change how the site works.

“Can we [users] be dragged into a lawsuit, even a well-meaning lawsuit, just because we put a two-star review for a restaurant, just because like we clicked downvote or upvote on that one post, just because we decided to help volunteer for our community and start taking out posts or adding in posts?” Lee asks. [these actions] enough for us to suddenly become liable for something?”

An “existential threat” to smaller platforms

Lee points to a case in Reddit’s recent history. In 2019, in the subreddit r/Screenwriting, users started discussing screenwriting competitions they thought might be scams. The operator of those alleged scams went on to sue the moderator of r/Screenwriting for pinning and commenting on the posts, thus prioritizing that content. The Superior Court of California in LA County excused the moderator from the lawsuit, which Reddit says was due to Section 230 protection. Lee is concerned that a different interpretation of Section 230 could leave moderators, like the one in r/Screenwriting, significantly more vulnerable to similar lawsuits in the future.

[ad_2]

Source link

Recommended For You

About the Author: News Center