Document Type

Article

Publication Date

2015

Abstract

TL;DR—On a Friday in 2005, the Los Angeles Times launched an experiment: a “wikitorial” on the Iraq War that any of the paper’s readers could edit. By Sunday, the experiment had ended in abject failure: vandals overran it with crude profanity and graphic pornography. The wikitorial took its inspiration and its technology from Wikipedia, but missed something essential about how the “the free encyclopedia that anyone can edit” staves off abuse while maintaining its core commitment to open participation.

The difference is moderation: the governance mechanisms that structure participation in a community to facilitate cooperation and prevent abuse. Town meetings have moderators, and so do online communities. A community’s moderators can promote posts or hide them, honor posters or shame them, recruit users or ban them. Their decisions influence what is seen, what is valued, what is said. They create the conditions under which cooperation is possible.

This Article provides a novel taxonomy of moderation in online communities. It breaks down the basic verbs of moderation—exclusion, pricing, organizing, and norm-setting—and shows how they help communities walk the tightrope between the chaos of too much freedom and the sterility of too much control. Scholars studying the commons can learn from moderation, and so can policy-makers debating the regulation of online communities.

Publication Citation

17 Yale Journal of Law & Technology 42 (2015).

Disciplines

Internet Law

Included in

Internet Law Commons

Share

COinS