Document Type

Book Chapter

Publication Date

2010

Keywords

Internet, searching

Abstract

In the last few years, some search-engine critics have suggested that dominant search engines (i.e. Google) should be subject to “search neutrality” regulations. By analogy to network neutrality, search neutrality would require even-handed treatment in search results: It would prevent search engines from playing favorites among websites. Academics, Google competitors, and public-interest groups have all embraced search neutrality.

Despite this sudden interest, the case for search neutrality is too muddled to be convincing. While “neutrality” is an appealing-sounding principle, it lacks a clear definition. This essay explores no fewer than eight different meanings that search-neutrality advocates have given the term. None of them would lead to sensible regulation. Some are too ill-defined to measure; others measure the wrong thing.

Search is inherently subjective: it always involves guessing the diverse and unknown intentions of users. Regulators, however, need an objective standard to judge search engines against. Most of the common arguments for search neutrality either duck the issue or impose on search users a standard of “right” and “wrong” search results they wouldn’t have chosen for themselves. Search engines help users avoid the websites they don’t want to see; search neutrality would turn that relationship on its head. As currently proposed, search neutrality is likely to make search results spammier, more confusing, and less diverse.

Disciplines

Digital Communications and Networking

Recommended Citation

James Grimmelmann. "Some Skepticism About Search Neutrality" The Next Digital Decade: Essays on the Future of the Internet. Ed. Berin Szoka & Adam Marcus. TechFreedom, 2010. 435-459.