Will Google Use "Editorial Discretion" to Exclude Books from Book Search?

[Note: please be sure to read the comments with responses from Google's Alexander Macgillivray]

Joris van Hoboken recently brought this section of the Google Book Search Settlement Agreement to my attention:

Section 3.7(e) Google’s Exclusion of Books

Google may, at its discretion, exclude particular Books from one or more Display Uses for editorial or non-editorial reasons. However, Google’s right to exclude Books for editorial reasons (i.e., not for quality, user experience, legal or other non-editorial reasons) is an issue of great sensitivity to Plaintiffs and Google.  Accordingly, because Plaintiffs, Google and the libraries all value the principle of freedom of expression, and agree that this principle is an important part of GBS and other Google Products and Services, Google agrees to notify the Registry of any such exclusion of a Book for editorial reasons and of any information Google has that is pertinent to the Registry’s use of such Book other than Confidential Information of Google and other than information that Google received from a third party under an obligation of confidentiality.

In short, Google can refuse to provide access to a particular book in its Book Search service for “editorial reasons.” In such a case, Google will notify the Registry and provide the a digital copy of the book which could be made available outside of Google’s services (this provision is outlined in a subsection I haven’t reproduced here).

The Settlement Agreement is silent on what these “editorial reasons” might be. While it differentiates them from “non-editorial reasons”, which might include legal restrictions (the copyright holder hasn’t granted the necessary permissions, the book is child pornography and illegal in certain jurisdictions, etc) or technical limitations (the pages simply aren’t scannable, etc), there are no definitions, guidelines, or limitations on what kind of editorial discretion Google could perform in restricting access to certain texts.

Google’s approach to censoring content is a bit scattered, but clearly controversial. In its Web index, Google famously includes everything that is legal (including hate speech), and typically only excludes pages such as child pornography or those subject to a DMCA takedown notice. While you can filter your results to exclude some objectionable material, Google performs only minor censoring of its Web search product (notable exceptions include regional censorship in China or Germany, etc). Nearly all known instances of the removal of content from Google’s index were, in one way or another, legally required. Thus, framed in the language of Section 3.7(e) above, these were “non-editorial” exclusions. To date, I know of no purely “editorial” exclusions from Google’s index: even if they don’t like the site, they still include it.

On YouTube, Google takes a more aggressive stance towards censoring objectionable material, exerting editorial discretion in removing content that violates “community guidelines, which includes restrictions on pornography, sexually explicit content, graphic or gratuitous violence, or “videos showing bad stuff like animal abuse, drug abuse, under-age drinking and smoking, or bomb making”.

Now, I’m not sure how much input the YouTube “community” really has in crafting these guidelines, but Google appears to respect that YouTube started as, and continues to be, a user/community-driven space for sharing videos. As YouTube grew, certain norms of appropriate content emerged, and Google appears to be trying to respect and maintain those norms. By couching them as “community standards,” and giving individual users the ability to report problematic videos, Google has, at least in appearance, handed some of its editorial discretion to the YouTube community. But, at the end of the day, it remains Google who gives the thumbs up or thumbs down to a particular video, carrying with it potentially dangerous consequences.

If Google appears to be following the norms of appropriateness and access within the historical context of YouTube, will it do the same for books within Book Search? While an amateur-based online video sharing website might create certain community standards for appropriate material, the American Library Association has long held the position of ensuring and protecting patrons’ intellectual freedom and full access to information, recognizing the essential value of the freedom to read any and all materials that are legal for the library to possess.

[While librarians do exert discretion over books acquired, these decisions most commonly are made based on budget/space considerations (your local library branch can't buy and shelve all books), as well as regional considerations (there simply won't be too much demand for How to Build an Igloo at the San Antonio Public Library). But these kinds of constraints are just what the Google Book Search project are supposed to help overcome.]

So what kind of editorial discretion does Google contemplate given the inclusion of Section 3.7(e)? Will it respect the existing norms of access for books in public libraries, and include all materials that the law allows (essentially making Section 3.7(e) irrelevant)? Or will it create a YouTube-like policy that might exclude books deemed (by who?) to be offensive or otherwise filled with “bad stuff”?

This is a critical issue that the Settlement Agreement fails to adequately address. I will pose these questions at the Public Index, and hope that they might be further addressed at upcoming conferences discussing the Settlement. I welcome any further thoughts or reactions.

Print Friendly

Tags: ,