Two Bits

Chapter 14

5 Eisenstein, The Printing Press as an Agent of Change. Eisenstein"s work makes direct reference to McLuhan"s thesis in The Gutenberg Galaxy, and Latour relies on these works and others in "Drawing Things Together."

6 Johns, The Nature of the Book, 1920.

7 On this subject, cf. Pablo Boczkowski"s study of the digitization of newspapers, Digitizing the News.

8 Conventional here is actually quite historically proximate: the system creates a pdf doc.u.ment by translating the XML doc.u.ment into a LaTeX doc.u.ment, then into a pdf doc.u.ment. LaTeX has been, for some twenty years, a standard text-formatting and typesetting language used by some sectors of the publishing industry (notably mathematics, engineering, and computer science). Were it not for the existence of this standard from which to bootstrap, the Connexions project would have faced a considerably more difficult challenge, but much of the infrastructure of publishing has already been partially transformed into a computer-mediated and -controlled system whose final output is a printed book. Later in Connexions"s lifetime, the group coordinated with an Internet-publishing startup called Qoop.com to take the final step and make Connexions courses available as print-on-demand, cloth-bound textbooks, complete with ISBNs and back-cover blurbs.

9 See Johns, The Nature of the Book; Warner, The Letters of the Republic.

10 On fixity, see Eisenstein"s The Printing Press as an Agent of Change which cites McLuhan"s The Gutenberg Galaxy. The stability of texts is also questioned routinely by textual scholars, especially those who work with ma.n.u.scripts and complicated varoria (for an excellent introduction, see Bornstein and Williams, Palimpsest). Michel Foucault"s "What Is an Author?" addresses a related but orthogonal problematic and is unconcerned with the relatively sober facts of a changing medium.

11 A salient and recent point of comparison can be found in the form of Lawrence Lessig"s "second edition" of his book Code, which is t.i.tled Code: Version 2.0 (version is used in the t.i.tle, but edition is used in the text). The first book was published in 1999 ("ancient history in Internet time"), and Lessig convinced the publisher to make it available as a wiki, a collaborative Web site which can be directly edited by anyone with access. The wiki was edited and updated by hordes of geeks, then "closed" and reedited into a second edition with a new preface. It is a particularly tightly controlled example of collaboration; although the wiki and the book were freely available, the modification and transformation of them did not amount to a simple free-for-all. Instead, Lessig leveraged his own authority, his authorial voice, and the power of Basic Books to create something that looks very much like a traditional second edition, although it was created by processes unimaginable ten years ago.

12 The most familiar comparison is Wikipedia, which was started after Connexions, but grew far more quickly and dynamically, largely due to the ease of use of the system (a bone of some contention among the Connexions team). Wikipedia has come under a.s.sault primarily for being unreliable. The suspicion and fear that surround Wikipedia are similar to those that face Connexions, but in the case of Wikipedia entries, the commitment to openness is stubbornly meritocratic: any article can be edited by anyone at anytime, and it matters not how firmly one is identified as an expert by rank, t.i.tle, degree, or experience-a twelve year old"s knowledge of the Peloponnesian War is given the same access and status as an eighty-year-old cla.s.sicist"s. Articles are not owned by individuals, and all work is pseudonymous and difficult to track. The range of quality is therefore great, and the mainstream press has focused largely on whether Wikipedia is more or less reliable than conventional encyclopedias, not on the process of knowledge production. See, for instance, George Johnson, "The Nitpicking of the Ma.s.ses vs. the Authority of the Experts," New York Times, 3 January 2006, Late Edition-Final, F2; Robert McHenry, "The Faith-based Encyclopedia," TCS Daily, 15 November 2004, Again, a comparison with Wikipedia is apposite. Wikipedia is, morally speaking, and especially in the persona of its chief editor, Jimbo Wales, totally devoted to merit-based equality, with users getting no special designation beyond the amount and perceived quality of the material they contribute. Degrees or special positions of employment are anathema. It is a quintessentially American, anti-intellectual-fueled, Horatio Algerstyle approach in which the slate is wiped clean and contributors are given a chance to prove themselves independent of background. Connexions, by contrast, draws specifically from the ranks of intellectuals or academics and seeks to replace the infrastructure of publishing. Wikipedia is interested only in creating a better encyclopedia. In this respect, it is transhumanist in character, attributing its distinctiveness and success to the advances in technology (the Internet, wiki, broadband connections, Google). Connexions on the other hand is more polymathic, devoted to intervening into the already complexly const.i.tuted organizational practice of scholarship and academia.

14 An even more technical feature concerned the issue of the order of authorship. The designers at first decided to allow Connexions to simply display the authors in alphabetical order, a practice adopted by some disciplines, like computer science. However, in the case of the Housman example this resulted in what looked like a module auth.o.r.ed princ.i.p.ally by me, and only secondarily by A. E. Housman. And without the ability to explicitly designate order of authorship, many disciplines had no way to express their conventions along these lines. As a result, the system was redesigned to allow users to designate the order of authorship as well.

15 I refer here to Eric Raymond"s "discovery" that hackers possess unstated norms that govern what they do, in addition to the legal licenses and technical practices they engage in (see Raymond, "Homesteading the Noosphere"). For a critique and background on hacker ethics and norms, see Coleman, "The Social Construction of Freedom."

16 Bruno Latour"s Science in Action makes a strong case for the centrality of "black boxes" in science and engineering for precisely this reason.

17 I should note, in my defense, that my efforts to get my informants to read Max Weber, Ferdinand Tonnies, Henry Maine, or Emile Durkheim proved far less successful than my creation of nice Adobe Ill.u.s.trator diagrams that made explicit the reemergence of issues addressed a century ago. It was not for lack of trying, however.

18 Callon, The Laws of the Markets; Hauser, Moral Minds.

19 Oliver Wendell Holmes, "The Path of Law."

20 In December 2006 Creative Commons announced a set of licenses that facilitate the "follow up" licensing of a work, especially one initially issued under a noncommercial license.

21 Message from the cc-sampling mailing list, Glenn Brown, Subject: BACKGROUND: "AS APPROPRIATE TO THE MEDIUM, GENRE, AND MARKET NICHE," 23 May 2003, Sampling offers a particularly clear example of how Creative Commons differs from the existing practice and infrastructure of music creation and intellectual-property law. The music industry has actually long recognized the fact of sampling as something musicians do and has attempted to deal with it by making it an explicit economic practice; the music industry thus encourages sampling by facilitating the sale between labels and artists of rights to make a sample. Record companies will negotiate prices, lengths, quality, and quant.i.ty of sampling and settle on a price.

This practice is set opposite the a.s.sumption, also codified in law, that the public has a right to a fair use of copyrighted material without payment or permission. Sampling a piece of music might seem to fall into this category of use, except that one of the tests of fair use is that the use not impact any existing market for such uses, and the fact that the music industry has effectively created a market for the buying and selling of samples means that sampling now routinely falls outside the fair uses codified in the statute, thus removing sampling from the domain of fair use. Creative Commons licenses, on the other hand, say that owners should be able to designate their material as "sample-able," to give permission ahead of time, and by this practice to encourage others to do the same. They give an "honorable" meaning to the practice of sampling for free, rather than the dishonorable one created by the industry. It thus becomes a war over the meaning of norms, in the law-and-economics language of Creative Commons and its founders.

Conclusion.

1 See ~peters/fos/overview.htm, and See Clive Thompson, "Open Source Spying," New York Times Magazine, 3 December 2006, 54.

3 See especially Christen, "Tracking Properness" and "Gone Digital"; Brown, Who Owns Native Culture? and "Heritage as Property." Crowdsourcing fits into other novel forms of labor arrangements, ranging from conventional outsourcing and off-shoring to newer forms of bodyshopping and "virtual migration" (see Aneesh, Virtual Migration; Xiang, "Global Bodyshopping" ).

4 Golub, "Copyright and Taboo"; Dibbell, Play Money.

© 2024 www.topnovel.cc