Criticism of Code and Other Laws of Cyberspace

From LessigWiki
Jump to navigation Jump to search

Criticism of Code and Other Laws of Cyberspace, Code v.2 Wiki - see Anti-Lessig Reader for other works.



Chapter 1: Code is Law

Chapter 2: Four Puzzles from Cyberspace

Chapter 3: Is-ism

Chapter 4: Architectures of Control

Chapter 5: Regulating Code


Chapter 9: Translation

Chapter 10: Intellectual Property

Chapter 11: Privacy

From Review of "Code and Other Laws of Cyberspace" by Karen Cole

Lessig fares surprisingly poorly in the chapter on privacy. . . .

Unfortunately, by the end of the chapter Lessig embraces the vision of privacy as property, and therefore suitable as a commodity to be traded, sold or bartered. Although he doesn't exactly endorse the World Wide Web Consortium's Platform for Privacy Preferences (P3P) he fails to disclose the elements of P3P that fit so well into his basic thesis: that P3P sets in code behaviors that would be much better served by a constitutional view of the rights of the individual within society. It is private law that has been developed by a consortium of private institutions with their own interests. . . .

The fact that Lessig misses the message of P3P perhaps shouldn't be such a surprise. Lessig is a legal scholar, not a system developer. The articles written about P3P by its developers are plain language statements that give their interpretation of its intentions. The real impact of P3P is only visible on a reading of the protocols themselves[4] , and most people, even very smart people like Lessig, are not able to read and understand such protocol documents. . . .

Although I agree in many ways with Lessig's exhortation to look at and understand, or at least question, the codes we live with I cannot get over the feeling that the code has grown so complex that we can no longer unravel it.

Chapter 12: Free Speech

Here Lessig misses the most important point with government bodies regulating radio frequency space. Frequency allocation works on a kind of "the biggest bully" wins principle, even sometimes now when the space is regulated. Case in point. Electrical motors are terrific radio transmitters, especially powerful motors used in lifts/elevators. Now, why would it be better if private bodies regulated matters of interference, than if a government body did this? Let us not forget that radio communications are vital to the military. Would the military have to negotiate with private entities, or would they have to wage a radio frequency war, for getting the space they wanted?

If Lessig would have argued in his book, Code 2.0, about certain frequencies and their "free" use, then I could have followed his reasoning. Now, however, the reasoning is deeply flawed and contains grave errors. Regarding "free" use, there cannot be absolute freedom. Almost everything electrical can transmit radio waves. Should we tolerate electrical motors, televisions, computers, transformers, etc., to broadcast, and mess up limited radio space?

And what is so good about wifi? The more users sharing the same frequencies, the lower the throughput gets. A better technology is needed for any of the effects Lessig is mentioning to take place.

Chapter 13: Interlude

Chapter 14: Sovereignty

From Wired "Lessig Suffers From Bad Code" By Declan McCullagh

The real problem is that Lessig's proposed solution [to private regulation by code] is no better. He bemoans that too much of the Internet is run by companies and individuals instead of by bureaucrats and legislators -- and the private sector isn't limited by constitutional restrictions on the government.

What he fails to note, however, is that the framers of the Constitution placed strict limits on the power of the feds because citizens don't have a choice. We can't opt out of laws passed by the federal government, which enjoys a monopoly far more extensive than Microsoft ever did.

Fortunately, on the Net, consumers have choices. Anyone fed up with AOL's persnickety approach to prurience can switch to Netcom or any number of other, freer Internet providers.

But a broader point is worth making: The free market isn't perfect. It makes mistakes -- remember the Edsel? But, on the average, free-market economies outperform overly regulated ones. After finishing Code, you'll realize that Lessig's complaints aren't just about the Net. They're about laissez-faire capitalism in general, and he doesn't like it very much at all.


Chapter 15: The Problems We Face

To the extent that Lessig's arguments against Internet "freedoms" that have not been achieved through democratic processes are persuasive, they may resist dismissal as instances of the genetic fallacy.

However, the Internet originally emerged not only in the absence of such support, but for the most part in complete obscurity, esp. from the market actors and regulators that had until then served as the (exclusive) legitimate agents of democratic action in the US communications sector. Thus, to the extent that his arguments on this point are persuasive, they constitute equally persuasive demonstrations that the Internet itself is an illegitimate accident of history.

Chapter 16: Responses

From Stanford Law Review "What Larry Doesn't Get" by David G. Post

The source of our disagreement here is clear. I have no quarrel with the notion that the code/architectures of cyberspace embed fundamental values, and I have no quarrel with the notion that each of us, confronting the design of these new cyberplaces, faces a choice among different values. . . .

But I do quarrel with the notion that because there are choices to be made among value-laden architectures, these are "political" decisions that should necessarily be subject to "collective" decision-making. Consider, by way of counter-example, the original, and still probably the most powerful, value-laden code/architecture of them all: the English language. The semantic and syntactic structures of English (and of all natural languages) are deep architectural constraints on our social life, as the critics (and, indeed, Lessig himself) have been fond of pointing out. . . .

I take it as obvious that we do not, and that we should not, subject those semantic and syntactic structures to the collective for decision-making. English will evolve, best, not by subjecting it to a series of decisions by the collective empowered to impose its will on all, but by a series of individual and sub-group decisions aggregated together. . . .

There is thus a building project at hand; cyberspace needs architectures where deliberation and reason and freedom can flourish, because -- we both believe -- people want to live in communities where deliberation and reason and freedom can flourish.(102) We can disagree about the extent to which the coercive power of the State needs to be invoked in order to get those communities built and to get people to live there.



The New Chicago School

To be integrated above

From review of ElectricRay on LibraryThing

Just as he rightly brings the utopians to book for believing their hype about this golden new age of freedom - of course governments and vested interests will figure out the net and how to effectively regulate it, like they have every other social revolution since Wat Tyler's time - I think his own vision is needlessly dystopian. It assumes that code will be able, at some point, to regularly, systematically, reliably and effortlessly know every single fact about every one of us - and hence we are ultimately regulable.

But this isn't realistic. Just as it would be impossible to accurately predict the trajectory of a crisp packet blown across St Mark's Square, no matter how sophisticated your equipment and scientific knowledge, the web is too weird, people's applications for it too dynamic and unpredictable and the "true meaning" of our communications too innately susceptible of multiple interpretations for any code to ever fully get the better of us (not even really close). For example, in my organisation I have spent months, with considerable IT infrastructural support, trying to figure how to reliably capture simple, non-controversial attributes of regular documents which routinely and predictably pass between an easily identified and small community of users across a tightly defined and fully monitored part of our internal computer system - and this has proved so far to be quite impossible. The idea that one might reliably capture deliberately masked communications even from this minute sample seems absurd, and the idea that one could do this across the whole world wide web preposterous.

Just as the spammers and virus programmers keep ahead of the filters, our freedom is adaptable and valuable enough to keep ahead of the Man.