TEXTS
Why You Should Distrust „Trusted Computing“ By Volker Grassmuck
Wouldn‘t it be nice if you were able to trust your computer? If you could be confident that it would do only and exactly what you want it to do? Initiatives for “Trusted” and “Trustworthy Computing” imply that they will turn computers into just that kind of machine. In fact, there are good reasons to distrust them.

In the mid-1990s, Mark Stefik from Xerox PARC developed a computing environment for controlling delivery, access to, and use of digital content. The digital revolution had empowered individuals to freely manipulate and distribute any text, image, and sound. The music companies, followed by other branches of the entertainment industry, came to see this as a threat to their business models, and pushed to solve the perceived problem caused by technology with the help of a technology that is known as DRM (Digital Restrictions Management). Stefik still called them “Trusted Systems” and left no doubt about whose trust they are supposed to gain: “Trusted systems presume that the consumer is dishonest”.

Since then, a whole range of DRM technologies has emerged. Based on cryptography, they include encryption and scrambling, watermarking, authentication, online registration, remote update, and revocation of rights. All of them have one thing in common: they were broken as soon as they arrived on the market. The entertainment industry therefore called upon lawmakers to create special protection for DRM. It started with the Copyright Treaty of the World Intellectual Property Organization (WIPO) in 1996. This was first implemented into US-American Copyright Law in 1998 as the Digital Millennium Copyright Act (DMCA). Europe followed in 2001 with the EU Copyright Directive which is currently being implemented into the national European copyright laws. Technically, it‘s still possible to circumvent DRM technology, but it has become even more illegal than it would be without the new law provisions.

On the technical front, a more radical approach has been pursued by the Trusted Computing Platform Alliance (TCPA), a large consortium set up in 1999, superseded in April 2003 by the Trusted Computing Group (TCG). The idea to not only control data but the whole computing environment has been around in the military field since the early 1970s. A cryptographic chip with a unique „endorsement key“ is put in charge of starting the PC, validating BIOS, operating system, hardware drivers, and application programs. This can be used for detection of viruses and Trojans and for access control.

TCG adds two features to it. It allows the system to report a kind of an x-ray of the currently running software configuration to a third party, e.g. a bank or a music service (“remote attestation”). If the requested service sees programs it doesn‘t like it will refuse to conduct the transaction. If it is satisfied with the user‘s configuration and decides to send data, it can lock them to the current system state (“sealed storage”). They can only be decrypted if the system is in exactly the same state. As MIT cryptologist and Turing Award winner Ron Rivest put it: “The right way to look at this is you are putting a virtual set-top box inside your PC. You are essentially renting out part of your PC to people you may not trust.”

Microsoft, though a member of the consortium, has its own plans. Within the larger framework of its “Trustworthy Computing Initiative” it calls its interpretation of TCG “Next-Generation Secure Computing Base” (NGSCB), formerly known as Palladium. It involves not only a new crypto chip but changes to the CPU, chip﷓set, memory, graphics processor, and USB-hub for connecting mouse and keyboard. It is essentially a complete re-design of the architecture of the PC.

Both forms of Trusted Computing (TC) supposedly address security problems and serve the content industry in controlling their works on the computers of the users. Both implement the distrust in the user that Stefik mentioned, and are therefore rightfully called Treacherous Computing.

TC creates a whole range of problems. Encrypted data becomes unreadable not only when the crypto chip fails but even when the system is changed by updating or installing new software. It marks the end of the flexible general-purpose computer as we know it, which will be replaced by a special purpose machine optimized for the needs of the content industry. Privacy is threatened because DRM is intended to create high-resolution personalized usage profiles. The fair use provisions of copyright law need to be decided on a case-by-case basis. Since they can‘t be implemented in technology, TC will abolish them. The TCG claims that in order to work, TC has to become ubiquitous. Legislating industry-wide adoption is not opportune today, so the consortium will exert its power through means like bundled licensing to prevent non-TC systems from being offered. This obviously raises antitrust issues. It blocks innovation, leads to customer lock-in, and reduces consumer choice. The high cost of development and roll-out of the technology will have to be born by consumers.

A less obvious problem is that the interlocking technological, legislative, and industrial steps are hardening a path of development that makes other solutions unthinkable. Since it becomes increasingly clear that DRM and TC are inefficient and a dead-end street with unacceptable costs for industries, consumers, and the society at large, alternatives need to be thought about.

Secure rather than “trusted” computing is possible today through means like firewalls, intrusion detection systems, layered permissions, and smart cards for generating and storing cryptographic keys. The major security issues, as is well known in computer science, have nothing to do with security technology but with their social acceptance. Trust is obviously not a technical feature but a quality in inter-human relations, and the object of a booming field of research into networks of trust and reputation. And also the intricacies of copyright law like fair use and parody can only be solved at the social level. What we want is secure computing and trustable social relations.

This leaves the question of how creators can be compensated in the light of a media industry that only rewards a few stars. Practitioners and scholars all over the world are working on alternatives that include voluntary contributions and changed business models. Four Microsoft DRM specialist conclude their famous Darknet paper: “In short, if you are competing with the darknet, you must compete on the darknet’s own terms: that is convenience and low cost rather than additional security”. The digital revolution allows authors and users to circumvent media oligopolies altogether. A promising solution is to extend the existing system of lump-sum levies to the digital realm. No TC needed. You trust no one? Well, then allow no one to control your computer.

Volker Grassmuck is researcher at the Humbold University Berlin, and initiator of the Wizards of OS conferences on the social dynamics of open source systems.
http://waste.informatik.hu-berlin.de/Grassmuck/



--
Source: http://world-information.org/wio/wsis/2003/Texts