I had to make a visit to the doctor’s clinic yesterday, interrupting my writing for this website. In order to not lose much time, I grabbed a book I hadn’t yet opened, but knew would be of benefit to what I was working on: Sherry Turkle’s 1995 book Life on the Screen. I had picked it up in Boston last years while attending the Free Software Foundations annual convention, LibrePlanet 2023. Now, from having read the opening chapters, I get the unfortunate impression that to the author—at least in 1995—, I may as well have been attending a Microsoft Appreciation convention.

Of course, working with the benefit of 29 years hind-sight, I have Dr. Turkle at an extreme disadvantage. I’ve found all of her books to be absolutely invaluable as sociological histories. Further, I will personally attest that her dynamic psychological portraits of computers users are absolutely fearsome. I have often felt vivisected by her words on the page, and have diligently studied her sources and references for illumination of both my own human condition and society at large.

And there in lies the problem. Dr. Turkle is the pre-eminant authority on growing up in a world full of computers. So when she’s blind to any major facet of computer culture or technology, so too is everyone who relies on the authority of her texts.

Part One of the book is titled The Seductions of the Interface, and its first chapter is A Tale of Two Aesthetics. Already we reach a show-stopping bug in the book’s premise. The Macintosh is presented as a high-level simulation, which disallows any penetration into the machine itself.

The power of the Macintosh was how its attractive simulations and screen icons helped organize an unambiguous access to programs and data. The user was presented with a scintillating surface on which to float, skim, and play. There was nowhere visible to dive.

Anybody familiar with my writing knows that this is exactly my complaint about “cyberspace” writ-large: it keeps users trapped at a high level. My work, unlike Turkle’s, goes deep into the history underlying these illusions. And as it should be: I’m doing media history, she’s writing about psychology.

Having laid out the Macintosh as the epitome of user capture in high-level cyberspace, the obvious polar opposite to the Macintosh style is the Free Software “aesthetic” of GNU/Linux. GNU and other GPL-enforced systems,  use the power of courts of law to ensure users can fully examine their machine top to bottom, source-code an all. As I said in my talk last year, the GPL software license hacks through the envelopes of discontinuity which separate the post-modern subject from grounding in their material environment—in other words, which restores the medium as a clear message. This is in fact spirit of the “aesthetic” which  Turkle proceeds to lay out next in the book—with a disastrous twist.

IBM computers shipping with MS-DOS and Microsoft Windows, we are told, represent the fleeting “modernist,” “utopian” dream of understanding and controlling your computer top-to-bottom.

The modern and postmodern aesthetics were locked in competition for the second half of the 1980s with the IBM personal computer (and its clones) becoming the standard-bearer on the modernist side. The myth of the Macintosh was that it was like a friend you could talk to; the myth of the IBM, abetted by that company’s image as a modernist corporate giant, was that the computer was like a car you could control. Although most people who bought an IBM personal computer would have never thought to open it up, to modify the machine or its operating system, this possibility was implicit in the design of the system. As one user told me, “The source code is out there in the public domain. I never want to look at it. It would just slow me down. But I love it that it’s out there.”

What the heck is going on? IBM and Microsoft as the last vestiges of computer transparancy and openness? I mean, I did a whole lot of finaggling with my config.sys and autoexec.bat files in my day, but at no point was I getting to the bottom of anything. Changing the shell in win.ini and typing the IRQ and DMA values for my soundcard into every video games was not exactly a lesson in computer hardware design. It wasn’t until I spent a summer installing Gentoo Linux that I really first grappled with my computer as a material, physical computer instead of a box full of magic. A “simulation,” as Dr. Turkle calls it.

To give her the benefit: GNU wasn’t exactly a viable home computer operating system for more than a handful of hackers during the time she was researching this book. What she did have was thousands of consumers comfortable with MS-DOS, Windows, and the Macintosh to interview. But the problem goes even deeper than that. What’s this about Microsoft source code being “in the public domain?” Our anonymous user is either being misquoted, or is severely mistaken.

This is no small quibble. The narratives in Dr. Turkle’s books are upstream of a great deal of contemporary sociological study and theory. There is no other author like her doing work so important. The invaluable nature of so important a book as this inflates a small nitpick as mine into a fatal flaw. In her account, simulation (in the truest Baudrillardian sense) and post-modernism overtook modernism and the ability to know your machine in the ’90s when MS-DOS went out of vogue. Dr. Turkle’s psychological profiles and explanations of what living in simulation does to people have been of immeasurable value. But the context she gives to deliver these profiles, in its incompletness and piecemeal nature, is fatally pessimistic and deterministic. Her account condemns computer users to total capture by big tech companies.  For instance, she quotes Maury, “a sociology student whose fierce loyalty to MS-DOS and Microsoft Windows is based on the appeal of transparent understanding:”

I like the feeling that when I learned how to program in C [a computer language] I could really get to Windows because Windows is written in C. But then sometimes I want to get something done very quickly. Then I can just do it … on top of Windows… and I don’t need to worry about the computer at all. Then it’s as though Windows steps in between the machine and me and tries to take over. But I don’t have to let that happen. If I want to manipulate things, I can always get past Windows. I want to be able to do all this because of my personality. I am pretty compulsive. I like to set things up a certain way. For me, the greatest pleasure is to get the machine to do all that it can do. I want to optimize the hard drive and get it set up exactly the way I want… like allocate memory just the way I like it… I can do that with an IBM machine, not with a Mac.

First of all, learning C to be able to make Windows applications would be prohibitively expensive for most people owning Windows computers. The GNU C Compiler (the GCC) from Richard Stallman (founder of the Free Software Foundation) was the first no-cost, Free compiler available to the mass public. This is all not to mention the cost of the Windows Software Development Kit. There were far more VisualBasic coders for Windows than hobbyist C programmers. Secondly, there is this tacit implication here that, by knowing C, one can reprogram Windows. Just because Windows is written in C doesn’t mean you get the C code to Windows. The fact is ultimately immaterial, then, except for linking to the libraries through their C header files in the SDK, which provide no code, only interface to inscrutable, compiled binary blobs. No, benefits from knowing the language your OS is programmed in is a GNU/Linux feature, and feature of other Free and some Open Source software programs. Not of Windows or DOS—which would require the totally different skills of reverse engineering. Yet, from reading the quote above, everything that is true for Free Software would be naively thought true of Microsoft software.

Just because MS-DOS and Windows—being entirely file-based beyond the BIOS—were relatively more open to tinkering than the ROM-based Macintosh Systems, doesn’t make them anywhere near the bastion of transparancy which Dr. Turkle misleadingly paints them. Furthermore, she tacitly paints them as fated to lose.

Of course, many people still prefer to work with transparent computation in its earlier, modernist sense. But in the course of the 1980s, there grew to be less for them to work with, less in off-the-shelf computing, less in research computing, and less that they could recognize as their own in the long shadow that computing cast over the larger culture. The aesthetic of simulation had become increasingly dominant in the culture at large. In 1984, William Gibson’s novel Neuromancer celebrated its approach to computing’s brave new worlds….Gibson called that information space cyberspace, meaning the space that exists within a computer or matrix of computers. Cyberspace is not reducible to lines of code, bits of data, or electrical signals.

Here we go again. When you use proprietary software from Microsoft, Apple, and many other commercial outfits, your binaries are “not reducible to code” to you, personally. Because you don’t have the source code. They are reducible to lines of code when whomever programmed them makes that code available to you. The only reason that the simulation is post-modern is because you are denied the means of crawling down the stack to the material cause.

Here’s a fun story. My grandfather was a cabinet maker in Bulgaria. He fashioned very beautiful pieces of furniture out of very fine woods with delicate inlays and ornate flourishes. He was commissioned to build a large desk for a client, which he built in pieces to be assembled on site, in the clients office. Well, during assembly, he was informed that he would not be paid what had been negotiated as the price of the desk. My grandfather politely acquiesced, assembled the desk, and then very careful covered every single screw-hole on the desk with inlays of precisely the wood into which it was drilled, rendering them virtually undetectable. My family can only guess at how long it was before the client discovered that it would be impossible to ever move the desk from his office without destroying a wall or window in his building.

Sometimes, the indivisibility or irreducibility of a perceptual “gestalt” is not a fact of nature. Sometimes it’s an intentional “fuck you,” a power move by one person or group of persons over another. My grandfather was saying “fuck you” to the client who stiffed him on his bill, and tech companies are saying the same when they sell you a computer you can’t control. And it’s certainly more than just a mere “aesthetic” in both cases. It’s literally a matter of being in control of the thing you bought, or being controlled by it. The giant, gaping exit sign from this trap is everywhere: GPL licensed software has proliferated across the world. But, in the tale Turkle told in 1995, based on the sources she had, it may as well have not existed.

In the last chapter of his 1984 book Hackers, Steven Levy explains how Richard Stallman, “The Last of the True Hackers,” delivered the “fuck you” right back to tech companies who collectively ushering in the post-modern age of proprietary software.

It was painful for everybody, and when both companies came out with similar versions of LISP machines in the early 1980s it was clear that the problem would be there for a long time… The magic was now a trade secret, not for examination by competing firms. By working for companies, the members of the purist hacker society had discarded the key element in the Hacker Ethic: the free flow of information. The outside world was inside…

Stallman wanted to fight back. His field of battle was the LISP operating system, which originally was shared by MIT, LMI, and Symbolics. This changed when Symbolics decided that the fruits of its labor would be proprietary; why should LMI benefit from improvements made by Symbolics hackers? So there would be no sharing. Instead of two companies pooling energy toward an ultimately featureful operating system, they would have to work independently, expending energy to duplicate improvements… A virtual John Henry of computer code, RMS had single-handedly attempted to match the work of over a dozen world-class hackers, and managed to keep doing it during most of 1982 and almost all of 1983. “In a fairly real sense,” Greenblatt noted at the time, “he’s been outhacking the whole bunch of them.”

Stallman had no illusions that his act would significantly improve the world at large. He had come to accept that the domain around the AI lab had been permanently polluted. He was out to cause as much damage to the culprit as he could. He knew he could not keep it up indefinitely…

Richard Stallman did leave MIT, but he left with a plan: to write a version of the popular proprietary computer operating system called UNIX and give it away to anyone who wanted it. Working on this GNU (which stood for “Gnu’s Not Unix”) program meant that he could “continue to use computers without violating [his] principles.” Having seen that the Hacker Ethic could not survive in the unadulterated form in which it had formerly thrived at MIT, he realized that numerous small acts like his would keep the Ethic alive in the outside world.

Access to source code is essential, but far from adequate to freeing one’s self from the post-modern condition. Turkle’s work shines when it examines how identities are formed in relation to “objects to think with.” From the gears in Seymour Papert’s childhood to the robotic Turtle which drew on the floor, our relation to our material environment shapes and makes us. The stories and histories of those objects, their constitution and reconstitution in new forms and in new places, their obsolescence and retreivals according to McLuhan’s tetrad—all of these are narratives which must be told. McLuhan drew on accounts of blast-furnaces and horse stirrups and every other sort of machine to demonstrate the massive impact our relation to objects has to our identity.

Computers are a tricky case. They’re so complex when rendered abstractly. I hope that by telling the history of their development, we can find places in our embodied, material world to place all these little bits of cyberspace with otherwise threaten to overwhelm us in the maelstrom. We need to relate to the scales and proportions of these things with our bodies, our senses, the rate of time and the breadth of our reach as humans of flesh and blood. It is happening already, dimly, everywhere that people fall out of belief in the normalized mainstream perception of commodity computing “experiences.” The Turing Complete User isn’t dead yet. The timeline of books in the ‘90s heralding the age of simulation were both premature and incomplete. That should be happy news to everyone who worried otherwise, Dr. Turkle included.