Ellen Ullman: We Have to Demystify Code
Because Our Adversaries are Armed with Algorithms, Too
Ellen Ullman is a former software engineer and a writer. She started programming in the late 1970s and was a first-hand witness to the rise of the internet and the various tech booms and busts of the last forty years. She wrote a memoir in 1997 titled Close to the Machine: Technophilia and its Discontents. The book is often described as a “cult classic.” Ullman has also written two novels, The Bug and By Blood. She writes about technology with the lyrical gifts of a poet and the penetrating mind of a philosopher. She recently published a new book of non-fiction called Life in Code: A Personal History of Technology (MCD: Farrar, Straus and Giroux, 2017). It chronicles her experiences in the tech world from 1994 to the present.
Morgan Meis: Let me quote the first few sentences of the book, which I love, and which go like this: “People imagine that programming is logical, a process like fixing a clock. Nothing could be further from the truth. Programming is more like an illness, a fever, an obsession. It’s like riding a train and never being able to get off.”
I take this opening to be something of a shot across the bow, a challenge to any of the standard ways of writing about technology, which tend to be either “techie” on the one side or outside critiques that don’t really get it on the other. Am I basically on the right track here?
Ellen Ullman: If you see the opening as my desire to begin a new way to talk about technology, I’m pleased. But the truth is I had no intention at all, no idea of shooting anything across any particular bow.
It was 1994. I got a phone call from James Brook, who was to be the co-editor of the City Lights Book titled Resisting the Virtual Life. He said that Nancy Peters, the editor at City Lights, had suggested he call me about contributing something for the book. The next night, I was in my study trying to decipher some code, when I remembered Jim’s call. I started up Word and wrote, “People imagine that programming is logical . . . ,” then I went on to complete the first section in the essay, which survives today almost verbatim.
I had no idea of addressing anyone in particular. It turns out that what I wrote about was my internal experience of programming, the emotional one.
The fact that I became someone who writes about computing technology, as opposed to someone who lived inside the culture that was creating it, came as a shock to me.
MM: Got it. So let’s talk more about this “world” you were in as a programmer. I’m not sure I understand entirely why you wanted to go into this world in the first place. You have some wonderful passages in the book about “getting close to the machine.” Can you explain a bit what that means to you, what that compulsion to get close to the machine is all about?
EU: The words “close to the machine” refer to the programmer’s relationship to the innards of the computer. You can think of code as residing in inner and outer orbits around the machine’s kernel—in Unix, the guarded center of the operating system is indeed called the kernel. Then code moves outward toward the edges, getting farther away from that center.
The trick is to write code that interacts only with other code, “low level code,” that resides in the innermost orbits. Most prestigious, in terms of operating systems, is to write code that runs in the kernel.
I learned that I loved working with machines in the early 1970s, when I joined a group called the Ithaca Video Project. We taught ourselves to make videos using the first recorder that individuals could use, the Sony Portapak. We made documentaries, got involved in social and political activities, revered the work of the pioneering video artist, Nam June Paik. It was an exciting time. Small machines in our hands! Breaking the hold of behemoth corporations! If this sounds like the coming of the first PCs, it was. When I saw a 1978-era microcomputer, the TRS-80, in the window of a Radio Shack store, I thought: Is this anything like the Portapak? Can you use it for social action? Will it make art? And, without any forethought, I bought it. When I got my first decent program running, I sat back and marveled at it as if I’d just mined a ruby.
As the book goes on, I describe my first efforts at coding, the frustration but also the allure. Programming, in part, is the art of learning from failure. To write code is to create bugs. Then comes the teeth-grinding frustration of removing them, one by one by one. But, to stay in the profession, you have to feel a sense of excitement, to be intrigued by what you don’t know, to feel the pleasure of the hunt.
“It was an exciting time. Small machines in our hands! Breaking the hold of behemoth corporations!”
MM: To me, your book is unique in the way that you both take programming so seriously and, at the same time, have such a poetic sensibility and a feel for the intangible aspects of life (which you describe so well in terms of the sensual pleasures of cooking, your love for your cat, the simple needs of our bodies, like shitting and eating, as inextricable from our form of consciousness). My sense from reading the book is that, in the end, you are having trouble squaring the circle between your “felt sense” for the complexity of human life and your fascination with the logic of machines. Is that fair to say?
EU: I wouldn’t say I’m having “trouble.” I wouldn’t call it a “struggle.” I would say that both of us are looking at an ongoing difficulty, a conundrum. Conundrums are not solved “in the end.” The difficulties started long before either of us was born and will go on after long we’re dead (or as long as humans exist).
To begin with, there is no hard break between logic and “the felt sense,” as your question assumes. Evolution has created us as an inseparable, roiling concoction of homo faber, the tool-makers; homo sapiens, the knowing ones; and two additions I would coin as homo sociales, the social ones; and homo affectus, the ones who feel. Comprehending the whole of human life, the relationships among the parts of us that have evolved separately over eons, the harmonies and antagonisms, is something people have been trying to figure out for centuries, without notable progress. If I seem to be having trouble, I’ll take comfort in knowing I’m not alone.
I’ll start with the human as tool-maker, the creator of the computing machine, because that is the part of you have associated with logic, which you imply is the antagonist of the “deep feel.”
We make things: It’s in our nature. We create survival tools: for gathering food, killing prey, storing food and water, creating shelter.
Yet our need to survive as a species brings in the wilder, irrational part of our nature: sex. And food is more than sustenance: It is pleasure. We adorn the tools we make. There was no pressing survival need, I think, in painting patterns on clay vessels, no great reason to take time away from other needs to devise dyes. Anyone who looks at the artifacts left for us by the humans of the deep past must surely recognize our ancestors’ yearning for beauty.
The same is true for our tool called the computer. Computer code can be elegant; algorithms are often described as “beautiful.” But code cannot contain the wildness of natural language. In human language, we can fracture the formalities of grammar, refuse to spell correctly, use dialect, hip-hop, the cadences of marches, the complexities of fugues, jump in and out characters’ perceptions. Poetry could not exist without natural languages’ ability to rebel against the rules. We can “break” language but still be understood. Broken words can still “work.”
The computer, as our tool, exists with the roiling concoction of what we call human nature. And no one can say which part is more essential: shitter, fighter for survival, builder of cities, painter of vessels, writer of code.
MM: I’m convinced by much of what you are saying here, and have similar philosophical instincts myself. But just to push the point, you do suggest at different places throughout your book that the tension between the human and the machine has taken an ominous turn. Here, for instance, is a thought you have standing in a supermarket checkout line, “The lines at the checkout stands were long; neat packages rode along on the conveyor belts; the air was filled with the beep of scanners, as the food, labeled and bar-coded, identified itself to the machines. Life is pressuring us to live by the robots’ pleasures, I thought. Our appetites have given way to theirs. Robots aren’t becoming us, I feared; we are becoming them.”
This is quite a bit stronger than what you are expressing above, no?
EU: Ah, Morgan. I have been longing for questions that make me think hard. This question and the one before it come under the rubric of “Watch out what you wish for.”
The essay does indeed look at the fearful changes forced upon us in the digital age, the damage to our inner and social lives.
The fear of is not of the humanoid robots per se: “That long-standing fear—robots who fools us into taking them for humans—suddenly seemed a comic-book peril, born of another age . . . ” The fear comes from what society is doing to us in the digital age: “Working long hours, our work life invading home life through email and mobile phones . . . the increasing rarity of those feasts that turn the dining room into a wreck of stated desire.”
But I hope you also see my love for technology in the essay. I am having a lavish dinner party. There will be course after course of wine and food. As I’ve said, we need to eat for logical reasons, survival, but also to satisfy the wilder part of our nature: desire, pleasure. Nonetheless, I imagine inviting a robot to the dinner. I know it can’t share our pleasures, so I wonder, as a good host, what would please a robot. The philosopher Jeremy Bentham said that we confer sentience on a being if we can say yes to the question, “Does it suffer?” I wanted to turn that question on its head: “Can I confer sentience on my guest robot if I see its pleasures?” It was my attempt to get close to the robot, even befriend it.
In the end, I cannot. The robot’s pleasures—standardization, consistency—are inimical to our own. The natural antagonism between our desires and theirs has become a battle, which we are losing. My love for technology battles with my fear of it. But “technology” does not have agency. It is humans who create it. The damage to society can be resisted. As the book goes on I send out a call for us take up arms, as it were, against the segregated culture that invents the new tools, writes the new code.
MM: Right. And this seems to segue nicely into another point that comes up frequently in your book. Perhaps the problem isn’t technology in itself, but the specific sector of society that is “in charge” of technology. You share many moving episodes in your life as a programmer where you are basically trapped in this very childish, teenage-boy culture that seems completely cut-off from the richer aspects of human life we’ve been talking about. This culture of male immaturity seems often to have no idea what to do with a poet/woman/programmer such as yourself. Yet, these are the people who are writing the code that more and more dominates our lives. Do you think there is any necessary connection between the immaturity and unsociability among those who write code and the code itself, or is it a historical accident that the two are so intertwined?
EU: Thanks, Morgan, but I’m no poet. I write poetry only when I’m distressed, and it’s awful.
I have to begin by saying that, in talking about the teenage-boy culture, I’m speaking in generalities. I know that the first essay is very hard on that culture. But as the book goes on, I also talk about the sweetly, brilliantly geeky guys I worked with, who taught me a great deal. If I could not remember the helpful, kind men I worked with, and the pleasures of coding, I could never encourage anyone to enter the profession.
Does the puerile culture come from the nature of coding itself? Ah! An easy question at last. No. There is nothing in coding that is inherently female or male. It takes a particular kind of person to stay with the profession, as I’ve said. High tolerance for failure. Failure inducing a sense of intrigue. A rush of pleasure when something works. This special sort of drive can overcome anyone.
Too many of those recently arrived co-workers were functionally illiterate outside of engineering disciplines. A few—not too many, thank god!—were horribly challenged interpersonally. I once worked with a guy with whom I was supposed to create a new human interface. He refused to talk to me. We communicated only by email and a shared white board. If he could not look me in the eye (I sat ten feet away from him), how could he possibly invent a rich interface that envisioned the complexity of the human beings on the other side of the screen?
Software engineers are not “in charge” of the new directions computing technology will take. But the culture they work in is created by the ones who are in charge. Venture capitalists (rich white men, now trending to be younger) choose which startups to fund. The founders they choose are, again, overwhelming men, white and Asian—and young. The application for support from Y Combinator, the premier source of seed money, asks for the applicant’s age (!). Investors are charmed by boys who wrote complicated software when they were 11.
The young-male culture seeps down from there. One founder, who was looking for a vice president, told the HR person that he wanted someone under 26 years old. The HR person told him it was illegal. They hired a man under 26 years old. Then the officers look for managers who are young guys, and they in turn look for young-guy programmers. And down we go into the coding rooms. From top to bottom: a closed society of young men, where sexism can be practiced with impunity.
“Investors are charmed by boys who wrote complicated software when they were 11.”
MM: Along the lines of what you’re saying here, I notice that toward the end of your book a more, shall we say, manifesto-like tone creeps in. You are making the call, if I understand you, for breaking code wide open. You write, “I dare to imagine the general public learning to write code.” But you are also realistic about how unrealistic this sounds. Are you still feeling vaguely hopeful about this possibility?
EU: There’s no need for a “shall we say.” Yes, there is a manifesto, and it doesn’t creep in. It stomps in.
You’re missing the point of “imagine.” Not only do I imagine, I dare to imagine. A thread that runs throughout the book is the segregated culture of technology and its sorrowful effects on human life. At some point, I believe, the book has to offer a conception of a way forward.
I make it clear that I don’t expect everyone to become a professional programmer. The point is to demystify code, let the general public learn that code is written by human beings, who have their preconceptions and biases, and can be changed by human beings.
It’s no news to say that we’re bound up in the chains of algorithms (I use this medieval metaphor intentionally). The algorithms are rife with bias—or might not be. We can’t tell because they are not open to public view. There must be a new army of programmers who can question the intentions in the code. When our social and political adversaries are armed with algorithms, we, too, must take up the study of algorithms.
The difficulty, of course, is how to get into the closed technology culture. I found ways that may indeed bring in the former outsiders. Yet I know that a small army of new programmers cannot rid society of the traps, dangers, and damnations computing technology has visited upon us. But it’s not nothing. It’s a start, shall we say?
__________________________________
Ellen Ullman’s memoir Life in Code is available now from MCD.