Is it Too Late to Save the Internet from Itself?
Noam Cohen Talks to Andrew Keen About Digital Capitalism Run Amok
2017 has been the year that writers turned on Silicon Valley. First there was Jonathan Taplin’s Move Fast and Break Things, then Franklin Foer’s World Without Mind, and they are now joined by Noam Cohen’s The Know It Alls. Cohen, a New York Times writer focusing on technology, has seen Silicon Valley up close and his book is series of biographical snapshots of the leading figures of the Internet revolution. The Know It Alls is a highly original analysis of the mentality and ideology of the multi-billionaires who are shaping our collective digital future. From Jeff Bezos and Mark Zuckerberg to Peter Thiel and Larry Page, Cohen has drawn an engaging and relentlessly critical portrait of our new digital overlords.
Andrew Keen: What story were you trying to tell in The Know It Alls?
Noam Cohen: I wanted to explain the harsh libertarian values that seemed to be taking root in America, through the lens of the rise of the Web and Internet. One of the first books I read as I began this project was the marvelous Age of Fracture, by Daniel Rodgers, which showed how hyperindividualism won the narrative war in the last 25 years of the 20th
At the start, I believed that the Web would be an amazing tool for collaboration and political organization—I wrote a piece for Dissent making that case back in 2001. Over time, I came to recognize that other than Wikipedia and Craigslist, I was wrong. Monopolies control how we access search, commerce, social networking. Tracking and surveillance is standard, and vital to these monopolies’ business models; these companies resist even basic regulation over housing, driving, advertising to young people or advertising meant to sway elections. The arrogance was hard to avoid, and frankly scared me. These hubristic leaders—the know-it-alls of my title—in the past few years acquired wealth and power of unprecedented scope with unprecedented speed.
I wanted to see how we got to this dire state of affairs. From my reporting at the Times, and a couple of relatives I was close to, I had a good sense of how the typical computer programmer approaches the outside world. It wasn’t too pretty. But the people I was thinking about didn’t have much power, nor did they seek it, even. I wanted to explore what it would mean for a hacker to have the reins of power in our country, and I wanted to understand how hackers like Marc Andreessen, Sergey Brin, Larry Page and Mark Zuckerberg got those reins. That’s the story of The Know-It-Alls, and unfortunately it couldn’t be more relevant to today.
AK: What did looking back at the history of artificial intelligence tell you about the story of the Internet?
NC: Until I worked on The Know-It-Alls, I hadn’t realized how fundamental artificial intelligence research was to the development of the field of computer science. I tell this part of the story through the life of John McCarthy, a legendary Stanford computer science professor, who is the first chapter of my book. His personal life is fascinating—he was a “red diaper” baby raised in Los Angeles who moved to the right politically as an adult. A math prodigy, McCarthy came up with the term artificial intelligence for an influential conference he helped put together in 1956 and spent the rest of his life in pursuit of making a thinking machine. Why? One professor speculates that McCarthy and the other men had “uterus envy”—they, too, wanted to create life.
Computer science was the best tool for making a thinking machine, and McCarthy shifted from pure mathematics to become a pioneer in computer science, creating a powerful language, LISP, that is still used and revered today. At the Massachusetts Institute of Technology McCarthy cofounded the A.I. lab, where he was a mentor to the first generation of “hackers,” as described brilliantly in Steven Levy’s book of the same name. McCarthy moved to Stanford in 1962, where he created a new A.I. lab, spurring on succeeding generations of hackers and continuing his pursuit of genuinely thinking machines. Needless to say, McCarthy never fulfilled his A.I. dreams, and this disappointment is palpable in his later life.
In place of research, McCarthy migrated to politics: he became an outspoken advocate for the hackers’ view of the world. He encouraged students to challenge authority, and led the computer science department in protest when Stanford tried to regulate what students could see on the Internet; he shared their hostility to outsiders and resistance to addressing the lack of racial and gender diversity among their ranks; he encouraged their faith that machines, like, say, computers, could make the world a better place; as an A.I. enthusiast, he believed that a computer programmed correctly and with access to the necessary data could do anything a person could. The personality traits of McCarthy, who died in 2011, are still the dominant ones in Silicon Valley.
AK: So McCarthy was an early version of later techno-billionaires like Larry Page and Sergei Brin or Mark Zuckerberg?
NC: McCarthy didn’t care about money—in a moving scene during Stanford’s celebration of the 40th anniversary of the Computer Science Department in 2006, McCarthy laments how his field has been overtaken by businesses. He refers to a San Francisco Chronicle piece about the celebration: “One of things this Chronicle article said was that somehow the essence of computer science faculty was starting companies, or at least that was very important—I have a somewhat negative view of this. I haven’t started any companies. … It’s my opinion that there’s considerable competition between doing research, doing basic research and running a company. I don’t expect to convince anybody because things have gone differently from that.”
AK: Much of The Know It All’s story seems to take place on the campus of Stanford University? Tell me more about Stanford’s role in the history of the Internet.
NC: The drive to start companies represents the other half of my story in the Know It Alls, and for an explanation I looked to Stanford, which happens to be at the heart of Silicon Valley. My second chapter is about Frederick Terman, Stanford’s provost in the 1950s and 60s who encouraged faculty members and students to go into business. He shrewdly reasoned that if Stanford allowed potentially lucrative research to go to market, even helped in the process, Stanford would end up seeing a significant part of those profits, either through direct licensing agreements or through donations, or both. Terman had early experience with this—in the late 1930s, he encouraged two of his engineering students, David Packard and William Hewlett, to go into business together. He helped them get funding and their first big contract, from Walt Disney, to provide sound editing equipment for his ambitious film, Fantasia. In 1977, Hewlett and Packard raised more than $9 million to build the Frederick Terman Engineering Building on the Stanford campus, just as Terman would have expected.
Stanford’s endowment was growing along with its reputation. When the Web arrived, Stanford helped nurture Yahoo, Google and later Instagram to billion-dollar status. McCarthy’s lab proved crucial to the entrepreneurial Silicon Valley culture, even if he didn’t believe in it personally. Thanks to the research at McCarthy’s labs and others, A.I. achieved half-way progress—computers can’t write poetry or explain the cosmos, as McCarthy had hoped, but they can make people think they are thinking. This limited form of A.I. is excellent at helping people shop or find friends, as well as getting people to give up their personal information and stay logged onto a site. Silicon Valley companies are making money from A.I., even if they aren’t making the world any better in the way A.I. researchers had hoped.
There is a perfect postscript to the Terman story: In 2010, Stanford built a new science and engineering quad, which included a new engineering center, named in honor of a different Stanford figure, Jen-Hsun Huang, who has a master’s degree in electrical engineering and cofounded the tech company Nvidia. Huang and his wife donated $30 million to the project. The next year, the Terman engineering center was demolished. However, inside the Huang center, on the second floor, you will find the Frederick Emmons Terman Engineering Library.
Terman’s legacy has been a guiding star of the rise of Silicon Valley as a political powerhouse and social wrecking ball. Even the great first disruptor was disrupted. No one is safe. Which is ultimately what I am trying to say in The Know-It-Alls.
AK: How inevitable is this rather tragic Silicon Valley story you present? You seem to suggest there was a very sharp fork in the road in the mid 1990s—a choice between Netscape founder Marc Andreessen’s libertarian and commercial vision of the Internet and World Wide Web inventor Tim Berner-Lee’s more communitarian and less pecuniary approach. Why did the Internet take Andreessen’s fork? Was it inevitable?
NC: Indeed, the fight between Tim Berners-Lee and Andreessen over the best way to grow the World Wide Web is an early example of why my book is called The Know-It-Alls, and not something like, The Folks Who Want to Learn From Others. The younger, harder working and supremely arrogant Andreessen always seemed destined to win and yes, the libertarian, highly commercialized Web and Internet are among the consequences of his victory.
I write about an early meeting in Chicago in 1993 between Berners-Lee and Andreessen, which Berners-Lee described as having a “strange tension,” mainly because even then he knew he was destined to be on sidelines commenting about his own remarkable invention, the Web, while Andreessen would be in the driver’s seat, first as an undergraduate (!) who developed the popular browser, Mosaic, and later as the co-founder and visionary behind the commercial browser Netscape Navigator. In that first meeting, Berners-Lee was unnerved that Andreessen and his team already talked about something being “on Mosaic,” rather than being “on the Web.” Well, Tim, buckle your seat belt.
The Mosaic team and the later one at Netscape scrapped Berners-Lee’s vision for the Web as expressed by his browser, which included an “edit” feature. Allowing a surfer of the Web to use an edit button to easily add to what appeared there was important, Berners-Lee wrote in his memoir, Weaving the Web, because he had intended the Web “to be a medium of all sorts of information, from the very local to the very global.” But to Andreessen’s mind, Berners-Lee was a snob. He tells Walter Isaacson, for The Innovators, that Berners-Lee, “had a very pure vision. He basically wanted it used for scientific papers. His view was that images are the first step on the road to hell. And the road to hell is multimedia content and magazines, garishness and games and consumer stuff.” Andreessen, by contrast, was a man of the people, “a Midwestern tinkerer type,” who believed that “If people want images, they get images. Bring it on.”
The Web became a passive medium, which people used to be entertained or to shop. Soon enough, Netscape Navigator was changed to include “cookies,” snippets of code that gave memory to the Web. The initial reason to add cookies was to help with commerce—the change was necessary in order for a website to include a shopping cart that remembered products you wanted to buy at the end of your shopping. In quick measure, “third-party” cookies cropped up that allowed marketing companies to create an online profile on Web users, the basis of digital advertising. After news leaked out, and some of the public were outraged, Netscape considered removing third-party cookies, but ultimately decided not to. The programmer who came up with cookies, Lou Montulli, described his personal anguish, before deciding to recommend doing nothing: “If third-party cookies were disabled ad companies would use another mechanism to accomplish the same thing, and that mechanism would not have the same level of visibility and control as cookies. We would be trading out one problem for another.”
In other words, the descent of the Internet into a realm of tracking and surveillance and intense commerce was inevitable, as you ask in your question. Is Montulli right? Well, certainly, the odds are stacked in favor of the Andreessen model, since he had the venture-capital funds to hire programmers to create the browser. If the battle was going to be fought in the marketplace without any counterbalance or regulation by the government, yes, the outcome appears inevitable. The Andreessen vision is good for business, the Berners-Lee one is indifferent at best.
When speaking of alternative histories my mind goes to the classic film It’s a Wonderful Life, which presents two histories of Bedford Falls, NH—one with George Bailey, and one without. With George running his savings-and-loan, there is a happy, healthy New England town; with George gone, there is an angry, glitzy, morally corrupt one, renamed Pottersville, after its richest citizen, Mr. Potter. Seems kind of familiar, no?
AK: Many of the know-it-alls you describe—from Google founders Larry Page and Sergey Brin to Peter Thiel and, of course, Mark Zuckerberg—are presented as variants of Mr. Potter. Do you think these characters have been corrupted by the dizzying wealth of Silicon Valley? Or there something intrinsically uncivil or, perhaps uncivic, about these tech entrepreneurs which make them natural architects of a digital Pottersville?
NC: Henry F. Potter was famously described by George Bailey as a “warped, frustrated old man,” which misses the mark in a few obvious ways when it comes to the know-it-alls. They are not old: even Bill Gates, whose profit-obsessed ways in the mid-1970s is covered in an early chapter of my book, is only 62 years old and, heck, Mark Zuckerberg is still too young to run for president. (Though, not for long!) And I don’t see them as particularly frustrated, either. Sure these tech leaders face a number of hurdles—governments in the US and Europe incessantly discussing how to meddle in their affairs—but the favored tactic in Silicon Valley is to keep grinning and explain that they come in peace.
Ah, warped. That one may have some legs! I argue that in the case of the hackers in my book who became filthy rich company founders—Andreessen, Brin and Page, Mark Zuckerberg—they needed to be cajoled and twisted to turn their massive brains toward making a profit. Brin and Page were motivated by the example of Nikola Tesla, who had the better ideas, but lost out to the business-minded Thomas Edison. Making money, for better or worse, was how you got your ideas accepted. Zuckerberg to this day seems to believe in his quest to connect the world, and again, making money for his investors is how he can gather the resources for this mission. Andreessen believes in the market as a way of ensuring that companies deliver what the people want.
The bankers who led Web companies—Bezos and Thiel, for example—were never conflicted. Each intended that their company, Amazon and PayPal, respectively, would become the middleman for Internet commerce and get a cut of all transactions. This is the role of a smart banker, no, finding a way of making easy money from the work and transactions of others? Thus, it wasn’t such a surprise to see that Bezos had a connection to McCarthy’s lab through David Shaw, a Stanford computer science PhD, who used his computer skills to make money in the stock market as one of the first “quants.”
AK: Okay. Enough with the criticism of the know-it-alls. How do we fix things? Can we reclaim the civic legacy of Tim Berners-Lee and establish a more equitable digital economy? Or might there be new technologies—blockchain, for example, or A.I. and the Internet of Things—that will enable a return to the original ideals of visionaries like Berners-Lee?
NC: Yes, where do go from here? I’ve been struck that some recent critiques of Silicon Valley end by saying something to the effect of, “Citizens, stop selling yourself short! Your attention is precious. Don’t waste it on social networking sites—read a book, look at a sunset.” I love the sentiment, but doubt these steps will be sufficient.
In my book, I advocate an Internet that is “local, small-scale and active.” Those qualities are what will ensure that the Internet is human-scale and thus has the chance to be empathic, civic-minded. This would mean breaking up the big monopolies like Facebook, Amazon, Google. There isn’t enough diversity of business models and the network effect means that big companies just keep getting bigger—there have been troubling reports that small tech businesses are having an even harder time of it lately. Also, I think being local is so important—I was talking with a friend who was writing about local publications in New York City. In essence, Facebook and Google have cut the legs out from under them by allowing advertisers to target by location. Facebook uses an algorithm to find people who live in Bayside, Queens, while a local news site or paper uses the genuine connections between neighbors. The paper has helped stitch the community together, Facebook has profited from the stitching that already has taken place.
Of course, breaking up monopolies and increasing regulation plays right into the libertarian line that governments prevent innovation and want a stultified economy that privileges bureaucrats. So I’ve been thinking, maybe a better way of framing the issue is to say that we have to start protecting the data rights of individuals. Your data is yours and it can’t be contracted away in exchange for services, even great ones. I was intrigued to see that back in the 1970s, there was a committee appointed in the US to examine data privacy—it recommended restraining both the government and corporations from secretly collecting data and empowering individuals to know exactly what data had been collected about themselves. The law only related to the government, feeding the libertarian narrative that Big Brother is big government, when we all can see that Big Brother today is Silicon Valley.