At the New York City World’s Fair in 1964, the Ford Motor Company invited you to take a ride on the Magic Skyway designed by Walt Disney, riding over a Jetsons-like “City of Tomorrow” with towering metal spires and bubble-domed buildings. This even imagined the colonization of Antarctica, the logic being that because energy was so cheap and abundant in this tomorrow-world, any place on Earth could be made habitable. General Electric also employed a bit of Disney magic, co-producing “Progressland,” a $17 million, three-storey pavilion and later inspiration for Epcot with an animatronic “Carousel of Progress” at its centre.
The audience rotated around six stages, showing stories of how the American family had, from the 1880s to 1960s, gradually been freed up to spend less time on chores and more with each other. Above it all, an audio-animatronic father figure acted as host and narrator, with jokes about “a fellow named Tom Edison who’s working on an idea for snap-on electric lights!” before the family finally arrived in their General Electric Gold Medallion Home, surrounded by the newest models of kitchen equipment. A specially composed song, “There’s a Great Big Beautiful Tomorrow” by the Sherman Brothers (better known for the Mary Poppins score), promised with a jingle, “There’s a great big beautiful tomorrow, shining at the end of every day.”
This was all very much in the model of the previous World Fairs, going back to the coal-powered Great Exhibition of 1851, just more so. This 1964 event was especially lavish in its use of energy, with even more lavish use implied for the future. Electric companies were keen to show off nuclear power as a way to such energy abundance, but fossil fuels were still core to the event. Gas provided 80 percent of the cooling, something the gas companies proudly advertised to counter all the images of an atomic future on offer.
Even if nuclear power hadn’t been part of the Progressland story (and it very much was, with a “manmade sun” thermonuclear fission explosion as part of the show), the fair was a key event in the Cold War, articulating not just an image of the future, but how this was better than any communist visions. Lenin had been an electrical utopian in the early days of the USSR, proclaiming there would never be full communism until full electrification had been achieved. But stories of scarcity in communist nations was a mainstay of American Cold War rhetoric, and domestic gadgets made for a useful proxy for other tech developments, just as images of spaceships did elsewhere.
By the 1960s, oil had found a tight little spot for itself at the centre of the global economy, but there was a growing sense that the rise of nuclear meant fossil fuels’ days were numbered. Indeed, one of the reasons scientists in the 1950s didn’t worry too much about what carbon emissions might do to the climate was because they figured this phase of human history—where we did something as weird as power ourselves by exploding the buried remains of ancient bugs and trees—was very much a temporary state of affairs. Nuclear received most of the shiny happy electrical future hype, but renewables got a look in too. After all, the first really big electrical project had been hydro, at Niagara, and wind and solar were about to start to at least talk about catching up too.
Sometimes dubbed “the Danish Edison” Poul la Cour got into electricity in the 1870s, working first on telegraphs at the Danish meteorological office. Inspired by the utopian rhetoric surrounding electrical power at the time, he wanted to ensure this bright new future was available to people across Denmark, not just in the cities. He was awarded a research grant from the Danish parliament and in 1891 used wind power to light up the high school where he was a teacher. His designs soon spread throughout Denmark, and la Cour was canny to set up training courses, a professional Society of Wind Electricians and a journal.
Over in the US, brothers Joe and Marcellus Jacobs had grown up among screaming winds in the north-east corner of Montana, and built a small wind industry aimed at the off-grid rural market. They had little formal engineering training but liked to tinker. After playing around with a surplus First World War aircraft, adding propellers to sleds to get through the snow, they used this knowledge to build a wind generator using bits from an old Model T and the fan blade of an old wind-powered water pump. They started selling these to local farmers, setting up the Jacobs Wind Electric Company in 1928, eventually opening a factory in Minneapolis in 1932. Admiral Byrd took a Jacobs turbine on one of his trips to the South Pole and 22 years later visitors found it still spinning, even in the harshest of environments.
In Russia, the early Soviets were also determined to make the most of the promise of electrification and Lenin instructed the Academy of Sciences to research wind power, proclaiming there would never be full communism until full electrification had also been achieved. In 1931 a 100ft wind turbine was constructed near Balaklava, overlooking the Black Sea. It moved around a circular rack so it could be positioned to face directly into the wind and had an output of 100kW. To put that in some context, the turbines the Jacobs brothers were pushing out at their factory in Minnesota around the same time were just 1.5–3kW.
There were bigger ones, but still, this Balaklava project was of a whole new magnitude and the Russians had started plans for a giant machine 50 times larger. In the end, Soviet wind power research fell by the wayside during the war and wasn’t picked up again. No one knows what happened to the Balaklava turbine—there are some reports that it was used until it was destroyed in 1942, possibly doubling as an observation post for a while. An American entrepreneur named Palmer Putnam was inspired by the Russian talk of big wind and figured he could do better. Ten times better than the Balaklava turbine—the first megawatt.
As Alexis Madrigal describes evocatively in his brilliant book on the history of green tech, Powering the Dream, Putnam was an MIT-trained geologist but had no particular background in generating electricity. He did, however, have ambition. Plus he was charming and came from a well-connected family of high flyers—his mother was the first dean of Barnard, his father was a celebrated American Civil War veteran who’d gone on to run a prestigious publishing company and his cousin was an arctic explorer who married Amelia Earhart—and soon managed to raise cash.
Before long, Putnam’s 1,250kW turbine was taking shape on top of a mountain in Vermont, Grandpa’s Knob. It was a huge undertaking. The turbine’s tower came from a bridge-builder in Pennsylvania, where the blades were also built, but then everything was put together in Ohio before being shipped to Vermont. It was too heavy for local roads, so bridges had to be temporarily reinforced. It took ten rather hair-raising trips and that just took it to the bottom of the mountain. Reaching the top was another 2,000ft trek. Plus, there was no road, so they had to build their own.
Finally, it was all there and assembled. On 19 October 1941 it fed electricity into the grid. But a bearing broke in 1943 and by then American engineers had other troubles to be dealing with. For another two years, Putnam’s turbine just sat there while the world went to war. Finally, in 1945, someone found time to fix the bearing and it was switched back on again. It promptly broke again. It promptly broke again. After 1,100 hours of operation, a blade fell off, sailing 750ft through the night, knocking Putnam off his feet. Speedily dubbed “the blade that failed,” congressional hearings in 1951 cited it as a reason to write off the tech altogether. Putnam himself turned away from wind too, arguing instead for nuclear and solar.
Today, a phone tower stands on the top of Grandpa’s Knob. Still, the company that bankrolled the project put the patents in the public domain and got Putnam to write a book detailing everything that happened. When, in the 1970s, people came back to the idea of big wind, it’d help give them a head start.
Back in a 19th century World Fair, the 1878 Paris Exposition Universelle, maths teacher Augustin Mouchot had presented his new and exciting ideas for steam solar. This included a printing press that, even when cloudy, could produce 500 copies an hour of a special solar-themed publication, the Soleil-Journal. Mouchot had started off playing with a type of solar oven as a way to create steam without coal, adding a massive funnel of mirrors to concentrate the sunlight, an approach that is still used in today’s concentrated solar plants.
The French government gave Mouchot funding to take a scientific mission to Algeria to experiment with the abundant sunshine there (it’s not just fossil fuels that have histories wrapped up in colonialism). Coal was cheap, so Mouchot’s research funding was cut and he went back to teaching. But dreams of solar power hadn’t ended there.
In the 1900s, pioneering solar-preneur Frank Shuman started to build a power plant in his backyard in Philadelphia. Strips of blackened iron pipes covered with glass were filled with a liquid that had a low boiling point; the Sun would heat these pipes, turn the liquid to vapor and this would power a steam engine. It was clever, you concentrate the power of the Sun with the glass and blackened pipes and cut down the amount of work they have to do by picking a liquid that turns to gas more easily than water. Still, it needed a lot of space. He figured that if he could establish solar power in countries where coal was expensive but the Sun was plentiful, he’d be able to scale it up, and tried a project in Maadi, a suburb of Cairo. Sadly, despite some favorable coverage in the New York Times, the First World War largely put paid to the idea.
Still, on a smaller scale, solar hot water heaters were produced; an insulated box of pipes of water that could sit on the roof, using the Sun’s heat to warm your bath. In June 1979, 32 such ‘solar thermal panels’ would make their way to the roof of the White House, as Jimmy Carter used them to make a point about alternative energy (Indeed as a Nature editorial quipped in 2016, you can trace a history of American energy policy though solar panels at the White House: Carter put some up, Reagan took them down again, then during George W. Bush’s time in office a few were quietly installed in the gardens, seemingly without the president’s knowledge or interest, but Obama made another show of doing a roof install again).
In the 1930s, George Keck, an architect working on a dodecahedron glass “House of Tomorrow” for the 1934 World’s Fair in Chicago, noticed that despite the freezing weather outside and the furnace not yet installed, workers inside the building were dripping with sweat. The principle was pretty straightforward, it wasn’t all that different from how Joseph Paxton had managed to grow Amazonian lilies in the UK back in the early nineteenth century. Keck started building homes with south-facing windows, dubbed “solar homes” by the press, popular with customers keen to cut heating costs post-war.
In 1947, the glass company Thermopane published a book entitled Your Solar House with 49 examples of different solar homes, each by a different architect, designed with different states in mind. Whole parts of Chicago and New York suburbs were built with solar home principles, and prefab designs meant components could be cheaply turned out in a factory. It could go awry, with bad planning, either losing heat at night through the windows or overheating, and they were still some way from a modern super-energy-efficient “passivhaus.”
And at the same time, they had to compete with the same sort of stories of uncomplicated energy abundance electricity companies were pushing in the World’s Fair Progressland display; an image of the future where you can flick a switch for air con or heating, not needing to worry about how much electricity or gas it used, or even where that energy was coming from at all.
The real gee-whiz excitement for solar power was the promise of photovoltaics (solar PV); using the Sun to produce electricity and doing that without the need to turn a turbine. There’d been experiments with this since the mid-nineteenth century, but no one had managed to build anything that could really compete with steam, wind or hydro. An accidental discovery at Bell Labs in 1940 would change this. Russell Ohl was playing with some silicon samples and noticed one with a crack in it. Impurities had built up on either side of the crack, one side positively charged, the other negative. When he shone light on this odd little broken and dirty sample, a current would flow.
Ohl had inadvertently made a positive-negative junction, the basis of the modern solar cell. It would take a lot more work before it was anywhere near efficient to use, but in April 1954, Bell Labs proudly presented their new solar cells to the world, using a strip of them to run first a toy Ferris wheel and then a radio transmitter that could broadcast music. Excited to get on board with the solar hype, the New York Times stuck the story on their front page. The new device, made from the same ingredient as sand, “may mark the beginning of a new era, leading eventually to the realization of one of mankind’s most cherished dreams—the harnessing of the almost limitless energy of the Sun.”
It was all still hype though at this point. The first major order came from a hearing aid company, the idea being that small cells could be mounted on spectacles. But they went bankrupt before paying, so at first solar PV was largely left to toys. Solar cells had been developed by a telephone company and one early use was to power telephone lines in rural off-grid areas.
But the panels got covered in bird droppings so had to be cleaned every week and the silicon transistor (based on similar tech) soon meant the lines didn’t need so much electricity anyway, so the cells weren’t needed. The army was excited though. As soon as it heard about the Bell Labs breakthrough it sent its lead power researcher to find out more. He was immediately smitten by the idea and quickly got a team together to work out applications. But after months of searching, they could only find one project where solar power might be viable: a top-secret plan, codenamed Operation Lunch Box, to launch a satellite.
This wasn’t entirely original; Arthur C. Clarke had already written about space stations running on solar. It simply made more sense to use energy directly from the Sun in space, rather than bringing a store of fossil fuels. The US would be beaten to launch a satellite by the Russians and Sputnik—but in March 1958 Vanguard 1, the first solar-powered satellite, went into space as part of the IGY science program. It’s still up there, if you want to give it a wave, holding the record for being in space longer than any other human-made object. It’d be a while before solar made sense anywhere other than space, but ultimately the work that would go into powering satellites would bring down the cost of PV panels on Earth.
In his social history of American energy, Consuming Power, David Nye notes that the average American household in 1970 used more energy than a whole town in the Colonial period. A single TV at the time used as much energy in an hour as a team of horses could provide in a week. And yet, on the whole, this abundance wasn’t seen as remarkable. If anything, it was a vindication for the American way of life, the success of capitalism. With the growth of unions and better labor laws, the average American working day decreased from twelve hours a day in the middle of the 19th century to eight in the 1930s.
At first this trend seemed likely to continue but, in the end, no excess of leisure ever emerged. People spent money in the free time they had and on new goods that now seemed necessary to navigate a modern life, and so maintained long working hours to pay for them. Consumption, not relaxation, would become the pattern of aspiration for 20th-century life. Economies would soon become reliant on this pattern too, further yoking themselves to the fossil fuels that produced and shipped goods around the world.
The marketing techniques people like Josiah Wedgwood had pioneered in the eighteenth century—celebrity endorsement, careful segmentation of different audiences to sell to, illustrated catalogues—became ever-more elaborate and the advertising industry boomed, all lit up in electric signs. The first department stores had started to pop up from the nineteenth century onwards, with malls joining them from the middle of the twentieth century. Shopping became a leisure activity in itself, shopping sites becoming tourist attractions.
Plastics played a big part in this shiny new version of the American dream—along with whizzy new electrical devices, cars and concrete—and it’s worth sketching some of their history here. People had started to research ways to improve naturally occurring plastics in the mid-19th century. In the 1840s, Charles Goodyear patented processes for strengthening rubber (from trees), “vulcanising” it with sulphur. Charles Macintosh had worked out how to use a by-product of the coal gas industry to make waterproof raincoats a few decades before, and submarine telegraph cable-makers would create their own special recipes for treating gutta percha (indeed, it was from playing with such mixes that led Willoughby Smith to make an accidental discovery that helped build the first solar cells).
In 1862, artist turned chemist Alexander Parkes displayed his new material Parkesine at a World’s Fair held in the space left by the Great Exhibition in London. Made from gun cotton, Parkes sold this new material as a cheap replacement for ivory or mother-of-pearl. Back in the 18th century, manufacturers like Matthew Boulton had made a fortune with new techniques that offered the sort of goods that were once only available to the super rich to a larger market, and Parkesine was just another stage of this. Parkes opened a factory in Hackney Wick, east London in 1866, but wasn’t a great businessman—reports that some of his products exploded didn’t help—filing for bankruptcy after just two years.
A couple of years later, American inventor John Wesley Hyatt entered a competition offering $10,000 for an alternative to ivory to make billiard balls, utilizing an improved version of Parkes’s idea. He added camphor to the mix and gave it the name “celluloid.” Working with his brother Isaiah, he also developed a process of “blow moulding” to produce hollow tubes of celluloid, paving the way for mass production of cheap toys and ornaments. Another of the advantages of celluloid was that it could be mixed with dyes, including mottled shades, allowing the Hyatts to produce not just artificial ivory but coral and tortoiseshell too.
The next big plastics breakthrough came in 1907, when Belgian-born American chemist Leo Baekeland patented a manmade alternative to shellac—a resin secreted by the female lac bug that could be used for electrical insulating—which he named Bakelite. This soon found a range of applications—it was marketed as “the material of a thousand uses”—and was joined by a host of new plastics in the 1930s and 1940s. Nylon offered a sort of synthetic silk, useful for both parachutes and women’s stockings, and Plexiglass fed the burgeoning aviation industry.
Plastics could be durable, one of their appeals (as well as their tragedy) is how long they last, but the growth of plastic goods intensified a growing culture of disposability. The mass production of cloth and paper in the nineteenth century coupled with the emergence of germ theory had fostered a culture of disposables; products that might be used once and thrown away. This model suited companies that wanted to keep selling to customers and the number of disposable products grew in the 20th century, from the first disposable razors in 1903 to nappies in 1961.
Styrofoam, first developed for flotation devices in the war, found a new market as single-use coffee cups and take-away food containers. Colorful, squeezable polythene bottles soon became the norm for the packaging of a range of household goods, from washing-up liquid to shampoo to ketchup. The durability of the plastic was part of its attraction. A plastic straw, for example, doesn’t get soggy while you use it. But it also meant they didn’t rot so easily once they’d been used. Along with the simple growth in how much stuff people were consuming, the persistence of plastics meant the waste piled up.
The first calls to curtail disposables came in 1953, from dairy farmers wanting to protect their animals from ingesting stray glass. As Heather Rogers describes in her illuminating book Gone Tomorrow, the packaging industry mobilized fast and within months had launched the non-profit organization Keep America Beautiful. Modeled on the sort of beatification organizations that had been running since the 19th century, it ran off a similar playbook as the jaywalking campaigns—shift the blame. The organization printed pamphlets for schools to encourage “lasting acceptance of good outdoor manners” and recruited a range of local community groups. Within a few years it was active in 32 states, with membership over 70 million, enjoying active support of four federal departments.
The focus was quite squarely on the behavior of individual consumers, not the existence of packaging in the first place. As one executive from the American Can Company put it, “Packages don’t litter, people do”; or, from a 1963 education film narrated by Ronald Reagan, “Trash only becomes trash after it has first served a useful purpose. It becomes litter only after people thoughtlessly discard it.” It’s a persuasive line, not least because it offers the individual, on the face of things, some apparent agency. But, ultimately, it’s a fudge to avoid changing a system that probably never should have been allowed to establish itself in the first place. It’s an approach that the fossil fuel industry would make use of later too, most notably with BP’s promotion of carbon footprints as part of its “Beyond Petroleum” PR push in the early 2000s.
The packaging industry might have been quick to avoid blame, but by the mid 1960s, the sort of world Disney presented in its “Carousel of Progress” was starting to lose its shine. In his 1965 State of the Union address, Lyndon B. Johnson celebrated what he called the nation’s “flourishing progress,” but invited the American people to consider what they wanted to do with this flourish, who’d benefit and how they might protect themselves from any negative side-effects.
“We worked for two centuries to climb this peak of prosperity. But we are only at the beginning of the road to the Great Society,” which, he said, “asks not how much, but how good; not only how to create wealth but how to use it; not only how fast we are going, but where we are headed.” This, he promised, would shape his policies on education, health, crime provision and the arts, but also work towards ending “the poisoning of our rivers and the air that we breathe.”
Johnson followed this up a month later with a special statement on the preservation of natural beauty to Congress, which warned of “the storm of modern change” as “modern technology, which has added much to our lives can also have a darker side. Its uncontrolled waste products are menacing the world we live in, our enjoyment and our health … the same society which receives the rewards of technology, must, as a cooperating whole, take responsibility for control.” There was a sense that consumption had got to a point where the country was saturated with pollution—“skeletons of discarded cars litter the countryside’, rivers were “over-burdened with waste.”
Johnson’s speech-writers were perhaps reacting to a larger cultural shift, something pollster Daniel Yankelovich termed “the new naturalism.” In a series of studies of college students in the late 1960s and early 1970s, Yankelovich discovered a widespread conviction that everything artificial was bad, while everything “natural” was good. Ideas like these had a history—one that flowed through conservation, and back to the transcendentalists and Romantic poets—but also reflected a more recent rebellion against the world their parents built and a desire to build new cultural values.
There’s a scene in the 1967 film The Graduate that illustrates this well. A friend of the Dustin Hoffman character’s parents takes him aside and says he has one word of advice for the young man: plastics. The idea of plastics, once at the forefront of the sort of shiny futures on display at World’s Fairs, was now used by the film to denote everything the Hoffman character despises about the world he’s invited to join: cheap, sterile, ugly, mass-manufactured and all too easily disposed of.
The great utopian promise of fossil-fueled abundance was starting to show its age, or possibly had just got a little too big for its boots. It wasn’t just heat and light and transport, but also broken toasters, not having as much free time as you’d been promised and, as Johnson’s speech-writer put it so vividly, skeletons of discarded cars littering the countryside. This would set the scene for a growth spurt in the environmental movement and—at first more in science and politics than with activists—a growing concern about all this carbon dioxide we were adding to the atmosphere too.
Johnson was far from the first president to be interested in conservation, but he would be the first to talk about the climate crisis. His statement on the preservation of natural beauty extended familiar calls to clean up the air and rivers with a reference to the carbon dioxide problem: “Air pollution is no longer confined to isolated places. This generation has altered the composition of the atmosphere on a global scale.” This, he notes, is in part through radioactive materials (this was the atomic 1960s after all), but also “a steady increase in carbon dioxide from the burning of fossil fuels.” It was a blink-and-you-miss-it mention, but the carbon dioxide problem had reached the White House.
This excerpt is adapted from Our Biggest Experiment: An Epic History of the Climate Crisis. Copyright © 2021 by Alice Bell. Excerpted by permission of Counterpoint Press.