Wouldn’t It Be Great To Directly Perceive the Warping of Space-Time?
Astronomer Chris Impey is Out Here Asking the Important Questions
A revolution is brewing. We’re on the verge of being able to “see” black holes in action. For 400 years, astronomers have learned about the universe solely by using light and other forms of electromagnetic radiation. They measure properties of the “stuff” of the universe by the ways it emits and interacts with radiation. Then, in 2015, gravitational waves were detected for the first time.
Gravitational waves are ripples in space-time that travel at the speed of light. They offer a unique window into the intense gravity of black holes, neutron stars, and supernovae, and will allow astronomers to test general relativity in new ways. They reach us from vast distances and can be used to probe the universe just after the big bang. Seeing with gravity eyes promises to transform our understanding of black holes.
There have been two major revolutions in the way we see the universe. The first began in 1610, when Galileo took a newly invented device called a telescope and aimed it at the night sky. His best telescope had lenses a half inch across, and it gathered 100 times more light than the eye. Ever since Galileo’s time astronomers have worked to improve his simple spyglass. A hundred years ago they began using mirrors instead of lenses to gather the light, since lenses sag when they’re large and they don’t bring all colors to a focus at the same location. In the modern era, astronomers have built optical telescopes 10 meters in diameter, either using single monolithic mirrors or mosaics of smaller hexagonal segments. The gain in light-gathering power in four centuries since the time of Galileo is a factor of a million.
Meanwhile, an additional gain in depth came from improving the way light is detected. The eye is an inefficient, chemical detector. To give us the illusion of continuous motion it must transmit the information that falls on the retina to the brain 10 times a second. That means that it only gathers light, or “integrates,” for a tenth of a second. Photography was invented in the mid-19th century, and soon afterward astronomers were using it to take pictures of the night sky. Light is captured chemically in a process that’s no more efficient than the eye, but long exposure times give much greater depth. A real leap forward came in the 1980s, when digital imaging was perfected. Charge-coupled devices, or CCDs, now have 80–90% efficiency in converting incoming photons into electrons and then into an electrical signal that can easily be digitized. CCDs are near-perfect detectors. The gain in detection efficiency over the eye is a factor of 100,000.
Combining these two factors means that the best telescopes see a remarkable factor of 100 billion times deeper than the eye. This is the difference between a Northern Hemisphere dweller seeing just one external galaxy, M31, and a large telescope seeing 100 billion. It’s the difference between seeing stars a few hundred light years away and seeing light that has traveled for 13 billion years. CCDs have improved so much that the number of photons recorded with large telescopes in the past year exceeds the number of photons recorded by all the human eyes in history.
The second revolution in seeing the universe played out over the first half of the 20th century. Ever since our early ancestors stared at the sky from the African savannah, astronomy has used a small sliver of the electromagnetic spectrum. From the bluest blue to the reddest red is only a factor of 2 in wavelength or frequency. The largest telescopes simply drill deeper in the same narrow slice of spectrum.
Technologies were developed to pry open the electromagnetic spectrum for astronomy. Viewing the universe in visible light is as limited as seeing in black and white compared to seeing in vivid color. Perhaps a better analogy comes from music: visible light is two adjacent keys on a piano, while the electromagnetic spectrum from radio waves to gamma rays is the full set of 88 keys. The first invisible waves to be used for astronomy were radio waves. At the end of the 19th century, Guglielmo Marconi showed that radio waves could be sent and detected over large distances, and, as we’ve seen, within 30 years Karl Jansky used a simple antenna to detect radio waves from the center of our galaxy. In the 1920s two astronomers at Mount Wilson Observatory used a device that converts a temperature difference into an electrical signal to detect infrared radiation from a number of bright stars, but infrared astronomy didn’t take off until more sensitive detectors were perfected in the 1970s. Observations at invisibly short wavelengths were impossible until astronomers could avoid the radiation being absorbed by the Earth’s atmosphere. The X-ray Sun was detected by a sounding rocket in 1949, and the archetypal black hole Cygnus X-1 was first spotted 15 years later. X-ray astronomy advanced rapidly, with a series of satellites in the 1970s. Cosmic gamma rays were predicted years before they were detected by satellites in the 1990s.
These capabilities give astronomers tools to detect radiation with wavelengths as long as 10 meters and as short as a thousandth of the size of a proton (frequencies from 108 to 1027 Hz). The extension in wavelength grasp from a factor of 2 to a factor of 10 billion billion attests to the power of technology to transform our view of the universe. Only a few sources can be detected at all wavelengths across the electromagnetic spectrum, and they’re all active galaxies powered by supermassive black holes.
Everything we learn about the universe involves telescopes gathering radiation. It’s very easy to forget that we rely on indirect information. The universe is full of matter: dust grains, gas clouds, moons, planets, stars, and galaxies. We don’t see this matter directly; we infer its properties by the way it interacts with electromagnetic radiation. Chemical elements are diagnosed by the particular spectral lines they emit or absorb. Dust grains reveal themselves by absorbing light and emitting infrared radiation. Moons and planets are seen in the reflected light of nearby stars. Stars are seen by the radiation they leak out as a byproduct of nuclear fusion. Galaxies are mapped using Doppler shifts of spectral lines from their gas and stars.
All of this is indirect, and it only relates to the 5% of the universe that’s normal matter. The 95% that’s dark matter and dark energy is still invisible to us because it doesn’t interact with radiation. The astronomical objects are the actors, but the “stage” for this cosmic drama is also unseen. Astronomers trace the expansion of the universe using galaxies as markers of invisible space-time.
The detection of black holes is also indirect. The closest we get is information from the high-energy radiation in the surrounding corona that reflects off the inner part of the accretion disk; then the mass and spin of the black hole can be diagnosed through X-ray spectral lines.
Wouldn’t it be nice to see the “stuff” of the universe without the intermediary of electromagnetic radiation? Wouldn’t it be great to directly perceive the warping of space-time? We could, if only we had “gravity eyes”. The best analogy for what that might be like for a person is telepathy. A brain is a lump of living tissue weighing about three pounds. In more detail, it’s an electrochemical network consisting of billions of neurons and trillions of connections between them. But this knowledge hasn’t told us where we store memories, emotions, momentary thoughts, or our sense of self. Seeing the universe in terms of gravity would be as profound as seeing someone else’s thoughts and feelings as they experience them.
From Einstein’s Monsters: The Life and Times of Black Holes. Courtesy of W.W. Norton. Copyright © 2018 by Chris Impey.