The Dangers of Brain Science Overdetermining Legal Outcomes
Jed S. Rakoff on Eugenics, Lobotomy, and Psychoanalysis
As you sit reading this, you probably experience an internal voice, unheard by any outsider, that verbally repeats the words you see on the page. That voice (which, in your case, speaks perfect English) is part of what we call your conscious mind. And the physical organ that causes what you see on the page to be simultaneously voiced internally is what we call your brain. The scientific study of how the brain relates to the mind is what we call cognitive neuroscience.
The brain is an incredibly complex organ, and for most of modern history it has defied serious scientific study. But the development in the past few decades of various technologies that enable us to observe certain operations of the brain, collectively called brain scans, has considerably increased our knowledge of how brain activities correlate with various mental states and behaviors.
The law, for its part, is deeply concerned with mental states, particularly intentions, and how they relate to behavior. As Justice Oliver Wendell Holmes, Jr., famously put it, “Even a dog distinguishes between being stumbled over and being kicked.” Distinctions of intent frequently determine, as a matter of law, the difference between going to prison and going free. Cognitive neuroscience holds out the promise of helping us to perceive, decide, and explain how intentions are arrived at and carried out. In theory, therefore, cognitive neuroscience could have a huge impact on the development and refinement of the law.
But there is reason to pause. Cognitive neuroscience is still in its infancy, and much of what has so far emerged that might be relevant to the law consists largely of hypotheses that are far from certainties. The natural impulse of forward-thinking people to employ the wonders of neuroscience in making the law more “modern” and “scientific” needs to be tempered with a healthy skepticism, or some dire results are likely. Indeed, the history of using “brain science” to alter the law is not a pretty picture. A few examples will illustrate the point.
In the early twentieth century, a leading “science” was eugenics, which put forward, among other ideas, a genetic theory about the brain. Eugenics also had philosophical accompaniments similar to social Darwinism (and, coincidentally, was first developed by Darwin’s half cousin, Francis Galton). Eugenicists claimed to “prove” that certain deleterious mental states, most notably “feeblemindedness,” were directly inheritable. It followed that the frequency of such unfortunate states of mind could be greatly reduced by prohibiting the carriers of the defective genes from procreating. Not only would this be advantageous to society as a whole, but it would also reduce the number of people destined to a life of misery and dependency because of their feeblemindedness.
So convincing was this argument, and so attractive its “scientific” basis, that eugenics quickly won the support of a great many famous people across the political spectrum, including Alexander Graham Bell, Winston Churchill, W.E.B. Du Bois, Havelock Ellis, Herbert Hoover, John Maynard Keynes, Linus Pauling, Theodore Roosevelt, Margaret Sanger, and George Bernard Shaw. Many of the major universities in the United States included a eugenics course in their curriculum.
The widespread acceptance of eugenics also prepared the way for the enactment of state laws that permitted the forced sterilization of women thought to be carriers of “feeblemindedness.” At first such laws were controversial, but in 1927 they were held constitutional by a nearly unanimous Supreme Court in the infamous case of Buck v. Bell. Writing for the eight justices in the majority (including such notables as Louis D. Brandeis, Harlan F. Stone, and William Howard Taft), Justice Oliver Wendell Holmes, Jr., found, in effect, that the Virginia state legislature was justified in concluding (based on eugenics) that imbecility was directly heritable, and that the findings of the court in the case showed that not just Carrie Buck but also her mother and illegitimate child were imbecilic. It was therefore entirely lawful to sterilize Buck against her will, because, in Holmes’s words, “Three generations of imbeciles are enough.”
In the first half of the twentieth century, more than fifty thousand Americans were sterilized on the basis of eugenics-based laws. It was not until Adolf Hitler became a prominent advocate of eugenics, praising it in Mein Kampf and repeatedly invoking it as a justification for his extermination of Jews, Gypsies, and gays, that the doubtful science behind eugenics began to be subjected to widespread criticism.
Yet even as eugenics began to be discredited in the 1940s, a new kind of “brain science” began to gain legal acceptance: lobotomies. A lobotomy is a surgical procedure that cuts the connections between the prefrontal cortex (the part of the brain most associated with cognition) and the rest of the brain (including the parts more associated with emotions). From the outset of its development in the 1930s, it was heralded as a way to rid patients of chronic obsessions, delusions, and other serious mental problems; indeed, many such results were initially reported. It was also regarded as the product of serious science, to the point that its originator, the Portuguese neurologist António Egas Moniz, shared the Nobel Prize for Medicine in 1949 in recognition of its development.
Lobotomy science was so widely accepted that in the United States alone at least forty thousand lobotomies were performed between 1940 and 1965. While most of these were not court ordered, the law, accepting lobotomies as sound science, required only the most minimal consent on the part of the patient; often the patient was a juvenile and the “consent” was provided by the patient’s parents. Lobotomies were also performed on homosexuals, who, in what was the official position of the American psychiatric community until 1973, suffered from a serious mental disorder by virtue of their sexual orientation.
Nonetheless, some drawbacks to lobotomies were, or should have been, evident from the outset. About 5 percent of those who underwent the operation died as a result. A much larger percentage were rendered, in effect, human vegetables, with a limited emotional life and decreased cognition. But many of these negative results were kept secret. For example, it was not until John F. Kennedy ran for president that it became widely known that his sister Rosemary had become severely mentally incapacitated as a result of the lobotomy performed on her in 1941, at her father’s behest, when she was twenty-three years old.
Still, by the early 1960s, enough of the bad news had seeped out that lobotomies began to be subject to public scrutiny and legal limitations. Eventually, most nations banned lobotomies altogether; but they are still legal in the United States and some other countries in limited circumstances.
While U.S. law in the mid-twentieth century tolerated lobotomies, it positively embraced psychiatry in general and Freudian psychoanalysis in particular. This was hardly surprising, since, according to Professor Jeffrey Lieberman, former president of the American Psychiatric Association, “by 1960, almost every major psychiatry position in the country was occupied by a psychoanalyst” and, in turn, “the psychoanalytic movement had assumed the trappings of a religion.” In the judicial establishment, the original high priest of this religion was the brilliant and highly influential federal appellate judge David L. Bazelon.
Having himself undergone psychoanalysis, Bazelon, for the better part of the 1950s and 1960s, sought to introduce Freudian concepts and reasoning into the law. For example, in his 1963 opinion in a robbery case called Miller v. United States, Bazelon, quoting from Freud’s article “Psychoanalysis and the Ascertaining of Truth in Courts of Law,” suggested that judges and juries should not infer a defendant’s consciousness of guilt from the fact that the defendant, confronted by a victim with evidence that he had stolen a wallet, tried to flee the scene. Rather, said Bazelon, “Sigmund Freud [has] warned the legal profession [that] ‘you may be led astray . . . by a neurotic who reacts as though he were guilty even though he is innocent.’” If Freud said it, it must be right.
Eventually, much of the analysis Bazelon used to introduce Freudian notions into the law proved both unworkable as law and unprovable as science, and some of his most important rulings based on such analysis were eventually discarded, sometimes with his own concurrence.
Ultimately, Bazelon himself became disenchanted with psychoanalysis. In a 1974 address to the American Psychiatric Association, he denounced certain forms of psychiatric testimony as “wizardry” and added that “in no case is it more difficult to elicit productive and reliable expert testimony than in cases that call on the knowledge and practice of psychiatry.” But this change of attitude of Bazelon, and others, came too late to be of much use to the hundreds of persons who had been declared incompetent, civilly committed to asylums, or otherwise deprived of their rights on the basis of what Bazelon subsequently denounced as “conclusory statements couched in psychiatric terminology” (a form of testimony that persists to this day in many court proceedings).
Still a more recent example of how the law has been misled by what previously passed for good brain science must be mentioned, since it bears on the conviction of innocent persons described in previous chapters. Beginning in the 1980s, a growing number of prominent psychotherapists advocated suggestive techniques to help their patients “recover” supposedly repressed memories of past traumas, such as childhood incestuous rapes. Eventually, in the early 1990s, more than a hundred people were prosecuted in the United States for sexual abuse based on such retrieved memories, and even though in most of these cases there was little or no other evidence, more than a quarter of the accused were convicted.
But also beginning in the early 1990s, careful studies undertaken by memory experts, most prominently Professor Elizabeth Loftus, showed that many of the techniques used in helping people to recover repressed memories had the ability to implant false memories in them, thus casting doubt on the entire enterprise. Although at first met with resistance, the work of Loftus and other serious scientists was so rigorous and convincing that it prevailed in the end. But while many of those convicted on the basis of recovered memory evidence were then released, others were not, and some may still be in prison.
It is only fair to note that, just as the law has often been too quick to embrace, and too slow to abandon, the accepted brain science of the moment, it is equally the case that the law has sometimes asked of brain scientists more than they are equipped to deliver. Consider, for example, the process of civil commitment, by which persons with serious mental disorders are involuntarily confined to psychiatric wards, mental facilities, insane asylums, and the like. Although these facilities (many of which have now been shut down) are in some respects like prisons, especially since the patients are not free to leave the premises and must follow the orders of their keepers, commitment to these facilities differs from commitment to prison in that the committed individuals are confined for an indefinite period (sometimes forever) until their treatment is sufficiently successful as to warrant their release. Moreover, the commitment comes about not by their being criminally convicted by a jury on proof beyond a reasonable doubt, but rather by a judge making a determination—almost exclusively on the basis of psychiatric testimony—that it is more likely than not that they meet the civil standard for commitment.
What is that standard? In most American jurisdictions prior to 1960, the standard was that the individual was “in serious need of [mental] treatment.” This standard had the virtue of being one that was acceptable to most psychiatrists, for a large part of their everyday practice was determining who needed mental treatment and of what kind. But the standard suffered from a vagueness and concomitant arbitrariness that troubled courts and legislators. Accordingly, beginning in the mid-1960s, it was replaced, in nearly all jurisdictions, by the current standard: that a particular person, by reason of mental problems, “is a danger to himself or others.”
What this means, in practice, is that the psychiatrist testifying at a civil commitment hearing must make a prediction about whether the person is likely to engage in violence. But if there is one thing psychiatrists are not very good at, it is predicting future violence. Indeed, in an amicus brief submitted to the Supreme Court in 1980, the American Psychiatric Association reported that its members were frequently no better than laypeople in predicting violence. The “future danger” test, it argued, was therefore not a very useful one.
In subsequent years, psychiatrists and others have tried to develop more refined instruments for predicting danger, but with only limited success. Yet it remains the test, and the law thus forces psychiatrists called to testify at a civil commitment hearing to make the very prediction they have stated officially they cannot make.
Clearly, then, brain science, or what has passed for it, has had a problematic impact on the law, suggesting that proposed interactions should be approached with caution. Which brings us back to cognitive neuroscience. It cannot be doubted that cognitive neuroscience has made considerable advances in recent years. But despite the considerable publicity it has received, it is still not able to produce well-tested, reliable procedures for detecting and measuring specific mental states in specific individuals. Neuroscience has something to offer the legal system and, given the rapid rate of its current development, may have even more to offer in the future. For now, however, the lessons of the past suggest that a too-quick acceptance by the legal system of the latest neuroscientific discoveries may be fraught with danger.
__________________________________
From Why the Innocent Plead Guilty and the Guilty Go Free by Jed S. Rakoff. Used with the permission of Farrar, Straus and Giroux. First appeared in The New York Review of Books. Copyright © 2021 by Jed S. Rakoff. All rights reserved.
Previous Article
This Year’s NBCC Award Finalists: Hamnet byMaggie O’Farrell