Comfortably Numb
$26.00
- Description
- Additional information
Description
Charles Barber was educated at Harvard and Columbia and worked for ten years in New York City shelters for the homeless mentally ill. The title essay of his first book, Songs from the Black Chair, won a 2006 Pushcart Prize. His work has appeared in The New York Times and Scientific American Mind, among other publications, and on NPR. He is a lecturer in psychiatry at the Yale University School of Medicine and lives in Connecticut with his family.
In 1988, almost by accident, I began working with homeless people suffering from mental illness in New York City. This was meant to be a short-term vocation, a year at most. But for the next fourteen years, I worked with the homeless mentally ill in Manhattan in a variety of settings–first on the streets, then in shelters, then in supportive residential programs. All of my clients suffered from, as the psychiatric textbooks put it, “severe and persistent mental illness.” That is, they were diagnosed with various forms of schizophrenia, extreme mood complications such as bipolar disorder and major depression, and a range of personality disorders. Most of my clients had been or were addicted to some combination or other of alcohol, heroin, crack, cocaine, benzodiazepines, and PCP. A very large percentage had chronic physical ailments like diabetes, HIV, and hepatitis. Despite the rather remarkable burden of their collective afflictions, my clients were also often engaging, interesting, and without exception astonishingly resilient.
To quell their unruly moods and their troublesome delusions and hallucinations, my patients were taking all manner of psychiatric medications. Some of these medications had been around since the 1950s and 1960s–mood stabilizers like lithium, antipsychotics such as Haldol and Thorazine–while others, at the time, were brand-new, with strange and exotic names like Prozac, Paxil, and Zoloft. Each year over the course of the 1990s, new psychiatric medications were introduced and consumed en masse by my clients. Some of these new medications arrived with great fanfare and extremely high expectations. In particular, a class of agents called “atypical antipsychotics”–Risperdal, Clozaril, and Zyprexa are the best known–had been shown in early clinical studies to be far superior to the Haldols and the Thorazines. Overnight, it seemed, almost all patients were converted to these new drugs, as well as new-generation antidepressants and mood stabilizers. It was not at all unusual for my clients to be taking three, four, five, or six different types of psychiatric drugs in a given day–a combination not unlike the number of street drugs many of them had once been addicted to.
I am not a psychiatrist. My job was first as a counselor and then director of a number of clinical and residential programs, and finally as a researcher at medical schools. But I became oddly enthralled by the ongoing parade of medications that entered my clients’ mouths (and sometimes their arms, via injection). I became deeply immersed in the sheer zeitgeist of all that was involved in their ever more complex pharmacological regimens: from the monthly filling of their multiple prescriptions (which would have cost hundreds and hundreds of dollars if not paid for by Medicaid); to the cheerful colors and happy-sounding, near-poetic names of the drugs (and the colors become more vibrant and the names more poetic as the 1990s wore on); to the regular visits of perky drug reps ready to hand out free meals, pens, calendars, and coffee cups to anybody who would listen; and, not least, to the complicated and broadly variable impact of the drugs on my clients’ symptoms, personalities, and physical health. The influence of the drugs ranged greatly: from near-miraculous apparent “cures”; to therapeutic numbing, to no effects whatsoever; to, in one case, a near-fatal attack.
In the late 1980s, when I told people outside the field about my work–say, friends at cocktail parties in suburban, upper-middle-class Connecticut, where I grew up–no one seemed to quite comprehend what I did for a living. The prevailing tenor of these conversations was one of confusion. It would require real effort on my part to explain to these highly educated, eminently bourgeois people the nature of the problems which m clients faced. Sipping white wine, my friends and the friends of my parents struggled to grasp terms like “bipolar disorder” and “schizophrenia.” If I happened to mention the medications my clients were taking, the names fell upon barely comprehending ears. (Or I would be asked, in so many words: “Those are those zombie meds that they gave out in the old mental hospitals, aren’t they?”) It was also evident that these professors and lawyers and businesspeople, while not lacking in compassion, suspected my clients of having taken way too many drugs and/or being possessed of a seedy moral shiftlessness. The consensus was that while my clients were no doubt victims of multiple forms of injustice, their own characterological defects or weakness was the primary cause of their problems. I was sure that I was suspected of slumming–of immersing myself in a noble but deeply unsuitable venture for someone of my background and education. Returning to my work on the streets and in the shelters of New York City, I felt that what I was doing was at the very margins of American society.
But by the end of the 1990s, at these same cocktail parties, not only did people enthusiastically appreciate what I did, they were likely to share with me in no longer hushed tones that their friend or son or “someone very close to me” was suffering from depression or some other major psychiatric illness, and many were taking many of the same drugs my clients were. Words like Prozac and Paxil and lithium were tossed around along with the salted peanuts and the shrimp. Upon learning of the nature of my work, people would gather around, and I would be solicited for advice on various technical questions, like how long it took for Zoloft to fully enter the bloodstream and the advantages of Depakote as compared to lithium, or whether a neighbor’s behavior was classically bipolar or merely hypomanic. Everyone seemed to be filled with a new and abrupt compassion for my clients, who, it was now universally agreed, were–of course how could it be otherwise!–suffering from chemical imbalances and inner torments that, while unseen, were as physiological and real as diabetes or cancer. My career choice was to be applauded, and it was universally agreed that I was engaged in something important and meaningful. Even the terminology had changed: I no longer worked in shelters with psychotic people, but in the brand-new shiny field of “Mental Health.”
There was of course nothing unique about the cocktail parties I was attending. These same discussions and attitudinal changes were flowing vigorously through the popular culture. Over the 1990s, mental illness became highly visible and quite suddenly almost chic. One celebrity after another confessed to their long-secret psychiatric anguishes. A movie about a genius with paranoid schizophrenia, A Beautiful Mind, won multiple Oscars. Radio talk show hosts began to talk semiknowledgeably about borderline personality disorder and the differences between SSRI and MAOI antidepressants. The wife of the vice president of the United States, Tipper Gore, revealed in a national op-ed that she had suffered from clinical depression. In 1999, Bill Clinton convened a high-profile summit meeting on the nation’s mental health, and his surgeon general released the first report on that topic. Even George W. Bush, not typically known for his progressive stances, issued his own remarkably forward-looking report on mental health in 2002 and publicly supported “mental health parity”–equality in the insurance coverage of physical and mental ailments. Bush declared: “Political leaders, health-care professionals, and all Americans must understand and send this message: mental disability is not a scandal–it is an illness.” In a reflection of how much support for “mental health” has moved to the fore, in 2004 California passed Proposition 63, “the millionaires for mental health tax,” by which citizens with personal income over $1 million were levied an additional tax to fund the expansion of public psychiatric services. Over the course of the 1990s and into the 2000s, there was a widespread optimism about the new medications and about the curative possibilities of advances in psychiatry in general. I would attend psychiatric conferences and hear breathless predictions that the genetic causes of schizophrenia would be identified within a decade, and a cure would follow not long thereafter.
But the biggest sign of the newfound acceptance and fascination with the new biological psychiatry lay within our own bloodstreams. Each year more and more Americans were taking psychiatric medications, particularly the SSRI (selective serotonin reuptake inhibitor) antidepressants like Zoloft, Paxil, and Prozac.
What started as a drip developed into a stream, a river, and then a torrent. Introduced to the market in 1988, Prozac appeared on the cover of Newsweek in 1990 (“A Breakthrough Drug for Depression”); was the subject of a best-selling and extremely influential book, Peter Kramer’s Listening to Prozac in 1993; exceeded a billion dollars in sales in 1995; and graced, again, that same year the Newsweek cover, this time with even greater claims (“Shy? Forgetful? Anxious? Fearful? Obsessed? How Science Will Let You Change Your Personality with a Pill”). Prozac famously launched the concept of “cosmetic psychopharmacology,” or the use of drugs for people who are patently not ill. In an oft-quoted passage from Listening to Prozac, Peter Kramer wrote “With Prozac I had seen patient after patient become . . . better than well. Prozac seemed to give social confidence to the habitually timid, to make the sensitive brash, to lend the introvert the social skills of a salesman . . . In addressing lifestyle issues rather than actual diseases, Prozac vastly expanded its market base and paved the way for a succession of lifestyle-enhancing medications–Viagra, most notably; Lipitor, and other cholesterol medications; a series of other psychiatric drugs–which have overwhelmingly driven Big Pharma’s profits over the last decade.
It worked beyond anybody’s expectations. Sales of Prozac hit $2 billion dollars in 1998. By 2002, more than 11 percent of American women and 5 percent of American men were taking antidepressants, which amounts to about 25 million people. Ultimately, Prozac became the best-selling drug in the history of the pharmaceutical industry. No one knows how many people in America have tried antidepressants at one point or another, but given that only about a quarter of people who start a course of antidepressants continue to take them for longer than ninety days, it is entirely conceivable that sixty, seventy, eighty million Americans may have taken them.
And despite an FDA warning in 2004 of the increased risk of suicidal behavior for young people associated with SSRI antidepressants and a great deal of public relations bloodying of Big Pharma in 2006 and 2007, antidepressant use actually went up during that period. In 2006, 227 million antidepressant prescriptions were dispensed to Americans–more than any other class of medication–and up by 30 million prescriptions since 2002.
While there were many positive aspects to this shift (the most important being the reduction in stigma toward the people with whom I worked), the rapidity with which the transformation occurred and the certainty with which the new consensus was endorsed, were slightly bizarre to me. While the new attitudes about psychiatry were a vast improvement, representing a far more enlightened and realistic set of beliefs about what mental illness is really like, they also seemed in some ways just as sketchily and prematurely arrived at as the earlier convictions. There was, in short, a newfound faddishness about the whole phenomenon. It felt as if, overnight, the divide between my homeless, mentally ill clients and the wealthy, prosperous denizens of suburban America–who had heretofore lived in such separate worlds that it was hard for me to fathom that they occupied the same continent–was gone. The cocktail party set and the homeless mentally ill were now inextricably linked, not least by the pharmaceuticals that ran through their bloodstreams.
Many of the greatest advances in psychiatry in the last two decades have actually occurred outside the realm of psychotropic medications: in particular, a series of extraordinarily elegant and subtle refinements in social and therapeutic techniques, such as cognitive-behavioral therapy (CBT), that have produced outcomes that would be the envy of most drug trials. For example, CBT has often been shown to be more effective than antidepressants in treating mild and moderate depression, and with a significantly lower recurrence rate. CBT has also been used with an impressively high degree of effectiveness for a dizzying number of conditions–among them, bulimia, hypochondriasis, obsessive-compulsive disorder, substance abuse, posttraumatic stress disorder, even as a proven means of reducing criminal behavior. Cognitive-behavioral treatments have been shown in analyses to reduce criminal recidivism by 25 percent. One can only imagine the hype that would surround a pill if it was found to reduce criminal behavior at such a rate. Ads for “Pacify,” as it might be called, would dominate the airwaves, and the public debate, for years.
Furthermore, two innovative treatment approaches–the Stages of Change model and Motivational Interviewing–have provided an entirely new paradigm of how caregivers conceive of the process by which people change and how to motivate them to do so. Their tenets, in a nutshell, are that change should be viewed as a cyclical process rather than a linear one; the job of changing is the responsibility of the patient, not the caregiver, thereby putting the recipient of care “in charge” and reversing the centuries-old hierarchical construct of the doctor-patient relationship; and the approach that the caregiver takes in assisting the client to change must vary according to the client’s “stage of change”–that is, their insight and motivation to move forward on a particular problem. The positive outcomes of these various approaches in alleviating some of the most intractable of human problems–such as addictions and the most severe mental illnesses, like schizophrenia–has been proven repeatedly and spectacularly. But no one outside the field even knows about these alternative approaches. Why should they? There are no products associated with these developments to sell to the masses, no billions to be made on Wall Street.
Furthermore, research on the brain has revealed the unexpected “plasticity” of the organ–i.e., its capacity to change its function and structure throughout life. Brain imaging has shown that social and psychological experiences exert measurable changes in the brain. Specifically, learning and social experience–such as psychotherapy, for what is psychotherapy other than a particularly intensive form of learning?–are capable of producing changes in the brain at the level of neuronal and synaptic connections, and thereby in the functioning of nerve cells. There is no longer any doubt that psychotherapy can change the brain at functional and structural levels. That is, in perhaps the greatest irony of the neuro revolution, psychotherapy can be viewed as a “biological” treatment, along with pharmaceutical approaches.
But these exciting, if at times complex and nuanced, advances have gone almost completely unnoticed by the media and the public, and underappreciated even within the field of psychiatry. In the last two decades, there has been a tremendous bias in academic psychiatry against psychological and social forms of inquiry. The psychosocial realm is tolerated but often barely so–viewed as well-intentioned and sort of cute, but ultimately and soundly relegated to the margins, literally and figuratively. The programs at which I have worked at the Yale and Columbia medical schools that were engaged in such social approaches and were typically off-campus, sometimes near derelict settings.
In this era, one could hardly get an article published in a academic journal of psychiatry that was purely qualitative (i.e., didn’t include statistics), or that told the story of an individual patient, or that included any personal thoughts or feelings on the part of the authors about the people or the work they were engaged with. All that would be deemed not appropriately robust for the new standards of the profession. In our particularly American zeal for simple explanations, quick fixes, and overwhelming the enemy with technology, we’ve too quickly lost sight of the centrality of social and environmental factors. And despite undeniable progress in the pharmacological realm, the enduring truth is that the human factor, and the human approach, remains critical to healing.
Meanwhile, back in the shelters in New York City, my clients remained essentially unchanged by the pharmacological advances which had surrounded them over the 1990s. In the early 2000s, they were suffering from the same exact set of monstrous afflictions that had beset them a decade earlier. The new medications had brought some relief, and in a few cases dramatic improvements, but in general they were a disappointment. Analyses over the last couple of years have shown that the new medications are no more effective, or only minimally more effective, than the old ones, while costing at least ten times as much. Additionally, some of the new antipsychotics cause rapid and intense weight gain, leading to high rates of diabetes. Two massive government studies released in 2006 on the “real world” efficacy (as opposed to that reported in clinical trials) of both antidepressants and antipsychotics showed that most patients do not get better taking the drugs. Only about a third of patients taking antidepressants, for example, improved dramatically after a first trial. Altogether, results were not much better than the outcomes of placebo studies. Said Robert Freedman, M.D., editor-in-chief of the American Journal of Psychiatry:“The results of STAR*D [the depression trial] continue to be sobering. . . . the rate of remission continues to be quite low, which underscores the persistence of depression, and its resistance to current treatments.”
In some ways, things were actually worse by the 2000s. The emergence of managed care–the handmaiden of the medication revolution–had severely shortened hospital stays. Under managed care, psychiatric hospitalization came to be viewed not as an opportunity to work on treatment issues or to arrive at a thoughtful discharge plan, but primarily as a place to tinker with medications. Once a medication regimen was arrived at, and patients were no longer considered unsafe, they were out, sometimes on the streets. My clients were routinely hospitalized and rehospitalized, and discharged and redischarged, always before they were ready. This caused incalculable confusion and pain.
No, it was the people at the cocktail parties who had changed–not my clients.
Going back to New York, I would wonder: How did this happen, and in such a short time? How did attitudes about mental illness . . . no, “mental health”–change so quickly?Back in the shelters, I would wonder: How did biological psychiatry get to be so omnipresent, so powerful, so hip, so quickly?Public perceptions of mental health issues have changed dramatically over the last fifteen years, and nowhere is this more apparent than in the rampant overmedication of ordinary Americans. In 2006, 227 million antidepressant prescriptions were dispensed in the United States, more than any other class of medication; in that same year, the United States accounted for 66 percent of the global antidepressant market. In Comfortably Numb, Charles Barber provides a much-needed context for this disturbing phenomenon.
Barber explores the ways in which pharmaceutical companies first create the need for a drug and then rush to fill it, and he reveals that the increasing pressure Americans are under to medicate themselves (direct-to-consumer advertising, fewer nondrug therapeutic options, the promise of the quick fix, the blurring of distinction between mental illness and everyday problems). Most importantly, he convincingly argues that without an industry to promote them, non-pharmaceutical approaches that could have the potential to help millions are tragically overlooked by a nation that sees drugs as an instant cure for all emotional difficulties.
Here is an unprecedented account of the impact of psychiatric medications on American culture and on Americans themselves.
“In Charles Barber’s compelling new book, "Comfortably Numb: How Psychiatry Is Medicating a Nation," the author contends that we underwent a major shift in attitudes toward mental illness and medications…Barber brings a street-smart perspective to all this…[and he] offers something several of the other books don’t: practical, therapeutic alternatives to antidepressants.”
—Salon.com“A fine, informed writer on cultural history as well as neuroscience, psychotherapy, and economics, Barber convincingly argues against the overprescription of psychiatric drugs in the United States and sums up the history of U.S. psychiatry from the asylum to the community to glitzy but still elementary neuroscience. A blockbuster essential for all libraries.”
—Library Journal (starred review)
“A sharply critical look at the way antidepressants are marketed and prescribed in the United States . . . Barber articulately and persuasively counsels that it’s time to abandon the quick-fix, pop-a-pill approach.”
—Kirkus “Comfortably Numb chronicles the extraordinary psychopharmaceuticalization of everyday life that has arisen in recent years and appears to be growing apace. Barber marks out the inconvenient truths on our path to emotional climate change but also offers alternatives to readers who wish to avoid pharmageddon.”
—David Healy, author of Let Them Eat Prozac“In this passionate yet fair-minded book, Charles Barber explores the disturbing medicalization and medication of unhappiness in America today. The author understands that while medication has an important role to play in the treatment of severe mental illnesses such as schizophrenia, Big Pharma has seduced Americans into believing they need drugs for the normal sorrows of life. Almost 70 percent of antidepressants worldwide are sold in the U.S. The author asks the critical question of whether Americans are crazier than the rest of the world or whether we have simply developed a crazy dependency on legal drugs.”
—Susan Jacoby, author of The Age of American Unreason
Additional information
Weight | 1 oz |
---|---|
Dimensions | 1 × 6 × 10 in |