Should antidepressants be used for major depressive disorder?

From a 2019 meta-analysis in the British Medical Journal: Conclusions: The benefits of antidepressants seem to be minimal and possibly without any importance to the average patient with major depressive disorder. Antidepressants should not be used for adults with major depressive disorder before valid evidence has shown that the potential beneficial effects outweigh the harmful effects.

Wikipedia notes:

[Harvard psychology professor Irving] Kirsch’s analysis of the effectiveness of antidepressants was an outgrowth of his interest in the placebo effect. His first meta-analysis was aimed at assessing the size of the placebo effect in the treatment of depression.[7] The results not only showed a sizeable placebo effect, but also indicated that the drug effect was surprisingly small. This led Kirsch to shift his interest to evaluating the antidepressant drug effect.

The controversy surrounding this analysis led Kirsch to obtain files from the U.S. Food and Drug Administration (FDA) containing data from trials that had not been published, as well as those data from published trials. Analyses of the FDA data showed the average size effect of antidepressant drugs to be equal to 0.32, clinically insignificant according to the National Institute for Health and Clinical Excellence (NICE) 2004 guidelines, requiring Cohen’s d to be no less than 0.50.[8] No evidence was cited to support this cut-off and it was criticised for being arbitrary;[9] NICE removed the specification of criteria for clinical relevance in its 2009 guidelines.[10][11]

Kirsch challenges the chemical-imbalance theory of depression, writing “It now seems beyond question that the traditional account of depression as a chemical imbalance in the brain is simply wrong.” [12] In 2014, in the British Psychological Society’s Research Digest, Christian Jarrett included Kirsch’s 2008 antidepressant placebo effect study in a list of the 10 most controversial psychology studies ever published.[13]

In September 2019 Irving Kirsch published a review in BMJ Evidence-Based Medicine, which concluded that antidepressants are of little benefit in most people with depression and thus they should not be used until evidence shows their benefit is greater than their risks.

Marcia Angell writes in the New York Review of Books:

It seems that Americans are in the midst of a raging epidemic of mental illness, at least as judged by the increase in the numbers treated for it. The tally of those who are so disabled by mental disorders that they qualify for Supplemental Security Income (SSI) or Social Security Disability Insurance (SSDI) increased nearly two and a half times between 1987 and 2007—from one in 184 Americans to one in seventy-six. For children, the rise is even more startling—a thirty-five-fold increase in the same two decades. Mental illness is now the leading cause of disability in children, well ahead of physical disabilities like cerebral palsy or Down syndrome, for which the federal programs were created.

A large survey of randomly selected adults, sponsored by the National Institute of Mental Health (NIMH) and conducted between 2001 and 2003, found that an astonishing 46 percent met criteria established by the American Psychiatric Association (APA) for having had at least one mental illness within four broad categories at some time in their lives. The categories were “anxiety disorders,” including, among other subcategories, phobias and post-traumatic stress disorder (PTSD); “mood disorders,” including major depression and bipolar disorders; “impulse-control disorders,” including various behavioral problems and attention-deficit/hyperactivity disorder (ADHD); and “substance use disorders,” including alcohol and drug abuse. Most met criteria for more than one diagnosis. Of a subgroup affected within the previous year, a third were under treatment—up from a fifth in a similar survey ten years earlier.

Nowadays treatment by medical doctors nearly always means psychoactive drugs, that is, drugs that affect the mental state. In fact, most psychiatrists treat only with drugs, and refer patients to psychologists or social workers if they believe psychotherapy is also warranted. The shift from “talk therapy” to drugs as the dominant mode of treatment coincides with the emergence over the past four decades of the theory that mental illness is caused primarily by chemical imbalances in the brain that can be corrected by specific drugs. That theory became broadly accepted, by the media and the public as well as by the medical profession, after Prozac came to market in 1987 and was intensively promoted as a corrective for a deficiency of serotonin in the brain. The number of people treated for depression tripled in the following ten years, and about 10 percent of Americans over age six now take antidepressants. The increased use of drugs to treat psychosis is even more dramatic.

Marcia Angell follows up in the July 14, 2011 issue:

One of the leaders of modern psychiatry, Leon Eisenberg, a professor at Johns Hopkins and then Harvard Medical School, who was among the first to study the effects of stimulants on attention deficit disorder in children, wrote that American psychiatry in the late twentieth century moved from a state of “brainlessness” to one of “mindlessness.” By that he meant that before psychoactive drugs (drugs that affect the mental state) were introduced, the profession had little interest in neurotransmitters or any other aspect of the physical brain. Instead, it subscribed to the Freudian view that mental illness had its roots in unconscious conflicts, usually originating in childhood, that affected the mind as though it were separate from the brain.

But with the introduction of psychoactive drugs in the 1950s, and sharply accelerating in the 1980s, the focus shifted to the brain. Psychiatrists began to refer to themselves as psychopharmacologists, and they had less and less interest in exploring the life stories of their patients. Their main concern was to eliminate or reduce symptoms by treating sufferers with drugs that would alter brain function. An early advocate of this biological model of mental illness, Eisenberg in his later years became an outspoken critic of what he saw as the indiscriminate use of psychoactive drugs, driven largely by the machinations of the pharmaceutical industry.

When psychoactive drugs were first introduced, there was a brief period of optimism in the psychiatric profession, but by the 1970s, optimism gave way to a sense of threat. Serious side effects of the drugs were becoming apparent, and an antipsychiatry movement had taken root, as exemplified by the writings of Thomas Szasz and the movie One Flew Over the Cuckoo’s Nest. There was also growing competition for patients from psychologists and social workers. In addition, psychiatrists were plagued by internal divisions: some embraced the new biological model, some still clung to the Freudian model, and a few saw mental illness as an essentially sane response to an insane world. Moreover, within the larger medical profession, psychiatrists were regarded as something like poor relations; even with their new drugs, they were seen as less scientific than other specialists, and their income was generally lower.

About Luke Ford

I've written five books (see Amazon.com). My work has been covered in the New York Times, the Los Angeles Times, and on 60 Minutes. I teach Alexander Technique in Beverly Hills (Alexander90210.com).
This entry was posted in Psychology. Bookmark the permalink.