A History of Psychiatry: From the Era of the Asylum to the Age of Prozac

Edward Shorter writes in this 1998 book:

* For historians of psychiatry who wrote 30 or 40 years ago—the last time anyone attempted an overview of the discipline—the story seemed relatively straightforward. First there were those wicked biological psychiatrists in the nineteenth century, then psychoanalysts and psychotherapists came along to defeat the biological zealots, establishing that mental illness resulted from unhappiness in childhood and stress in adult life. Freud’s insights opened a new frontier in our understanding of mental illness and little more needed to be said. Between the 1950s and the 1990s, a revolution took place in psychiatry. Old verities about unconscious conflicts as the cause of mental illness were pitched out and the spotlight of research turned on the brain itself. Psychoanalysis became, like Marxism, one of the dinosaur ideologies of the nineteenth century. Today, it is clear that when people experience a major mental illness, genetics and brain biology have as much to do with their problems as do stress and their early-childhood experiences. And even in the quotidian anxieties and mild depressions that are the lot of humankind, medications now can lift the symptoms, replacing hours of aimless chat. If there is one central intellectual reality at the end of the twentieth century, it is that the biological approach to psychiatry—treating mental illness as a genetically influenced disorder of brain chemistry—has been a smashing success. Freud’s ideas, which dominated the history of psychiatry for the past half century, are now vanishing like the last snows of winter. The time has therefore come for a new look.

* Part and parcel of European culture, the fateful notion of degeneration was picked up by the eugenists, by social-hygienists intent on combating mental retardation with sterilization, and by antidemocratic political forces with a deep hatred of “degenerate” groups such as homosexuals and Jews. Psychiatry’s responsibility for all this is only a partial one. Academic psychiatrists in the 1920s were not generally associated with right-wing doctrines of racial hygiene, though there were exceptions to this, such as the Swiss psychiatrist Ernst Rüdin who after 1907 worked at the university psychiatric clinic in Munich, and the Freiburg professor Alfred Hoche who in 1920 coauthored a justification for euthanasia. 101 Academic medicine in Germany on the whole stood waist-deep in the Nazi sewer, and bears heavy responsibility for the disaster that followed. After 1933, degeneration became an official part of Nazi ideology. Hitler’s machinery of death singled out Jews, people with mental retardation, and other supposedly biological degenerates for campaigns of destruction.102 The Nazi abuse of genetic concepts rendered any discussion of them inadmissible for many years after 1945. The notions of degeneration and inheritability became identical in the minds of the educated middle classes. Both were synonymous with Nazi evil. After World War II, any reference to the genetic transmission of psychiatric illness, whether as one factor among many or as inexorable degeneration, became taboo. The mere discussion of psychiatric genetics would, in civil middle-class dialogue, be ruled out of court for decades to come.

* The first biological psychiatry as a clinical approach died long before the Nazis. It was not necessarily discredited by research findings. That’s not the way paradigms change within medicine. People simply lost interest in brain anatomy once a new way of looking at psychiatric illness appeared on the horizon. The new approach saw illness vertically rather than cross-sectionally…

* Many histories of psychiatry see psychoanalysis as the end point of the story, the goal to which all previous events had been marching. Yet with the hindsight of half a century since Freud’s death in 1939, we are able to achieve a different perspective, in which psychoanalysis appears not as the final chapter in the history but as an interruption, a hiatus. For a brief period at mid-twentieth century, middle-class society became enraptured of the notion that psychological problems arose as a result of unconscious conflicts over long-past events, especially those of a sexual nature. For several decades, psychiatrists were glad to adopt this theory of illness causation as their own, especially because it permitted them to shift the locus of psychiatry from the asylum to private practice. But Freud’s ideas proved short-lived. In the longer perspective of history, it was only for a few moments that the patient recumbent upon the couch, the analyst seated silently behind him, occupied the center stage of psychiatry. By the 1970s, the progress of science within psychiatry would dim the lights on this scenario, marginalizing psychoanalysis within the discipline of psychiatry as a whole. In retrospect, Freud’s psychoanalysis appears as a pause in the evolution of biological approaches to brain and mind rather than as the culminating event in the history of psychiatry.

Yet it was a pause of enormous consequence for psychiatry. Freud’s psychoanalysis offered psychiatrists a way out of the asylum. The practice of depth psychology, based on Freud’s views, permitted psychiatrists for the first time in history to establish themselves as an office-based specialty and to wrest psychotherapy from the neurologists. Moreover, psychiatrists aspired to a monopoly over this new therapy. In the mind of the public, psychotherapy and psychoanalysis became virtually synonymous. If patients wanted one of the fashionable new depth therapies they would have to go to a psychiatrist for it, for the American Psychoanalytic Association initially insisted that only MDs could be trained as analysts, and later that only psychiatrists could be so. In retrospect, this insistence was bizarre, for psychoanalysis required no more medical training than astrology, and the attempt to impose a medical monopoly over Freud’s technique was a self-interested ploy to exclude psychologists, psychiatric social workers, and other competitors from the newly discovered fountain of riches.

Ultimately, psychoanalytically oriented psychiatrists were unable to preserve their monopoly. After the 1960s, all manner of nonmedical types demanded admission to the training institutes, for there was no intrinsic reason why professors of English could not do analysis as well as psychiatrists. Even worse, what had previously passed for the scientific basis of psychoanalysis began to collapse. It could not be simultaneously true that one’s psychological problems were caused by an abnormal relationship to the maternal breast and by a deficiency of serotonin. As evidence began to accumulate on the biological genesis of psychiatric illness, psychiatry began to regain the scientific footing it had lost at the beginning of the analytic craze: The brain was indeed the substrate of the mind. By the 1990s a majority of psychiatrists considered psychoanalysis scientifically bankrupt. Thus Freud’s model of the unconscious and the elaborate therapeutic techniques he devised for laying bare its supposed contents failed to stand the test of time. Accordingly, analysis largely vanished from psychiatry, discredited as a medical approach to the problems of mind and brain, although nonmedical psychoanalysis continued to flourish. The whole affair turned out to be the artifactual product of a distinctive era. Psychoanalysis failed to survive because it was overtaken by science, and because the needs that it initially met became dulled in our own time.

* In psychoanalysis by its very nature, doctor and patient communicate in the enterprise of soul-searching, creating the suggestion that one is being cared for emotionally. Thus psychoanalysis became popular initially because it filled a sentimental gap in the consultation. It offered a doctor-patient relationship in which patients basked in what they believed to be an aura of concern.

Numerous physicians other than Freud understood these psychological cravings, but Freud was the first to elaborate a therapy that would appeal to middle-class sensibilities, in particular to the desire for leisurely introspection. Yet his theories possessed a powerful additional resonance because, owing to his own ethnic origin and social position, he had privileged access to a group of patients who were especially needy in psychological terms: middle-class Jewish women in families undergoing rapid acculturation to West European values.

…Although by 1860 every city in Western Europe had a contingent of Jews, the Jews of Vienna were distinctive in constituting virtually the city’s entire middle class. Whatever circle one examines—journalists, bankers, businesspeople, academics—all had a significant Jewish component by the end of the nineteenth century. This tremendous preponderance of Jews in the middle classes reflected the great social progress the Jews of Europe had made since the end of the eighteenth century, when they lived largely sequestered in the small towns of Poland, Russia, and the Ukraine. As a result of the Jewish emancipation of the nineteenth century, the small-town Jews of the east flocked to the cities of the west, using the high-school diploma as a launching pad for careers in the liberal professions. In 1890, for example, 33 percent of students at the Vienna University were of Jewish origin.1 Fully one half of the professors of Vienna’s medical faculty were Jewish.2 As many as two-third’s of the city’s physicians were Jews.3 Thus, rather than being marginalized or scorned for his ethnic background as some have claimed, the young Sigmund Freud found in Vienna an intensely Jewish setting where he had every prospect of advancement through dint of hard work.

* The early analysts became well known for searching out sexual material. Viennese psychiatrist Emil Raimann, who knew Freud and his patients well, complained that Freud was able to persuade these complaisant and easily suggestible young women to say anything he wished them to. “The patients who consult Freud know in advance the information he wants to extract from them. These are patients who have let themselves be convinced of the causal significance of their sexual memories. Individuals in whom sexual motives play no role are aware that they would consult Freud in vain.” (Raimann noted that in working-class families in Vienna there was plenty of sexual contact, even incest, but no hysteria. Yet among the closely guarded young women of the city’s better families, where there was no possibility of sexual trauma, hysteria flourished.)

…[the Freudians were ] the only ones to offer a road map of how one got from sexual desire and repression of it to neurosis. On the basis of this map, psychoanalysis, a term Freud first used in 1896, would turn into a movements.17 It launched itself on the world as a group of doctrines comprising three main areas: study of the patient’s resistance to thoughts that attempted to press into the conscious mind from the unconscious; concentration on the causal significance of sexual matters; and an emphasis on the centrality of early childhood experiences.18 The core doctrine, from which Freud never wavered, was that neurotic symptoms represented a trade-off between sexual and aggressive drives and the requirements of reality.

* Freud was so intent on propagating his own views that, by turning psychoanalysis into a movement rather than a method of studying subrational psychology, he denied analysis the possibility of ever acquiring a scientific footing. The master’s insights were to become articles of faith, incapable of disproof. And the efforts of others to criticize Freud’s wisdom would always be considered evidence of “resistance,” of personal pathology, never as scientific hypotheses to be dealt with in the way that science treats all hypotheses. Alfred Adler fell away, as did Wilhelm Stekel, Freud’s physician-patient who had suggested establishing the Wednesday group in the first place. Such far-distant fans of analysis as the Zurich academics Carl Jung and Eugen Bleuler would soon turn heretic, as did later many others. The efforts of all these critical individuals to nudge Freud away from the bedrock of childhood sexuality on which he built his theories would fail. But a core of faithful remained. And it was these loyal captains who, in the belief that they possessed an inner truth, took psychoanalysis to the wide world.

* Did Freud and his followers really know these truths? Or were they simply self-suggesting one another into accepting highly dubious propositions as being somehow “confirmed”? Freud tended to see himself more as an adventurer than a scientist, once telling Fliess flatly, “I am actually not at all a man of science, not an observer, not an experimenter, not a thinker. I am by temperament nothing but a conquistador-an adventurer, if you want it translated—with all the curiosity, daring, and tenacity characteristic of a man of this sort.”20 His inner circle was rife with toadyism, for the other analysts were economically dependent on Freud for referrals. (He kept a pile of their calling cards in his drawer, and would dole them out to patients according to his whim.)21 “Freud never realized how much of a suggestive impact he had on his followers,” writes historian Paul Roazen, “and therefore could be led to think that his findings were being genuinely confirmed by independent observers.”22 The issue of validity would therefore haunt psychoanalysis until its eclipse within psychiatry.

* As psychoanalysis set out to take over psychiatry, therefore, it was with a doctrine that was therapeutically uncertain, intellectually highly speculative to say the least, and best adapted to the psychological needs of a deracinated group in transition: young middle-class Jewish women who aspired to be like their non-Jewish counterparts. It would be hard to imagine a therapy less appropriate for the needs of people with serious psychiatric illnesses.

* Given the intrinsic inappropriateness of psychoanalysis for psychiatry, there must have been some other force driving it forward in Europe than the power of the idea itself. That force was middle-class enthusiasm. Freud’s ideas proved tremendously popular among the educated classes as a codification of the kind of search for self-knowledge that had run through bourgeois culture throughout the entire second half of the century. Psychoanalysis was to therapy as expressionism was to art: Both represented exquisite versions of the search for insight.

* Regular psychiatrists were bemused at the grassfire spread of psychoanalysis within the middle classes. One physician at the Budapest psychiatric clinic tried to account for it along the following lines: “The flood of patients seeking salvation through psychoanalysis is explainable partly from the publicity, partly from the receptiveness of our time to introversion and introspection.” It was a procedure of obvious appeal to “hypersexual neurotics,” he said.31 Thus we have a core of physicians dubious, even contemptuous of “hypersexual neurotics” and their problems, and an educated middle class keening at the doctor’s office for further self-insight.

* What ultimately converted a chic therapeutic boomlet into a mass ideology shaping almost every aspect of American thought and culture was the Holocaust. In the 1930s, fascism drove many analysts who were Jewish from Central Europe to the United States, where they lent the strippling little American movement the glamour and heft of the wide world. On the face of it, this massive transfer of culture from the German-speaking world to the English had positive results for psychoanalysis, reinforcing the homespun American heterodoxy with the prestige of internationally acclaimed figures.81 In the long run, however, the migration of the European analysts proved fatal for psychoanalysis in the New World, for the refugees brought with them a stifling orthodoxy, a reflexive adherence to the views of Freud and his daughter Anna that American analysis was never able to outgrow and that ultimately caused, within medicine at least, its death from disbelief.

* American psychiatry before World War II was biological psychiatry and within a few years after the war it was largely a psychoanalytical psychiatry.

* From the viewpoint of the history of psychiatry, the vicissitudes of the Jews in the Old World and New were a matter of capital importance. The common theme linking the misadventures of psychoanalysis on both sides of the Atlantic was the desire of recently acculturated middle-class Jews for some symbol of collective affirmation. Although Freud sought mightily to downplay any kind of ethnic specificity in psychoanalysis, the subtext of Freud and his followers to the non-Jewish charter culture was: We Jews have given this precious gift to modern civilization.

Why would Jews need such a symbol any more than any other ethnic group? In the history of modern times, Jewish people have had to endure not just one but two great shocks. Every people that undertakes the long journey from small-scale life in the traditional village to middle-class life in the big city undergoes one major shock: the shock of assimilation and integration, the psychological upheaval that goes with newness of arrival. In their move from shtetl life in the small towns of eighteenth-century Poland and the Ukraine to such bustling cities as Berlin, Frankfurt, and Vienna, the Jews underwent this shock just as everybody else did.

But then a second shock lay in store for the Jews, the Holocaust, and the forced transplantation of hundreds of thousands of individuals who themselves had only recently become middle-class, from a comfortable and bourgeois European existence to the nightmare of scrambling for a passage to America. This second shock was experienced by no other cultural group.155 It profoundly shaped the desire of the American Jews for some kind of a special symbol of self-affirmation, a collective badge of pride in the chaos of the living city. That symbol, I argue, was psychoanalysis.

At the turn of the twentieth century, the Jews of Central Europe were experiencing the cultural confusion of a massive deracination. Between the 1860s and 1900, countless numbers of people were torn from the ghettoes and shtetls of Eastern Europe, without becoming as yet newly rooted among the middle classes of the West. Many of the Jews of Berlin and Vienna had left their religion behind and were rapidly trying to assimilate by changing their names and by converting to Protestantism (less so to Catholicism). Yet despite their best intentions, despite their knowledge of the plays of Schiller and of the refinements of the German language, they encountered a baffling wall of anti-Semitism. There was something about psychoanalysis that made it, according to historian John Cuddihy, a “plausible ideology for [a] decolonizing people.”156

Jewish patients with psychoneurosis were therefore drawn to it. Perhaps psychoanalysis was seized upon because it extended the possibility of finding one’s identity from within, as opposed to the external signposts that orthodox Judaism offered. And it may have appealed to Jewish women in particular. Perhaps these cloistered but well read and highly curious women—members of a “middle-class drenched in spirit” in the words of Viennese novelist Robert Musil—were simply more self-reflective, more psychologically minded that the women of the non-Jewish lower-middle classes below them who worked alongside their husbands in shops, or the women of the nobility above, busy with the social whirl of the salon.157 Or perhaps Jewish men and women alike adored psychoanalysis because it was “our thing.” In any event, psychoanalysis in the early days had a very specific social address.

It was above all among the middle-class Jews of Berlin, Budapest, and Vienna that psychoanalysis proved such a hit. Historian Steven Beller finds the Jews of Vienna, as outsiders, using psychoanalysis to “make a political attack on Viennese society by an alliance of scientific rationality with instinct” against the city’s traditional sensual baroque culture.158 In Budapest, there were descriptions of psychoanalysis in the Jewish quarter, the Leopoldstadt, as an almost “incomprehensible and impenetrable secret doctrine or ceremony….”

Historian Paul Harmat concludes, “Psychoanalysis was most popular among enlightened Jewish circles as a result of their minority situation.” 159 Of course, non-Jews had recourse to analysis as well. Yet among patients, there seems to have been a kind of Jewish tropism. The analysts themselves also tended heavily to be Jewish, and many of them assumed that Jewishness helped one to appreciate Freud’s wisdom fully. As Freud said in 1908 to the Berlin analyst Karl Abraham, on the occasion of a malentendu with Carl Jung (then one of the few non-Jews in the movement), “Please be tolerant, and don’t forget that it is actually easier for you than for Jung to follow my ideas … because you stand closer to me as a result of racial affinity, while he, as a Christian and son of a pastor, finds the way to me only in the face of great inner resistance.” On another occasion, Freud reassured Abraham, “May I say that what attracts me to you are our related, Jewish characteristics. We understand each other.”160 Freud’s inner circle was almost entirely Jewish, and Ferenczi said to Freud of the one non-Jewish member, the Londoner Ernest Jones, “It has seldom been so clear to me as now what a psychological advantage it signifies to be born a Jew…. you must keep Jones constantly under your eye and cut off his line of retreat.”161

Within the middle-class Jewish public, psychoanalysis became signposted as belonging to some larger Jewish worldview. Humorist Salomo Friedländer, writing in the 1920s under the pseudonym “Mynona,” made analysis the portal through which Christians who wanted to convert to “true Judaism” must pass. In one tale Friedländer allows the wildly anti-Semitic Count Reschock to fall in love with the beautiful Rebecka Gold-Isak. Losing his bearings completely, the Count decides to convert to Judaism to win his prize. Rebecka insists that he must become truly Jewish before she will accept him. The Count’s first step on the path of a Jewish identity is an analysis with Professor Freud. “This destroyer of fig-leaves,” as Friedländer termed Freud, “robbed the noble Reschok soul of its protective coat with such anatomical certainty that the Count fell with a cry into the arms of his alarmed servant.” (Reschock goes on to have a famous surgeon convert him from a blonde Prussian warrior into a “Jewish Torahstudent.”) 162 Jewish and non-Jewish readers alike found the Friedländer fable delicious, yet accepted implicitly its premise that psychoanalysis was identified with Judaism. If psychoanalysis is written as a history of ideas, these social themes are unimportant. But if we try to understand its rise and decline as a movement, the singular tropism that many Jews felt toward analysis, both as doctors and patients, is of considerable significance.

With the passage of time, in Europe at least, psychoanalysis lost its Jewish stamp. Although it had originated among the Jews of Vienna and Berlin, as it developed, it ceased to be their property. There was certainly no Jewish tropism among the chief physicians of the many private clinics that offered psychoanalysis. And in Switzerland and England, psychoanalysis was known to be a specifically non-Jewish affair. As Swiss psychiatrist Max Muller commented of the 1920s, “It was characteristic of the psychoanalytic movement in Switzerland that, unlike other countries, it did not consist predominantly or almost exclusively of Jewish physicians and lay-analysts.”163 And the two most prominent advocates of analysis in Switzerland before 1914—Eugen Bleuler and Carl Jung—were if anything anti-Semitic. (It is perhaps indicative of the mood of the Bleuler household that, upon discovering that Viennese psychiatrist Erwin Stransky was Jewish, Bleuler’s wife expressed great astonishment and said, “Well then you must at least have an aryan soul in you.”)164 Commenting on the plethora of Jews in psychoanalysis generally, Ernest Jones noted, with relief, that apart from the refugees, “in England … only two analysts have been Jews.”165 Before 1933, a number of Jewish physicians figured prominently among the opponents of analysis.

After 1933, all this changed. As a movement, analysis in Europe was destroyed. Its main representatives who fled to the New World were Jews. For these battered and profoundly disoriented survivors, psychoanalysis became one of the Jewish accomplishments that could be presented to the host population as a ticket of entry. Among the refugee Jews, both physicians and nonphysicians—psychoanalysis became a badge of Jewish solidarity in the face of a population of Anglo-Saxons perceived to be racially hostile, psychologically insensitive, and culturally backward. Said Martin Grotjahn of his fellow emigré analysts, “Psychoanalysis symbolized for them the light of the Old Country to be carried to the New Country.”167 But it was a light that Jews had created, and in whose warmth they would bask for several decades.

The American Jews had not experienced the trauma of emigration. Yet they too had arrived as outsiders, and as psychoanalysis acquired new prestige in medicine after the Second World War many Jewish physicians and patients alike were drawn to it as a symbol of collective self-affirmation: This is what we have created. By it we shall become better and in doing so bring enlightenment to others.

After 1945, American Jews took on psychoanalysis as a kind of mission civilisatrice, a healing gift to all the world, which is not at all an overwrought formulation considering the prose with which Jewish analysts themselves described their mission to humanity. How things have changed for us, Franz Alexander assured his colleagues in 1953, “as soon as all that you professed is accepted and the world is asking you sincerely and avidly to explain the new truth. They turn to you now: ‘Please tell us all about it. How does the new knowledge help us, how can we use it constructively to cure a neurotic or psychotic patient … to alleviate social prejudice and international tension, and to prevent war.’”168 Is it any wonder that Jews themselves would preferentially have recourse to this new knowledge?

Why had psychoanalysis spread so rapidly after World War II? asked psychologist Seymour Sarason. “Most analysts (and a significant portion of the psychiatrists who received training during the war years) were Jewish. For them, Hitler and fascism were not abstractions but threats to existence. And for them, Freud represented a Moses-like figure whose contributions had opened up new vistas about the nature of humans….” 169 For Sarason and Alexander, Jews were a gifted but marginal population, still ill at ease and unintegrated.

Surveys establish the extent to which Jewish physicians predominated in the practice of psychoanalysis. In 1959, two researchers drew up a profile of psychiatrists who believed in psychoanalysis: Eighty percent of them were of Jewish origin and tended to be upwardly mobile, insight-oriented, and deracinated (in contrast to the biologically oriented psychiatrists in the sample, who tended to be mainly Protestant). On a number of characteristics, the psychoanalytically inclined Jewish psychiatrists stood out from the non-Jews: They were agnostic, as opposed to the organically oriented Protestant psychiatrists who retained some shreds of their religious faith. They were more leftist, and they were more aware of the importance of social class, as opposed to the Protestant group who were somewhat embarrassed by the subject.170 When Arnold Rogow quizzed a sample of 35 psychoanalysts and 149 nonanalyst psychiatrists in 1965, he found 26 percent of the analysts willing to declare they were Jewish; a further 17 percent were willing to say they had Jewish mothers; a third were unwilling to say anything about religious affiliation. (By contrast, the figures for the nonanalyst psychiatrists were lower in all three categories.)171 On the basis of these statistics, it is fair to infer that a majority of the practitioners of psychoanalysis were of Jewish origin though of course numerous non-Jews entered the field as well. How about patients? It seems to be the case that Jews overconsume most psychiatric services in proportion to their numbers in the population. This is certainly true of psychoanalysis. In Rogow’s study, one third of the analysts said they had practices consisting heavily or overwhelmingly of Jews.172 A variety of other studies revealed the same finding in other ways.173 Most dramatic perhaps was a random, nationwide survey of the adult American population in 1976, which found that 59 percent of Jewish respondents had at some point in time received psychotherapy (in contrast to the non-Jewish help-seeking rate of 25 percent).174 In other words, more than half of all American Jews had sought out psychotherapy at a time when psychotherapy was overwhelmingly psychoanalytically oriented. It is not stretching the facts to refer to psychoanalysis in the middle decades of the twentieth century as a kind of Jewish “our thing.”

* Yet since Jews are under discussion here, this might be the place to mention the role that the loss of a social base appears to have played in the plunging popularity of analysis. In my opinion the main source of this loss was the increasing social assimilation of the American Jews. They no longer required psychoanalysis as a badge of collective identity because they were no longer affirming themselves. Instead they were becoming like everyone else.

* Yet a handful of intellectuals in particular became identified with the antipsychiatry movement.130 And the force of their ideas brewed up a mass hostility to the advance of biological thinking within psychiatry. The movement’s basic argument was that psychiatric illness is not medical in nature but social, political, and legal: Society defines what schizophrenia or depression is, and not nature. If psychiatric illness is thus socially constructed, it must be deconstructed in the interest of freeing deviants, free spirits, and exceptional creative people from the stigma of being “pathological.”131 In other words, there really was no such thing as psychiatric illness. It was a myth.

Although antipsychiatry movements had flourished throughout the nineteenth century, their late-twentieth-century rebirth began with the virtually simultaneous publication in the early 1960s of a series of exceptionally influential books on psychiatry. Most famous perhaps of these was Michel Foucault’s Madness and Civilization, published in 1961 (see p. 276), which argued that the notion of mental illness was a social and cultural invention of the eighteenth century. Yet there were several other blockbusters, and collectively they became the intellectual springboard from which the theorists of deinstitutionalization of the late 1960s would launch themselves.

Earliest of the founding fathers—they were all men—was Thomas Szasz, a Budapest-born psychoanalyst, who had trained just after World War II in Chicago. When called to active service in the Navy in 1954, Szasz, who was then 34, used the time to put down on paper a notion that had long troubled him, that mental illness was in fact a “myth,” a medical misapprehension foisted on individuals who had problems in living. In his 1960 book The Myth of Mental Illness, he called the whole notion of psychiatric illness “scientifically worthless and socially harmful.”132 The book enjoyed wide currency and the American intellectual class began asking, if there is no such thing as mental illness, how can we justify locking people up in asylums?

* The works of Foucault, Szasz, and Goffman were influential among university elites, cultivating a rage against mental hospitals and the whole psychiatric enterprise. Yet the book that did most to inflame the public imagination against psychiatry was a novel written by Ken Kesey. Kesey had just finished taking a creative writing course at Stanford when he volunteered for government LSD experiments conducted at a veterans administration hospital at Menlo Park. He stayed on to take a job as an orderly at the hospital. Out of this experience came his 1962 novel One Flew Over the Cuckoo’s Nest, a book that formed the image of psychiatry for an entire generation of university students. Kesey’s notion of psychiatric illness was embodied in the novel’s antihero, Randle McMurphy…

By the end of the 1960s, the antipsychiatric interpretation of “so-called psychiatric illness” had gained the catbird seat among intellectuals both in the United States and Europe. In these circles, a consensus had formed that the discipline of psychiatry was an illegitimate form of social control and that psychiatrists’ power to lock people up must be abolished with the abolition of institutionalized psychiatric care, Pinel’s therapeutic asylum.

Even though these interpretations were very popular among college students and intellectuals, actual patients found them less convincing. Joanne Greenberg, author as “Hannah Green” of I Never Promised You a Rose Garden, had a real psychiatric illness. She hated the Kesey book. She later said, “Creativity and mental illness are opposites, not complements. It’s a confusion of mental illness with creativity…. Craziness is the opposite [of imagination]: it is a fort that’s a prison.”

* Long before the rise of the antipsychiatry movement, the destruction of the asylum had begun. Patients were to be returned to “the community.” That the very phrase now turns to ashes in one’s mouth is evidence of one of the greatest social debacles of our time.

* Midst this horrendous publicity for psychiatry, on which the antipsychiatric movement would later feed, several basic realities were obscured. One is that most patients younger than 65 were discharged relatively rapidly from mental hospitals: They did not experience prolonged stays to say nothing of lifelong incarceration. In the years 1946 to 1950 at Warren State Hospital in Warren, Pennsylvania, almost 80 percent of all patients under 65 were released within five years.144 Second, much of the bizarre posturing and disordered movement that Deutsch and later antipsychiatric writers ascribed to “hospitalism,” meaning the iatrogenic results of institutionalization, turned out to be an inherent biological feature of such illnesses as schizophrenia that, in affecting the entire brain, affect the entire nervous system as well.145 Third, even though conditions in mental hospitals were unsettling enough, there were worse alternatives. One was being tossed to the mercy of the streets.

* In the United States, the number of patients in state and county mental hospitals declined from its historic high of 559,000 in 1955 to 338,000 in 1970, further to 107,000 in 1988, representing a decrease over the 30 year period of more than 80 percent.148 The red bricks lost four-fifths of their patients. In 1955, 77 percent of all psychiatric “patient care episodes” occurred in mental hospitals, in 1990 only 26 percent. Amplifying the shift was a fivefold expansion in the total volume of care in mental-health organizations over that period, from 1.7 million episodes in 1955 to 8.6 million in 1990.149 This was a shift in the locus of care virtually without precedent in the history of medicine.

* Reducing the threshold of what constitutes psychiatric illness was partially doctor-driven, partially patient-driven. Psychiatrists have an obvious self-interest in pathologizing human behavior and have been willing to draw the pathology line ever lower in their efforts to tear as much counseling as possible away from competing psychologists and social workers.

* Ignoring the perils of school-teacher psychiatry, educational professionals grasped gratefully for this new pathologizing of boyhood. In 1968, “hyperkinetic reaction of childhood (or adolescence)” entered the official nomenclature, supposedly manifest in restlessness and distractibility.5 In 1980, this became officially known as “attention deficit disorder with hyperactivity.” 6 It is still unclear whether there is some core group of those diagnosed as “ADD” who have a real organic disorder. The point, however, is that medical therapy for it could be done only by MDs, prescribing an amphetamine-like compound called “Ritalin” (methylphenidate). By 1995 doctors were writing 6 million prescriptions for Ritalin a year, and 2.5 million American children were on the drug.7 This is one way of maintaining market share. Since ancient times, both boys and girls have become anxious about scary stories. Yet it would have occurred to no one across the centuries to give psychiatric diagnoses to these anxieties about fantasms, not at least until the advent of “posttraumatic stress disorder” (PTSD), a syndrome initially associated with the trauma of combat (see pp. 304-305). Whether a distinctive veterans’ psychiatric syndrome involving stress actually exists is unclear. But even if it exists, once PTSD became inserted into official psychiatric lingo, the popular culture grabbed it and hopelessly trivialized it as a way of psychologizing life experiences. By 1995, therapists were talking about “PTSD” in children exposed to movies like Batman. According to one authority, 80 percent of children who had watched media coverage of a crime hundreds of miles distant exhibited symptoms of “posttraumatic stress.”8 The anxieties of the children themselves were nothing new under the sun. New was psychiatry’s willingness to persuade parents that the quotidian problems of maturation represent a distinct medical disorder. The boundaries of what constitutes depression have been expanded relentlessly outward.

Depression as a major psychiatric illness involving bleakness of mood, self-loathing, an inability to experience pleasure, and suicidal thoughts has been familiar for many centuries. The illness has a heavy biological component. Depression in the vocabulary of post-1960s American psychiatry has become tantamount to dysphoria, meaning unhappiness, in combination with loss of appetite and difficulty sleeping. Thus it comes as no surprise that the incidence of depression so defined has been rising steadily and occurring at ever younger ages.9 In 1991, the National Institute of Mental Health began organizing a “National Depression Screening Day,” in the context of its “Mental Illness Awareness Week.” Such programs encourage family doctors to diagnose depression more often in their patients and refer them to psychiatrists. Although this is partly legitimate—a missed major depression may result in a patient’s suicide—the ultimate effect is psychiatric empire-building against other kinds of care. Indeed, the American Psychiatric Association jubilates over “record numbers” each year.10 As a consequence of this continual hammering of the depression theme, depression has become the single commonest disorder seen in psychiatric practice, accounting for 28 percent of all patient visits.11 (The availability of drugs such as “Prozac,” said to be specific for an entire “spectrum” of affective disorders [see pp. 323-324], has doubtless contributed as well to increasing the diagnosis of depression: Physicians prefer to diagnose conditions they can treat rather than those they can’t.) Personality disorders have become a whole sandbox for empire building.

Although the concept of a disorder of the personality—in which everybody suffers but the patient—remains scientifically rather murky, in practice imputed personality disorders have taken off. Diagnoses such as antisocial personality disorder arose preferentially in private psychiatric practice and were virtually unknown in other medical settings.12 Multiple personality disorder (MPD) roared in from obscurity to become epidemic in the 1980s.13 Other so-called disorders of personality represented merely the exaggeration of familiar character traits. Yet the entire notion of giving patient-status to people because they are troublesome to others represented a pathologizing of essentially normal if irksome behavior. Thus these diagnoses of personality, as well as the other ballooned disease labels, dipped greatly the threshold at which individuals were said to be ill.

* Leaders in the field started speaking of “minor depression, mixed anxiety-depression, and mild neurocognitive disorder as … conditions that may deserve consideration as separate categories.” The notion of “subthreshold symptoms” gained currency as a means of reaching out to “previously subthreshold patients.”14 This is the language of empire-building and market-conquest. The evidence that these conditions represent diseases, or disorders, in the sense that mumps and major depression are disorders, is extremely slim. In insider discussions, psychiatrists were perfectly frank with one another about shifting the focus from disease to unhappiness.

* In the 1990s, psychiatry was being bent out of shape by a colossal kind of failure-to-fit. Psychiatrists had been trained for one thing and ended up treating another. They had trained as residents to treat the major psychiatric illnesses. But once in office practice, they gravitated to the commoner and more lucrative psychoneuroses. In doing so, they found themselves in direct competition with the social workers and psychologists. Rather than returning to the main psychiatric diseases, the terrain of choice of biological psychiatry, they went in the opposite direction, expanding the definition of illness to include behavior and symptoms previously reckoned as “subthreshold,” and catering to the great American public’s demand for psychotherapy in dealing with problems of living.

* Increasingly, the view became accepted that psychoanalysis was not for illness but for the interior voyage. While insisting that psychoanalysis was still valid therapy for “the major psychoses,” analyst Robert Michels decided to take a more embracing stance: The discipline was ideal for “the optimalization of experience and the enhancement of sensitivity.”96 Indeed, said critic Adolf Grunbaum, picking up on comments that Michels had made elsewhere, analysis was most akin to “an edifying experience of the kind provided by, say, a season ticket to the opera.”97 From Studies in Hysteria in 1895 to a ticket to the opera in 1994: what an odyssey! By the mid-1990s, psychoanalysis had by no means gone out of fashion among intellectuals, and it was a rage in many departments of languages and sociology. In 1994, the University of Dublin began offering an undergraduate arts degree in psychoanalysis. 98 The late-twentieth-century trajectory of psychoanalysis had carried it beyond the discipline of psychiatry and into the ether of arts and letters where, however it fared, it would no longer be identified as a privileged treatment for psychiatric illness.

* American psychoanalysis had always exhibited strenuous resistance to the collection of data on the outcome of therapy.

* “There is virtually no evidence that therapies labeled ‘psychoanalysis’ result in longer-lasting or more profound positive changes than approaches that are given other labels and that are much less time-consuming and costly.”

Posted in Psychiatry | Comments Off on A History of Psychiatry: From the Era of the Asylum to the Age of Prozac

From Paralysis to Fatigue: A History of Psychosomatic Illness in the Modern Era

Edward Shorter writes in this 1993 book:

This cultural pressure is the crux of the book. The unconscious mind desires to be taken seriously and not be ridiculed. It will therefore strive to present symptoms that always seem, to the surrounding culture, legitimate evidence of organic disease. This striving introduces a historical dimension. As the culture changes its mind about what is legitimate disease and what is not, the pattern of psychosomatic illness changes. For example, a sudden increase in the number of young women who are unable to get out of bed because their legs are “paralyzed” may tell us something about how the surrounding culture views women and how it expects them to perform their roles.

Psychosomatic illness is any illness in which physical symptoms, produced by the action of the unconscious mind, are defined by the individual as evidence of organic disease and for which medical help is sought. This process of somatization comes in two forms. In one no physical lesion of any kind exists and the symptoms are literally psychogenic; that is to say, they arise in the mind. In the second an organic lesion does exist, but the patient’s response to it—his or her illness behavior—is exaggerated or inappropriate. Culture intervenes in both forms, legislating what is legitimate, and mandating what constitutes an appropriate response to disease. Our late-twentieth-century culture, for example, which values individual dynamism, regards physical paralysis and sudden “coma” (both common before 1900) as inappropriate responses.

Psychosomatic illnesses have always existed, because psychogenesis—the conversion of stress or psychological problems into physical symptoms—is one of nature’s basic mechanisms in mobilizing the body to cope with mental distress. People have always tried to achieve some kind of plausible interpretation of their physical sensations. They cast these sensations on the model of well-defined medical symptoms available in a kind of “symptom pool.” Only when an individual’s act of making sense amplifies the sensations, or attributes them to disease when none exists, does psychosomatic illness come into play.

The two actors in this psychodrama of making sense of one’s sensations are, and always have been, doctors and patients. The interaction between doctors and patients determines how psychosomatic symptoms change over the years. Doctors’ notions of what constitutes “genuine” organicity may alter, perhaps as a result of increased scientific knowledge or of new cultural preconceptions. Although patients’ notions of disease tend to follow doctors’ ideas—a kind of obedience that has started to break down at the end of the twentieth century—patients may also change their notions of the legitimacy of symptoms for reasons that have little to do with medicine.

* Given the reluctance of the unconscious mind to be made a fool of, patients have always tended to reject psychological interpretations of physical symptoms. They find this kind of attribution unsettling because it seems to make inaccessible to them the remedies of medicine, conferring upon their symptoms a kind of hopelessness. Patients often think, Who after all can control the action of his or her unconscious mind?

* This lamentation about the lack of insight in somatizing patients constitutes a steady stream in medical literature. Every decade has its offerings. Here is Herbert Berger in 1956 on the subject of his first few years of medical practice in a small town: “The certainty that I lived in a belt of inbred neurotics became firmly fixed in my mind. Coming from a large urban center myself, I felt fairly certain that the residents of my community had intermarried … and that this explained the large number of functionally incompetent individuals whom I met.” Later he realized that this was just a typical general practice. “Gradually I have come to recognize that these individuals never wish to be told that they are just nervous. The word ‘imagination’ is anathema to them for they are certain that they are seriously ill, and they expect and demand that the physician treat their disease with considerable respect. It is often necessary to medicate these people.” Referral to a psychiatrist, said Berger, was impossible. “The patient is often reluctant to admit even to himself that he is mentally sick, whereas he can continue to believe that he is organically ill as long as he visits the office of a non-psychiatrist.” Berger treated these patients with placebo therapy (giving them injections of a muscle relaxant called mephenesin) plus a kind of Dubois-Dejerine-style psychotherapy.98

Over the years a kind of informal consensus on the management of the somatizing patient established itself within internal medicine and neurology: Seek out the convenient fiction. “Almost every one is filled with the belief that he is debilitated,” wrote Baltimore physician Daniel Cathell in 1882. “Say to the average patient, ‘you are weak and need building up,’ and you will instantly see by his countenance that you have struck his key-note. So much is this the case, that many of the sick, fully impressed with this idea, will want you to treat them with tonics and stimulants, even when their condition is such that these medicines are not at all indicated.”99 In Harley Street it was rather more fashionable to tell patients they had “malnutrition and dyspepsia producing nervous exhaustion” (rather than the reverse).

* When the doctors’ idea of “legitimate” disease changes, the patients’ idea changes as well. When the doctors shifted their paradigm from reflex neurosis emphasizing motor hysteria to the central-nervous paradigm of sensory symptoms, the patients shifted accordingly: Symptoms of psychosomatic illness passed from the motor side of the nervous system to the sensory. Anxious to present legitimate disease, somatizing patients in the last quarter of the nineteenth century and the first quarter of the twentieth abandoned the classic hysteria of the past and adopted sensory symptoms that would correspond to the new medical paradigms of central-nervous disease and psychogenesis. Pain and fatigue came to the forefront of the consultation as examples of symptoms that “exhausted cerebral centers” would be likely to produce. For what better corresponded to the notion of intrinsic cerebral deficits than the highly subjective sensations of pain and tiredness?

* In the social history of medicine there is no more striking phenomenon than the disappearance of classic hysteria. Enthroned in the middle of the nineteenth century as the quintessential illness of the “labile” woman, the fits and paralyses that had been summoned from the symptom pool since the Middle Ages—spreading almost epidemically during the nineteenth century—virtually came to an end by the 1930s. Although doubtless caused by many circumstances, this change was in part a consequence of changing medical paradigms.

* At the psychiatric hospital in Florence, for example, grave hysteria declined from 4 percent of all admissions in 1898-1908 to 0.1 percent in 1938-48.2 Whereas the total number of patients diagnosed as hysterical at Cery Hospital, the university psychiatric clinic of Lausanne, did not change between 1910-29 and 1970-80, the kinds of symptoms that “hysterical” patients presented did alter significantly: Eighty-one percent of all hysteria patients in the former period displayed muscular tetany and agitation; only 27 percent did so in the latter. Fainting declined from 47 to 31 percent of all patients, and globus hystericus (lump in throat) from 13 to 5 percent. The dissociative conditions so popular at the turn of the century also dropped off sharply: “Twilight states” (états crépusculaires), which is to say second states, declined from 57 to 24 percent of all hysteria patients; amnesia dropped from 32 to 18 percent. By contrast, general fatigue rose from being present in 4 percent of all hysteria patients to 13 percent, and visceral problems from 8 to 22 percent. Whereas no patients had complained of sexual frigidity in 1910-29, 22 percent (all of them women) did so in 1970-80.

* In 1916 almost all German neurologists came to agree upon the purely psychological origin of shell shock. Since then we have trained an entire generation of physicians in this tradition. Shell shock is now nipped in the bud, meaning that we would let the front soldiers rest for a couple of days instead of sending them home as in 1914-18, where their symptoms would become fixated and contagious to others. By 1945 the military district had over 30,000 beds and over 3000 neurological cases; and the neurotic division almost never contained more than 30 or 35 soldiers.

* From the viewpoint of the patient, pain and fatigue had the benefits (1) of corresponding to what doctors under the influence of the central-nervous paradigm expected to see, and (2) of being almost impossible to “disprove.” Highly subjective sensations, neither pain nor tiredness can be said not “really” to exist, in the way that the Babinski test can “disprove” a hysterical paralysis or an ophthalmic diploscope can “disprove” the presence of achromatopsia (claimed inability to see colors). One could disprove medically many motor symptoms by demonstrating their lack of an anatomical basis. The potential anatomic basis of fatigue and pain was, by contrast, so much more complex and difficult to investigate that patients could retain the symptoms far longer before physicians would start murmuring the word “hysteria.” Advancing medical knowledge had the ironical result of driving somatization deep into the nervous system, where a “million-dollar workup” would be required to clarify matters.

Writing the history of chronic fatigue as part of the symptom pool involves disentangling it from the diagnosis of neurasthenia. This is a chicken-egg problem: Did a rise in the frequency of fatigue prompt adoption of the diagnosis neurasthenia? Or did Beard’s creation of neurasthenia elicit a rise in the complaint of tiredness among patients who wanted to be taken seriously? Both are likely.

* The psychosomatic symptoms of the 1990s are not very different from those of the 1920s. Now as then, pain and fatigue continue to be the commonest physical complaints. But there are two significant differences between the psychosomatic patients of the 1990s and those of the 1920s. Sufferers today are more sensitive generally to the signals their bodies give off, and they are more ready to assign these symptoms to a given “attribution”—a fixed diagnosis of organic disease. Many patients today have acquired the unshakable belief that their symptoms represent a particular disease, a belief that remains unjarred by further medical consultation.

This increase in illness attribution stems, at the level of the doctorpatient relationship, from the loss of medical authority and from the corresponding increase in the power of the media to suggest individuals into various fixed beliefs. At the cultural level, these new patterns come from a distinctively “postmodern” disaffiliation from family life. If the psychosomatic problems of the nineteenth century resulted from an excess of intimacy in the familial psychodrama, those of the late twentieth century have been the result of the opposite phenomenon: a splintering of close personal ties and the lack of intimacy. These changes of the late twentieth century have had the effect of making people more sensitive to bodily signals than ever before and more willing to shift the attribution of their plight from internal demons to external toxins.

A New Sensitivity to Pain

Our culture witnesses a kind of collective hypervigilance about the body, a sensitivity to variations in weight, for example, that has sufficed to make many fortunes in the industry devoted to dieting and slimming, or a bowel consciousness that keeps pharmacy shelves stocked high with medically unnecessary laxatives. This kind of extreme alertness to the body’s normal functions is itself without historical precedent. But even more striking is a willingness to amplify bodily signals so that they become evidence of disease and justify seeking help or taking medication.

People today believe themselves to be highly symptomatic. After reviewing various studies, one scholar writes: “Only 5 to 14 percent of the general population do not experience symptoms in a given two-week period. The average adult has four symptoms of illness on one out of every four days.” She concludes: “There are probably many people with vague symptoms in search of a diagnosis.”1

Some of these symptoms are psychogenic; some come from organic disease. People today are more sensitive to both.

* In addition to psychogenic pain, fatigue is the other great somatoform symptom of the end of the twentieth century. For many reasons one might expect people leading frenetic, compartmentalized lives in crowded cities to feel tired. But we are talking about fatigue as an illness rather than simply feeling tired at the end of the day. Many individuals who are chronically fatigued believe something is physically wrong with them and end up having more than just a symptom. From their physician or from some other source, they acquire the diagnosis of chronic fatigue syndrome. Accordingly, fatigue is both a symptom and a syndrome, or pattern of illness.

* In the 1990s it is above all chronic fatigue syndrome—consisting of a combination of severe fatigue, weakness, malaise and such mental changes as decreased memory—that has won out over its competitors, just as reflex hysteria triumphed over spinal irritation in the nineteenth century.

The saga of chronic fatigue syndrome represents a kind of cautionary tale for those doctors who lose sight of the scientific underpinning of medicine, and for those patients who lose their good sense in the media-spawned clamor that poisons the doctor-patient relationship. As a precondition, we have a pool of nonspecific symptoms in search of a diagnosis. These symptoms include, in the experience of Donna Stewart, a psychiatrist who has dealt extensively with fixeddiagnosis somatizers, “transient fatigue, headaches, muscle or joint aches, backaches, digestive upsets, respiratory complaints, vague pains, irritability, dizziness, poor concentration, and malaise.” It is chronic somatizers, Stewart continues, who are “especially prone to elaborate on non-specific symptoms, and tend to embrace each newly described disease of fashion as the answer to long-standing, multiple, undiagnosed complaints.”28

How does a given symptom become a disease of fashion? An epidemic of illness attribution, or epidemic hysteria, seems to involve two phases: (1) appropriating a genuine organic disease—whose cause is difficult to detect and substantiate—as a template; (2) broadcasting this template to individuals with often quite different symptoms, who then embrace this template as the explanation of their problems. This broadcasting is effected by sympathetic physicians, patient support groups, and the media.

* Chronic fatigue syndrome is without a doubt the illness attribution that has dominated the last two decades of the twentieth century. One researcher estimated in 1990 that “at least one million Americans [are] currently carrying a diagnosis of CFIDS [chronic fatigue immune dysfunction syndrome], and possibly another five million are ill and yet to be diagnosed.”32 By 1990, some four hundred local support groups for the illness had arisen in the United States, and the Centers for Disease Control of the U.S. Government, in Atlanta, were receiving a thousand to two thousand calls a month about chronic fatigue syndrome.33 Many similar stories of wildfirelike spread elsewhere could be told.

A whole subculture of chronic fatigue has arisen in which those patients too tired to walk give each other hints about how to handle a wheelchair and exchange notes about how to secure disability payments from the government or from insurance companies.34 The whirl of activities within this subculture sounds so diverting that one can understand why the members would be reluctant to part with their symptoms. Among various local associations for chronic fatigue in England, for example, we encounter the following notices: “Berks and Bucks. On 21st May [1988] there will be a stall for M.E. [myalgic encephalomyelitis, the English version of chronic fatigue] at the Young Farmer’s RALLY at the ChildBeale Wildlife Trust near Pangbourne. Please do look out for anything yellow that you can spare,” wrote the local organizer, “and either post it to me or let me know so that I can arrange for its collection (Stall themes are colours).”

“Gloucestershire. Seventeen members, together with partners and friends, attended a coffee morning at Lapley Farm, Coaley on March 5th. This was an excellent turnout for such a large and scattered county…. Next: Family Ploughmans Lunch, also at Lapley Farm, on Saturday, June 4th. We are hoping to arrange a meeting for the autumn in Cheltenham.”35 Chronic fatigue thus can become a way of life.

* Yet infectious mononucleosis never really achieved phase two—diffusion to large numbers of somatizers in an epidemic of symptom attribution—because doctors looked for the characteristic misshaping of cells before granting mono as a diagnosis. It was really after the discovery in 1968 of Epstein-Barr virus as the cause of mononucleosis that EBV became a disease of fashion, because the vast majority of the population bears EBV antibodies in the blood. Disproof was impossible. Finally “evidence” was at hand that sufferers were “really ill”: Their blood tests (and everybody else’s) showed the antibodies. This particular proof seemed to be dramatically delivered in 1984, when an epidemic of stillinscrutable character occurred at Lake Tahoe. EBV antibodies were detected in blood samples of some of the victims, and the case for organicity seemed to be clinched.42 In the mid-1980s EBV was warmly embraced as the explanation of one’s difficulties, a series of learned medical articles strengthening the supposition of organicity. 43EBV was christened in the press “the Yuppie flu,” an infection to which fast-tracking professionals were thought especially vulnerable.

Unfortunately, the very ubiquity of Epstein-Barr virus caused its downfall as an illness attribution. In 1988 Gary Holmes at the Centers for Disease Control, along with coworkers, realized that the correlation was poor between those patients who had hematological evidence of chronic EBV infection and those who had the symptoms of chronic fatigue. Holmes therefore rebaptized chronic Epstein-Barr virus infection as chronic fatigue syndrome, or CFS.44 This renaming did not sit well with patient groups, who promptly renamed their condition CFIDS, chronic fatigue immune dysfunction syndrome, to better insist on its organicity.45

These two templates therefore, neuromyasthenia and mononucleosis EBV, provided the presumption of organicity for self-labeled sufferers of chronic fatigue in the United States and Canada. Donna Greenberg, professor of psychiatry at Harvard, wrote of these diagnoses: “Chronic mononucleosis and chronic fatigue syndrome represent neurasthenia in the 1980s…. It is in the nature of chronic fatigue that [the diagnosis] will inevitably recruit subjects with depressive disorders, anxiety, personality disorders, and other common medical syndromes such as allergic rhinitis or upper respiratory infections.”46 Exactly as appendicitis had given way to colitis, and reflex neurosis to neurasthenia, so in the United States chronic EBV gave way to CFIDS as somatization attempted to keep one jump ahead of science.

* In a curious inversion of the normal diffusion of scientific findings, the media advocates of CFS seize immunological data as they become available in the lab and apply them willy-nilly to their pet illnesses. “Not just the blues,” trumpeted Newsweek, as a cover story of November 12,1990, on chronic fatigue syndrome alerted readers to new findings about “a newly discovered herpes virus called HHV-6.” Research on patients’ “interleukin-2” levels had also proved promising, the story said.63 Although individual sufferers may display disparate immunological abnormalities, no pattern of findings has emerged common to CFS patients as a whole. Nor is it clear how widespread these abnormalities are in the general public, nor to what extent they are shared by individuals with other psychiatric illnesses. Driving forward the pseudoscience underlying CFS has not been the medical profession itself—it has been the media.

In the United States, a widely read story in Rolling Stone magazine in 1987 gave the signal for converting chronic fatigue into a media frenzy. Entitled “Journey into Fear: The Growing Nightmare of Epstein-Barr Virus,” the journalist-sufferer, once “in control of my career and my life,” explained how an “enigmatic disease” had rendered her “unable to lift my toothbrush or remember my phone number.” Of course her physicians had been unhelpful. “After rendering their diagnoses, my doctors made it clear they had served me to the limit of their ability. One of them, the internist, tried to comfort me: ‘At least it isn’t terminal.’” The writer cried a good deal and felt “a sadness akin to the raw grief of mourning.” Then one day she read about the Lake Tahoe “epidemic” and realized what she had.

The writer located a physician-enthusiast. Because she carried with her copies of all her blood reports “rolled up and stuffed in my bag,” she pulled them out for him to look at. Sure enough, she had the Lake Tahoe disease. He explained to her that her reports displayed the “reactivation phenomenon,” a phenomenon unknown to his medical colleagues generally.

“I understand there are doctors who leave the room after speaking to one of these patients and can’t stop laughing,” he told her.

The message to Rolling Stone readers was that a terrible epidemic was ravaging the country and that a mainline physician was the last person one would want to put one’s trust in.

* Television has spread this plague of illness attribution even more rapidly than the print media. A “chronic fatigue” story on “TV Ontario,” for example, prompted more than fifty-one thousand viewers to try to phone the station during the forty-minute segment.68 A short spot on chronic fatigue on Channel 3 in Philadelphia produced seven hundred calls to the station—a record for that particular program—and a further two thousand inquiries to the CFIDS Association.69

On September 23 and 30, 1989, NBC aired a two-part show in the “Golden Girls” series, featuring Dorothy’s struggle with chronic fatigue. Her first doctors, mainline physicians, had been beastly. As Dorothy is about to leave for an appointment with “her virologist,” her friend Rose tells her: “Good luck, I hope he finds something wrong with you…. Oh, I don’t mean something wrong wrong, I just mean something wrong so you’ll know you’re right when you know there’s something wrong and you haven’t been wrong all along.” (This is the exact functional equivalent of nineteenth-century young women hoping to be admitted to hospital for ovariotomies.)

In the program Doctor Chang, the virologist, reassures Dorothy that “she really is sick and not merely depressed…. There are new diseases arising all the time,” he says.

“So,” Dorothy says with relief, “I really have something real.”70

Dorothy’s encounter with chronic fatigue demonstrates the oppositional stance to mainline medicine of this subculture of invalidism, a refusal to accept medical reassurance. The chronic fatigue sufferers of today are far more skeptical of medical authority than were victims of ovarian hysteria in the 1860s or brucellosis patients of the 1930s. In 1990 Woman’s Day bannered “The Illness You Can’t Sleep Off.” “Can you imagine,” asked the author, “how it feels to know there is something terribly wrong with you and have one doctor after another tell you there can’t be?”71 This theme of medical incompetence and indifference runs throughout the movement, which elevates the patients’ subjective knowledge of their bodies to the same status as the doctors’ objective knowledge. This presumption of privileged self-knowledge of one’s body dovetails perfectly with media marketing strategies.

The rejection of psychiatric diagnoses by chronic fatigue patients is much more violent than are the normal reactions of medical patients to psychiatric consultation, and is itself a characteristic of the illness. Anything smacking of psychiatry or psychology is completely taboo. The chronic fatigue subculture evaluates internists, for example, not on the basis of the quality of their clinical judgment but their friendliness to the diagnosis. The work of Stephen Straus, a distinguished internist at the National Institutes of Health in Bethesda, was initially greeted by hosannas because in 1985 he seemed to take the EBV explanation at face value. Three years later, however, Straus became an object of vilification when he said that psychopathology might help to explain the symptoms as well.72 “Expecting Stephen Straus to talk about CFS for very long without inevitably mentioning psychiatric disorders is like expecting Blaze Starr to walk without jiggling,” wrote one disappointed sufferer.73

The chronic fatigue subculture brims with folklore about choosing physicians thought to be sympathetic. How does one pick a doctor? A patients’ organization advised selecting one who would share test results and let the patient keep a copy—a bizarre request in the context of normal medical practice.74 Chronic fatigue patients, reluctant to disclose emotional symptoms, are often quite resistant to psychological probing of any kind from the doctor.75 Needless to say, psychiatrists are unwelcome in the subculture of chronic fatigue. The several psychiatrists who appeared at a chronic fatigue symposium in 1988 in London were called, by one physician-enthusiast, “colourful and frankly strange remnants of prehistoric medicine” and “as mad as hatters.”76 Behind this fear of psychiatry is the horror that one’s symptoms will be seen as “imaginary,” which characterizes most patients with fixed illness attributions. Thus patients welcome the occasional blood abnormalities that turn up in their testing.77

Another characteristic of the subculture of invalidism is its “pathoplasticity,” the willingness to change symptoms and attributions as new fads appear. Chronic fatigue sufferers are quite willing to believe that they also have other illnesses that are stylish at the moment. Monilia infections, sometimes called candida or total body yeast infections, enjoyed a certain currency during the 1980s. “Could Yeast Be Your Problem?” headlined one American chronic fatigue newsletter.78 An English sufferer suggested an “anti-candida diet,” including “half an avocado pear sprinkled with lemon juice.”79 A number of English patients expressed their concerns about yeast in letters to Doctor Dawes: “I put myself on an anti-candida diet, and persuaded my doctor to give me Nystatin [a fungicide],” wrote one patient. “He is gradually reducing the amount of Nystatin I am taking but he was reluctant to allow me to have Nystatin in the first place. I am not sure that he is the best judge of how much I should be taking.” (Doctor Dawes responded: “A number of people need to take it for a year or two.”)80

Other patients believe they have chronic fatigue and multiple food allergies (“causing immediate sensations in my stomach and legs”).81 Pyramiding the syndromes one atop the other, one person wrote to a physicianenthusiast, “I have CFS and was recently told I have Candida and given a special diet that excluded food items to which Candida sufferers are allergic. I was about to start when I saw you on TV and now wonder, what happens if I am also allergic to foods on the Candida diet.”82

Still other patients believe that they have chronic fatigue and hypoglycemia (“It took me two years to find a doctor who understood.”)83 Or that they have TMJ syndrome, polio, and Lyme disease. One sufferer believed she was being poisoned by the mercury fillings in her teeth. She failed, however, to get better after having all the fillings removed.84 Indeed, the only current disease chronic fatigue patients are sure they do not have is highly stigmatized AIDS. The occasional suggestion that whatever organism ails them is similar to the one producing AIDS is greeted with dismay.85

One study has demonstrated how closely the diseases of fashion are interwoven with one another. Fifty patients with “environmental hypersensitivity,” a disease attribution closely related to chronic fatigue, were asked what else they thought they had. Ninety percent were found to be “suffering from at least one other media-popularized condition,” including EBV, food allergy, candidiasis hypersensitivity, and fibrositis. More than 10 percent of the patients reported eight or more diseases of fashion. In 1985, when the study began, all patients attributed their problems to environmental sensitivity, but by 1986 many had shifted to Candida albicans as the main cause, and by 1987 EBV had become particularly popular. Most of the patients were on disability; none expected to return to his or her former job (88 percent were women). The author concluded: “These patients are suggestible and at high risk for acquiring diagnoses that are popularized by the media.”86

Such hypersuggestibility is conceivable only in a population that has quite lost its moorings in the folk culture of body knowledge. In the United States there was once a common set of assumptions, or folk culture, about health and illness that was handed down from generation to generation. These assumptions gave people a commonsensical understanding of their own sensations. Instead, individuals today are buffeted by every new “finding” on television or in the morning paper. Accompanying this loss of contact with a folkloric inheritance and its tranquil interpretation of bodily symptoms, has been a loss of willingness to believe in “what the doctor says.” For example, the percentage of patients in the United States willing to use the family doctor as a source of “local health care information” declined from 46 percent in 1984 to 21 percent in 1989.87 As for selecting which hospital to attend, more than 50 percent of patients polled in 1989 said that “they or their family have the most influence in selection of a hospital”—as opposed to listening to the doctor—up from 40 percent in previous years.88 (Non-American readers will recall that private American hospitals compete for patients.) According to a Gallup poll in 1989,26 percent of patients said they respected doctors less now than ten years ago (14 percent said more). And of those who respected doctors less, 26 percent said, “they [the doctors] are in it for the money.” Seventeen percent claimed that doctors “lack rapport and concern.”89

The late twentieth century is writing a new chapter in the history of psychosomatic illness: fixed belief in a given diagnosis. The diagnosis itself may be changeable, based on fashion, but the fixity of belief remains the same, a questing after certainty resulting from the rising influence of the media upon public opinion and the corresponding decline of medical authority.

* Although the term postmodern has been bandied about in a nonspecific way, it does have a specific meaning in the area of family life: the triumph of the desire for individual self-actualization over commitment to the family as an institution.90 This kind of larger commitment, not a commitment to specific individuals but to the ideal of “family,” characterized the modern family of the nineteenth-and early-twentieth century. In the postmodern family, the notion of “relationship” has taken priority over the concept of the family as a building block of society. Indeed since the 1960s the relationship has often supplanted the concept of marriage itself. Sexual relationships involving periods of living together are becoming the antechamber to marriage.91 Adulterous relationships often exist on the side for both partners, and after divorce the partners are spun once again into the world of relationships. So the notion of “relationship” has deeply pervaded the institution of marriage.

The intrinsic logic of the relationship lies in achieving self-actualization, or personal growth, instead of pursuing communitarian objectives. It is this search for individual psychological fulfillment for the individual partners that gives the postmodern family its remarkable fragility, for once personal growth ceases within marriage, the marriage itself terminates. Thomas Glick, a senior demographer at Arizona State University, wrote in 1987: “The relatively fragile state of American family life at present is undeniable in view of the prospect that close to one-half of the first and second marriages of young adults will end in divorce.”92 Accordingly, instability is becoming the rule rather than the exception.

The keynote of postmodern life is the solitude and sense of precariousness arising from ruptures in intimate relationships. As the average age at marriage rises, the number of young people living alone increases. Divorce further accelerates singlehood. And the social isolation of the elderly has greatly increased.

* What are the consequences of postmodernity for psychosomatic illness? People who are socially isolated tend to have higher rates of somatization in general than those who are not. One scholar concluded, after a review of the literature on health and loneliness, that “loneliness is linked with reported feelings of ill health, somatic distress, and visits to physicians as well as physical disease.”

* By removing “feedback loops,” social isolation intensifies the tendency of individuals to give themselves fixed selfdiagnoses. The advantage of living closely with others is that one can test one’s ideas. I’m feeling poorly today. Do I have chronic fatigue syndrome? No, it’s because you slept poorly last night. This is the kind of feedback that occurs routinely in living together with others. We profit from the collective wisdom about health and illness of our co-residents. These feedback loops cease to function when one lives alone, and function imperfectly in living solely with one other individual, for one is either cut off from the collective wisdom entirely or has substantially reduced access to it.

The unmarried, divorced, and widowed tend to be easy prey for chic media-spawned diseases because they have few “significant others” with whom they may discuss interpretations of their own internal states. Of fifty patients with chronic fatigue syndrome seen at Toronto Hospital, “most were unmarried women and at least 4 had been divorced.” Their average age was thirty-three, and fully 50 percent had had a major depression before the onset of the fatigue.99 Of eight patients in one study who were “allergic to everything,” four were married, two divorced and two single.100 As for “twentieth-century disease,” psychiatrist Donna Stewart describes a population of young, middle-class female sufferers whose personal lives were in chaos. Of her original eighteen patients reported in 1985, seven were married, eight single, and three divorced.101 Lacking feedback loops, such individuals have only the media against which to test readings of their internal sensations, and the media purvey the most alarmist view possible.

In the nineteenth century the “restricted” Victorian woman gave us an image of the motor hysteria common among women. In the late twentieth century somatization has become the lot of both sexes. Both men and women have been victims of the shattering of the family, and both experience the kinds of pain and fatigue distinctive to our century. It is the lonely and disaffiliated who give us the image of our own times, who are the latter-day equivalent of the hysterical nineteenth-century woman in her hoop skirts and fainting fits. The difference is that, whereas the nineteenth-century woman was virtually smothered by the stifling intimacy of family life, the disaffiliated of the late twentieth century expire in its absence.

The development of psychosomatic symptoms can be a response to too much intimacy or too little. And if our forebears of the “modern” family suffered the former problem, it is we of the postmodern era who endure the latter. The disaffiliated, having lost their faith in scientific medicine and unable to interpret body symptoms in social isolation, seek out alternative forms of cure. The therapies are largely placebos, if not directly harmful to the body as in the case of colonic irrigation—a revival of the outdated practice of curing reflex neurosis by “getting those poisons out of there.” This alternative subculture represents a population that has lost its faith in medical reassurance, that in the absence of folkloric family wisdom seeks its knowledge of the body from the media, and that has taken the full blow of the “relationship” stresses of postmodern life. It is a generation that did not invent psychosomatic illness, but finds itself singularly vulnerable to pain and fatigue that have no physical cause.

Posted in CFS | Comments Off on From Paralysis to Fatigue: A History of Psychosomatic Illness in the Modern Era

I Remember The First Time My Back Went Out

I believe it was late 1992 or early 1993. I had just passed the Reform Beit Din for my initial conversion to Judaism. I had just started placing and responding to singles ads. I had met a woman over the phone that week and I was fantasizing about her. I think we’d had one good conversation. I had been largely bedridden (about 18 hours a day) for the previous four years with Chronic Fatigue Syndrome (CFS). I was living with my parents in Newcastle, CA. They were away for a few days. I rolled out of bed one night to go pee when my lower left back suddenly seized up and I was absolutely helpless. I couldn’t get up. Nothing like it had happened before. We lived on seven acres. Nobody was close. I started crying aloud for help but nobody could hear me. I panicked. I thought about the woman I’d just met and I dreamed she’d come to rescue me, but no rescue came.

After about 30 minutes, I managed to roll on to my side and push myself up. The pain was severe for a couple of days and then it gradually lessened. I couldn’t believe how vulnerable I was. Not just CFS, but my lower back could go into spasm and I would be essentially paralyzed.

After that, about every year or so, my lower left back would go out similarly and I would be hobbled for a couple of days and then gradually return to normal.

Now I’m reading about John Sarno MD’s methods and I am trying to explore the hidden emotional forces in my back pain. I’m wondering if I had a desire to become became helpless so this new woman would rescue me.

I remember in the weeks prior to my February 1988 collapse into CFS (when I was taking 21 units at college and working about 30 hours a week in addition to strenuous workouts every other day), I kept getting this unwanted and embarrassing thought — “I’m going to break through to success or I’m going to break down. Either way, I’ll get the love that I need.”

Howard Schubiner MD blogs:

It is important to realize that Mind Body Syndrome is not a new diagnosis. When Dr. Sarno described Tension Myositis Syndrome (TMS) in the 1970’s, he created a new name for a syndrome that has actually been known for hundreds of years. I agree with Dr. Sarno that we do need a name for this syndrome (and I will explain why in future blogs). However, when you look at the history of medicine you will find many examples of MBS. I highly recommend the book by the University of Toronto historian, Edward Shorter, From Paralysis to Fatigue: A History of Psychosomatic Medicine. Dr. Shorter uses the term psychosomatic, which is commonly used in medicine, but a term that I do not prefer to use because it has a connotation of being unkind to the patients, implying that they are somehow less than normal, or somewhat “crazy.” As I often say, I know that people with MBS are not crazy because I have MBS and I know I’m not crazy.

In any case, the reason people get MBS, or physical (or psychological symptoms) due to emotions which are often unconscious, is that they are human. They have a human brain that processes emotions in certain ways and they have human existences that often cause great stress in our lives. That is why there has always been MBS and there will always be MBS. However, the type of symptoms that the brain creates in our bodies does change over time.

For example, we know (courtesy of Dr. Shorter) that a common manifestation of great stress and emotions in the 1600’s and 1700’s was the development of paralysis. A story that captures this is about a young man who was beaten, abused, and berated his whole life by his father. When he was approximately 25 years old, while being berated once again, he had a great surge of energy and suddenly went to hit his father with his fist. At that very moment, his arm became paralyzed and he couldn’t move it at all. We know that he didn’t suddenly have a stroke because he regained use of the arm fully within a short time. And therefore we know that the cause of the paralysis was a combination of emotions, which were all unconscious (i.e. he was unaware that he was feeling them), and the main emotions were anger, fear and guilt. In those centuries, doctors did not consider this type of reaction to be caused by psychological factors, but rather some kind of physical condition. In the 1900’s, doctors learned how to tap on the tendons of an arm or leg and determine immediately if there was a stroke or some other severe neurologic condition. We now call these reflexes, the deep tendon reflexes, and use them all the time. When they are normal, in someone with sudden paralysis, we know that there is no neurologic condition and that the cause of the paralysis is due to MBS.

Since doctors have been able to use deep tendon reflexes, the number of people with paralysis due to stress and emotions has dropped drastically so that it’s relatively rare. Why? The cause of MBS is in the mind, in our unconscious mind that is trying to help us cope with great stress. The unconscious mind will find some physical symptoms to use when necessary and it will choose a physical symptom that makes some kind of sense. And typically, it will choose a physical symptom that will not be seen as “psychological.” Since paralysis is now seen as psychological, it is rarely used by the unconscious mind. We are more likely now to get Back Pain, headaches, fatigue, and stomach pains, which are more likely to be seen as physical conditions and therefore more acceptable to our self and to the doctors.

This is one reason why there are so many people today with these chronic symptoms and often they do not respond to biomedical treatments. Since so few doctors are aware of MBS, they often are not treating the underlying cause of the symptoms and therefore the treatment is trying to cope with the symptoms of the problem and is less likely to be successful…

MBS is not new. As long as there have been humans, there have been physical symptoms caused by stress and emotions. It is important to realize that physical symptoms, even very severe physical symptoms can be caused by stress and emotions. In fact, the emotions that tend to have the largest effect on us are precisely those that we are unaware of. There are two ways to think about how these symptoms can be produced.

The first way is to understand how the neurologic system works. Pain is a learned response, i.e. the body actually learns how to produce certain symptoms by experiencing them. For example, I had a patient who fell and hurt her back as a teenager. A decade later, she was in a very difficult situation in a job where she felt trapped and unable to get out of her problems there. At that moment, suddenly her back seized up and she had tremendous pain. The nerves that send signals from the back to the brain had been fired when she fell as a teenager and those nerve connections had been “learned” at that time. When a significant emotional situation arose where she had no way out, her body responded in a way that it already knew, by producing the Back Pain it had learned 10 years earlier.

A good way to understand how MBS works is by thinking about phantom limb syndrome. In this syndrome, which is very common among amputees, pain or other sensations can be felt in the part of the body (arm or leg usually) that is missing. There is obviously no disease in that area, yet we can feel pain (often severe) that appears to be coming from the missing body part. What has happened is that the nerves that send signals to the brain have been sensitized and are continuing to fire and those signals are interpreted as pain by the brain. A vicious cycle is formed of sensitized nerves that send signals to the brain, then those signals get amplified in the brain (by a structure called the anterior cingulated cortex; more about that area of the brain in upcoming posts), and then signals are sent out to the body by the autonomic nervous system (the fight, flight or freeze system). This pain is real, very real. However, there is no tissue breakdown, no tissue disease in the body. This is exactly what happens in Mind Body Syndrome. We may feel pain in an area of the body, for example, the head or back or stomach, yet there is no tissue breakdown, no tissue disease there. Of course, pain can be caused by tissue breakdown or disease, such as occurs in cancer, infections, or fractures. When the doctors are unable to find disease after a careful and thorough search, the diagnosis of MBS is usually correct. It is important to realize that MBS is a physiologic process, i.e. a process that occurs due to normal reactions of the body. When we get scared, our heart speeds up; when we get nervous, our stomach tightens up or we get clammy hands. These are physiologic processes, normal reactions that are 100% reversible. That is why MBS is curable. It can be reversed by interrupting the vicious cycle.

Posted in Back, Personal | Comments Off on I Remember The First Time My Back Went Out

Recent Shows

00:00 Who are the real sex pests?
06:00 Millenial Woes addresses the sex pest accusation, https://www.youtube.com/watch?v=FH6rrwDJsco
20:00 Where Woes went wrong, https://trad-news.blogspot.com/2020/12/woes-finally-lauches-sex-pest-defence.html
1:13:00 The American Conservative magazine conference with Michael Anton, Chris Buskirk, https://www.youtube.com/watch?v=wOk4VBdfIlw
1:54:00 Back Pain and Tension Myositis Syndrome, https://www.tmswiki.org/forum/threads/back-pain-and-tension-myositis-syndrome-tms.11990/
2:04:00 Prof John Mearsheimer – US Foreign Policy under President Biden, https://www.youtube.com/watch?v=KaTGGdsomf4
2:05:20 R&B Lecture: “Daughters of Esther and Peace Between Abrahamics” by Roseanne Cherrie Barr, https://www.youtube.com/watch?v=TMdn4yZeU8o
2:07:30 Dooovid makes Roseanne Barr laugh with a Luke Ford quote
2:15:00 Reb Dooovid joins the stream
2:34:00 Dooovid’s ability to find weak points
2:36:00 Dooovid found help for his anger in Hinduism
2:36:40 Dooovid’s multiple truth hypothesis
2:42:00 Prominent SPLC Board Member Vanishes from Website Amid Racism, Sexism Scandal, https://pjmedia.com/news-and-politics/tyler-o-neil/2019/03/26/prominent-splc-board-member-vanishes-from-website-amid-racism-sexism-scandal-n64720
2:51:30 Project Veritas releases CNN Tapes

Posted in America | Comments Off on Recent Shows

Why Did Blacks Make More Progress Before Civil Rights Than After?

From comments to Steve Sailer:

* By every standard you can measure….blacks were much better off before “civil rights.”

Prior to the left “helping” blacks with desegregation, blacks had thriving businesses, an intact family unit, a much lower rate of illegitimacy, strong churches and church attendance, lower rates of crime and substance abuse, etc.etc.etc.

Welfare incentivized single motherhood….and fatherless boys do MASSIVE crime. Out of wedlock births are now 76%, the more intelligent (leadership class) fled to white hoods to escape high crime leaving blacks without decent leaders, substance abuse and gang crime exploded, the entire family structure was destroyed, and now blacks are dependent on govt. handouts.

* Black people, on the whole, had much more self-respect in the earlier era, even though they didn’t advertise it to anywhere near the degree they do now. Sort of ironic.

The various pathologies which characterise way too much of black culture today were trivial back then by comparison. We now have a society which disparages personal responsibility and celebrates every kind of immorality, and is much more racist than ever before. Also sort of ironic.

I’ve seen single black moms struggling with their sons. It can’t be easy. Our cultural propaganda makes it nigh impossible. No one dares tell kids that they shouldn’t have kids of their own, outside a stable family unit. The results are everywhere.

* Anything subsidized grows: TANF, SNAP, Section 8, heating assistance, free school lunches and breakfasts, free preK-12 education, Pell grants, etc, etc. We are subsidizing the reproduction of the least able people. The crop of neck and face tatooed carjackers that bedevil our streets have been brought into being by the good intentions of people like Nancy Pelosi and Chuck Schumer.

* America has been sliding downhill overall since 1970, with blacks sliding even more than whites. Striving for equality is just one of many things that America can’t do as well as it could 50 years ago. Personally, I look at the Apollo 17 mission returning from the moon for the last time one month before the Supreme Court decided the U.S. Constitution includes a right to abort fetuses, and I wonder if the nation turned its back on the blessings of God available to it. Secularists can formulate that idea in their own secular terms if inclined.

* Moving to the North with lots of good jobs in factories for those with limited education or skills undoubtedly helped. But as the workforce for manufacturing declined, it probably affected blacks the first and the most, as there were no comparable employment opportunities to replace them. Toss in badly misguided social policies and we have experienced a social disaster.

They were then sold on the transformative promise of college education, with the result that culturally they place a high premium on credentialism while being fleeced like no other group by the higher ed industry. That has led to the current moment of millions of people with useless degrees and no practical skills believing only a systemic force organized solely to hold them back is responsible for all the disappointments.

* Things that hit the fan for Blacks around 1970: Black fathers disappearing and Black marriage rates plummetting; deindustrialization; lots more whites going to college on financial aid; drug use and selling by blacks going way up; rising crime rates.

Plus let us not forget the ability of white people to replace black labor with Hispanic labor due to mass illegal migration and stagnation of wages that has now lasted decades and loss of labor union membership.

* Perverse incentives in the 1960s turned the black lower class feral: that’s well understood by now.

Perverse incentives today are turning the black elite destructive and useless. I’m thinking of incentives like: white credulity, meaning that a hate hoax results in career advancement and monetary rewards; booming employment as DIE enforcers, resulting in talent (such as it is) getting channelled away from honest productive work, and the proliferation of professional black racists; intensification of affirmative action (from a thumb on the scale, to a foot on the scale), eroding the need to study and perform even at the levels a mediocre person would be capable of.

* There’s also been a general race blind falling behind of the working class, which disproportionately affects blacks compared to whites and so ceteris paribus increases the black-white gap. It’s something that the civil rights movement is partially responsible for because it helped suck the oxygen away from old school economic liberalism that was dedicated to giving the working class stability and economic resources. It’s still a pretty common response to pointing out how much more economically equal the US used to be or certain other countries currently are to say that well those places are racist so we shouldn’t copy them. Civil rights also poisoned many whites against the left broadly

You’re also ignoring the fact that the black-white relationship never stabilized around no net discrimination, but has been ratcheting up more and more net anti white discrimination for decades.

Posted in Blacks | Comments Off on Why Did Blacks Make More Progress Before Civil Rights Than After?