From Paralysis to Fatigue: A History of Psychosomatic Illness in the Modern Era

Edward Shorter writes in this 1993 book:

This cultural pressure is the crux of the book. The unconscious mind desires to be taken seriously and not be ridiculed. It will therefore strive to present symptoms that always seem, to the surrounding culture, legitimate evidence of organic disease. This striving introduces a historical dimension. As the culture changes its mind about what is legitimate disease and what is not, the pattern of psychosomatic illness changes. For example, a sudden increase in the number of young women who are unable to get out of bed because their legs are “paralyzed” may tell us something about how the surrounding culture views women and how it expects them to perform their roles.

Psychosomatic illness is any illness in which physical symptoms, produced by the action of the unconscious mind, are defined by the individual as evidence of organic disease and for which medical help is sought. This process of somatization comes in two forms. In one no physical lesion of any kind exists and the symptoms are literally psychogenic; that is to say, they arise in the mind. In the second an organic lesion does exist, but the patient’s response to it—his or her illness behavior—is exaggerated or inappropriate. Culture intervenes in both forms, legislating what is legitimate, and mandating what constitutes an appropriate response to disease. Our late-twentieth-century culture, for example, which values individual dynamism, regards physical paralysis and sudden “coma” (both common before 1900) as inappropriate responses.

Psychosomatic illnesses have always existed, because psychogenesis—the conversion of stress or psychological problems into physical symptoms—is one of nature’s basic mechanisms in mobilizing the body to cope with mental distress. People have always tried to achieve some kind of plausible interpretation of their physical sensations. They cast these sensations on the model of well-defined medical symptoms available in a kind of “symptom pool.” Only when an individual’s act of making sense amplifies the sensations, or attributes them to disease when none exists, does psychosomatic illness come into play.

The two actors in this psychodrama of making sense of one’s sensations are, and always have been, doctors and patients. The interaction between doctors and patients determines how psychosomatic symptoms change over the years. Doctors’ notions of what constitutes “genuine” organicity may alter, perhaps as a result of increased scientific knowledge or of new cultural preconceptions. Although patients’ notions of disease tend to follow doctors’ ideas—a kind of obedience that has started to break down at the end of the twentieth century—patients may also change their notions of the legitimacy of symptoms for reasons that have little to do with medicine.

* Given the reluctance of the unconscious mind to be made a fool of, patients have always tended to reject psychological interpretations of physical symptoms. They find this kind of attribution unsettling because it seems to make inaccessible to them the remedies of medicine, conferring upon their symptoms a kind of hopelessness. Patients often think, Who after all can control the action of his or her unconscious mind?

* This lamentation about the lack of insight in somatizing patients constitutes a steady stream in medical literature. Every decade has its offerings. Here is Herbert Berger in 1956 on the subject of his first few years of medical practice in a small town: “The certainty that I lived in a belt of inbred neurotics became firmly fixed in my mind. Coming from a large urban center myself, I felt fairly certain that the residents of my community had intermarried … and that this explained the large number of functionally incompetent individuals whom I met.” Later he realized that this was just a typical general practice. “Gradually I have come to recognize that these individuals never wish to be told that they are just nervous. The word ‘imagination’ is anathema to them for they are certain that they are seriously ill, and they expect and demand that the physician treat their disease with considerable respect. It is often necessary to medicate these people.” Referral to a psychiatrist, said Berger, was impossible. “The patient is often reluctant to admit even to himself that he is mentally sick, whereas he can continue to believe that he is organically ill as long as he visits the office of a non-psychiatrist.” Berger treated these patients with placebo therapy (giving them injections of a muscle relaxant called mephenesin) plus a kind of Dubois-Dejerine-style psychotherapy.98

Over the years a kind of informal consensus on the management of the somatizing patient established itself within internal medicine and neurology: Seek out the convenient fiction. “Almost every one is filled with the belief that he is debilitated,” wrote Baltimore physician Daniel Cathell in 1882. “Say to the average patient, ‘you are weak and need building up,’ and you will instantly see by his countenance that you have struck his key-note. So much is this the case, that many of the sick, fully impressed with this idea, will want you to treat them with tonics and stimulants, even when their condition is such that these medicines are not at all indicated.”99 In Harley Street it was rather more fashionable to tell patients they had “malnutrition and dyspepsia producing nervous exhaustion” (rather than the reverse).

* When the doctors’ idea of “legitimate” disease changes, the patients’ idea changes as well. When the doctors shifted their paradigm from reflex neurosis emphasizing motor hysteria to the central-nervous paradigm of sensory symptoms, the patients shifted accordingly: Symptoms of psychosomatic illness passed from the motor side of the nervous system to the sensory. Anxious to present legitimate disease, somatizing patients in the last quarter of the nineteenth century and the first quarter of the twentieth abandoned the classic hysteria of the past and adopted sensory symptoms that would correspond to the new medical paradigms of central-nervous disease and psychogenesis. Pain and fatigue came to the forefront of the consultation as examples of symptoms that “exhausted cerebral centers” would be likely to produce. For what better corresponded to the notion of intrinsic cerebral deficits than the highly subjective sensations of pain and tiredness?

* In the social history of medicine there is no more striking phenomenon than the disappearance of classic hysteria. Enthroned in the middle of the nineteenth century as the quintessential illness of the “labile” woman, the fits and paralyses that had been summoned from the symptom pool since the Middle Ages—spreading almost epidemically during the nineteenth century—virtually came to an end by the 1930s. Although doubtless caused by many circumstances, this change was in part a consequence of changing medical paradigms.

* At the psychiatric hospital in Florence, for example, grave hysteria declined from 4 percent of all admissions in 1898-1908 to 0.1 percent in 1938-48.2 Whereas the total number of patients diagnosed as hysterical at Cery Hospital, the university psychiatric clinic of Lausanne, did not change between 1910-29 and 1970-80, the kinds of symptoms that “hysterical” patients presented did alter significantly: Eighty-one percent of all hysteria patients in the former period displayed muscular tetany and agitation; only 27 percent did so in the latter. Fainting declined from 47 to 31 percent of all patients, and globus hystericus (lump in throat) from 13 to 5 percent. The dissociative conditions so popular at the turn of the century also dropped off sharply: “Twilight states” (états crépusculaires), which is to say second states, declined from 57 to 24 percent of all hysteria patients; amnesia dropped from 32 to 18 percent. By contrast, general fatigue rose from being present in 4 percent of all hysteria patients to 13 percent, and visceral problems from 8 to 22 percent. Whereas no patients had complained of sexual frigidity in 1910-29, 22 percent (all of them women) did so in 1970-80.

* In 1916 almost all German neurologists came to agree upon the purely psychological origin of shell shock. Since then we have trained an entire generation of physicians in this tradition. Shell shock is now nipped in the bud, meaning that we would let the front soldiers rest for a couple of days instead of sending them home as in 1914-18, where their symptoms would become fixated and contagious to others. By 1945 the military district had over 30,000 beds and over 3000 neurological cases; and the neurotic division almost never contained more than 30 or 35 soldiers.

* From the viewpoint of the patient, pain and fatigue had the benefits (1) of corresponding to what doctors under the influence of the central-nervous paradigm expected to see, and (2) of being almost impossible to “disprove.” Highly subjective sensations, neither pain nor tiredness can be said not “really” to exist, in the way that the Babinski test can “disprove” a hysterical paralysis or an ophthalmic diploscope can “disprove” the presence of achromatopsia (claimed inability to see colors). One could disprove medically many motor symptoms by demonstrating their lack of an anatomical basis. The potential anatomic basis of fatigue and pain was, by contrast, so much more complex and difficult to investigate that patients could retain the symptoms far longer before physicians would start murmuring the word “hysteria.” Advancing medical knowledge had the ironical result of driving somatization deep into the nervous system, where a “million-dollar workup” would be required to clarify matters.

Writing the history of chronic fatigue as part of the symptom pool involves disentangling it from the diagnosis of neurasthenia. This is a chicken-egg problem: Did a rise in the frequency of fatigue prompt adoption of the diagnosis neurasthenia? Or did Beard’s creation of neurasthenia elicit a rise in the complaint of tiredness among patients who wanted to be taken seriously? Both are likely.

* The psychosomatic symptoms of the 1990s are not very different from those of the 1920s. Now as then, pain and fatigue continue to be the commonest physical complaints. But there are two significant differences between the psychosomatic patients of the 1990s and those of the 1920s. Sufferers today are more sensitive generally to the signals their bodies give off, and they are more ready to assign these symptoms to a given “attribution”—a fixed diagnosis of organic disease. Many patients today have acquired the unshakable belief that their symptoms represent a particular disease, a belief that remains unjarred by further medical consultation.

This increase in illness attribution stems, at the level of the doctorpatient relationship, from the loss of medical authority and from the corresponding increase in the power of the media to suggest individuals into various fixed beliefs. At the cultural level, these new patterns come from a distinctively “postmodern” disaffiliation from family life. If the psychosomatic problems of the nineteenth century resulted from an excess of intimacy in the familial psychodrama, those of the late twentieth century have been the result of the opposite phenomenon: a splintering of close personal ties and the lack of intimacy. These changes of the late twentieth century have had the effect of making people more sensitive to bodily signals than ever before and more willing to shift the attribution of their plight from internal demons to external toxins.

A New Sensitivity to Pain

Our culture witnesses a kind of collective hypervigilance about the body, a sensitivity to variations in weight, for example, that has sufficed to make many fortunes in the industry devoted to dieting and slimming, or a bowel consciousness that keeps pharmacy shelves stocked high with medically unnecessary laxatives. This kind of extreme alertness to the body’s normal functions is itself without historical precedent. But even more striking is a willingness to amplify bodily signals so that they become evidence of disease and justify seeking help or taking medication.

People today believe themselves to be highly symptomatic. After reviewing various studies, one scholar writes: “Only 5 to 14 percent of the general population do not experience symptoms in a given two-week period. The average adult has four symptoms of illness on one out of every four days.” She concludes: “There are probably many people with vague symptoms in search of a diagnosis.”1

Some of these symptoms are psychogenic; some come from organic disease. People today are more sensitive to both.

* In addition to psychogenic pain, fatigue is the other great somatoform symptom of the end of the twentieth century. For many reasons one might expect people leading frenetic, compartmentalized lives in crowded cities to feel tired. But we are talking about fatigue as an illness rather than simply feeling tired at the end of the day. Many individuals who are chronically fatigued believe something is physically wrong with them and end up having more than just a symptom. From their physician or from some other source, they acquire the diagnosis of chronic fatigue syndrome. Accordingly, fatigue is both a symptom and a syndrome, or pattern of illness.

* In the 1990s it is above all chronic fatigue syndrome—consisting of a combination of severe fatigue, weakness, malaise and such mental changes as decreased memory—that has won out over its competitors, just as reflex hysteria triumphed over spinal irritation in the nineteenth century.

The saga of chronic fatigue syndrome represents a kind of cautionary tale for those doctors who lose sight of the scientific underpinning of medicine, and for those patients who lose their good sense in the media-spawned clamor that poisons the doctor-patient relationship. As a precondition, we have a pool of nonspecific symptoms in search of a diagnosis. These symptoms include, in the experience of Donna Stewart, a psychiatrist who has dealt extensively with fixeddiagnosis somatizers, “transient fatigue, headaches, muscle or joint aches, backaches, digestive upsets, respiratory complaints, vague pains, irritability, dizziness, poor concentration, and malaise.” It is chronic somatizers, Stewart continues, who are “especially prone to elaborate on non-specific symptoms, and tend to embrace each newly described disease of fashion as the answer to long-standing, multiple, undiagnosed complaints.”28

How does a given symptom become a disease of fashion? An epidemic of illness attribution, or epidemic hysteria, seems to involve two phases: (1) appropriating a genuine organic disease—whose cause is difficult to detect and substantiate—as a template; (2) broadcasting this template to individuals with often quite different symptoms, who then embrace this template as the explanation of their problems. This broadcasting is effected by sympathetic physicians, patient support groups, and the media.

* Chronic fatigue syndrome is without a doubt the illness attribution that has dominated the last two decades of the twentieth century. One researcher estimated in 1990 that “at least one million Americans [are] currently carrying a diagnosis of CFIDS [chronic fatigue immune dysfunction syndrome], and possibly another five million are ill and yet to be diagnosed.”32 By 1990, some four hundred local support groups for the illness had arisen in the United States, and the Centers for Disease Control of the U.S. Government, in Atlanta, were receiving a thousand to two thousand calls a month about chronic fatigue syndrome.33 Many similar stories of wildfirelike spread elsewhere could be told.

A whole subculture of chronic fatigue has arisen in which those patients too tired to walk give each other hints about how to handle a wheelchair and exchange notes about how to secure disability payments from the government or from insurance companies.34 The whirl of activities within this subculture sounds so diverting that one can understand why the members would be reluctant to part with their symptoms. Among various local associations for chronic fatigue in England, for example, we encounter the following notices: “Berks and Bucks. On 21st May [1988] there will be a stall for M.E. [myalgic encephalomyelitis, the English version of chronic fatigue] at the Young Farmer’s RALLY at the ChildBeale Wildlife Trust near Pangbourne. Please do look out for anything yellow that you can spare,” wrote the local organizer, “and either post it to me or let me know so that I can arrange for its collection (Stall themes are colours).”

“Gloucestershire. Seventeen members, together with partners and friends, attended a coffee morning at Lapley Farm, Coaley on March 5th. This was an excellent turnout for such a large and scattered county…. Next: Family Ploughmans Lunch, also at Lapley Farm, on Saturday, June 4th. We are hoping to arrange a meeting for the autumn in Cheltenham.”35 Chronic fatigue thus can become a way of life.

* Yet infectious mononucleosis never really achieved phase two—diffusion to large numbers of somatizers in an epidemic of symptom attribution—because doctors looked for the characteristic misshaping of cells before granting mono as a diagnosis. It was really after the discovery in 1968 of Epstein-Barr virus as the cause of mononucleosis that EBV became a disease of fashion, because the vast majority of the population bears EBV antibodies in the blood. Disproof was impossible. Finally “evidence” was at hand that sufferers were “really ill”: Their blood tests (and everybody else’s) showed the antibodies. This particular proof seemed to be dramatically delivered in 1984, when an epidemic of stillinscrutable character occurred at Lake Tahoe. EBV antibodies were detected in blood samples of some of the victims, and the case for organicity seemed to be clinched.42 In the mid-1980s EBV was warmly embraced as the explanation of one’s difficulties, a series of learned medical articles strengthening the supposition of organicity. 43EBV was christened in the press “the Yuppie flu,” an infection to which fast-tracking professionals were thought especially vulnerable.

Unfortunately, the very ubiquity of Epstein-Barr virus caused its downfall as an illness attribution. In 1988 Gary Holmes at the Centers for Disease Control, along with coworkers, realized that the correlation was poor between those patients who had hematological evidence of chronic EBV infection and those who had the symptoms of chronic fatigue. Holmes therefore rebaptized chronic Epstein-Barr virus infection as chronic fatigue syndrome, or CFS.44 This renaming did not sit well with patient groups, who promptly renamed their condition CFIDS, chronic fatigue immune dysfunction syndrome, to better insist on its organicity.45

These two templates therefore, neuromyasthenia and mononucleosis EBV, provided the presumption of organicity for self-labeled sufferers of chronic fatigue in the United States and Canada. Donna Greenberg, professor of psychiatry at Harvard, wrote of these diagnoses: “Chronic mononucleosis and chronic fatigue syndrome represent neurasthenia in the 1980s…. It is in the nature of chronic fatigue that [the diagnosis] will inevitably recruit subjects with depressive disorders, anxiety, personality disorders, and other common medical syndromes such as allergic rhinitis or upper respiratory infections.”46 Exactly as appendicitis had given way to colitis, and reflex neurosis to neurasthenia, so in the United States chronic EBV gave way to CFIDS as somatization attempted to keep one jump ahead of science.

* In a curious inversion of the normal diffusion of scientific findings, the media advocates of CFS seize immunological data as they become available in the lab and apply them willy-nilly to their pet illnesses. “Not just the blues,” trumpeted Newsweek, as a cover story of November 12,1990, on chronic fatigue syndrome alerted readers to new findings about “a newly discovered herpes virus called HHV-6.” Research on patients’ “interleukin-2” levels had also proved promising, the story said.63 Although individual sufferers may display disparate immunological abnormalities, no pattern of findings has emerged common to CFS patients as a whole. Nor is it clear how widespread these abnormalities are in the general public, nor to what extent they are shared by individuals with other psychiatric illnesses. Driving forward the pseudoscience underlying CFS has not been the medical profession itself—it has been the media.

In the United States, a widely read story in Rolling Stone magazine in 1987 gave the signal for converting chronic fatigue into a media frenzy. Entitled “Journey into Fear: The Growing Nightmare of Epstein-Barr Virus,” the journalist-sufferer, once “in control of my career and my life,” explained how an “enigmatic disease” had rendered her “unable to lift my toothbrush or remember my phone number.” Of course her physicians had been unhelpful. “After rendering their diagnoses, my doctors made it clear they had served me to the limit of their ability. One of them, the internist, tried to comfort me: ‘At least it isn’t terminal.’” The writer cried a good deal and felt “a sadness akin to the raw grief of mourning.” Then one day she read about the Lake Tahoe “epidemic” and realized what she had.

The writer located a physician-enthusiast. Because she carried with her copies of all her blood reports “rolled up and stuffed in my bag,” she pulled them out for him to look at. Sure enough, she had the Lake Tahoe disease. He explained to her that her reports displayed the “reactivation phenomenon,” a phenomenon unknown to his medical colleagues generally.

“I understand there are doctors who leave the room after speaking to one of these patients and can’t stop laughing,” he told her.

The message to Rolling Stone readers was that a terrible epidemic was ravaging the country and that a mainline physician was the last person one would want to put one’s trust in.

* Television has spread this plague of illness attribution even more rapidly than the print media. A “chronic fatigue” story on “TV Ontario,” for example, prompted more than fifty-one thousand viewers to try to phone the station during the forty-minute segment.68 A short spot on chronic fatigue on Channel 3 in Philadelphia produced seven hundred calls to the station—a record for that particular program—and a further two thousand inquiries to the CFIDS Association.69

On September 23 and 30, 1989, NBC aired a two-part show in the “Golden Girls” series, featuring Dorothy’s struggle with chronic fatigue. Her first doctors, mainline physicians, had been beastly. As Dorothy is about to leave for an appointment with “her virologist,” her friend Rose tells her: “Good luck, I hope he finds something wrong with you…. Oh, I don’t mean something wrong wrong, I just mean something wrong so you’ll know you’re right when you know there’s something wrong and you haven’t been wrong all along.” (This is the exact functional equivalent of nineteenth-century young women hoping to be admitted to hospital for ovariotomies.)

In the program Doctor Chang, the virologist, reassures Dorothy that “she really is sick and not merely depressed…. There are new diseases arising all the time,” he says.

“So,” Dorothy says with relief, “I really have something real.”70

Dorothy’s encounter with chronic fatigue demonstrates the oppositional stance to mainline medicine of this subculture of invalidism, a refusal to accept medical reassurance. The chronic fatigue sufferers of today are far more skeptical of medical authority than were victims of ovarian hysteria in the 1860s or brucellosis patients of the 1930s. In 1990 Woman’s Day bannered “The Illness You Can’t Sleep Off.” “Can you imagine,” asked the author, “how it feels to know there is something terribly wrong with you and have one doctor after another tell you there can’t be?”71 This theme of medical incompetence and indifference runs throughout the movement, which elevates the patients’ subjective knowledge of their bodies to the same status as the doctors’ objective knowledge. This presumption of privileged self-knowledge of one’s body dovetails perfectly with media marketing strategies.

The rejection of psychiatric diagnoses by chronic fatigue patients is much more violent than are the normal reactions of medical patients to psychiatric consultation, and is itself a characteristic of the illness. Anything smacking of psychiatry or psychology is completely taboo. The chronic fatigue subculture evaluates internists, for example, not on the basis of the quality of their clinical judgment but their friendliness to the diagnosis. The work of Stephen Straus, a distinguished internist at the National Institutes of Health in Bethesda, was initially greeted by hosannas because in 1985 he seemed to take the EBV explanation at face value. Three years later, however, Straus became an object of vilification when he said that psychopathology might help to explain the symptoms as well.72 “Expecting Stephen Straus to talk about CFS for very long without inevitably mentioning psychiatric disorders is like expecting Blaze Starr to walk without jiggling,” wrote one disappointed sufferer.73

The chronic fatigue subculture brims with folklore about choosing physicians thought to be sympathetic. How does one pick a doctor? A patients’ organization advised selecting one who would share test results and let the patient keep a copy—a bizarre request in the context of normal medical practice.74 Chronic fatigue patients, reluctant to disclose emotional symptoms, are often quite resistant to psychological probing of any kind from the doctor.75 Needless to say, psychiatrists are unwelcome in the subculture of chronic fatigue. The several psychiatrists who appeared at a chronic fatigue symposium in 1988 in London were called, by one physician-enthusiast, “colourful and frankly strange remnants of prehistoric medicine” and “as mad as hatters.”76 Behind this fear of psychiatry is the horror that one’s symptoms will be seen as “imaginary,” which characterizes most patients with fixed illness attributions. Thus patients welcome the occasional blood abnormalities that turn up in their testing.77

Another characteristic of the subculture of invalidism is its “pathoplasticity,” the willingness to change symptoms and attributions as new fads appear. Chronic fatigue sufferers are quite willing to believe that they also have other illnesses that are stylish at the moment. Monilia infections, sometimes called candida or total body yeast infections, enjoyed a certain currency during the 1980s. “Could Yeast Be Your Problem?” headlined one American chronic fatigue newsletter.78 An English sufferer suggested an “anti-candida diet,” including “half an avocado pear sprinkled with lemon juice.”79 A number of English patients expressed their concerns about yeast in letters to Doctor Dawes: “I put myself on an anti-candida diet, and persuaded my doctor to give me Nystatin [a fungicide],” wrote one patient. “He is gradually reducing the amount of Nystatin I am taking but he was reluctant to allow me to have Nystatin in the first place. I am not sure that he is the best judge of how much I should be taking.” (Doctor Dawes responded: “A number of people need to take it for a year or two.”)80

Other patients believe they have chronic fatigue and multiple food allergies (“causing immediate sensations in my stomach and legs”).81 Pyramiding the syndromes one atop the other, one person wrote to a physicianenthusiast, “I have CFS and was recently told I have Candida and given a special diet that excluded food items to which Candida sufferers are allergic. I was about to start when I saw you on TV and now wonder, what happens if I am also allergic to foods on the Candida diet.”82

Still other patients believe that they have chronic fatigue and hypoglycemia (“It took me two years to find a doctor who understood.”)83 Or that they have TMJ syndrome, polio, and Lyme disease. One sufferer believed she was being poisoned by the mercury fillings in her teeth. She failed, however, to get better after having all the fillings removed.84 Indeed, the only current disease chronic fatigue patients are sure they do not have is highly stigmatized AIDS. The occasional suggestion that whatever organism ails them is similar to the one producing AIDS is greeted with dismay.85

One study has demonstrated how closely the diseases of fashion are interwoven with one another. Fifty patients with “environmental hypersensitivity,” a disease attribution closely related to chronic fatigue, were asked what else they thought they had. Ninety percent were found to be “suffering from at least one other media-popularized condition,” including EBV, food allergy, candidiasis hypersensitivity, and fibrositis. More than 10 percent of the patients reported eight or more diseases of fashion. In 1985, when the study began, all patients attributed their problems to environmental sensitivity, but by 1986 many had shifted to Candida albicans as the main cause, and by 1987 EBV had become particularly popular. Most of the patients were on disability; none expected to return to his or her former job (88 percent were women). The author concluded: “These patients are suggestible and at high risk for acquiring diagnoses that are popularized by the media.”86

Such hypersuggestibility is conceivable only in a population that has quite lost its moorings in the folk culture of body knowledge. In the United States there was once a common set of assumptions, or folk culture, about health and illness that was handed down from generation to generation. These assumptions gave people a commonsensical understanding of their own sensations. Instead, individuals today are buffeted by every new “finding” on television or in the morning paper. Accompanying this loss of contact with a folkloric inheritance and its tranquil interpretation of bodily symptoms, has been a loss of willingness to believe in “what the doctor says.” For example, the percentage of patients in the United States willing to use the family doctor as a source of “local health care information” declined from 46 percent in 1984 to 21 percent in 1989.87 As for selecting which hospital to attend, more than 50 percent of patients polled in 1989 said that “they or their family have the most influence in selection of a hospital”—as opposed to listening to the doctor—up from 40 percent in previous years.88 (Non-American readers will recall that private American hospitals compete for patients.) According to a Gallup poll in 1989,26 percent of patients said they respected doctors less now than ten years ago (14 percent said more). And of those who respected doctors less, 26 percent said, “they [the doctors] are in it for the money.” Seventeen percent claimed that doctors “lack rapport and concern.”89

The late twentieth century is writing a new chapter in the history of psychosomatic illness: fixed belief in a given diagnosis. The diagnosis itself may be changeable, based on fashion, but the fixity of belief remains the same, a questing after certainty resulting from the rising influence of the media upon public opinion and the corresponding decline of medical authority.

* Although the term postmodern has been bandied about in a nonspecific way, it does have a specific meaning in the area of family life: the triumph of the desire for individual self-actualization over commitment to the family as an institution.90 This kind of larger commitment, not a commitment to specific individuals but to the ideal of “family,” characterized the modern family of the nineteenth-and early-twentieth century. In the postmodern family, the notion of “relationship” has taken priority over the concept of the family as a building block of society. Indeed since the 1960s the relationship has often supplanted the concept of marriage itself. Sexual relationships involving periods of living together are becoming the antechamber to marriage.91 Adulterous relationships often exist on the side for both partners, and after divorce the partners are spun once again into the world of relationships. So the notion of “relationship” has deeply pervaded the institution of marriage.

The intrinsic logic of the relationship lies in achieving self-actualization, or personal growth, instead of pursuing communitarian objectives. It is this search for individual psychological fulfillment for the individual partners that gives the postmodern family its remarkable fragility, for once personal growth ceases within marriage, the marriage itself terminates. Thomas Glick, a senior demographer at Arizona State University, wrote in 1987: “The relatively fragile state of American family life at present is undeniable in view of the prospect that close to one-half of the first and second marriages of young adults will end in divorce.”92 Accordingly, instability is becoming the rule rather than the exception.

The keynote of postmodern life is the solitude and sense of precariousness arising from ruptures in intimate relationships. As the average age at marriage rises, the number of young people living alone increases. Divorce further accelerates singlehood. And the social isolation of the elderly has greatly increased.

* What are the consequences of postmodernity for psychosomatic illness? People who are socially isolated tend to have higher rates of somatization in general than those who are not. One scholar concluded, after a review of the literature on health and loneliness, that “loneliness is linked with reported feelings of ill health, somatic distress, and visits to physicians as well as physical disease.”

* By removing “feedback loops,” social isolation intensifies the tendency of individuals to give themselves fixed selfdiagnoses. The advantage of living closely with others is that one can test one’s ideas. I’m feeling poorly today. Do I have chronic fatigue syndrome? No, it’s because you slept poorly last night. This is the kind of feedback that occurs routinely in living together with others. We profit from the collective wisdom about health and illness of our co-residents. These feedback loops cease to function when one lives alone, and function imperfectly in living solely with one other individual, for one is either cut off from the collective wisdom entirely or has substantially reduced access to it.

The unmarried, divorced, and widowed tend to be easy prey for chic media-spawned diseases because they have few “significant others” with whom they may discuss interpretations of their own internal states. Of fifty patients with chronic fatigue syndrome seen at Toronto Hospital, “most were unmarried women and at least 4 had been divorced.” Their average age was thirty-three, and fully 50 percent had had a major depression before the onset of the fatigue.99 Of eight patients in one study who were “allergic to everything,” four were married, two divorced and two single.100 As for “twentieth-century disease,” psychiatrist Donna Stewart describes a population of young, middle-class female sufferers whose personal lives were in chaos. Of her original eighteen patients reported in 1985, seven were married, eight single, and three divorced.101 Lacking feedback loops, such individuals have only the media against which to test readings of their internal sensations, and the media purvey the most alarmist view possible.

In the nineteenth century the “restricted” Victorian woman gave us an image of the motor hysteria common among women. In the late twentieth century somatization has become the lot of both sexes. Both men and women have been victims of the shattering of the family, and both experience the kinds of pain and fatigue distinctive to our century. It is the lonely and disaffiliated who give us the image of our own times, who are the latter-day equivalent of the hysterical nineteenth-century woman in her hoop skirts and fainting fits. The difference is that, whereas the nineteenth-century woman was virtually smothered by the stifling intimacy of family life, the disaffiliated of the late twentieth century expire in its absence.

The development of psychosomatic symptoms can be a response to too much intimacy or too little. And if our forebears of the “modern” family suffered the former problem, it is we of the postmodern era who endure the latter. The disaffiliated, having lost their faith in scientific medicine and unable to interpret body symptoms in social isolation, seek out alternative forms of cure. The therapies are largely placebos, if not directly harmful to the body as in the case of colonic irrigation—a revival of the outdated practice of curing reflex neurosis by “getting those poisons out of there.” This alternative subculture represents a population that has lost its faith in medical reassurance, that in the absence of folkloric family wisdom seeks its knowledge of the body from the media, and that has taken the full blow of the “relationship” stresses of postmodern life. It is a generation that did not invent psychosomatic illness, but finds itself singularly vulnerable to pain and fatigue that have no physical cause.

About Luke Ford

I've written five books (see Amazon.com). My work has been covered in the New York Times, the Los Angeles Times, and on 60 Minutes. I teach Alexander Technique in Beverly Hills (Alexander90210.com).
This entry was posted in CFS. Bookmark the permalink.