Wednesday, February 16, 2011
It is our pleasure to introduce to this blog guest columnist Jan Henderson, PhD. She is a historian of science and medicine who writes about the history of the medical profession as well as changing attitudes towards health care on her blog, The Health Culture. The following is the first of a two part series.
History of Patient Modesty - Part 1: How Bodily Exposure Went from Unacceptable to Required
Guest post by Jan Henderson, PhD
Even doctors can be embarrassed when it comes time to expose their private parts to medical personnel. In an essay that appeared in The Journal of the American Medical Association, a doctor describes her discomfort as she arrives for a colonoscopy appointment.
[A]s a person not exactly looking forward to the morning’s adventure, I found the receptionist’s demeanor and lack of eye contact wrapped me tight within a cold, impersonal cocoon. I was a subject. Though I hadn’t shared my sentiments with anyone, I felt both vulnerable and completely sheepish about having a very human reaction to such a common procedure. But this was my bottom and I was not happy to share it with others. Here to be exposed and invaded, in truth I was embarrassed and sought compassion. As anyone else would, I wanted to know that my discomfort, self-consciousness, and loss of control were understood. Instead, she exuded efficiency and delivered transparent quality assurance and poise.
The need to reveal private and intimate parts of our bodies is a routine occurrence in medical practice today. Though it may offend our modesty, we take it for granted that the embarrassing moments of a colonoscopy, a Pap test, or a prostate exam are necessary for our health.
Has it always been so? Have doctors always expected patients to disrobe? Have young male technicians always exposed the chests of female patients in need of a routine EKG? Have patients always been willing to allow doctors and their staff to view parts of the body normally seen by only the most intimate of partners?
The answer is a resounding no. Exposing the body for medical purposes is a relatively recent development. It began in the 19th century, before anyone now alive can remember. Prior to that time — for thousands of years — doctors considered it socially unacceptable and morally improper to examine an unclothed patient, especially a woman (the doctors at the time were all men). Over a period of just decades, however, doctors began to place stethoscopes on ladies’ bosoms and use visual scopes to examine the bladder, rectum, and vagina.
This was a significant change, both in the practice of medicine and in the experience of patients. How and why did this change come about? Part of the explanation comes from a change in the medical understanding of disease. A contributing factor was the erosion of a sharp distinction between physicians and surgeons. In what follows I give a brief account of why the practice of medicine changed and – in part 2 of this post — how the medical profession sought to convince patients to accept the change.
When physicians listened to patients
The practice of Western medicine, from the time of the ancient Greeks and Romans to the early 19th century, was based on the humoral theory of health and illness. The theory asserted that the interior of the body was filled with four humors or fluids: blood, yellow bile, black bile, and phlegm. When the humors were in balance – in a stable equilibrium – the individual was healthy. When out of balance, the patient became ill. This may seem quaint to us today, but note that this theory of internal balance lives on in traditional Chinese medicine and continues to inform the contemporary practice of acupuncture.
According to humoral theory, each individual could fall out of balance in a unique way, depending on the history and current circumstances of his or her life. Physicians might group illnesses into broad categories, such as fevers, fluxes (dysentery), or dropsies (edema), but the idea that many people could have the same disease (appendicitis, cirrhosis, diabetes) – though proposed in the 17th century — was not accepted until the 19th. In effect, there were as many “diseases” as there were patients.
To diagnose an illness, therefore, the physician needed to listen carefully to the patient’s account of sensations, symptoms, and life events. The patient’s narrative was considered much more important and revealing than any signs or symptoms a physician might observe. If the diagnosis was in doubt, the patient’s account took precedence over the physician’s observations.
The physical exam prior to modern medicine
In the era of humoral medicine, physicians practiced four methods of diagnosis, none of which required observing the unclothed body or touching the patient on a part of the body that was normally unexposed. The first, and most important, was eliciting the patient’s account of his or her own history.
The second was observation of the patient’s appearance, with special attention to the eyes and the face. This might include a look inside the mouth, including the tongue. Physicians would note the skin color and any peculiar behavior, listen to a cough or a wheeze, and note the smell of putrefaction, if present.
Occasionally a physician would feel an exposed part of the body for heat. Thermometers had been available since the 17th century, but the evidence they provided was not valued. A patient’s temperature did not always correspond to a subjective sense of warmth — a patient could have a fever, but feel chilled. And the patient’s account was paramount.
The third method was to feel the pulse at the wrist. Physicians did not count the number of beats in an interval of time. They listened for the quality of the pulse – how the pulse hit the fingers or varied over time. Again, this is similar to the practice of traditional Chinese medicine today.
The fourth method was to perform a visual inspection of various bodily excretions, such as urine, feces, sputum, pus, vomit, or blood.
How did such a physical exam lead to treatment of an illness? In a word, it did not. Based on experience, physicians were often able to offer a diagnosis – too much blood, too much bile – and a prognosis – a quick recovery or an imminent decline. Patients considered the prognosis valuable, since it was useful to know how long one might expect to be incapacitated.
The few treatments available – primarily bloodletting and purging – probably did more harm than good. This period is called the era of “heroic” medicine: Those who survived the treatment were heroes. For good reason, patients typically consulted physicians only when an illness seemed life threatening.
So, one reason physicians were highly respectful of patient modesty up until the 19th century was that the prevailing theory of disease did not require the patient to disrobe. Another part of the explanation involves the status of physicians in society and their social superiority to surgeons.
When surgeons got no respect
and Greece , the theories of the physician and the practical skills of the surgeon were combined in one practitioner. Starting in the Middle Ages, however, when the lost writings of antiquity were rediscovered, a division occurred. Physicians acquired their medical training in universities, whereas surgeons learned their skills by serving an apprenticeship. Physicians, whose studies required proficiency in Latin, were highly regarded for their book learning and mental acumen. Surgeons, on the other hand, worked with their hands, an activity beneath the dignity of the gentleman physician. Rome
Physicians and surgeons each treated a different class of patients. Physicians preferred members of the well-to-do upper classes. Surgeons attended to those who couldn’t afford the more expensive physicians. Each practiced medicine in a manner appropriate to their social standing and to the social standing of their patients. Physicians used their minds to theorize. Surgeons used their hands to lance boils. It would have been totally improper for a physician to ask a lady to remove her garments.
Surgeons, of necessity, did observe unclothed parts of the body. Before anesthesia and asepsis, surgery was of course quite limited. But surgeons operated on hernias, bladder stones, and anal fistulas. Even surgeons, however, were obliged to honor the wishes of a patient who was unwilling to submit to direct visual inspection or a manual exam.
In an 18th century account, a surgeon describes his treatment of a female patient. It took eight days before the patient revealed she had a tumor in her groin. “She would not allow me to see it, but told me it was as big as a small hen’s egg, and by gentle pressure of the hand receded, and never gave her any pain.” It took another four days — and then only because there was increasing pain — before the surgeon “prevailed upon her to let me see it.”
The idea that transformed medicine
In certain European countries – northern
, the Italy – the sharp distinction between physicians and surgeons began to break down in the 18th century. These new, modern doctors were willing both to theorize and to perform autopsies. Netherlands
Human autopsies had been done as early as 1600. The ancient Greek understanding of human anatomy was based on animals, so human autopsies greatly improved anatomical knowledge. The most significant contribution of autopsies in the 18th century – the one that led to modern scientific medicine – was not anatomical knowledge, however. It was that physician/surgeons began to correlate the patient’s symptoms before death with what an autopsy revealed when the patient died.
The idea that a disease might be associated with a specific location in the body – in an organ or in localized tissues – had been proposed by Giovanni Battista Morgagni in his book The seats and causes of diseases investigated by anatomy, published in 1761. The “seats” in the title refers to bodily locations. Morgagni’s assertion — that internal lesions were located at specific bodily sites – was accepted only gradually over the next 100 years. This turned out to be the idea that transformed medicine from the humoral theory to the scientific medicine we know today.
Patient modesty inspires the invention of the stethoscope
Once medicine subscribed to this new anatomical approach to disease, the question became: How can we determine what’s happening inside the body by examining the outside? The attempt to answer this question prompted the invention of techniques such as percussion (tapping), auscultation (listening), and succussion (shaking the body and listening for a splash). It also led to the invention of diagnostic instruments. One of these instruments was the stethoscope, and its invention was prompted by the need to accommodate patient modesty.
In 1816, a French doctor, Rene Laënnec, was consulted by a young woman suffering from heart disease. Laënnec first tried to use percussion – tapping on the chest with the fingers – to gain information about the internal organs. This was not satisfactory, however, partly because the patient was female and partly because she was obese, which interfered with the production of meaningful sounds.
The doctor next considered an ancient technique – one that goes back to Hippocrates –that was currently making a comeback: auscultation. By placing an ear on the chest, one could listen to the sounds of the heart. In his account of the stethoscope’s discovery, Laënnec writes that he found this technique “inadmissible” because the patient was a young woman.
Then he had an inspiration. He was aware that sound can travel through a solid body, such as a piece of wood. If you scratch one end, you can hear the sound at the other end. Spying a square of paper lying nearby, he rolled it into a cylinder, placed one end on the woman’s chest near her heart, and placed the other end at his ear. He was “not a little surprised and pleased” with how clearly and distinctly he was able to hear the sounds of the heart.
An initial rift in the doctor/patient relationship
This first primitive stethoscope underwent a number of improvements over the ensuing years. It was another 20 years, however, before it was generally accepted by the medical profession. By allowing a respectable distance between doctor and patient, the stethoscope was able to overcome prevailing social conventions of modesty – at least with regard to listening to sounds inside the body.
One early stethoscope was several feet long and allowed the doctor to stand in a separate room. Most patients did not require such extremes. In 1829 a practitioner wrote of the flexible stethoscope – which allowed a greater distance than the original rigid instruments — that it could be used with ladies “in the highest ranks of society without offending fastidious delicacy.”
The stethoscope ushered in other hands-on diagnostic techniques. Percussion, for example, had been described and recommended to physicians more than 50 years before the invention of the stethoscope. It became an acceptable practice, however, only after the stethoscope’s use became common practice.
Other instruments for examining the body were developed in short order, many of them much more of an infringement on patient modesty than holding a stethoscope to the chest. Doctors were soon using scopes and specula to examine the bladder, vagina, and rectum.
The introduction and acceptance of the stethoscope was a major landmark in the history of medicine. This was not simply because of the information it provided — that was available by placing an ear on the chest. The stethoscope initiated a shift in the power relationship between doctor and patient.
No longer was the patient’s account of symptoms of primary importance. Doctors were increasingly able to diagnose an illness without any input from the patient. They became much more independent of their patients when it came to formulating a diagnosis. Medical professionals began to adopt a more self-reliant view of their abilities. This distance between doctor and patient became a salient characteristic of modern medicine.
Continued in part two.
Posted by Doug Capra at 6:57 PM