Fall from Grace is a candid, personal history of an academic physician and biotechnology executive that reflects on medicine as it was in the mid-twentieth century and chronicles the changes in society and medicine during the second half of that century. The book investigates the social revolution of those times; the scientific and technological advances that occurred; the influence of the computer and the digital revolution; the entry of corporate management into health care; and the effects of the profit motive on the care of patients. All of these have had enormous influence on the role of the physician in health care. The inadequacies, over the years, of the fee-for-service system and the consequent governmental involvement in reimbursement systems are discussed and compared with other health care payment systems around the world. The net effect of these various forces has been to benefit patients through greatly improved technology yet has caused medicine to evolve from an art form focused on personal care to a more technical exercise largely controlled by fiscal considerations. These changes also refashioned the role of the physician from healer and counselor into manager of an impersonal health care team. The book provides a view of the current state of medicine, patients, and physicians and a perspective on the future.
|Product dimensions:||5.40(w) x 8.30(h) x 0.50(d)|
Read an Excerpt
Fall From Grace
A Physician's Retrospective on the Past Fifty Years of Medicine and the Impact of Social Change
By J. Joseph Marr
iUniverseCopyright © 2015 J. Joseph Marr, MD.
All rights reserved.
THE WAY WE WERE
ENTERING AN ERA AT ITS CLOSE
The changes in medicine and its practice during the past half century are well beyond anything that would have been imagined at the halfway mark of that century. The situation and the role of the physician have altered considerably. The status then enjoyed by the physician has been eclipsed by technical progress in medicine. In addition, there are many interesting and compelling scientific and technological careers available that did not exist in midcentury. They have served as magnets for younger people with high aspirations. These are the types of people who would have entered into medicine when it was the acme of careers. They have within them varying mixtures of the entrepreneur, the curious, the intellectual, the scientist, the humanist, and a touch of the pragmatist.
The science and technology that developed over these past fifty years have made it possible to have careers in information technology, astrophysics, space exploration, undersea exploration, and new sources of energy, among others. The technology of war even played a role, and an important one. Paramount in all this was the computer. It changed society and then the world, and, as it changed the world, it changed medicine. But it changed it for the better. It brought diagnostic power undreamed of and made efficient an inefficient process. It brought the recognition that medicine, unbeknownst to itself, controlled a large segment of the economy. This was a phenomenon that occurred almost by accident: As more people were born—and kept alive by sanitation and vaccination—more medicine was needed to care for them in their adult years. The costs became significant, and many people could not afford them. Physicians engrossed in the practice of a consuming art—the Magnificent Obsession of movie fame and many novels—noted but did not assimilate the societal reaction to what physicians saw only as improved methods to care for people. Soon enough, business recognized it and began to organize and manage medical care and, relatively quickly, began to profit handsomely from it.
Those of us born in the late thirties or very early forties entered medical school in the later fifties or early sixties. It was a time that I have heard described as a "Golden Age of Medicine." In surveys taken at the time, physicians were ranked second only to Supreme Court justices in public esteem. A golden age, of course, is relative to the observer. Physicians were at the top of a revered profession dedicated to the care of others, and they were almost solely responsible for the management and delivery of that care. The fact that care was very unevenly distributed and closely related to ability to pay was not a consideration. The physician wore the garb of a priest and seer; his opinions were respected, given great credence, and sought in areas outside of medicine. He was the educated person, in the broad, liberal arts sense of the term. In addition, he knew a certain amount of science, and he knew the workings of the human body and psyche as well. He was a shaman at what would be the end of the age of shamans. There is some hyperbole here, used to crystallize the image, but not too much. It was like that. Younger people thinking about careers aspired to enter medicine for reasons intellectual, altruistic, compassionate, and aspirational. The career provided a good living, but although that may have been subsumed in the career choice, it was not a driving force. The concept of helping one's fellow man had not yet become a cliché, and "My son the doctor" was a humorous descriptor but still one that many parents wished to be able to use.
Consider that medical care at midcentury truly was not too far from the time when infectious diseases ravaged populations and certain age groups—the very young and the very old. With the exception of some specific vaccinations and early antibiotic research, there had been very few real advances in medical care and therapeutics since the time of Galen. There were very real benefits to health from better sanitation, improved nutrition, and less crowded living conditions, and life expectancy was beginning to increase. But the nineteen centuries leading to our own twentieth all were about the same in terms of medical therapeutic results. There had been incremental gains in the understanding of some physiological processes and certainly a growing appreciation of the importance of public health, but the upstroke in therapeutics and disease prevention really would not take place until just before the mid-twentieth century. That was not very long ago.
Consider that we have had societies of some sophistication for about the past five thousand years; we are talking about just 1 percent of that time span. Perhaps it should not be a surprise that medical care has undergone such a metamorphosis in the past half century. What we call modern medicine moved into its adolescence at midcentury, and we know how quickly adolescents change.
There were three advances in medicine that were present at the beginning of the twentieth century and marked an inflection point that separated the century from the fifty centuries that had gone before. These were: vaccination, the beginnings of what was termed the "germ theory," and the importance of sanitation and public health in the control of disease. A second inflection point came about midcentury with the expansion of biochemistry and immunology that brought modern science into medicine. The emergence of molecular biology and the rapid expansion of technology and computers midway in the second half of the century were a third inflection point, and it is that upstroke that we are riding yet today. Note that the last two of these changes in the sophistication of medicine and medical care occurred in or around the mid-twentieth century and largely over a span of about forty years.
These changes in medicine and medical science did not develop in isolation; they were a parallel to advances in other areas. The nineteenth century had brought electricity and power grids, steel, the telephone, the telegraph, the automobile, and, as early as midcentury, the first oil well. The Industrial Revolution, building upon all of these, brought the enormous expansion of the railroads and steamships. The advent of the railroad was the first real change in locomotion since the time of the Romans. Think about that! The momentum of new ideas and the savoir faire to bring them into commercial reality provided the momentum for an enthusiastic start to the twentieth century. Much of this was on view at the Chicago World's Fair (Columbian Exposition) in 1893 and both summarized the astounding scientific and technical advances of the nineteenth century and presaged what might be anticipated from the twentieth. Despite the heroic efforts of its organizers and planners, it did not come even close to what the twentieth century would bring us.
We entered that century with an automobile that was functional but primitive, the zeppelin, and the beginnings of radio. Within ten years, the Wright brothers had made their flight; the helicopter made its first flight; Einstein published the Theory of Relativity; synthetic plastic, the gyrocompass, and the first sonar appeared; and William Kellogg invented Corn Flakes. At the end of the century, transcontinental flight was routine; we had walked on the moon; space travel had become commonplace; the Internet and the World Wide Web were in use internationally; the home computer had become a standard household implement; the digital cell phone had been invented; digital images had replaced film; and, as a capstone, we had invented Viagra.
Vaccinations and Germs
It was as recent as 1885 that Louis Pasteur, a polymath chemist, demonstrated to a skeptical world that rabies could be prevented by a vaccine. He was not the first to demonstrate the benefits of immunization. Smallpox immunization (variolation) was introduced into Great Britain in 1721 by Lady Wortley Montagu, the wife of the British ambassador to Turkey. That technique, which used dried material from smallpox scabs, had been practiced in the Middle East and China from the 1100s. Lady Wortley Montagu saw its value and, against great opposition, introduced it to Great Britain. Edward Jenner, a physician in Great Britain in the late 1700s, saw the clinical similarities between cowpox and smallpox and noted that milkmaids acquired cowpox in the course of their work but did not get smallpox. Accordingly, he applied the principle of variolation (from varius, meaning spotted) using cowpox virus—obtained from a cow named Blossom, hence the name "vaccination" (from the Latin, vaccinus, of cows)—to cross-immunize against smallpox. In 1796, he demonstrated that immunization with cowpox protected against a challenge with smallpox. As a result, Jenner's cowpox material replaced attenuated smallpox virus for purposes of immunization. He probably was not the first to note this inverse association of cowpox and smallpox but undoubtedly was the person who brought it into the mainstream practice of medicine.
Pasteur was well aware of all this; he already had done experiments to demonstrate the value of immunization for chicken cholera and anthrax. However, the dramatic success, using an experimental vaccine on a nine-year-old boy who had been bitten by a rabid dog, Joseph Meister, made him a hero. It also made vaccination a major tool in the medical armamentarium and catalyzed vaccine research not only in his eponymous Pasteur Institute but also in pharmaceutical companies that developed in the years that followed. This demonstration took place only sixty-five years before the midcentury we are concerned with here. Childhood immunizations were few at midcentury, and most of us simply acquired the diseases themselves. However, the technology and science of immunization were developing rapidly and would produce successful vaccines for polio in the middle fifties and a succession of vaccines for childhood diseases in the ensuing half century. This scientific base of biological and biochemical knowledge as applied to medicine was emblematic of other fields of scientific endeavor that would fuse the world of science to the world of medicine. The best known of these, at that time and for several decades thereafter, was the field of antibiotics.
It was Pasteur, that amazing investigator, who indirectly brought about the field of antibiotic research. He did not do any of it himself, but he created the scientific milieu that brought it about. Around 1856, he had been invited to investigate the problem of spoiled fermentation of beer and suggested, on the basis of good evidence, that microbes caused this problem. He was ridiculed by various experts, since evidence then, as now, carried little weight compared to personal conviction. In 1865, having studied beer, wine, and milk, he investigated a disease that killed silkworms and showed that it was due to a microorganism. Thereafter, he showed that chicken cholera was an infectious disease and that it could be prevented with a vaccine. Later, he did the same with anthrax.
The knowledge that diseases could be caused by microorganisms led to exploration of this issue, and medical personnel, pragmatic as they are, thought immediately of how these organisms could be eliminated. It was Alexander Fleming's observation in 1928 that a mold could kill certain bacteria that led ultimately to the discovery of penicillin. This discovery was not capitalized upon until about ten years later by Howard Florey and his research group and was first used in humans in 1941. Further work, catalyzed by the demand due to World War II, made the drug available in limited quantities by 1943. That year was only nine years before the mid-twentieth century.
The only other antimicrobial available during this time was sulfanilamide, the progenitor of many sulfa drugs, which was discovered in 1932 and was in general use in the late 1930s. This first commercially successful antimicrobial was created by Bayer in Germany and sprinkled as "sulfa powder" on an untold number of wounds during World War II. It also was a drug that would cross the "blood-brain barrier," an anatomic and physiologic barrier that prevents many compounds from crossing from the bloodstream into the brain, and could treat some forms of bacterial meningitis, a disease that was otherwise untreatable and usually fatal. This was only about twelve years before the mid-twentieth century.
The appearance of antibiotics made it clear that chemistry could create compounds or improve on those found by screening methods. This, in turn, ushered in what would be the enormous academic and commercial emphasis on what was cheekily termed "rational chemotherapy." This was research focused on finding some biochemical reaction or series of reactions that were important to a microorganism and inhibit the process by means of a chemical that mimicked one of the components in the biochemical sequence. Infectious diseases were a main focus for several decades for two major reasons: they were endemic or epidemic in the human population and caused significant morbidity and mortality, and they were caused by microorganisms that had some metabolic sequences that differed from humans. Thus, inhibition of these reactions theoretically could be done without harm to the human host. This is somewhat simplistic, because many metabolic pathways are similar—we all are products of the same evolutionary process—and toxic reactions can occur. Moreover, the human body sometimes perceives these chemicals as "foreign" and reacts immunologically against them. When this happens, one becomes allergic to a compound and it becomes functionally useless even though it remains biochemically potent. Nevertheless, the antibiotics were perceived as "magic bullets" and served as the Holy Grail of most therapeutic agents that were to be developed later for other fields—that is, a specific action without significant side effects. This dream would prove to be elusive, if not unattainable, since it is one thing to inhibit biochemistry in a microorganism, evolutionarily distant, and quite another to inhibit a reaction in a human and hope that there was not another reaction similar to it in the same person—a fundamentally illogical hope. At that time—and to a large degree it remains true now—infections were the only diseases that could be cured by medical means. Even now, most diseases are palliated, not cured. Surgical cures are something else.
At midcentury the entry of science into medicine was in its very early stages. Our understanding of the physiology and biochemistry of humans was rudimentary. We saw the promise yet had no idea of the enormous academic and industrial operations and organizations that would arise to move medicine rapidly, and almost precipitously, into the diagnostic and therapeutic golden age of the late twentieth century. The best known example was the pharmaceutical industry or, in some respects, the academic– industrial complex. This industry was unique in its ability to focus basic science on therapeutic problems and then develop the resulting compounds for medical use. It was frowned upon by many academic scientists and physicians as being outside the academy and done for profit, but history shows clearly that it was these companies that produced many effective antimicrobials and carried both science and medicine forward in the process. The adverse effects of the profit motive would not be felt until sometime later. It was the welcomed entrance of commerce into medicine.
Public Health and Sanitation
The US Public Health Service was given its impetus in 1798 when President John Adams signed into law the Act for the Relief of Sick and Disabled Seamen. The next year, Congress expanded this to include all officers and sailors in the US Navy. The Marine Service, as it was called then, spent the next century engaged in public health, as it applied to oceans, lakes, and waterways, under the direction of a supervising surgeon headquartered in Washington, DC. This position became the surgeon general, and in 1912 the Public Health and Marine Hospital Service was renamed the Public Health Service (PHS) and its powers broadened into investigations into human diseases of many types (essentially all infectious), sanitation, water supplies, and sewage disposal. From 1930 to 1944, during the Roosevelt administration, the PHS expanded to include engineers, dentists, scientists, nurses, and physicians. Thus, as the country entered the 1950s, it had in place a government medical organization that concerned itself with the many aspects of public health. It was not an organization that was known by most citizens, nor did practicing physicians pay much attention. However, its very existence made the statement that the government had a responsibility to care for its people, and the responsibility was manifest in the PHS. This organization played an enormous role in the prevention of disease through sanitation, maintenance of clean water supplies, quarantine (when necessary), and immunization. It is difficult to overstate the importance of the PHS in the progressive development and maintenance of the public health in this country. So, at midcentury, we had a large governmental organization that practiced medicine, both therapeutic and preventive, alongside the private practice community.
Excerpted from Fall From Grace by J. Joseph Marr. Copyright © 2015 J. Joseph Marr, MD.. Excerpted by permission of iUniverse.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.
Table of Contents
THE WAY WE WERE, 1,
EXPANSION OF THE MEDICAL CARE SYSTEM, 13,
THE ADVENT AND DOMINANCE OF TECHNOLOGY, 30,
ADMINISTRATION OF MEDICINE, 43,
FOR-PROFIT AND NONPROFIT, 52,
DEFENSIVE MEDICINE, 71,
HUBRIS IN MEDICINE, 80,
THE REMAINS OF THE DAY, 89,
CODA AND PERSPECTIVE, 102,