Do you like internal combustion engines?
Thank a few white men. (Jean Lenoir, Nikolaus Otto, Karl Benz, Rudolf Diesel, Gottlieb Daimler, Emil Jellinek, Henry Ford among others.)
Are you a fan of flush toilets and indoor plumbing?
Thank white males Alexander Cumming, Thomas Twyford, and Isaiah Rogers
Toilet paper?
Thank Joseph Gayetty, W.M.
How about washing machines and dryers?
Thank white males Alva Fisher and J. Ross Moore.
“When you’ve got your health, you’ve got just about everything” ran the tag-line in a famous Geritol commercial from the 1970s, and the guys we most have reason to be grateful for are undoubtedly those who’ve developed the medical practices and the drugs and devices that have transformed our lives over the past hundred fifty years.
Before the turkey gets carved, it’s worth taking a moment to remember a few of these brilliant, persistent, and lucky men, and recall their accomplishments. Even when they’ve won Nobel Prizes in Medicine, their names are virtually unknown. They’re not mentioned in the Core Curriculum or celebrated by Google on their birthdays.
Pain
If you ever had surgery, did you opt for anesthesia?
If so, thank a few more white males, beginning with William Clarke in New York and Crawford Long in Georgia who both used chloroform in minor surgeries in 1842. A paper published four years later by William Morton, after his own work in Boston, spread the word. Ether replaced chloroform during the next decade. There are now scores of general and regional anesthetics and sedatives and muscle relaxants, administered in tandem. The first local anesthetic has also been superseded. It was cocaine, pioneered by a Viennese ophthalmologist, Carl Koller, in 1884.
Ever take an analgesic?
Next time you pop an aspirin, remember Felix Hoffmann of Bayer. In 1897, he converted salicylic acid to acetylsalicylic acid, much easier on the stomach. Aspirin remains the most popular and arguably the most effective drug on the market. In 1948 two New York biochemists, Bernard Brodie and Julius Axelrod, documented the effect that acetaminophen (Tylenol), synthesized by Harmon Morse in 1878, had on pain and fever. Gastroenterologist James Roth persuaded McNeil Labs to market the analgesic in 1953.
Infectious Diseases
Most Americans today die of heart disease or cancer, but before the twentieth century, it was infectious diseases that struck people down, and children were the primary victims. In pre-industrial England, still with the most developed economy in the world in the late 17th century, 50% of all children didn’t survive the age of 15. With the phenomenal growth of cities during the 19th century, cholera, typhoid fever, and tuberculosis became the leading killers.
In 1854, a London medical inspector, John Snow, proved that a cholera epidemic in Soho was caused by infected sewage seeping into the water supply. Until then it was thought the disease spread through the air. The sanitary disposal of sewage and the provision of clean water, possible thanks to mostly anonymous metallurgists and engineers — an exception is the famous Thomas Crapper, who pioneered the u-shaped trap and improved, though he didn’t invent, the flush toilet — has saved more lives than any drug or surgical innovation.
Dramatic improvements in food supply have also had an incalculable effect on health. Agricultural innovations, beginning with those introduced in England in the 18th century, were disseminated globally by the end of the 20th century — the “Green Revolution.” Famines struck Europe as recently as the late 1860s. (The man-made famines of the 20th century are another story.) A transportation revolution made possible the provision of more than sufficient protein, calories, and nutrients worldwide. Needless to say, it was white males who designed and built the roads, canals, railroads, and ports and airports, and the ships, trains, planes, and trucks that used them, and the mines, and then wells, pipelines, and tankers that supplied the fuel they ran on.
Whatever the merits of taking vitamins and supplements today, no one has to take vitamin C to prevent scurvy, or vitamin B to prevent pellagra, or vitamin D and calcium to prevent rickets. And, for the time being, we all live in a post-Malthusian world. The global population was about 800 million to 1 billion when the gloomy parson wrote his famous book in 1798. It’s now over 7 billion.
***
Dr. Snow had no idea what was actually causing cholera. It was Louis Pasteur who gave the world the germ theory of disease, as every schoolchild once knew. Studying the fermentation of wine, he concluded that this was caused by the metabolic activity of microorganisms, as was the souring of milk. The critters were responsible for disease, too, he recognized, and identified three killer bacteria: staphylococcus, streptococcus, and pneumococcus. Nasty microorganisms could be killed or rendered harmless by heat and oxygenation, Pasteur discovered, and would then prevent the disease in those who were inoculated. He went on to develop vaccines for chicken cholera, anthrax, and rabies. Edward Jenner had demonstrated in in the late 1790s that the dreaded smallpox could be prevented by injecting patients with material from the pustules of cowpox victims, a much milder disease. (The word vaccine comes from vaca, one of the Latin words for cow.) Pasteur, however, was the first to immunize patients by modifying bacteria rather than through cross-vaccination.
A parade of vaccines followed. People in their mid-60s and older can remember two of the most famous: the Salk and Sabin vaccines against poliomyelitis, a paralyzing disease that had panicked American parents in the late ‘40s and early ‘50s. Children preferred Albert Sabin’s 1962 version: the attenuated virus was administered on a sugar cube. Jonas Salk’s inactivated vaccine, available in 1955, was injected.
In 1847, more than a decade before Pasteur disclosed his germ theory, the Viennese obstetrician Ignaz Semmelweis documented the effectiveness of hand washing with chlorinated water before entering a maternity ward. He brought mortality rates from puerperal fever down from 8% to 1.3%. Two decades later, having read a paper by Pasteur, Joseph Lister demonstrated the effectiveness of carbolic acid to sterilize wounds and surgical instruments. Mortality rates fell from around 50% to about 15%. The efforts of both men, especially Semmelweis, were met with ridicule and disdain.
Pasteur’s German rivals Robert Koch and Paul Ehrlich made monumental contributions to biochemistry, bacteriology, and hematology, but left the world no “magic bullet” (Ehrlich’s term). Koch identified the organism causing tuberculosis, the leading killer of the 19th century, but his attempts at finding a vaccine failed. His purified protein derivative from the bacteria, tuberculin, could be used to diagnose the disease, however. It was two French researchers, Albert Calmette and Camille Guerin, who developed a successful vaccine, first administered in 1921, though it was not widely used until after World War II.
Ehrlich joined the search for antibacterial drugs that were not denatured bacteria or viruses. He synthesized neoarsphenamine (Neo-Salvarsan), effective against syphilis, a scourge since the late15th century, but which had toxic side effects. It was not until the 1930s that first generation of antibiotics appeared. These were the sulfa drugs, derived from dyes with sulfa-nitrogen chains. The first was a red dye synthesized by Joseph Klarer and Fritz Mietzsch. In 1935, Gerhard Domagk at I. G. Farben demonstrated its effectiveness in cases of blood poisoning.
The anti-bacterial properties of Penicillium had already been discovered at this point by Alexander Fleming. The Scottish bacteriologist had famously left a window open in his lab when he went on vacation in 1928, and returned to find that a mold had destroyed the staphylococcus colony in one of his petri dishes. But it’s one thing to make a fortuitous discovery and another thing to cultivate and purify a promising organic compound and conduct persuasive trials. This was not done until 1941. Thank Oxford biochemists Howard Florey and Ernst Chain. A Pfizer chemist, Joseph Kane, figured out how to mass-produce penicillin and by 1943 it was available to American troops. The wonder drug of the 20th century, penicillin killed the Gram-positive bacteria that caused meningitis, diphtheria, rheumatic fever, tonsillitis, syphilis, and gonorrhea. New generations of antibiotics followed, as bacteria rapidly developed resistance: among them, streptomycin in 1943 (thank Selman Waksman), tetracycline in 1955 (thank Lloyd Conover), and, the most widely prescribed today, amoxicillin.
Diagnostic technologies
Microscope: While the Delft draper Antonie van Leeuwenhoek didn’t invent the compound microscope, he improved it, beginning in the 1660s, increasing the curvature of the lenses, and so became the first person to see and describe blood corpuscles, bacteria, protozoa, and sperm.
Electron microscope: Physicist Ernst Ruska and electrical engineer Max Kroll constructed the prototype in Berlin in 1933, using a lens by Hans Busch. Eventually, electron microscopes would be designed with two-million power magnification. Leeuwenhoek’s had about two hundred.
Stethoscope: Thank the French physician René Laennec, who introduced what he called a microphone in 1816. British nephrologist Golding Bird substituted a flexible tube for Laennec’s wooden cylinder in 1840, and the Irish physician Arthur Leared added a second earpiece in 1851. Notable improvements were made by Americans Howard Sprague, a cardiologist, and electrical engineer Maurice Rappaport in the 1960s (a double-sided head), and Harvard cardiologist David Littmann in the same decade (enhancing the acoustics). The device undoubtedly transformed medicine, and with good reason became the symbol of the health care professional.
Sphygmograph: The first machine to measure blood pressure was created by a German physiologist, Karl von Vierordt in 1854.
X-rays: Discovered by Karl Wilhelm Röntgen, at Wurzberg in 1895, this was probably the single most important diagnostic breakthrough in medical history. Before Röntgen noticed that cathode rays, electrons emitted from a cathode tube, traveled through objects and created images on a fluorescent screen, physicians could only listen, palpitate, examine stools, and drink urine.
PET scans: James Robertson designed the first machine in 1961, based on the work of number of American men at Penn, Wash U., and Mass General, designed the first machine. The scanner provides an image from the positron emissions coming from a radioactive isotope injected into the patient, and is particularly useful for mapping activity in the brain.
CAT scans: The first model was developed by electrical engineer Godfrey Hounsfield, in London, 1972, drawing on the work of South African physicist Alan Cormack in the mid-1960s. It generates three-dimensional and cross-sectional images using computers and gamma rays.
MRI: Raymond Damadian, a SUNY professor of medicine with a degree in math, performed the first full-body scan 1977. His design was anticipated by theoretical work by Felix Bloch and Edward Purcell in the 1930s, and, later, Paul Lauterbur. MRIs map the radio waves given off by hydrogen atoms exposed to energy from magnets, and are particularly useful in imaging tissue — and without exposing the patient to ionizing radiation.
Ultrasound: Ian Donald, a Glasgow obstetrician, in the mid-1950s adopted a device already used in industry that generated inaudible, high frequency sound waves. The machine quickly and cheaply displays images of soft tissue, and now provides most American parents with the first photo of their baby.
Endoscopes: Georg Wolf produced the first flexible gastroscope in Berlin in 1911, and this was improved by Karl Storz in the late ‘40s. The first fiber optic endoscope was introduced in 1957 by Basil Hirschowitz, a South African gastroenterologist, drawing on the work of British physicist Harold Hopkins. The scope is indispensible in diagnosing GI abnormalities.
Angiogram: Werner Forssmann performed the first cardiac catherisation — on himself — in Eberswald in 1929. He inserted a catheter into his lower left arm, walked downstairs to a fluoroscope, threaded the catheter to his right atrium and injected a radioptic dye. The technique was further developed by Dickson Richards and André Courmand at Columbia in the ‘40s, and then extended to coronary arteries, initially accidentally, by Frank Sones at the Cleveland Clinic in 1958.
X-rays and scopes were quickly used in treatment as well diagnosis. Roentgen himself used his machines to burn off warts. Similarly, in 1964, Charles Dotter and Marvin Judkins used a catheter to open a blocked artery, improving the technique in 1967. Andreas Gruentzig then introduced balloon angioplasty in 1975, an inflated balloon opening the narrowed or blocked artery. In 1986, Jacques Puel implanted the first coronary stent at U. of Toulouse, and soon afterwards a Swiss cardiologist, Ulrich Sigwart, developed the first drug-eluding stent.
***
The men who developed five of the most dramatically effective and widely used drugs in internal medicine deserve mention.
In the late ‘30s, two Mayo Clinic biochemists hoping to cure rheumatoid arthritis, Philip Hench and Edward Kendall, isolated four steroids extracted from the cortex of the adrenal gland atop the kidneys. The fourth, “E,” was very difficult to synthesize, but Merck chemist Lewis Sarrett succeeded, and in 1948, the hormone was injected into fourteen patients crippled by arthritis. Cortisone relieved the symptoms. Mass produced, with much difficulty, by Upjohn chemists in 1952, it was refined by their rivals at Schering three years later into a compound five times as strong, prednisone. In addition to arthritis, corticosteroids are used in the treatment of other inflammatory diseases, like colitis and Crohn’s, and in dermatitis, asthma, hepatitis, and lupus.
Anyone over fifty can remember peptic ulcers, extremely painful lesions on the stomach wall or duodenum. They were thought to be brought on by stress. “You’re giving me an ulcer!” was a common expression. Women were especially affected, and a bland diet was the only treatment, other than surgery. The lesions were caused by gastric acid, and two British pharmacologists and a biochemist, George Paget, James Black, and William Duncan, investigated compounds that would block the stomach’s histamine receptors, reducing the secretion of acid. There were endless difficulties. Over 200 compounds were synthesized, and the most promising, metiamide, proved toxic. Tweaking the molecule, replacing a sulfur atom with two nitrogen atoms, yielded cimetidine in 1976. As Tagamet, it revolutionized gastroenterology. It was also the first drug to generate over $1 billion in annual sales. Its successors, the proton pump inhibitors Prilosec and its near-twin Nexium, more than doubling the acid reduction, have also been blockbuster drugs.
Cimetidine was the culmination of one line of research that began in 1910, when a London physiologist, Henry Dale, isolated a uterine stimulant he called “histamine.” Unfortunately, when it was given to patients, it caused something like anaphylactic shock. The search began for an “antagonist” that would block its production, even before it was recognized as the culprit in hay fever (allergic rhinitis). The most successful antagonist was one was developed in 1943 by a young chemist in Cincinnati, Geroge Rieveschl, diphenhydramine, marketed as Benadryl. Ten to thirty percent of the world’s population suffers from seasonal allergies, so this was hailed as miracle drug. In the early ‘80s a second generation of antihistamines appeared that didn’t cross the brain-blood barrier and thus didn’t sedate the user. Loratadine (Claritin), the first, was generating over $2 billion in annual sales before it went generic.
Diabetes, resulting in high blood glucose levels (heperglycemia), has been known for two millennia. It was a deadly disease, type 1 rapidly fatal, type 2, adult onset, debilitating and eventually lethal. By the end of the 19th century, the Islets of Langerhans in the pancreas had been identified as the source of a substance that prevented it, insulin, but this turned out to be a fragile peptide hormone, broken down by an enzyme in the pancreas during attempts to extract it. In 1921, Canadian surgeon Frederick Banting and medical student Charles Best determined a way to disable the production of the enzyme, trypsin. Injected in a teenager with type 1 diabetes, insulin was immediately effective. There is still no cure for diabetes, but today the 380 million sufferers globally can live normal lives thanks to Banting and Best.
Finally, millions of men and their wives and girlfriends owe a big debt to British chemists Peter Dunn and Albert Wood, and Americans Andrew Bell, David Brown, and Nicholas Terrett. They developed sildenafil, intended to treat angina. It works by suppressing an enzyme that degrades a molecule that relaxes smooth muscle tissue, increasing blood flow. Ian Osterloh, running the clinical trials for Pfizer, observed that the drug induced erections, and it was marketed for ED. Viagra made the cover of Time Magazine after it was approved in March 1998. The blue pill still generates about $2 billion annually in sales, despite competition, and is prescribed for 11 million men.
***
Two incredible machines build in the mid-20th century revolutionized the practice of medicine. Both remove blood from the body.
During World War II, the Dutch physician Willem Kolff constructed a machine to cleanse the blood of patients suffering from renal failure by filtering out urea and creatine. Over 400,000 Americans are on dialysis today.
In 1953, after 18 years of work, John Gibbon, a cardiologist at the University of Pennsylvania, produced a machine that oxygenated blood and pumped it around the body, permitting operations on the heart, like those performed a decade later by Michael DeBakey in Houston and René Favaloro in Cleveland. The two surgeons pioneered coronary bypass grafts, using a blood vessel in the leg or chest to re-route blood around a blocked artery. About 200,000 operations are performed each year, down from about 400,000 at the turn of the century, thanks to stents. Gibbon’s machine enabled the most widely covered operation in history, the heart transplant, first performed by South African surgeon Christian Barnard in 1967, based on research by Norman Shumway and others. Over 2,000 Americans receive heart transplants each year.
The cardiac device Americans are most likely to encounter is the defibrillator, now in airports, stadiums, supermarkets, and other public places. Thank two Swiss professors, Louis Prévost and Frédéric Batelli, who, in 1899, induced ventricle fibrillation, abnormal heartbeat, in dogs with a small electrical shock, and restored normal rhythm with a larger one. It was not until the 1940s that a defibrillator was used in heart surgery, by Claude Beck in Cleveland. A Russian researcher during World War II, Naum Gurvich, discovered that biphasic waves, a large positive jolt followed by a small negative pulse, was more effective, and a machine was constructed on this basis by an American cardiologist, Bernard Lown. Improvements by electrical engineers William Kouwenhoven and Guy Knickerbocker, and cardiologist James Jude at Hopkins in 1957, and subsequently by Karl and Mark Kroll, and Byron Gilman in the ‘90s made the device much smaller and portable.
Over three million people worldwide don’t have to worry about defibrillators or face open-heart surgery. These are the recipients of pacemakers, and can thank a Canadian electrical engineer, John Hopps. Predecessors were deterred by negative publicity about their experiments, which were believed to be machines to revive the dead. Gurvich had faced this as well. Hopps’ 1950 device used a vacuum tube. With the invention of the transistor, a wearable pacemaker became possible, and Earl Bakken designed one in 1958. Not long afterward, two Swedish engineers, Rune Elmquist and Åke Senning created an implantable pacemaker. The first recipient eventually received 26 and lived to age 86. Lithium batteries, introduced in 1976, enabled the creation of devices with a much longer life.
Cardiac Drugs
Cardiac stimulants have been around since the late 18th century. Thank William Withering, who published his experiments with the folk-remedy digitalis (from foxglove) in 1785.
Anti-anginal drugs were introduced a century later, also in Britain: amyl nitrite in the mid-1860s and nitroglycerin a decade later. Both compounds had been synthesized by French chemists. Thank Thomas Bruton and William Murrell.
The first diuretics, to reduce edema (swelling) and lower blood pressure, were alkaloids derived from coffee and tea. These were not very effective, but better than leeches. Mercury compounds were pioneered by the Viennese physician Arthur Vogel in 1919. These worked, but were tough on the kidneys and liver. The first modern diuretics, carbonic anhydrase inhibitors, were developed in the 1940s, with the American Karl Beyer playing a leading role.
The first anti-coagulants date from the ‘20s. A Johns Hopkins physiologist, William Howell, extracted a phospholipid from dog livers that he called heparin and that appeared to prevent blood clots. The first modern anti-coagulant, and still the most widely prescribed, was warfarin (Coumadin), developed as a rat-poison by Karl Link in Wisconsin in 1948. Its effectiveness, and lack of toxicity, was revealed when an army recruit took it in a suicide attempt.
Anti-arrhythmic drugs, to stabilize the heartbeat, were introduced in the opening decade of the 20th century. The first was derived from quinine. The big breakthrough occurred in 1962. Thank, once again, the Scotsman James Black, who synthesized propranolol in that year, the first beta-blocker. What they block are the receptors of epinephrine and norepinephrine. These two chemicals (catecholamines) increase the heart rate, blood pressure, and blood glucose levels, useful for many purposes, but not a good thing in patients with cardiac arrhythmia, irregular heartbeats. Beta-blockers are also prescribed to lower blood pressure.
ACE inhibitors lower the levels of an enzyme secreted by the kidneys and lungs that constricts blood vessels. The unpromising source for the first inhibitor was the venom of the Brazilian pit-viper. It was extracted, purified, and tested by three Squibb scientists in 1975, David Cushman, Miguel Ondetti, and Bernard Rubin. It’s still widely prescribed, though many other ACE inhibitors have since been designed. They are used for patients with congestive heart failure or who have had a heart attack, as well as those with hypertension.
Finally, mention must be made of the statins, which, though over-hyped and over-prescribed, lower serum cholesterol and reduce the risks of a second heart attack. A Japanese microbiologist, Akira Endo, derived, from a species of Penicillium, a substance that inhibited the synthesis of cholesterol, but it was too toxic to use on humans. In 1978, a team at Merck under Alfred Alberts had better luck with another fungus, and called the compound lovastatin. Statins work by inhibiting the activity of an enzyme called HMGR.
Cancer Drugs
In the forty-three years since Richard Nixon’s “war on cancer” was launched, the disease has received the lion’s share of government, foundation, and pharmaceutical industry funding, though heart disease kills more people — 596,577 Americans last year to 576,691 for cancer, according to the most recent data. This makes it particularly difficult, and invidious, to single out individual researchers.
There is still, of course, nothing close to a magic bullet, though cancer deaths have dropped about 20% since their peak in 1991. Around 27% of cancer deaths this year will be from lung cancer, so the rate will continue to fall as more people stop smoking.
The originators of a few therapies with good five-year survival rates ought to be singled out and thanked.
Seattle oncologist Donnall Thomas performed the first successful bone marrow transplant in 1956. The donor was an identical twin of the leukemia patient. With the development of drugs to suppress the immune system’s response to foreign marrow, Thomas was able to perform a successful transplant from a non-twin relative in 1969. About 18,000 are now performed each year.
One of the more notable successes of chemotherapy has been in the treatment of the childhood cancer acute lymphoblastic leukemia (ALL). Sidney Farber in the late ‘40s carried out clinical trials with the antifolate aminopterin, synthesized at Lederle by the Indian biochemist Yellapragada Subbarow. This proved the first effective compound in treating the disease. It was superseded by methotrexate, and now, as in all chemo treatments, a combination of agents is used. The five-year survival rate for ALL has jumped from near zero to 85%.
Early detection is the key to successful treatment in all cancers, and survivors of breast cancer can thank at least four men who pioneered and popularized mammography over a fifty-year period beginning in 1913: Albert Salman, Stafford Warren, Raul Leborgne, and Jacob Gershon-Cohen.
A second key to the comparatively high survival rates for women with breast cancer is tamoxifen. First produced in the late ‘50s by British endocrinologist Arthur Walpole, it was intended as a “morning-after” birth control pill because it blocked the effects of estrogen. However, it failed to terminate pregnancy. Researchers had meanwhile discovered that some, though not all, women with breast cancer recovered when their ovaries were removed. Walpole thought tamoxifen might block breast cancer estrogen receptor cells, inhibiting their reproduction, and persuaded a professor of pharmacology, Craig Jordan, to conduct experiments. These demonstrated the drug’s efficacy, and after clinical trials it was approved and marketed in 1973. Think of Arthur W. the next time you see one of those ubiquitous pink ribbons.
Most chemo agents are cytotoxic metal-based compounds that do not distinguish between abnormal cells and healthy cells that also divide rapidly. The nasty side effects range from hair-loss and nausea to decreased production of red blood cells, nerve and organ damage, osteoporosis and bone fusion, and loss of memory and cognition. More selective drugs, monoclonal antibodies, have been used for some time. These were first produced by Georges Köhler and César Millstein in 1975 and “humanized” by Greg Winter in 1988, that is, made more effective by using recombinant DNA from mammals. Over 30 “mab” drugs have been approved, about half for cancer.
Research has also been underway for years into delivery systems using “nano-particles” that will target tumors exclusively. Another approach, pioneered by Jonah Folkman, has been to find drugs that will attack the blood supply of tumors, angiogenesis inhibitors. This turned out not to be the magic bullet Folkman hoped for, but more than fifty of these drugs are in clinical trials, and a number are effective owing to other mechanisms, and are currently used.
Psychiatric medicine
Drugs have revolutionized the practice of psychiatry since the 1950s, and brought relief to millions suffering from depression, anxiety, and psychoses. For obvious reasons, these are some of the most highly addictive and widely abused drugs.
A few men to thank:
Adolph von Baeyer, Emil Fischer, Joseph von Mering: barbiturates, synthesized in 1865, but not marketed until 1903. The most commonly prescribed today are phenobarbital sodium (Nembutal) and mephobarbital (Membaral).
Bernard Ludwig and Frank Berger: meprobamate, the tranquilizer Miltown. By the end of the ‘50s, a third of all prescriptions in America were for this drug
Leo Steinberg: the anxiolytic (anti-anxiety) benzodiazepines, first synthesized in 1955. The most successful initially was diazepam, Valium, marketed in 1963. The most widely prescribed benzodiazepine today is alprazolam, Xanax. It’s also the most widely prescribed psychiatric drug, with nearly 50 million prescriptions. It increases concentrations of dopamine and suppresses stress-inducing activity of the hypothalamus.
Leandro Panizzon: methylphenidate (Ritalin). The Swiss chemist developed it in 1944 as a stimulant, and named it after his wife, whose tennis game it helped improve. Until the early ‘60s amphetamines were used, counter-intuitively, to treat hyperactive children. Thirty years after its patent expired, the controversial dopamine reuptake inhibitor is still the most widely prescribed medication for the 11% of children who’ve been diagnosed with ADHD.
Klaus Schmiegel and Bryan Malloy: the anti-depressant fluoxetine, the first SSRI, selective serotonin reuptake inhibitor, increasing serotonin levels. Marketed as Prozac in 1988, it made the cover of Newsweek and is still prescribed for over 25 million patients.
Paul Janssen: risperdone (Risperdal), the mostly widely prescribed antipsychotic drug worldwide. The Belgian researcher developed many other drugs as well, including loperamide HCL (Imodium). When commenters on web articles advise trolls to take their meds, they might want to specify risperdone.
Seiji Sato, Yasuo Oshiro, and Nobuyuki Kurahashi: aripiprazole (Abilify) which blocks dopamine receptors, and was the top selling drug at the end of 2013, grossing $1.6 billion in Q4.
***
A few observations.
Japanese and Indian researchers will make important contributions to future drugs, as the trio responsible for Abilify reminds us.
And, naturally, some women have played roles in the advances that have been summarized. Mary Gibbon, a technician, assisted her husband on the heart-lung machine. Lina Stern did important research on the blood-brain barrier, and it was in her lab that Guravich improved the defibrillator. Jane Wright conducted early trials of methotrexate that helped demonstrate its efficacy. Lucy Wills did pioneering work on anemia in India. Roslyn Yalow helped develop radioimmunoassay, which measures concentrations of antigens in the blood. Anne-Marie Staub did interesting work on antihistamines, though her compounds proved toxic.
They are exceptions. Our benefactors have not only been overwhelmingly European males, but are mostly from England and Scotland, Germany, France, Switzerland, and the Netherlands, as well as Americans and Canadians whose families emigrated from those countries. And, of course, Jews, who’ve won 28% of the Nobel Prizes in Medicine.
Some of the beneficiaries in particular might want to think about this.
Muslims boast that their faith has over 2 billion followers throughout the world. If this number is accurate it has far less to do with the appeal of Islam or with Arab or Turkish conquests, and everything to do with the work of some Northern Europeans and Jews, along with the “imperialists” who built roads, canals, and ports and the vehicles that use them, as well as schools and hospitals — like the traveling eye clinics in Egypt funded by the Jewish banker Ernest Cassel, which nearly eliminated blinding trachoma, then endemic.
The fact that we in the U.S. idolize our entertainers as no society has before is not going to cut off the supply of outstanding medical researchers. Very bright and inquisitive people usually don’t pay much attention to popular culture. But it diminishes us.
It’s the ingratitude, though, not the indifference, that’s more troubling.
Biting the hand that feeds is a core principle of Leftists. For 150 years, they’ve sought to concentrate power in their own hands by exploiting the resentment of ignorant people against a system that has finally enabled mankind to spring the Malthusian trap.
Multiculturalism, with its simple-minded relativism, has broadened the scope of the party line. Not only shadowy “capitalists” are vilified, but whites and males. Ignorant people can now think well of themselves by opposing “racism” and “the patriarchy” — and by voting for an unqualified and deceptive poseur because, though a male, he is not white.
The first step parents can take to help spare America from being “fundamentally transformed” is to insist that history be properly taught. This means, among other things, recognizing the accomplishments of a few men who’ve found cures for or relieved the symptoms of diseases that have killed and tortured humans for millennia.
Source.