Mid-1800s–1945 The modern era of the pharmaceutical industry began with local
apothecaries that expanded their traditional role of distributing
botanical drugs such as
morphine and
quinine to wholesale manufacture in the mid-1800s. Intentional
drug discovery from plants began with the extraction of
morphine – an
analgesic and sleep-inducing agent – from
opium by the German apothecary assistant
Friedrich Sertürner somewhere between 1803 and 1805. Sertürner later named this compound after the Greek god of dreams,
Morpheus. Multinational corporations including
Merck,
Hoffman-La Roche, Burroughs-Wellcome (now part of
GSK),
Abbott Laboratories,
Eli Lilly, and
Upjohn (now part of
Pfizer) began as local apothecary shops in the mid-1800s. By the late 1880s, German dye manufacturers had perfected the purification of individual
organic compounds from
tar and other mineral sources and had also established rudimentary methods in
organic chemical synthesis. The development of synthetic chemical methods allowed scientists to systematically vary the structure of chemical substances, and growth in the emerging science of
pharmacology expanded their ability to evaluate the biological effects of these structural changes.
Epinephrine, norepinephrine, and amphetamine By the 1890s, the profound effect of
adrenal extracts on many different tissue types had been discovered, setting off a search both for the mechanism of chemical signaling and efforts to exploit these observations for the development of new drugs. The blood pressure raising and vasoconstrictive effects of adrenal extracts were of particular interest to surgeons as
hemostatic agents and as a treatment for shock, and several companies developed products based on adrenal extracts containing varying purities of the active substance. In 1897,
John Abel at the
Johns Hopkins University identified the active substance as
epinephrine, which he isolated in an impure state as the sulfate salt. Industrial chemist
Jōkichi Takamine later developed a method for obtaining epinephrine in a pure state and licensed the technology to
Parke-Davis. Parke-Davis marketed epinephrine under the trade name
Adrenalin. Injected epinephrine proved to be especially efficacious for the acute treatment of
asthma attacks, and an inhaled version is sold over the counter in the United States. (
Primatene Mist). By 1929 epinephrine had been formulated into an inhaler for use in the treatment of nasal congestion. While highly effective, the requirement for injection limited the use of epinephrine and orally active derivatives were sought. A structurally similar compound,
ephedrine, was identified by Japanese chemists in the
Ma Huang plant and marketed by
Eli Lilly as an oral treatment for asthma. Following the work of
Henry Dale and
George Barger at
Burroughs-Wellcome, academic chemist
Gordon Alles synthesized
amphetamine and tested it in asthma patients in 1929. The drug proved to have only modest anti-asthma effects but produced sensations of exhilaration and palpitations. Amphetamine was developed by
Smith, Kline and French as a nasal decongestant under the trade name
Benzedrine Inhaler. Amphetamine was eventually developed for the treatment of
narcolepsy,
post-encephalitic parkinsonism, and mood elevation in depression and other psychiatric indications. It received approval as a New and Nonofficial Remedy from the
American Medical Association for these uses in 1937, and remained in common use for depression until the development of
tricyclic antidepressants in the 1960s.
Restrictions in use of amphetamines and barbiturates The 1950s and 1960s saw increased awareness of the addictive properties and abuse potential of barbiturates and amphetamines and led to increasing restrictions on their use and growing government oversight of prescribers. Today, amphetamine is largely restricted to use in the treatment of
attention deficit disorder and phenobarbital in the treatment of
epilepsy.
Benzodiazepines In 1958,
Leo Sternbach discovered the first
benzodiazepine,
chlordiazepoxide (Librium). Dozens of other benzodiazepines have been developed and are in use, some of the more popular drugs being
diazepam (Valium),
alprazolam (Xanax),
clonazepam (Klonopin), and
lorazepam (Ativan). Due to their far superior safety and therapeutic properties, benzodiazepines have largely replaced the use of barbiturates in medicine, except in certain special cases. When it was later discovered that benzodiazepines, like barbiturates, significantly lose their effectiveness and can have serious side effects when taken long-term,
Heather Ashton researched benzodiazepine dependence and developed a protocol to discontinue their use.
Insulin A series of experiments performed from the late 1800s to the early 1900s revealed that
diabetes is caused by the absence of a substance normally produced by the pancreas. In 1869,
Oskar Minkowski and
Joseph von Mering found that diabetes could be induced in dogs by surgical removal of the pancreas. In 1921, Canadian professor
Frederick Banting and his student Charles Best repeated this study and found that injections of pancreatic extract reversed the symptoms produced by pancreas removal. Soon, the extract was demonstrated to work in humans, but the development of insulin therapy as a routine medical procedure was delayed by difficulties in producing the material in sufficient quantity and with reproducible purity. The researchers sought assistance from industrial collaborators at
Eli Lilly and Co. based on the company's experience with large-scale purification of biological materials. Chemist
George B. Walden of Eli Lilly and Company found that careful adjustment of the
pH of the extract allowed a relatively pure grade of insulin to be produced. Under pressure from Toronto University and a potential patent challenge by academic scientists who had independently developed a similar purification method, an agreement was reached for the non-exclusive production of insulin by multiple companies. Before the discovery and widespread availability of insulin therapy, the life expectancy of diabetics was only a few months.
Early anti-infective research: salvarsan, prontosil, penicillin and vaccines The development of drugs for the treatment of infectious diseases was a major focus of early research and development efforts; in 1900, pneumonia, tuberculosis, and diarrhea were the three leading causes of death in the United States and mortality in the first year of life exceeded 10%. In 1911
arsphenamine, the first synthetic anti-infective drug, was developed by
Paul Ehrlich and chemist
Alfred Bertheim of the Institute of Experimental Therapy in Berlin. The drug was given the commercial name Salvarsan. Ehrlich, noting both the general toxicity of
arsenic and the selective absorption of certain dyes by bacteria, hypothesized that an arsenic-containing dye with similar selective absorption properties could be used to treat bacterial infections. Arsphenamine was prepared as part of a campaign to synthesize a series of such compounds and exhibited partially selective toxicity. Arsphenamine proved to be the first effective treatment for
syphilis, a disease that until then had been incurable and led inexorably to severe skin ulceration, neurological damage, and death. Ehrlich's approach of systematically varying the chemical structure of synthetic compounds and measuring the effects of these changes on biological activity was pursued broadly by industrial scientists, including
Bayer scientists Josef Klarer, Fritz Mietzsch, and
Gerhard Domagk. This work, also based on the testing of compounds available from the German dye industry, led to the development of
Prontosil, the first representative of the
sulfonamide class of
antibiotics. Compared to arsphenamine, the sulfonamides had a broader spectrum of activity and were far less toxic, rendering them useful for infections caused by pathogens such as
streptococci. In 1939, Domagk received the
Nobel Prize in Medicine for this discovery. Nonetheless, the dramatic decrease in deaths from infectious diseases that occurred before
World War II was primarily the result of improved public health measures such as clean water and less crowded housing, and the impact of anti-infective drugs and vaccines was significant mainly after World War II. In 1928,
Alexander Fleming discovered the antibacterial effects of
penicillin, but its exploitation for the treatment of human disease awaited the development of methods for its large-scale production and purification. These were developed by a U.S. and British government-led consortium of pharmaceutical companies during World War II. There was early progress toward the development of vaccines throughout this period, primarily in the form of academic and government-funded basic research directed toward the identification of the pathogens responsible for common communicable diseases. In 1885,
Louis Pasteur and
Pierre Paul Émile Roux created the first
rabies vaccine. The first
diphtheria vaccines were produced in 1914 from a mixture of
diphtheria toxin and
antitoxin (produced from the serum of an inoculated animal), but the safety of the inoculation was marginal and it was not widely used. The United States recorded 206,000 cases of diphtheria in 1921, resulting in 15,520 deaths. In 1923, parallel efforts by
Gaston Ramon at the Pasteur Institute and
Alexander Glenny at the Wellcome Research Laboratories (later part of
GlaxoSmithKline) led to the discovery that a safer vaccine could be produced by treating diphtheria toxin with
formaldehyde. In 1944,
Maurice Hilleman of Squibb Pharmaceuticals developed the first
vaccine against Japanese Encephalitis. Hilleman later moved to
Merck, where he played a key role in the development of vaccines against
measles,
mumps,
chickenpox,
rubella,
hepatitis A,
hepatitis B, and
meningitis.
Unsafe drugs and early industry regulation formulated in the toxic solvent diethylene glycol. Prior to the 20th century, drugs were generally produced by small scale manufacturers with little regulatory control over manufacturing or claims of safety and efficacy. To the extent that such laws did exist, enforcement was lax. In the United States, increased regulation of vaccines and other biological drugs was spurred by tetanus outbreaks and deaths caused by the distribution of contaminated smallpox vaccine and diphtheria antitoxin. The Biologics Control Act of 1902 required that federal government grant premarket approval for every biological drug and for the process and facility producing such drugs. This Act was followed in 1906 by the
Pure Food and Drugs Act, which forbade the interstate distribution of adulterated or misbranded foods and drugs. A drug was considered misbranded if it contained alcohol, morphine, opium, cocaine, or any of several other potentially dangerous or addictive drugs, and if its label failed to indicate the quantity or proportion of such drugs. The government's attempts to use the law to prosecute manufacturers for making unsupported claims of efficacy were undercut by a Supreme Court ruling restricting the federal government's enforcement powers to cases of incorrect specification of the drug's ingredients. In 1937 over 100 people died after ingesting "
Elixir Sulfanilamide" manufactured by S.E. Massengill Company of Tennessee. The product was formulated in
diethylene glycol, a highly toxic solvent that is now widely used as antifreeze. Under the laws extant at that time, prosecution of the manufacturer was possible only under the technicality that the product had been called an "elixir", which implied a solution in ethanol. In response to this episode, the U.S. Congress passed
the Federal Food, Drug, and Cosmetic Act of 1938 (FD&C Act), which for the first time required pre-market demonstration of safety before a drug could be sold, and explicitly prohibited false therapeutic claims.
1945–1970 Further advances in anti-infective research The aftermath of
World War II saw an explosion in the discovery of new classes of antibacterial drugs including the
cephalosporins (developed by Eli Lilly based on the seminal work of
Giuseppe Brotzu and
Edward Abraham),
streptomycin (discovered during a Merck-funded research program in Selman Waksman's laboratory), the
tetracyclines (discovered at Lederle Laboratories, now a part of
Pfizer),
erythromycin (discovered at Eli Lilly and Co.) and their extension to an increasingly wide range of bacterial pathogens. Streptomycin, discovered during a Merck-funded research program in Selman Waksman's laboratory at Rutgers in 1943, became the first effective treatment for tuberculosis. At the time of its discovery, sanitoriums for the isolation of tuberculosis-infected people were a ubiquitous feature of cities in developed countries, with 50% dying within 5 years of admission. A
Federal Trade Commission report issued in 1958 attempted to quantify the effect of antibiotic development on American public health. The report found that over the period 1946–1955, there was a 42% drop in the incidence of diseases for which antibiotics were effective and only a 20% drop in those for which antibiotics were not effective. The report concluded that "it appears that the use of antibiotics, early diagnosis, and other factors have limited the epidemic spread and thus the number of these diseases which have occurred". The study further examined mortality rates for eight common diseases for which antibiotics offered effective therapy (syphilis, tuberculosis, dysentery, scarlet fever, whooping cough, meningococcal infections, and pneumonia), and found a 56% decline over the same period. Notable among these was a 75% decline in deaths due to tuberculosis. During the years 1940–1955, the rate of decline in the U.S.
death rate accelerated from 2% per year to 8% per year, then returned to the historical rate of 2% per year. The dramatic decline in the immediate post-war years has been attributed to the rapid development of new treatments and vaccines for infectious disease that occurred during these years. The contamination appears to have originated both in the original cell stock and in monkey tissue used for production. In 2004 the
National Cancer Institute announced that it had concluded that SV40 is not associated with cancer in people. Other notable new vaccines of the period include those for
measles (1962,
John Franklin Enders of Children's Medical Center Boston, later refined by Maurice Hilleman at Merck),
Rubella (1969, Hilleman, Merck) and
mumps (1967, Hilleman, Merck) The United States incidences of rubella, congenital rubella syndrome, measles, and mumps all fell by >95% in the immediate aftermath of widespread vaccination. The first 20 years of licensed
measles vaccination in the U.S. prevented an estimated 52 million cases of the disease, 17,400 cases of
mental retardation, and 5,200 deaths.
Development and marketing of antihypertensive drugs Hypertension is a risk factor for atherosclerosis,
heart failure,
coronary artery disease,
stroke,
renal disease, and
peripheral arterial disease, and is the most important
risk factor for
cardiovascular morbidity and
mortality, in industrialized countries. Prior to 1940 approximately 23% of all deaths among persons over age 50 were attributed to hypertension. Severe cases of hypertension were treated by surgery. Early developments in the field of treating hypertension included quaternary ammonium ion sympathetic nervous system blocking agents, but these compounds were never widely used due to their severe side effects, because the long-term health consequences of high blood pressure had not yet been established, and because they had to be administered by injection. In 1952 researchers at CIBA (Gesellschaft für Chemische Industrie in Basel, predecessor to
Novartis) discovered the first orally available vasodilator,
hydralazine. A major shortcoming of hydralazine monotherapy was that it lost its effectiveness over time (
tachyphylaxis). In the mid-1950s Karl H. Beyer, James M. Sprague, John E. Baer, and Frederick C. Novello of
Merck and Co. discovered and developed
chlorothiazide, which remains the most widely used antihypertensive drug today. This development was associated with a substantial decline in the mortality rate among people with hypertension. The inventors were recognized by a Public Health
Lasker Award in 1975 for "the saving of untold thousands of lives and the alleviation of the suffering of millions of victims of hypertension". A 2009
Cochrane review concluded that
thiazide antihypertensive drugs reduce the risk of death (
RR 0.89), stroke (RR 0.63), coronary heart disease (RR 0.84), and cardiovascular events (RR 0.70) in people with high blood pressure. In the ensuing years other classes of the antihypertensive drug were developed and found wide acceptance in combination therapy, including
loop diuretics (Lasix/
furosemide,
Hoechst Pharmaceuticals, 1963),
beta blockers (
ICI Pharmaceuticals, 1964)
ACE inhibitors, and
angiotensin receptor blockers. ACE inhibitors reduce the risk of new-onset kidney disease [RR 0.71] and death [RR 0.84] in diabetic patients, irrespective of whether they have hypertension.
Oral contraceptives Prior to World War II, birth control was prohibited in many countries, and in the United States even the discussion of contraceptive methods sometimes led to prosecution under
Comstock laws. The history of the development of
oral contraceptives is thus closely tied to the
birth control movement and the efforts of activists
Margaret Sanger,
Mary Dennett, and
Emma Goldman. Based on fundamental research performed by
Gregory Pincus and synthetic methods for
progesterone developed by
Carl Djerassi at
Syntex and by
Frank Colton at
G.D. Searle & Co., the first oral contraceptive,
Enovid, was developed by G.D. Searle & Co. and approved by the
FDA in 1960. The original formulation incorporated vastly excessive doses of hormones and caused severe side effects. Nonetheless, by 1962, 1.2 million American women were on the pill, and by 1965 the number had increased to 6.5 million. The availability of a convenient form of temporary contraceptive led to dramatic changes in social mores including expanding the range of lifestyle options available to women, reducing the reliance of women on men for contraceptive practice, encouraging the delay of marriage, and increasing pre-marital co-habitation.
Thalidomide and the Kefauver-Harris amendments In the U.S., a push for revisions of the
FD&C Act emerged from Congressional hearings led by Senator
Estes Kefauver of Tennessee in 1959. The hearings covered a wide range of policy issues, including advertising abuses, questionable efficacy of drugs, and the need for greater regulation of the industry. While momentum for new legislation temporarily flagged under extended debate, a new tragedy emerged that underscored the need for more comprehensive regulation and provided the driving force for the passage of new laws. On 12 September 1960, an American licensee, the William S. Merrell Company of Cincinnati, submitted a new drug application for Kevadon (
thalidomide), a
sedative that had been marketed in Europe since 1956. The FDA medical officer in charge of reviewing the compound,
Frances Kelsey, believed that the data supporting the safety of thalidomide was incomplete. The firm continued to pressure Kelsey and the FDA to approve the application until November 1961, when the drug was pulled off the German market because of its association with grave congenital abnormalities. Several thousand newborns in Europe and elsewhere suffered the
teratogenic effects of thalidomide. Without approval from the FDA, the firm distributed Kevadon to over 1,000 physicians there under the guise of investigational use. Over 20,000 Americans received thalidomide in this "study," including 624 pregnant patients, and about 17 known newborns suffered the effects of the drug. The
thalidomide tragedy resurrected Kefauver's bill to enhance drug regulation that had stalled in Congress, and the
Kefauver-Harris Amendment became law on 10 October 1962. Manufacturers henceforth had to prove to the FDA that their drugs were effective as well as safe before they could go on the US market. The
FDA received authority to regulate the advertising of prescription drugs and to establish
good manufacturing practices. The law required that all drugs introduced between 1938 and 1962 had to be effective. A collaborative study by the FDA and the
National Academy of Sciences showed that nearly 40 percent of these products were not effective. A similarly comprehensive study of over-the-counter products began ten years later.
1970–1990s Statins In 1971,
Akira Endo, a Japanese biochemist working for the pharmaceutical company
Sankyo, identified
mevastatin (ML-236B), a molecule produced by the fungus
Penicillium citrinum, as an inhibitor of
HMG-CoA reductase, a critical enzyme used by the body to produce
cholesterol.
Animal trials showed very good inhibitory effects as in
clinical trials, however a long-term study in dogs found toxic effects at higher doses and as a result, mevastatin was believed to be too toxic for human use. Mevastatin was never marketed, because of its adverse effects of tumors, muscle deterioration, and sometimes death in laboratory dogs.
P. Roy Vagelos, chief scientist and later CEO of
Merck & Co, was interested and made several trips to Japan starting in 1975. By 1978, Merck had isolated
lovastatin (mevinolin, MK803) from the fungus
Aspergillus terreus, first marketed in 1987 as Mevacor. In April 1994, the results of a Merck-sponsored study, the
Scandinavian Simvastatin Survival Study, were announced. Researchers tested
simvastatin, later sold by Merck as Zocor, on 4,444 patients with high cholesterol and heart disease. After five years, the study concluded that patients saw a 35% reduction in their cholesterol, and their chances of dying of a heart attack were reduced by 42%. In 1995, Zocor and Mevacor both made Merck over US$1 billion. Endo was awarded the 2006
Japan Prize, and the
Lasker-DeBakey Clinical Medical Research Award in 2008 for his "pioneering research into a new class of molecules" for "lowering cholesterol".
21st Century Since several decades,
biologics have been rising in importance in comparison with small molecule treatments. The
biotech subsector, animal health and the Chinese pharmaceutical sector have also grown substantially. On the organisational side, big international pharmaceutical corporations have experienced a substantial decline of their value share. Also, the core generic sector (substitutions for off-patent brands) has been down valued due to competition. Torreya estimated the pharmaceutical industry to have a market valuation of US$7.03 trillion by February 2021 from which US$6.1 trillion is the value of the publicly traded companies. Small Molecules modality had 58.2% of the valuation share down from 84.6% in 2003. Biologics was up at 30.5% from 14.5%. The valuation share of Chinese Pharma grew from 2003 to 2021 from 1% to 12% overtaking Switzerland who is now ranked number 3 with 7.7%. The United States had still by far the most valued pharmaceutical industry with 40% of global valuation. 2023 was a year of layoffs for at least 10,000 people across 129 public biotech firms globally, albeit mostly small firms; this was a significant increase in reductions versus 2022 in part due to worsening global financial conditions and a reduction in investment by "generalist investors". Private firms also saw a significant reduction in
venture capital investment in 2023, continuing a downward trend started in 2021, which also led to a reduction in
initial public offerings being floated. It highlighted that some of the most impactful of the remedies of the early 21st Century were only made possible through M&A activities, specifically noting
Keytruda and
Humira. ==Research and development==