The Endless Frontier

The Endless Frontier (Progress in Biology and Medicine since 1944)

Biology in Our Times 

50th Reunion of the Yale Class of 1966 

Richard H. Kessin 

Let’s start at the beginning—our beginning. In 1944, Oswald Avery, Maclin McCarty and Colin McCloud showed that a long viscous molecule called DNA could penetrate Pneumococcus bacteria and change them in a heritable way. This result was a crucial proof that DNA carried the code for all life and was essential to our current revolution in molecular biology and medicine.  

Professor Avery and colleagues worked in a private organization supported by the Rockefellers. Most government research, like the enormous Manhattan and Radar and Penicillin projects, had been directed to practical ends, as World War II demanded. In his book The Great American University, sociologist Jonathan Cole recounted a conversation, which also occurred in 1944, between President Roosevelt and Vannevar Bush, then head of the U.S. Office of Scientific Research and Development.  “Roosevelt called me into his office and said, ‘What’s going to happen to science after the war?’ ‘It’s going to fall flat on its face,’ I said. He said, ‘What are we going to do about it?’ And I told him, ‘We’d better do something about it damn quick.’   

Roosevelt died in April of 1945, but Vannevar Bush persisted in defending the idea of basic research driven by the curiosity of scientists themselves. He had the precedents of physics, electronics, and vaccines, all of which came out of fundamental observations, to strengthen his case. Bush eventually convinced President Truman and Congress. By 1950, the National Science Foundation had been organized to fund basic research in many fields, while most biological and medical research had been assigned to a reorganized National Institutes of Health. This siting of basic research combined government and university capabilities. It has become the basic framework—later joined by research in the private sector—in which biological science operates today in most countries. 

Gradually American scientists gained access to resources that would drive the revolutions in molecular biology, medicine, chemistry, and physics. Research universities like Yale began to expand science teaching and research with grants to individual scientists, grants for program expansion, and student fellowships from NSF, NIH, and private foundations.  

This funding is not small scale. In 2015 from NIH alone Yale received $263,582,736 to support 657 separate projects, with additional money to supplement overhead for physical and administrative infrastructure. I was recently seated next to a fellow at a Yale Alumni event who lamented any dependence on government. Yet these hundreds of millions of annual dollars unleash the imaginations of thousands of Yale faculty and students, as far smaller funding did for me 50 years ago.  

Largesse does come with costs, of course. We have to account for this money and conform to many regulations. We maintain expensive facilities, including heavily controlled ones for animals—Yale maintains tens of thousands of mice to mimic disease and important biological processes. Strict oversight is in place to protect human subjects. Clinical trials face severe ethical and statistical scrutiny.  For 15 years I taught courses at Columbia’s College of Physicians and Surgeons on The Responsible Conduct of Research, which is now a requirement everywhere. The NIH requires an effort to recruit diverse graduate and medical students. The result is that students in science and medicine now include many more women and growing ethnic and economic diversity—which I see as a huge expansion of talent. 

*** 

Let’s follow the progress of biology and medicine since we got our own unique sets of DNA more than 70 years ago. By the time we were in middle school, the institutional structures of modern biology were in place in the US and other advanced countries. What happened? 

By 1953 the structure of DNA had been discovered by Crick, Watson, Wilkins and Franklin—as well as the methods that DNA uses to copy itself and to mutate. A code was obviously embedded in the sequence of its four famous subunit nucleotides A, C, G, and T. The best minds worked on the code until it was solved in the early 1960s.  

The most notable biological event for us in our childhood during the 1950s was polio. In 1953, its summer bouts were terrifying to parents as diphtheria had been until the 1930’s. A half dozen years later they were gone, thanks to the vaccines of Salk and Sabin, all based on the polio cell culture pioneered by John Enders (Yale ’19). Many of us were literal guinea pigs for the vaccine, getting either the active form or a placebo. Importantly, though, polio vaccine development did not depend on understanding poliovirus genetics. The scientists managed to make a vaccine without knowing what genes it contained. For newer vaccines against hepatitis, Ebola or SARS-Cov-2, by contrast, genetic knowledge is critical, and much safer. We can leave most genes out of a vaccine. 

*** 

While we were Yale undergraduates, biologists were learning how bacteria turn genes on to take advantage of a new sugar source, or how a virus inserts its genome into the DNA of its bacterial host and lies low until a challenge to the host unleashes it. This phase of bacterial genetics was intellectually elegant, but genetic methods applicable to bacteria and viruses did not translate immediately to higher organisms. Biologists were temporarily a bit  stuck.  

Higher organisms began to yield their genetic secrets in the 1970s. By cutting DNA at specific places, researchers could make a large amount of a particular sequence—several thousand nucleotides to start.  In turn we discovered ways to determine the sequence of DNA nucleotides. To begin with, a few hundred nucleotides a day was a good clip, but still dauntingly far from 3.2 billion, the size of the human genome. Remarkably, at an increasing pace over the next 35 years as technology became more sophisticated, biology was able to take advantage of special chemistry, optics, and informatics. Everything was automated. Finally in 2001 the entire human genome was sequenced. The map of the genetic microworld was made globally available for research on-line, with the bioinformatics tools needed to analyze and compare them.  Today it does not require a wizard to sequence a human genome in days for a few thousand dollars. 

A critical accelerating factor during those developmental years was the entry of many start-up companies and well-established industries into the research environment. They provided reagents, DNA sequencing and other services that research laboratories use, freeing them for further experimentation. The project of modern biology would not have worked without nimble entrepreneurs to commercialize essential techniques and take on translational work—and becoming major contributors to science and the economy.  Since the Bayh-Dole act of 1980, allowing universities to license their own discoveries, university-industry contacts have flourished.  

*** 

The years since we graduated have seen remarkable twists and unanticipated findings in biology—not just uninterrupted progress. The successes of vaccines and antibiotics bolstered the credibility of biological science, but this was no triumphal march. For example, the biomedical community assumed too soon that infectious disease was under control. Bacteria and viruses evolve. Drug resistance soon became a serious problem. The War on Cancer, loudly proclaimed in the 1970s, fell short because basic understanding of cells, cancerous or otherwise, was not yet sufficient to attack the problem successfully. More recently, after human genome sequencing became possible, the lack of quick progress on treating solid tumors spawned newspaper articles asking, “Where are the cures?” Still today we lack good vaccines for tuberculosis, malaria, and HIV, while the treatment of various neurodegenerative diseases is not moving as rapidly as we had hoped. 

Meanwhile basic research produced its share of surprises. In 1979, for instance, we learned that human and animal genes all have interruptions in their sequences. This means that the coded genetic information has to be spliced into a linear tape before translation into a protein. More complex than bacterial genes, animal genes can produce a protein whose segments vary depending on the tissue. With these intervening sequences our genes are huge, but 97% of the 3.2 billion nucleotides in human DNA do not code for proteins. Because splicing out the intervening sequences requires a complex molecular nanomachine made of many proteins, it sometimes fails unpredictably. Spinal muscular atrophy, the most common genetic disease of children after cystic fibrosis, is such a case. A clinical trial using creative molecular methods is underway to restore correct splicing. [It was successful and became the first case of treatment of an inherited disease, followed by others.] 

After decades of effort, we have a reasonable though incomplete understanding of how genes in animals turn on and off to regulate the development of an organism from a fertilized egg into hundreds of cell types. We understand more about cell division and the proteins involved in the formation of synapses between neurons.  And we have a number of new molecular technologies to deploy. Are we justified in thinking that new solutions for genetic diseases and cancer will succeed?   

The first encouraging factor is rapid, much cheaper DNA sequencing, which also enriches fields beyond biology and medicine—from agriculture to archeology, paleontology, and forensics. Sequencing is helping us grow blight-resistant chestnut trees by identifying the genes that lead to blight. In anthropology new methods have permitted the sequencing of the Neanderthal genome by Svante Paabo and his colleagues in Leipzig. Who knew that DNA is stable when encased in bone or teeth? A jawbone sitting in a dusty museum cabinet is now a potential trove of genetic information. We now understand that Neanderthals interbred with modern humans 40,000 years ago, and that we still carry their contributions, sequences of DNA recognizable in our own genomes.  

A second cause for optimism is the role of collaborative research at scale. My Columbia colleague Ricardo Dalla Favera realized a few years ago that half of lymphoma cases have a good prognosis, while for the other half, prospects are dismal. With tumors recovered from patients who had failed therapy, his team’s use of DNA sequencing has revealed mutations affecting processes that do respond to known drugs—providing therapeutic options for such patients. Searching tumor genomes is frequent in cancer biology, but it is a huge task. A new international project, The Cancer Genome Atlas, is now cataloguing mutated genes in many kinds of cancer from thousands of patients. This broader knowledge will allow cancers to be treated by mutations that cause rapid cell division or evasion of normal immune- system surveillance—not just by their tissue of origin.  

Induced pluripotential stem cells, discovered by Shinya Yamanaka in Kyoto in 2006, are yet a third discovery with extraordinary potential. Introducing four genes (carried on a harmless virus) into a skin cell causes it to revert to a primitive state, no longer resembling a skin cell at all, but one capable of dividing exponentially in a culture dish. With the right chemical coaxing, these induced pluripotential stem cells can be made to differentiate into any type of cell—from a red blood cell precursor or a precursor to different neurons in the brain.  

A good example of the benefits of global, leveraged research is a new study at Yale to address autism with such induced stem cells.  A child with autism does not have a defect in a particular, predictable gene, the way a child with cystic fibrosis would. Hundreds of variant genes have loose connections to autism. Prof. Laura Vaccarino and her colleagues use skin cells from autistic and non-autistic members of the same families and convert the resulting stem cells into specific brain neurons. Because brain tissue derived from autistic patients has more inhibitory neurons than those of unaffected relatives, Dr. Vaccarino’s team was able to reverse the effect in a culture dish. This may well become a tool to study autism regulation and screen for therapeutic drugs without harming patients. The same technique is being applied to other previously intractable problems. 

*** 

Among the amazing discoveries in the last few years is a gene-editing technique called CRISPR, found in bacteria and modified for use in cells of higher organisms. CRISPR allows accurate correction of gene defects— like the mutation that gives rise to sickle cell anemia or the one that causes cystic fibrosis. Since its discovery in 2012 by Jennifer Doudna and her colleagues at Berkeley and Feng Zhang and others at MIT, it has become evident that CRISPR could repair a defective gene in an induced pleuripotential stem cell and be reintroduced into a patient—a method already used to cure muscular dystrophy and liver diseases in mice. CRISPR has many other uses—it could spread destructive elements within an insect population and be designed to eliminate malaria-carrying mosquitoes, a process called genetic drive—but it has dangers. It could be used to correct mutations in unfertilized human eggs or in an attempt to alter other characteristics of a future generation, like height or intelligence. Serious ethical concerns abound; consensus is emerging that the germ line should not be touched, but ethicists do not hold sway in all of the world.  

Another example—epigenetics is a new field spanning biology and medicine. It turns out that DNA can be modified by the environment through addition of a carbon molecule to one of its nucleotides, modifying gene expression by causing DNA to fold up and sequestering the genes in that stretch of DNA. Some argue that such modification happens in child abuse or trauma. Genes induced [or repressed] by stress may not turn off when the trauma is removed, potentially explaining the persistence of post-traumatic stress. The biochemical basis of this persistence is understood. In animals and humans epigenetic changes are not, in theory, passed to the next generation, while in plants they can be, according to plant experts I have asked. Epigenetics are a potential case of inheritance of acquired characteristics, an idea that had been drummed out of biology more than a hundred years ago as a heretical thought.  

*** 

Science will always generate ethical, philosophical, legal, and religious conflicts, as well as differences of scientific opinion. Disagreements concerning vaccines, evolution, genetic testing, genetically modified crops and global warming have been with us for a long time and will not go away. Given the inevitability of controversy arising from biology, it is all critical that basic science continue to be done in universities—where a wide, responsible range of perspectives can be brought to bear on these issues. Scientists need to take a role in these debates, but so do constitutional experts, philosophers, and scholars from other fields. The inherent connections among disciplines should be reinforced from the moment students arrive.  We  need technicians; but we also need scientists and humanists who can talk to each other and the public. We need people who can stand up to Congressional Oversight Committees, whose members’ ignorance of science during the height of the Ebola epidemic was appalling. [That has not changed since the Ebola epidemic. The SARS-Cov-2 epidemic is an example].   

*** 

Public debate, by contrast, seems difficult in the face of what the science writer George Johnson called “the gradual extinction of accepted truths.” Explaining even the best available science to parents who show up dead set against the Measles, Mumps, and Rubella vaccine will not change any minds. The same has often been true for GMO crops or, famously, evolution. From time immemorial people have rejected ideas they consider harmful. The unnerving belief appears to grow among many people, even highly educated ones, that all thoughts are equivalent, that there are no generally accepted truths, and that individual conviction somehow trumps experiment and data.  

Science, medicine and public health have a tough time countering such attitudes. In Northwest Connecticut I write a newspaper column on scientific questions where I explain controversial issues carefully and calmly. I find it helps—sometimes—to explain how the immune system works to parents who reject vaccines.  I call such tenacious refusal to believe in science, Galahad Syndrome—like Tennyson’s Sir Galahad, they believe that “their strength is as the strength of ten, because their hearts are pure.” If they take good care of their children, they will be good kids, but they will not be immune to measles virus and other devastating infections. 

So, despite all obstacles will we benefit from continued advances in biology, in basic science and in applied areas? Will our own maladies and those of the wider world such as malaria, TB, Ebola, and HIV yield to continued effort?  With the powerful tools that have appeared recently and our improved understanding of how cells function, I believe there will be a lot of progress—but, as in recent decades, progress will be uneven and unpredictable.  

I take courage from a friend, a retired pediatric oncologist, who started treating children with leukemia in 1960, an era when all his patients died. “How could you bear it?” I asked, “Well,” he said, “I knew that if enough dedicated and intelligent people worked on the disease and tried to help these families, something good would happen.”  He was right.  I’m with him.