Culture

DNA's 50th Anniversary

Celebrating the wonders of genetic technologies

|

"We have discovered the secret of life," declared Francis Crick as he walked into the Eagle Pub in Cambridge, England, on February 28, 1953.

Crick and his collaborator James Watson announced their discovery to a wider, less tipsy world in the April 25, 1953, issue of Nature, coyly beginning their article, "We wish to suggest a structure for the salt of deoxyribose nucleic acid (D.N.A.). This structure has novel features which are of considerable biological interest." This article revealed the now-famous double helix structure of DNA. The two scientists, working in collaboration with British X-ray crystallographer Rosalind Franklin at Cambridge University, had succeeded in describing the structure of the DNA molecule, with its alternating base pairs of adenine and thymine and guanine and cytosine.

Watson and Crick immediately realized that this alternating structure—in which adenine always paired with thymine and guanine always coupled with cytosine—meant that if one knew the bases on one side of the chain, one also automatically knew the bases on the other side. "It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material," Watson and Crick understatedly described this crucial insight. The two shared the Nobel Prize in Medicine or Physiology in 1962 with Maurice Wilkins, whose lab had crystallized the DNA.

But it took another 13 years for researchers to begin to understand how the four base pairs in DNA could specify the construction of the proteins that do most of the work in the cells of living organisms. By 1966, researchers Marshall Nirenberg, Har Khorana, and Severo Ochoa had deciphered the "genetic code." They discovered that the four bases in DNA form three-letter "codons" which specify amino acids. A three-letter word made of four possible letters can have 64 permutations, which are more than enough to encode the 20 amino acids found in living beings. (Some amino acids are specified by more than one codon, as it turns out.)

This technical understanding soon bore practical fruit. In 1972, the first genetically engineered organisms were created by splicing DNA taken from one organism into another using techniques developed by Paul Berg, Herbert Boyer, and Stanley Cohen. Boyer and venture capitalist Robert Swanson launched the first biotech company, Genentech, in 1976. Today there are more than 9,000 such companies. In 1980, the first gene patent was awarded and the first human gene was introduced into a bacterium. Ever since, genetic science and technological development have been accelerating.

Our understanding of DNA hasn't merely created companies. It has also opened new frontiers in understanding and predicting human diseases. The first screening test for a genetic disease, phenylketonuria, was devised more than four decades ago, in 1961. To the dismay of bioethical mandarins, in 1996 the Genetics and IVF Institute became the first medical facility to offer commercial testing for the BRCA1 breast cancer gene. Today genetic tests are proliferating, giving researchers, physicians, and patients ever-greater insight into human health problems. For example, up-to-the-minute research results find that people unfortunate enough to bear two variant genes, CPY46 and APOE4, are 10 times more likely to develop Alzheimer's disease than those who do not bear those genes.

The federal government unsurprisingly has been a major player in genetic research. However, on its most ambitious project private industry managed to steal the feds' thunder. The government's Human Genome Project was launched in 1990, promising to decipher all human genes by 2005. This stately pace quickened when government researchers were challenged by maverick research genius Craig Venter, who declared in 1998 that his private company, Celera, would decipher all human genes by 2000. Although the race was declared a gentlemanly tie, there is no doubt that Venter's group won. The working drafts of the complete human genome were published in Science and Nature in February 2001, 48 years after Watson and Crick revealed the double helix.

The Internet, newspapers, magazines, radio, and television are filled almost every day with headlines about genetic breakthroughs and the ethical and political conundrums they pose. If you doubt the salience of biotechnological issues, consider that President Bush's first prime time speech to the nation in August 2001 wasn't about the economy or national defense, but human cloning.

Not scientists, industry leaders, and politicians are not the only people energized by the expanding genetic frontier that Watson and Crick first began to map 50 years ago. Artists are finding inspiration in our genes as well. New York City will host a DNAge Festival to mark the 50th anniversary. Works by artists who are fascinated or horrified by the aesthetic possibilities of biotechnology will be featured in the exhibition "Genomic Issues" at the Graduate Center of the City University of New York. Thirty emerging artists will get their say, too, with a downtown show entitled "Brave New World" at the Organization of Independent Artists on Hudson Street. There will even be a performance of a new musical suite, "GENOME: The Autobiography of a Species in Twenty-Three Movements," composed for the occasion. Dancer/choreographer John Pennington will perform a piece he composed in collaboration with a molecular biologist.

What will the next 50 years of the Age of Genetics hold? I foresee a convergence between science and art as the Promethean possibilities of genetic research become more widely available. The future will see miracles, cures, ecological restoration, vivid new art forms, and a greater understanding of the wellsprings of human compassion. But what can liberate can also tyrannize if misused. Evil minds will undoubtedly try to pervert the gifts of bioscience by creating such monstrosities as designer plagues or a caste of genetic slaves. But history has shown that with vigilance—not with blanket prohibitions—humanity can secure the benefits of science for posterity while minimizing the tragic results of any possible abuse.