Holocene

To follow up on my post about the Anthropocene, check out this excellent song by Bon Iver named after our current geologic period.

Advertisements

It’s Our Pluto

Pluto reminds me a lot of that kid in high school who never even tried to fit in. They were always on the outside of the social circles, and I would speculate if they were cool and mysterious or just awkward losers. Similarly, Pluto has carried an enigmatic reputation since its discovery in 85 years ago: It swings around the outside of the Solar System at an angle – tiny and distant, interacting only with its one companion Charon. We’ve never even gotten a good look at Pluto – the most detailed images we have, such as those above, are impressionistic blurs – and scientists are still debating its actual size.

This lack of knowledge didn’t stop Pluto from becoming a symbol in the public rancor following its exclusion from the group of planets by the International Astronomical Union (Fun fact: Neil deGrasse Tyson, this blog’s hero, ignited this debate when he removed Pluto from the planets in New York City’s Museum of Natural History in 2000). And though I agree with Pluto’s classification as a dwarf planet (an unfortunate name), I think that Pluto has been unfairly loved, hated or ignored based not on its own merits.

This is why I have been so excited about the New Horizons satellite since I learned about it 4 years ago. When it passes by Pluto and its moon Charon in July after 9 years of travel, Earth will finally get a clear glance at this solar system outsider. We will see the surfaces of Pluto and Charon, and be able to identify specific features. And in traditional human style, we’ll be able to name everything.

The team behind New Horizons has asked for the help of the world for names for Pluto and Charon’s peaks, valleys and in-betweens. They’ve launched a campaign called “Our Pluto” to solicit public suggestions based on themes like travelers, underworlds and historic explorers. People can nominate and vote on names, and the New Horizons team will submit the winning names to the IAU for use in July to begin naming.

“Pluto belongs to everyone,” said New Horizon science team member Mark Showalter in a press release from the SETI Institute.  “So we want everyone to be involved in making the map of this distant world.”

Help end Pluto’s mystery. Voting ends April 7.

A Time of Our Own: Defining the Anthropocene

In one of the most memorable sequences of his TV show Cosmos, Carl Sagan mapped the entire history of the universe onto a “cosmic calendar.” On this 12-month scale, it took until September for the Earth to be formed, and humanity did not enter the picture until 10:30 p.m. on December 31 – a sobering reminder of our minor existence in time.

Despite our relatively brief residence on this planet, humans have severely altered Earth in ways no other species has come close to – we cultivate vast amounts of land for our food, emit billions of tons of chemicals into the atmosphere annually and move more sediment with our mining activities than all the world’s rivers combined. A Public Library of Science One paper found evidence of discarded stone tools from early hominids littering an area in the Sahara Desert – evidence that we’ve affected the planet’s landscapes for millions of years.

Even more worryingly, our activities are killing off many of Earth’s species – a paper in Science last year estimated that between 11,000 and 58,000 of Earth’s approximately 5-9 million animal species are going extinct every year. This is an extinction rate equaling those observed in past mass extinctions.

In the face of these massive effects, Nobel Prize-winning atmospheric chemist Paul Crutzen and  biologist Eugene Stoermer argued in 2000 for emphasizing “the central role of mankind in geology and ecology by proposing to use the term ‘anthropocene’ for the current geological epoch.” Since then, the proposal has gained support from a large segment of the scientific community (there are now 3 extant journals focusing on the concept), and in 2009 the International Commission on Stratigraphy, the scientific body in charge of defining the geologic time scale (Earth’s “calendar”), formed “the Anthropocene Working Group” to figure out the details of adding a new period of geologic time.

The March of Time

Because of the overwhelming size of Earth’s timescale (4.54 billion years), scientists use the geologic time scale to discuss earth’s various ages. Scientists have divided Earth’s history into five scales in descending size: eons, eras, periods, epochs and ages. As you can see in the diagram below, these divisions are not uniform in duration because they are based on the Earth’s strata, or layers of rock and fossils (hence the “geologic” part). This chart shows the current standard for the GTS, with the Earth’s formation at the bottom and the present day at the top (Ma=millions of years ago).

Geologists mark new divisions based on where significant changes in the strata occur, usually because of some Earth-altering event. In the picture below, the black band running through this rock formation marks the boundary between the Cretaceous and Paleogene periods (the larger “Tertiary” period represented in the chart above is increasingly being discarded by scientists in favor of the two smaller Paleogene and Neogene periods). This band was caused by the Cretaceous-Paleogene extinction event, more commonly known as the asteroid impact that killed off the dinosaurs. In the Paleogene period that followed, increasing numbers of mammalian fossils show how mammals went from being Earth’s losers to one of its dominant forms of life.

Scientists have traditionally recognized five major extinctions events in Earth’s history. Each one caused massive shifts in the planet’s number and diversity of species, and represented the start of a new geologic period. The current geologic period, the Holocene, began about 11,700 years ago at the end of the last glacial period (or “ice age”).

Anthropo-why?

A feature article by science journalist Richard Monastersky in last week’s issue of Nature analyzes the support and dissension among scientists used to studying the distant past on how to define the present day.

An ally of the push is botanist John Kress, interim undersecretary for science at the Smithsonian, which has hosted two symposiums on the Anthropocene. The Smithsonian has so fully embraced the epoch that it plans to include a section on it in its renovated fossil hall at the National Museum of Natural History. “Never in its 4.6 billion-year-old history has the Earth been so affected by one species as it is being affected now by humans,” Kress told CBS News at the time of last fall’s symposium.

Another advocate is Australian climate scientist Will Steffen, who even gave a TEDx talk on the Anthropocene in 2010. Referring to graphs of metrics including global biodiversity, greenhouse gas levels, and hemisphere temperatures, Steffen said “In each case […] we have left that envelope of environmental stability which typifies the Holocene.”

On the other side of the debate are scientists who question whether it’s really appropriate to create a new division of geologic time based on no rock or fossil records. A 2012 review paper by geologists Whitney Autin and John Holbrook made its intentions clear in its title: “Is the Anthropocene an issue of stratigraphy or pop culture?”

“If there is an underlying desire to make social comment about the implications of human-induced environmental change,” the authors wrote in GSA Today, “Anthropocene clearly is effective. However, being provocative may have greater importance in pop culture than to serious scientific research.”

Autin and Holbrook’s assertion strikes at the increasingly controversial nature of this debate, with some scientists feeling that the push is being driven more by the media and climate change scientists like Steffens than by geological facts. “What you see here is, it’s become a political statement,” said Stan Finney, chairman of the ICS in Nature. “That’s what so many people want.”

Alternatively, some other scientists suggest that it’s too early into the proposed Anthropocene epoch to define it, and in the words of paleoclimatologist Eric Wolff, “it might be wise to let future generations decide, with hindsight, when the Anthropocene started, acknowledging only that we are in the transition towards it.”

In the middle of the debate is Jan Zalasiewicz, the chairman of the Anthropocene Working Group. In a 2008 review paper advocating the new epoch, Zalasiewicz wrote that the changes humans have made to the Earth since the Industrial Revolution are feasibly large enough to leave a unique impression in the rock record for this era. “These changes,” he and his colleagues wrote cautiously, “although likely only in their initial phases, are sufficiently distinct and robustly established for suggestions of a Holocene–Anthropocene boundary in the recent historical past to be geologically reasonable.” That paper got him the job of leading the working group, according to Nature, and the unenviable task of impartially sorting through many opinions.

Anthropo-when?

In a review article, British scientists Simon Lewis and Mark Maslin discuss one of the most pressing questions about the Anthropocene besides whether it should exist: When does it start? Several dates have been put forward over the past decade and a half: Paul Crutzen initially suggested the beginning of the Industrial Revolution in the 1800s, while others have suggested the Agricultural Revolution about 11,000 years ago.

In January, the Anthropocene Working Group announced tentative support for dating the epoch’s start to the beginning of the Nuclear Age – specifically the July 16, 1945, Trinity nuclear test in New Mexico.

Lewis and Maslin, however, argued that two different dates make the most sense for the start of this new epoch: 1610 or 1964. While these years may seem a bit arbitrary compared to the other events put forth, the authors sought to find a time when definite changes occurred on the Earth that would be preserved in the strata – a reference point called a Global Boundary Stratotype Section and Point.

Following Christopher Columbus’ initial voyage to the Americas in 1492, a wave of European settlement on the continents changed them drastically. In particular, the deaths of more than 50 million Native Americans from disease and warfare led to the regrowth of over 50 million hectacres of formerly developed land. This caused massive uptake of carbon dioxide from the atmosphere in a very short span, something which would definitely show up in ice cores and rock samples. The Columbian Exchange that occurred during this time also caused many plant and animal species to appear outside their natural habitats suddenly, which will appear unusual in the fossil record.

The other candidate date is 1964, when radioactive carbon levels peaked in the atmosphere. This year is close to the beginning of the “Great Acceleration” – a rapid growth of human population during the second half of the 20th century, and also a year after the Partial Test Ban Treaty ended above-ground nuclear testing.

Epoch TBD

The Anthropocene Working Group will present its initial recommendations to the ICS next year, but the voting and revision process could take years. Of course, that’s just the blink of an eye in the geologic time scale.

Drugs from DNA: 23andMe’s Next Move

23andMe, the California company that sells DNA sequencing tests, announced yesterday that it will be entering the drug development business. Normally, this wouldn’t turn heads – there is no shortage of companies doing treatment research in America and worldwide, but none of those companies is sitting on a database of nearly 700,000 fully sequenced human genomes.

“I believe that human genetics has a very important role to play in finding new treatments for disease,” said Richard Scheller, 23andMe’s newly hired chief science officer and head of therapeutics in a statement Thursday. “I am excited about the potential for what may be possible through 23andMe’s database. It is unlike any other.”

Scheller previously led Genentech, the company widely considered to have founded the field of biotechnology (Fun fact: Genentech marketed the first synthetic insulin brand in 1982).

Founded in 2006, 23andMe became famous for its easy-to-use tests that could tell one their ancestry, whether or not they liked cilantro and what diseases they were at risk of developing. More than 850,000 people purchased the service, which debuted at $1000 per kit before dropping to $99. In 2013, however, the Food and Drug Administration sternly rebuked 23andMe for marketing a test with medical implications without seeking federal approval. 23andMe removed all health information from its tests, and was only late last month allowed to begin reintroducing tests for whether someone is a carrier of a genetic disorder.

While 23andMe had to stop giving medical information to consumers, it never stopped asking them if it could store their sequenced genomes for future use. About 680,000 people agreed, creating an easily accessible archive larger than all of its predecessors. By accessing this data, Scheller and other scientists could conduct instantaneous genome-wide association studies, which is essentially testing if traits are associated with specific genetic variation. These variations often take the form of single-nucleotide polymorphisms – single base pairs that are mutated from one person to another – making the accurate, whole-genome sequences of 23andMe so useful for study.

This won’t be the first use of 23andMe’s vast DNA database for drug research – in fact in January the company received $10 million from Genentech in a deal to sequence the genomes of people at risk of developing Parkinson’s disease. And the company has also allowed dozens of university researchers access to its data for free.

Don’t expect significant results soon, though – genome-wide association studies are notoriously difficult because of the still largely unknown interactions between many genes that cause traits. At least 16 separate genes seem to factor into one’s eye color, and scientists still aren’t even sure what they all are. Meanwhile, more than 500 specific genetic mutations have been linked to cancer.

23andMe has experienced this difficulty first-hand – a Parkinson’s disease breakthrough that the company’s co-founder bragged about in 2011 didn’t pan out. But that shouldn’t dampen the excitement about this news, because genomics is truly the frontier of drug and therapy development, and 23andMe just built a really big fort there.

10,000

In its daily report on incidences of the Ebola virus, the World Health Organization quietly published today a disturbing statistic – more than 10,000 people are believed to have died from Ebola in West Africa in the outbreak that started last year. As the rate of new Ebola infections continues to decline (Liberia has had no new infections for several weeks) reaching this death total punctuates an epidemic that took West Africa to the brink of collapse, and had the entire waiting to see if they were next.

Research has shown that humans have difficulty comprehending large numbers, and it’s even harder to think about each number of a total being a living, breathing human being who had his or her own dreams, sadness and joy. According to the latest U.S. Census, more than 80% of American cities and towns have less than 10,000 people living in them. Think of your hometown, or that suburban city, or even your high school or university. That many people have died in just three countries from one epidemic.

On the scale of epidemics, this outbreak of Ebola is mercifully small. It never reached “pandemic” status, meaning it never became a global epidemic, and seems not so deadly when compared to the world’s assorted outbreaks of flu, such as the 2009 “swine flu” outbreak that killed nearly 300,000 people, or the 1918 flu pandemic that killed more than 75,000,000 (nearly 5% of the entire human population). This disparity comes from Ebola’s low contagiousness – unlike flu, it cannot spread through the air. In fact, Ebola comes in near the bottom of lists of diseases’ basic reproduction rates, or number of people each infected person will go on to infect in an unvaccinated population. An Ebola victim will infect 2 people on average, while individuals with the once-common childhood diseases mumps or pertussis will infect 12-17 people on average.

What Ebola lacked in numbers, it made up for in lethality, though. The 10,000 deaths came out of a total of about 24,350 known infections, meaning Ebola killed about 40% of people who caught it. The heroic interventions of local and international medical professionals brought this epidemic’s fatality rate down from the 80-90% death rate observed in other outbreaks. Ebola has one of the highest fatality rates ever observed in a human disease. Even the 1918 flu pandemic had a fatality rate of only 2.5%.

It’s still too early to tell what the world will learn from this outbreak. There was a lot of fear, a lot of incompetence and a lot of tragedy. But there was also a lot of bravery, survival and determination to keep going. Humanity has recovered and thrived after much worse, and hopefully the health knowledge and infrastructure put in place in West Africa will prevent another epidemic from spreading so violently. But as epidemiology professor Tara Smith so eloquently wrote in an article last October, “We will never be free of epidemics.”

Honoring 350 years of Scientific Publishing

On March 6, 1665, England’s Royal Society published the first issue of Philosophical

Transactions, an unassuming, printed pamphlet that became the world’s first scientific journal. This publication’s birthday actually predates the modern use of the word “science” – its title refers to what was then called “natural philosophy” (Philosophy, which means “love of wisdom” in Greek, was used in past centuries to refer to almost all academic inquiry. If you are studying to become a Doctor of Philosophy, or Ph.D., you’ve encountered this meaning.)

What is perhaps most amazing about this anniversary is not that this journal is still around today, but how closely it resembled modern scientific journals at its outset. Submissions were carefully dated and registered to ensure fairness, and members of the Royal Society were expected to peer review every paper published.

That first issue contained, among other things, the first account of Jupiter’s “Big Red Spot,” an obituary for influential mathematician Pierre de Fermat, and an introduction that summarizes very well the importance of scientific publishing. Addressing the benefits of communicating discoveries to other scientists, founder Henry Oldenburg wrote:

To that end, that such Productions being clearly and truly communicated, desires after solid and usefull knowledge may be further entertained, ingenious Endeavours and Undertakings cherished, and those addicted to and conversant in such matters, may be invited and encouraged to search, try, and find out new things, impart their knowledge to one another, and contribute what they can to the Grand design of improving Natural knowledge, and perfecting all Philosophical Arts, and Sciences.

The journal survived financial hardships, fierce critics, wars and a split into two journals to make it to 2015, and it published some amazing works along the way. From Anton van Leeuwenhoek’s first observations with a microscope (“In the year 1675, I discover’d living creatures in Rain water”), to the 1919 photograph of a solar eclipse that confirmed Albert Einstein’s theory of general relativity, the pages of Philosophical Transactions and its two successors (one for physical sciences, the other for biological sciences) are a window into the history of scientific progress.

Luckily for those of us “addicted to and conversant” in science, the entire archive of Philosophical Transactions and every other Royal Society journal is freely available for the month of March.

Suicidology, or the Science of Taking One’s Life

Last week, the American Journal of Preventive Medicine published a paper seeking to find out why the suicide rate for middle-aged American men and women has risen nearly 40 percent since 1999, with a particularly sharp increase since 2007.

The authors drew data from the Centers for Disease Control’s National Violent Death Reporting System (probably the most morbidly fascinating database in existence) for the years 2005-2010, and used as a starting point the fact that Americans ages 40-64 were the hardest-hit group in the mid-2000’s recession.

The paper found that suicides with “external circumstances” (defined as job, legal or financial circumstances) as a major factor rose from 32.9 percent to 37.5 percent of completed suicides in this age group from 2005-2010, with the chart below showing an increase corresponding with the start of the latest recession. In particular, the use of suffocation, a method that is commonly used in suicides caused by external circumstances, rose by 27.7 percent during this 5-year period among middle-aged people.

A Fading Taboo

In many parts of ancient Europe, suicide seemed to have been viewed as a routine business. According to the historian Livy, Romans could apply to the Senate for permission to commit suicide, and if granted, they would be given a free bottle of the poison hemlock. Stoicism, a school of philosophy popular in ancient Rome, viewed suicide as a dignified act if used not because of fear or weakness. Emperor Nero, Caesar’s assassins Brutus and Cassius, and the Mark Antony and Cleopatra are just a few of the famous people from this era to have committed suicide.

While the medieval Catholic Church viewed suicide as a sin, it was a topic oft discussed and debated by theologians, scholars and rulers continuing into the Enlightenment. The 19th century saw suicide become romanticized and trendy in many ways, particularly among artists and young men inspired by the protagonist of Johann von Goethe’s blockbuster novel The Sorrows of Young Werther.

Indeed, it seems only in the 20th century did suicide become a topic to be discussed only in hushed tones and rarely mentioned in print. A 1968 article in Science could only guess at the annual number of American suicides because many ended up being “disguised by the listing of another cause of death on the death certificate.”

In the 21st century, the taboo about suicide seems to be receding in favor of earnest discussions to fix the problem. The first World Suicide Prevention Day was celebrated in 2003, and issues such as the disturbing rate of suicide among American veterans are now widely covered and discussed.

Suicide Science

With advances in neuroscience and psychology, research into the causes of suicide has shifted from abstract discussions of mental disorder to recognizing specific neural proteins and compounds associated with depression and suicide.

Cover image Progress in Neuro-Psychopharmacology and Biological Psychiatry

A 2007 article in the journal Progress in Neuro-Psychopharmacology and Biological Psychiatry found that low levels of the protein Brain-derived neurotrophic factor (BDNF) were associated with major depression and suicidal behavior. In a 2005 article published in Molecular Brain Research, abnormally low levels of BDNF were found in the brains of suicide victims, particularly those who had not taken any psychological drugs. Further research has shown that people with suicidal tendencies tend to have excess receptors for seratonin, a neurotransmitter molecule that helps regulate mood in humans.

Beyond the neurobiology of depression, science has also begun to tackle what in particular causes some people with depression to take suicidal action while others don’t. Some underlying factors are at work – men are more likely than women to commit suicide, people with a history of impulsiveness are more likely to at least attempt suicide, and as alluded to above, life circumstances or mental illness are the clearest omens of suicidal action. But a 2005 article in the Journal of Consulting and Clinical Psychology suggested that the “capability” to go through with suicide is something that must be acquired through life experience, particularly exposure to “painful and provocative events” such as fights and self harm. A 2007 study by the same team found that physical and sexual abuse during childhood was a significant predictor of a person attempting suicide in adulthood, and research has even shown that childhood abuse can cause permanent epigenetic effects on the human brain.

“People who have been abused or have abused themselves would habituate to the experiences of pain and acquire the ability to act on suicidal thoughts,” Harvard psychologist Matthew Nock said in a 2010 article in Scientific American.

Preventing the Problem

Research into actual suicide prevention has taken several paths. Some studies have looked at the effectiveness of catching depression before it becomes a problem through screening at regular doctor’s appointments, however limited data has prevented them from making firm conclusions.

Other researchers have looked into various methods for treating depression, from the effectiveness of various antidepressant drugs to whether making patients sign a suicide prevention contract actually works. Psychotherapy has been proven to work for many, but not all, people with depression.

To read more about suicide, check out these links from the American Foundation for Suicide Prevention.

Addressing “The Dress”

I know you’ve seen it, whether from TV, social media or a friend pushing their phone into your face and asking “What color is this dress?”

(Disclosure: I only see it as blue and black.)

Don’t worry; there isn’t something wrong with your loved ones who see the dress differently than you. The biological explanation lies in the way in which human eyes have evolved to determine colors under various light conditions, and how those determinations vary from person to person.

Human beings evolved to see in daylight, but daylight changes color. That chromatic axis varies from the pinkish red of dawn, up through the blue-white of noontime, and then back down to reddish twilight. “What’s happening here is your visual system is looking at this thing, and you’re trying to discount the chromatic bias of the daylight axis,” says Bevil Conway, a neuroscientist who studies color and vision at Wellesley College. “So people either discount the blue side, in which case they end up seeing white and gold, or discount the gold side, in which case they end up with blue and black.” (Conway sees blue and orange, somehow.)

Screen Shot 2015-03-01 at 3.44.43 PMCheck out this excellent article from Wired analyzing the science behind why people can’t agree on the color of a dress in a poor quality photo.

23andMe and the Genomics Frontier

Human_male_karyotype

Last Thursday, the US Food and Drug Administration authorized the biotechnology company 23andMe to sell a consumer test for the genes that cause Bloom syndrome, an inherited recessive gene disorder that causes short stature, heightened skin sensitivity to sunlight and an elevated risk of developing cancer. The importance of this news comes not from its conclusion, but because it’s a stark reversal from the FDA’s previous dealings with this California-based start-up.

A New Age

Founded in 2006, 23andMe set out to sell humans a peek at the code that defines them. Simply spit into a vial, mail it away, and you’d soon be able to learn about details gleaned from your DNA, ranging from your ancestry to whether you’re destined to hate cilantro. But the tests’ most attention-grabbing feature was its analysis of one’s genetic predisposition to health problems like Parkinson’s or heart disease.

“The advent of retail genomics will make a once-rare experience commonplace,” wrote Thomas Goetz in a November 2007 article in Wired that declared 23andMe’s launch the start of the “Age of Genomics.”

“Simply by spitting into a vial, customers of these companies will become early adopters of personalized medicine. We will not live according to what has happened to us (that knee injury from high school or that 20 pounds we’ve gained since college) nor according to what happens to most Americans (the one-in-three chance men have of getting cancer, or women have of dying from heart disease, or anyone has for obesity). We will live according to what our own specific genetic risks predispose us toward.”

Though the test originally cost nearly $1,000, 23andMe soon cut the price to $399 in order to build up a database of genetic data for use in pharmaceutical and biotechnology research. Time named the test its Invention of Year in 2008, and continued buzz allowed the company to raise more than $50 million from investors and eventually drop its selling price to $99.

“You’re donating your genetic information,” 23andMe co-founder and CEO Anne Wojcicki told Time in 2008. “We could make great discoveries if we just had more information. We all carry this information, and if we bring it together and democratize it, we could really change health care.”

The Dark Years

2010 turned out to be a bad year for 23andMe. The FDA has long allowed testing kits, such as pregnancy tests, to be sold directly to consumers. 23andMe sold its “personal genome service” (PGS) under the assumption that it was allowable as a home testing kit, but the FDA had reservations, and sent the company a letter that year saying it considered the kits “medical devices,” and thus much more heavily regulated. The company was also dragged before a congressional committee to respond to damaging allegations from a sting operation conducted by the US Government Accountability Office. For a more thorough summary of this affair, check out this post from the excellent (and sadly inactive) science blog Genomes Unzipped.

After years of debate, a stern warning letter in November 2013 seemed to seal 23andMe’s fate – stop selling health tests until you get approval.

“The Food and Drug Administration (FDA) is sending you this letter because you are marketing the 23andMe Saliva Collection Kit and Personal Genome Service (PGS) without marketing clearance or approval in violation of the Federal Food, Drug and Cosmetic Act (the FD&C Act),” wrote Alberto Gutierrez of the Office of In vitro Diagnostics and Radiological Health. […] Most of the intended uses for PGS listed on your website, a list that has grown over time, are medical device uses under section 201(h) of the FD&C Act. Most of these uses have not been classified and thus require premarket approval or de novo classification, as FDA has explained to you on numerous occasions.”

23andMe was chastened, and quickly announced that it would discontinue all of its health-related testing, thus limiting its products only to ancestry information and raw genetic sequencing. Though the FDA intended to limit inaccurate diagnostic use of a person’s genome, its rebuke of 23andMe was widely viewed as overbearing and pointless. Science writer Razib Khan argued in Slate that this decision “highlights the tension between the paternalistic medical establishment that arose to deal with the dangers of 19th-century quack medicine, and a “techno-populist” element of American society pioneering personal health assessment and decision-making by leveraging new information technologies.”

“The glaring weakness in an aggressive strategy against interpretative services is that there will always be firms such as 23andMe, and there’s no reason that they need to be based out of the United States. Not only that, but there are open-source desktop applications, such as Promethease, that provide many of the same results by combining individual raw data with public peer-reviewed literature, if less slickly than 23andMe. To truly eliminate the public health threat that the FDA is concerned about, the U.S. government would have to constrict and regulate the whole information ecology, not just a strategic portion of it, from scientists distributing research about genetic variants, to international genome sequencing firms returning raw results on the cheap.”

On the other side, some argued that 23andMe was brazen in its defiance of the FDA, and it should have seen this coming. In a column for Forbes, health writer Matthew Herper called out the tech company for having gone more than six months at a time without communicating with the FDA.

“Either 23andMe is deliberately trying to force a battle with the FDA, which I think would potentially win points for the movement the company represents but kill the company itself, or it is simply guilty of the single dumbest regulatory strategy I have seen in 13 years of covering the Food and Drug Administration.”

Today

In its announcement last week, the FDA not only authorized the one test for Bloom syndrome, but also exempt all future carrier tests from “premarket review” by the agency. In particular, the language of the announcement seems to express a newfound willingness to accept direct-to-consumer genetic testing as part of 21st-century healthcare.

“The FDA believes that in many circumstances it is not necessary for consumers to go through a licensed practitioner to have direct access to their personal genetic information, wrote Gutierrez. “Today’s authorization and accompanying classification, along with FDA’s intent to exempt these devices from FDA premarket review, supports innovation and will ultimately benefit consumers.”

“This is a major milestone for our company and for consumers who want direct access to genetic testing,” Wojcicki said in a press release. “We have more work to do, but we remain committed to pursuing a regulatory path for additional tests and bringing the health reports back to the US market.”

Now, it seems only time will tell when the Personal Genome Service will return to the United States. I know I’m very eager to try it when it does.