Antibiotic Resistance Trends

Capture

Hello loyal readers – I have a post in the works about everyone’s favorite controversial science topic that isn’t climate change – vaccines! But until then, here’s an interesting addendum to my last post on antibiotic resistance.

This new online tool from the Centers for Disease Control and Prevention lets you view how antibiotic resistance has increased over the past 20 years in four different kinds of foodborne bacteria. Check it out, because who doesn’t want to see something depressing on a nice, late-summer day?

Advertisements

‘The end of antibiotics’

In this example of the Kirby-Bauer Disk Diffusion Susceptibility Test, E. coli bacteria are streaked on a dish with white paper disks containing seven separate antibiotics. The clear areas around the disks show where bacteria have been killed by the antibiotics. The E. coli at right is resistant to most of the antibiotics tested.

Many of us are familiar with the legend – in 1928, Scottish scientist Alexander Fleming noticed that mold infiltrating his petri dishes killed the bacteria in them, thus discovering penicillin. While antimicrobial substances had been used with some success by the ancient Egyptians and Greeks, Fleming’s discovery revolutionized medical treatment by allowing doctors to kill the sickening microbes that Louis Pasteur had proven to exist just 50 years before.

Now, more than 100 antibiotics exist, and they are a vital and omnipresent part of modern medicine. By one estimate, antibiotics save the lives of roughly 200,000 Americans annually and add 5-10 years life expectancy at birth.

The tragedy of the commons

The “tragedy of the commons” is one of the most famous and sobering concepts in analyzing human behavior – if enough humans behave only in their self-interest in using a resource, they could deplete or ruin that resource for their entire group. Ecologist Garrett Hardin’s famous 1968 exploration of this social dilemma focused on overpopulation and pollution, but it is a concept that applies all too well to the use of antibiotics.

“Antibiotics are a limited resource,” wrote the Centers for Disease Control and Prevention in a 2013 report on antibiotic resistance. “The more that antibiotics are used today, the less likely they will still be effective in the future.”

Antibiotics have often been deemed “magic bullets,” and their wonderfully curative properties make them seem almost unstoppable. But that unrelenting force behind all life on Earth also works against antibiotics – evolution. Just as humans have evolved to resist many diseases and toxins that once killed us, bacteria can and do evolve to resist the antibiotics that kill them. But while humanity’s relatively long gestation and maturation periods make our evolution proceed slowly, bacteria can reproduce roughly every 20 minutes, or 72 generations per day. With that kind of reproduction rate, the odds are that eventually a mutation will occur in a bacterium that allows it to survive an antibiotic treatment.

The problem results from two phenomenon – the overprescription and use of antibiotics by people, and the preventative use of antibiotics in livestock.

As much as 50 percent of the time, antibiotics are used unnecessarily in medical treatment, according to the CDC. Every exposure of an antibiotic to bacteria is another opportunity for the bacteria to develop a genetic mutation that lets it survive, and an environment free of helpful bacteria that that help control humans’ microbial populations.

Not only can people spread this drug-resistant bacteria to others, as demonstrated in the chart above, but bacteria themselves can spread their drug resistance to other bacteria they encounter through a process called gene transfer.

The chart below from the National Institutes of Health depicts one method of gene transfer – bacterial conjugation, in which a gene can pass directly from one bacterium to another. Several other methods, including transformation and transduction, also exist.

5080768325_54ed60e233_o

Gene transfer is what antibiotic resistance such a persistent and fast-developing problem. It’s like someone figuring out the combination to a safe, and then giving it to everybody they met, and all those people giving the combination to everybody they met. Soon, the safe can stop nobody.

The other big issue is the use of antibiotics in livestock to help them grow more quickly and avoid potentially costly infections.

It’s estimated that 30 million pounds of antibiotics are given annually to livestock just in the U.S. That is several times the amount given to the country’s 318.9 million people, because antibiotics are routinely given to perfectly healthy animals. Bacteria in these animals are constantly exposed to antibiotics, which cull the weak bacteria while giving the strong ones more opportunities to develop mutations to resist them.

As far back as 1977, the U.S. Food and Drug Administration has tried to curtail this overuse of antibiotics by farmers, but it’s been stymied by powerful agricultural interests in the business world and Congress.

The polar opposite of the U.S. is the Netherlands, where the government and agricultural businesses collaborated to end the use of preventative antibiotics in livestock more than a decade ago. Stringent and common-sense cleaning and veterinary practices have allowed the country’s livestock industry to continue to flourish.

The timeline from the CDC report shows how quickly bacteria can develop resistance to an antibiotic – oftentimes, it can happen in just a year or two.

Fitter bacteria=deadlier bacteria

What perhaps some officials clung to as hope in the fight against antibiotic resistant bacteria was a hypothesis that the development of resistance by the bacteria to antibiotics made them less deadly and more vulnerable to other methods of killing them.

A paper published this week in Science Translational Medicine refutes this hunch, however, by finding that bacteria that have developed drug resistance actually proved to be more deadly and infectious to mice than their non-resistant brethren.

The conclusion “raises a serious concern that drug-resistant strains might be better fit to cause serious, more difficult to treat infections, beyond just the issues raised by the complexity of antibiotic treatment,” wrote the authors.

“We’re in the post-antibiotic era”

That was the grim pronouncement made by Dr. Arjun Srinivasan, an associate director of the CDC, in a 2013 PBS Frontline documentary.

“For a long time, there have been newspaper stories and covers of magazines that talked about ‘The end of antibiotics, question mark?'” Srinivasan said. “Well, now I would say you can change the title to ‘The end of antibiotics, period.'”

One of the most visible and deadly consequences of antibiotic resistance is the persistent spread of Methicillin-resistant Staphylococcus aureus, or MRSA. First seen in the 1960’s, it is a deadly bacteria that often hits hospitals, infecting the weak and healthy alike. Though the rate of MRSA infections has declined it recent years, more than 11,000 die from MRSA-related causes each year.

Things don’t look great for the future, but doctors, scientists and lawmakers are still working to stop antibiotic resistance from causing medicine and ultimately to backslide.

“In a world with few effective antibiotics, modern medical advances such as surgery, transplants, and chemotherapy may no longer be viable due to the threat of infection,” warned the White House in a 2014 report.

I personally hope that world never comes.

Turf War

If you’ve been following the FIFA Women’s World Cup these past few weeks (go USA), you’ve probably heard talk about turf, or more specifically, artificial turf. This is the first World Cup to ever be played on artificial turf, and many players are not happy about it. Allegations of gender discrimination were made, and a lawsuit was filed and later dropped. What I found myself wondering, and you may be too, is why make a big deal over turf?

The grass is always greener when it’s fake

For thousands of years, humans played sports on regular old grass. But when roofed sports stadiums started being built in the 1960’s, a fairly significant problem arose – grass needs sunlight to grow. After a hilarious experiment with painting dead grass green, the Houston Astrodome installed a brand new synthetic turf called ChemGrass in 1966. The turf was renamed AstroTurf soon afterwards by its inventors, the chemical company Monsanto, and was soon installed in many similar stadiums nationwide. Even outdoor fields that received plenty of sunlight were soon being replaced with AstroTurf as owners realized that this fake grass never needed to be cut, watered or replanted, and would stay usable and attractive in any kind of climate conditions. In 1970, the commissioner of the National Football League predicted every team would soon be playing on “weather-defying phony grass,” as the newspaper article described it.

AstroTurf usage increased up through the ’80’s until nearly 2/3 of NFL teams, at least a dozen baseball teams and a handful of English soccer teams were playing on it. Eventually, players and fans began to react negatively to AstroTurf for its effects on the game and their health (discussed below), and many fields were transitioned back to natural grass or a newer version of artificial turf called FieldTurf.

The Mechanics

There were several reasons that sports teams ditched AstroTurf, but they all boiled down to the fact that playing on it was almost nothing like playing on natural grass.

AstroTurf was essentially a nylon carpet laid over concrete. It got much hotter than grass, so much so that baseball players standing on it for long periods in hot weather got their cleats melted. It was hard and springy, meaning balls that struck it went much higher than on normal grass. This same factor meant that players who hit it didn’t fare too well either. And the nylon surface was much easier to grip – a boon for some looking to run faster, but a major stress on the knees of many players, as a 1992 study in The American Journal of Sports Medicine showed.

Modern artificial turf (nowadays usually a version of FieldTurf), is more complex. Individual grass-like fibers are attached to a hard polymer base, with a layer of rubber granules simulating the cushioning dirt of a natural grass field.

The Controversy Continues

While modern-day turf is undoubtedly much better to play on, the jury is still out on how safe it really is. A 2013 study in Portugal found that amateur soccer players were injured at higher rates on artificial turf than on grass (though other studies have found no difference in injury rates).

Artificial turf was never popular in soccer, and FIFA, the sport’s global governing body, never allowed it to be used for World Cup matches before this year. But natural grass fields are difficult to grow in Canada, where the Women’s World Cup is being held this year, so FIFA bit the bullet and signed off on it.

The reaction was swift.

“There is no player in the world, male or female, who would prefer to play on artificial grass,” U.S. forward Abby Wambach told the Washington Post. “There’s soccer on grass, and then there is soccer on turf.”

Forward Sydney Leroux described playing on turf as “running on cement” to Vice Sports, and her celebrity friend Kobe Bryant even tweeted a picture of her bruised and bloodied legs after playing on turf to prove the abrasiveness of the surface.

FIFA was accused of sexism – not a hard allegation to make for an organization whose president once suggested women’s soccer would be more popular if the players wore “tighter shorts.” A lawsuit was filed by dozens of soccer players, though it was dropped in January.

FIFA hasn’t ruled out using turf again, though the issue is largely moot for the foreseeable future – all three of the coming World Cups (2018, 2019 and 2022) will use grass fields.

Drugs from DNA: 23andMe’s Next Move

23andMe, the California company that sells DNA sequencing tests, announced yesterday that it will be entering the drug development business. Normally, this wouldn’t turn heads – there is no shortage of companies doing treatment research in America and worldwide, but none of those companies is sitting on a database of nearly 700,000 fully sequenced human genomes.

“I believe that human genetics has a very important role to play in finding new treatments for disease,” said Richard Scheller, 23andMe’s newly hired chief science officer and head of therapeutics in a statement Thursday. “I am excited about the potential for what may be possible through 23andMe’s database. It is unlike any other.”

Scheller previously led Genentech, the company widely considered to have founded the field of biotechnology (Fun fact: Genentech marketed the first synthetic insulin brand in 1982).

Founded in 2006, 23andMe became famous for its easy-to-use tests that could tell one their ancestry, whether or not they liked cilantro and what diseases they were at risk of developing. More than 850,000 people purchased the service, which debuted at $1000 per kit before dropping to $99. In 2013, however, the Food and Drug Administration sternly rebuked 23andMe for marketing a test with medical implications without seeking federal approval. 23andMe removed all health information from its tests, and was only late last month allowed to begin reintroducing tests for whether someone is a carrier of a genetic disorder.

While 23andMe had to stop giving medical information to consumers, it never stopped asking them if it could store their sequenced genomes for future use. About 680,000 people agreed, creating an easily accessible archive larger than all of its predecessors. By accessing this data, Scheller and other scientists could conduct instantaneous genome-wide association studies, which is essentially testing if traits are associated with specific genetic variation. These variations often take the form of single-nucleotide polymorphisms – single base pairs that are mutated from one person to another – making the accurate, whole-genome sequences of 23andMe so useful for study.

This won’t be the first use of 23andMe’s vast DNA database for drug research – in fact in January the company received $10 million from Genentech in a deal to sequence the genomes of people at risk of developing Parkinson’s disease. And the company has also allowed dozens of university researchers access to its data for free.

Don’t expect significant results soon, though – genome-wide association studies are notoriously difficult because of the still largely unknown interactions between many genes that cause traits. At least 16 separate genes seem to factor into one’s eye color, and scientists still aren’t even sure what they all are. Meanwhile, more than 500 specific genetic mutations have been linked to cancer.

23andMe has experienced this difficulty first-hand – a Parkinson’s disease breakthrough that the company’s co-founder bragged about in 2011 didn’t pan out. But that shouldn’t dampen the excitement about this news, because genomics is truly the frontier of drug and therapy development, and 23andMe just built a really big fort there.

10,000

In its daily report on incidences of the Ebola virus, the World Health Organization quietly published today a disturbing statistic – more than 10,000 people are believed to have died from Ebola in West Africa in the outbreak that started last year. As the rate of new Ebola infections continues to decline (Liberia has had no new infections for several weeks) reaching this death total punctuates an epidemic that took West Africa to the brink of collapse, and had the entire waiting to see if they were next.

Research has shown that humans have difficulty comprehending large numbers, and it’s even harder to think about each number of a total being a living, breathing human being who had his or her own dreams, sadness and joy. According to the latest U.S. Census, more than 80% of American cities and towns have less than 10,000 people living in them. Think of your hometown, or that suburban city, or even your high school or university. That many people have died in just three countries from one epidemic.

On the scale of epidemics, this outbreak of Ebola is mercifully small. It never reached “pandemic” status, meaning it never became a global epidemic, and seems not so deadly when compared to the world’s assorted outbreaks of flu, such as the 2009 “swine flu” outbreak that killed nearly 300,000 people, or the 1918 flu pandemic that killed more than 75,000,000 (nearly 5% of the entire human population). This disparity comes from Ebola’s low contagiousness – unlike flu, it cannot spread through the air. In fact, Ebola comes in near the bottom of lists of diseases’ basic reproduction rates, or number of people each infected person will go on to infect in an unvaccinated population. An Ebola victim will infect 2 people on average, while individuals with the once-common childhood diseases mumps or pertussis will infect 12-17 people on average.

What Ebola lacked in numbers, it made up for in lethality, though. The 10,000 deaths came out of a total of about 24,350 known infections, meaning Ebola killed about 40% of people who caught it. The heroic interventions of local and international medical professionals brought this epidemic’s fatality rate down from the 80-90% death rate observed in other outbreaks. Ebola has one of the highest fatality rates ever observed in a human disease. Even the 1918 flu pandemic had a fatality rate of only 2.5%.

It’s still too early to tell what the world will learn from this outbreak. There was a lot of fear, a lot of incompetence and a lot of tragedy. But there was also a lot of bravery, survival and determination to keep going. Humanity has recovered and thrived after much worse, and hopefully the health knowledge and infrastructure put in place in West Africa will prevent another epidemic from spreading so violently. But as epidemiology professor Tara Smith so eloquently wrote in an article last October, “We will never be free of epidemics.”

Suicidology, or the Science of Taking One’s Life

Last week, the American Journal of Preventive Medicine published a paper seeking to find out why the suicide rate for middle-aged American men and women has risen nearly 40 percent since 1999, with a particularly sharp increase since 2007.

The authors drew data from the Centers for Disease Control’s National Violent Death Reporting System (probably the most morbidly fascinating database in existence) for the years 2005-2010, and used as a starting point the fact that Americans ages 40-64 were the hardest-hit group in the mid-2000’s recession.

The paper found that suicides with “external circumstances” (defined as job, legal or financial circumstances) as a major factor rose from 32.9 percent to 37.5 percent of completed suicides in this age group from 2005-2010, with the chart below showing an increase corresponding with the start of the latest recession. In particular, the use of suffocation, a method that is commonly used in suicides caused by external circumstances, rose by 27.7 percent during this 5-year period among middle-aged people.

A Fading Taboo

In many parts of ancient Europe, suicide seemed to have been viewed as a routine business. According to the historian Livy, Romans could apply to the Senate for permission to commit suicide, and if granted, they would be given a free bottle of the poison hemlock. Stoicism, a school of philosophy popular in ancient Rome, viewed suicide as a dignified act if used not because of fear or weakness. Emperor Nero, Caesar’s assassins Brutus and Cassius, and the Mark Antony and Cleopatra are just a few of the famous people from this era to have committed suicide.

While the medieval Catholic Church viewed suicide as a sin, it was a topic oft discussed and debated by theologians, scholars and rulers continuing into the Enlightenment. The 19th century saw suicide become romanticized and trendy in many ways, particularly among artists and young men inspired by the protagonist of Johann von Goethe’s blockbuster novel The Sorrows of Young Werther.

Indeed, it seems only in the 20th century did suicide become a topic to be discussed only in hushed tones and rarely mentioned in print. A 1968 article in Science could only guess at the annual number of American suicides because many ended up being “disguised by the listing of another cause of death on the death certificate.”

In the 21st century, the taboo about suicide seems to be receding in favor of earnest discussions to fix the problem. The first World Suicide Prevention Day was celebrated in 2003, and issues such as the disturbing rate of suicide among American veterans are now widely covered and discussed.

Suicide Science

With advances in neuroscience and psychology, research into the causes of suicide has shifted from abstract discussions of mental disorder to recognizing specific neural proteins and compounds associated with depression and suicide.

Cover image Progress in Neuro-Psychopharmacology and Biological Psychiatry

A 2007 article in the journal Progress in Neuro-Psychopharmacology and Biological Psychiatry found that low levels of the protein Brain-derived neurotrophic factor (BDNF) were associated with major depression and suicidal behavior. In a 2005 article published in Molecular Brain Research, abnormally low levels of BDNF were found in the brains of suicide victims, particularly those who had not taken any psychological drugs. Further research has shown that people with suicidal tendencies tend to have excess receptors for seratonin, a neurotransmitter molecule that helps regulate mood in humans.

Beyond the neurobiology of depression, science has also begun to tackle what in particular causes some people with depression to take suicidal action while others don’t. Some underlying factors are at work – men are more likely than women to commit suicide, people with a history of impulsiveness are more likely to at least attempt suicide, and as alluded to above, life circumstances or mental illness are the clearest omens of suicidal action. But a 2005 article in the Journal of Consulting and Clinical Psychology suggested that the “capability” to go through with suicide is something that must be acquired through life experience, particularly exposure to “painful and provocative events” such as fights and self harm. A 2007 study by the same team found that physical and sexual abuse during childhood was a significant predictor of a person attempting suicide in adulthood, and research has even shown that childhood abuse can cause permanent epigenetic effects on the human brain.

“People who have been abused or have abused themselves would habituate to the experiences of pain and acquire the ability to act on suicidal thoughts,” Harvard psychologist Matthew Nock said in a 2010 article in Scientific American.

Preventing the Problem

Research into actual suicide prevention has taken several paths. Some studies have looked at the effectiveness of catching depression before it becomes a problem through screening at regular doctor’s appointments, however limited data has prevented them from making firm conclusions.

Other researchers have looked into various methods for treating depression, from the effectiveness of various antidepressant drugs to whether making patients sign a suicide prevention contract actually works. Psychotherapy has been proven to work for many, but not all, people with depression.

To read more about suicide, check out these links from the American Foundation for Suicide Prevention.

23andMe and the Genomics Frontier

Human_male_karyotype

Last Thursday, the US Food and Drug Administration authorized the biotechnology company 23andMe to sell a consumer test for the genes that cause Bloom syndrome, an inherited recessive gene disorder that causes short stature, heightened skin sensitivity to sunlight and an elevated risk of developing cancer. The importance of this news comes not from its conclusion, but because it’s a stark reversal from the FDA’s previous dealings with this California-based start-up.

A New Age

Founded in 2006, 23andMe set out to sell humans a peek at the code that defines them. Simply spit into a vial, mail it away, and you’d soon be able to learn about details gleaned from your DNA, ranging from your ancestry to whether you’re destined to hate cilantro. But the tests’ most attention-grabbing feature was its analysis of one’s genetic predisposition to health problems like Parkinson’s or heart disease.

“The advent of retail genomics will make a once-rare experience commonplace,” wrote Thomas Goetz in a November 2007 article in Wired that declared 23andMe’s launch the start of the “Age of Genomics.”

“Simply by spitting into a vial, customers of these companies will become early adopters of personalized medicine. We will not live according to what has happened to us (that knee injury from high school or that 20 pounds we’ve gained since college) nor according to what happens to most Americans (the one-in-three chance men have of getting cancer, or women have of dying from heart disease, or anyone has for obesity). We will live according to what our own specific genetic risks predispose us toward.”

Though the test originally cost nearly $1,000, 23andMe soon cut the price to $399 in order to build up a database of genetic data for use in pharmaceutical and biotechnology research. Time named the test its Invention of Year in 2008, and continued buzz allowed the company to raise more than $50 million from investors and eventually drop its selling price to $99.

“You’re donating your genetic information,” 23andMe co-founder and CEO Anne Wojcicki told Time in 2008. “We could make great discoveries if we just had more information. We all carry this information, and if we bring it together and democratize it, we could really change health care.”

The Dark Years

2010 turned out to be a bad year for 23andMe. The FDA has long allowed testing kits, such as pregnancy tests, to be sold directly to consumers. 23andMe sold its “personal genome service” (PGS) under the assumption that it was allowable as a home testing kit, but the FDA had reservations, and sent the company a letter that year saying it considered the kits “medical devices,” and thus much more heavily regulated. The company was also dragged before a congressional committee to respond to damaging allegations from a sting operation conducted by the US Government Accountability Office. For a more thorough summary of this affair, check out this post from the excellent (and sadly inactive) science blog Genomes Unzipped.

After years of debate, a stern warning letter in November 2013 seemed to seal 23andMe’s fate – stop selling health tests until you get approval.

“The Food and Drug Administration (FDA) is sending you this letter because you are marketing the 23andMe Saliva Collection Kit and Personal Genome Service (PGS) without marketing clearance or approval in violation of the Federal Food, Drug and Cosmetic Act (the FD&C Act),” wrote Alberto Gutierrez of the Office of In vitro Diagnostics and Radiological Health. […] Most of the intended uses for PGS listed on your website, a list that has grown over time, are medical device uses under section 201(h) of the FD&C Act. Most of these uses have not been classified and thus require premarket approval or de novo classification, as FDA has explained to you on numerous occasions.”

23andMe was chastened, and quickly announced that it would discontinue all of its health-related testing, thus limiting its products only to ancestry information and raw genetic sequencing. Though the FDA intended to limit inaccurate diagnostic use of a person’s genome, its rebuke of 23andMe was widely viewed as overbearing and pointless. Science writer Razib Khan argued in Slate that this decision “highlights the tension between the paternalistic medical establishment that arose to deal with the dangers of 19th-century quack medicine, and a “techno-populist” element of American society pioneering personal health assessment and decision-making by leveraging new information technologies.”

“The glaring weakness in an aggressive strategy against interpretative services is that there will always be firms such as 23andMe, and there’s no reason that they need to be based out of the United States. Not only that, but there are open-source desktop applications, such as Promethease, that provide many of the same results by combining individual raw data with public peer-reviewed literature, if less slickly than 23andMe. To truly eliminate the public health threat that the FDA is concerned about, the U.S. government would have to constrict and regulate the whole information ecology, not just a strategic portion of it, from scientists distributing research about genetic variants, to international genome sequencing firms returning raw results on the cheap.”

On the other side, some argued that 23andMe was brazen in its defiance of the FDA, and it should have seen this coming. In a column for Forbes, health writer Matthew Herper called out the tech company for having gone more than six months at a time without communicating with the FDA.

“Either 23andMe is deliberately trying to force a battle with the FDA, which I think would potentially win points for the movement the company represents but kill the company itself, or it is simply guilty of the single dumbest regulatory strategy I have seen in 13 years of covering the Food and Drug Administration.”

Today

In its announcement last week, the FDA not only authorized the one test for Bloom syndrome, but also exempt all future carrier tests from “premarket review” by the agency. In particular, the language of the announcement seems to express a newfound willingness to accept direct-to-consumer genetic testing as part of 21st-century healthcare.

“The FDA believes that in many circumstances it is not necessary for consumers to go through a licensed practitioner to have direct access to their personal genetic information, wrote Gutierrez. “Today’s authorization and accompanying classification, along with FDA’s intent to exempt these devices from FDA premarket review, supports innovation and will ultimately benefit consumers.”

“This is a major milestone for our company and for consumers who want direct access to genetic testing,” Wojcicki said in a press release. “We have more work to do, but we remain committed to pursuing a regulatory path for additional tests and bringing the health reports back to the US market.”

Now, it seems only time will tell when the Personal Genome Service will return to the United States. I know I’m very eager to try it when it does.