Smallpox: vials found in NIH lab

July 9th, 2014

I was glancing through The Wall Street Journal. this morning (that period is intentional as I found out recently in their 150th anniversary issue) and saw an article about smallpox,  that old enemy of mankind. The CDC issued a media statement saying six vials labeled with the technical name of the disease, variola, had been found in an old storage room belonging to an FDA lab that is on the NIH Bethesda, Maryland facility. Forty-two years ago the FDA took over that lab, among others, and only now were those labs being moved to the main FDA location in the DC area. The vials themselves date back ~60 years and now will be tested to see if the material in them is viable (i.e., live smallpox viruses).

I reviewed the CDC’s Bio-hazard Safety Levels; they range from 1 to 4 with more serious infectious agents occupying the higher levels. A BSL-3 agent can cause serious or deadly disease, but either doesn’t spread from person to person (at least not easily) and/or has a preventive or treatment known. Plague, rabies virus and West Nile fit into this category. Smallpox is obviously a BSL-4 bug, the most dangerous kind and in the company of Ebola virus. A February 15, 2012 Reuters article, “How secure are labs handling the world’s deadliest pathogens?” talked about the precautions used in such a lab in Galveston, Texas. The boss there got entry by swiping a key card, was scanned by 100+ closed-circuit cameras as he opened seven additional locked doors before he reached the lab where another card swipe and a fingerprint scan were necessary for entry. The Washington Post article on the recently found vials has a six-minute video on BSL-4 procedures with a comment that there are three over-lapping types of safety precautions: those for containment of the hazardous material; those for personal protection and overall administrative factors.

And this may get you into BSL-3/

And this may get you into BSL-3

The air flow and exhaust systems used in Galveston, the full-body suits with their own air supply and the intruder drills that are conducted all made me somewhat more comfortable. But that’s in a government-funded laboratory. Even in the United States, a private-funded lab may not be subject to the same rules and regulations, Elsewhere the procedures that must be followed vary. In 2011 there were 24 known BSL-4 labs in the world with six in the U.S. (The GAO said we had considerably more.) In 2013 there was considerable protest in Boston over the proposed BSL-3 and BSL-4 lab there.

We don't see these anymore.

We don’t see these anymore.

I’ve written about smallpox before, but a brief history, available online on a dermatology website was worth reviewing. The disease likely originated in Africa about 12,000 years ago. caused a major epidemic during an Egyptian-Hittite war in 1,350 B.C.E and left typical scars on the mummy of Pharaoh Ramses V who died in 1157 B.C.E. It got to Europe somewhere between the 5th and 7th centuries C.E.; millions died in Europe and the Western Hemisphere before Edward Jenner developed vaccination in 1796. The term came from the Latin word for cow (vaca), as Jenner used fluid from a cowpox-infected dairymaid’s hand to inoculate an eight-year-old boy. In 1967 WHO estimated there were 15 million cases of smallpox and 2 million deaths from the disease. Total smallpox deaths over the past 12,000 years have been estimated at 300-500 million, more than all the world wars combined.

By 1980 the World Health Organization declared the globe smallpox-free. In this country, we quit vaccinating the general population in 1972 and stopped giving the inoculation to new military personnel in 1990.

My wife’s old shot record shows she got her first vaccination against small pox in 1956 and the last booster in 1980. We were both assigned to bases in the Far East in the early and mid 80s. I can’t find my military vaccination record from that time frame, but logically wouldn’t have had a booster after 1986 when I got back to a base in Texas. Since immunity is unlikely to last more than ten years, at this stage we’d both be vulnerable to smallpox, like most everyone else.

The only known supplies of the virus remained in government laboratories in the United States and Russia. There has been considerable international protest against keeping those specimens alive starting in the early 1990s, but thus far neither country wants to give in to that pressure. One rationale was the genetic structure of the virus was known, so it could conceivably be recreated by terrorists.

In 2004 the CDC said they had stockpiled enough smallpox vaccine on hand to vaccinate everyone in the United States. I haven’t found any updates on that statement. But the U.S. military was still giving those shots to personnel going to USCENTCOM areas (the Middle East and the “stans”) until the middle of May, 2014, to Korea for another two weeks and to some special mission troops after that with an end date unspecified.

So now it’s the middle of 2014 and, in one manner or another, smallpox is still lingering, fortunately not as an active disease. The CDC is testing those re-found vials of the virus  and we’ll hear in a couple of weeks is they were still viable.

 

 

 

 

.

Long-term acute care hospitals

June 30th, 2014

An article in The New York Times (NYT) several days ago opened a new topic and re-visited an old discussion in our household. The title was telling, “At These Hospitals Recovery is Rare, but Comfort is Not” and talked about what Medicare calls long-term care hospitals (LTCHs). I had never hear of the term.

The article said there were 400 of these facilities in the United States, but lots of practicing physicians are unaware of them. I did an online search and found a 20-bed facility in thus category about 15 miles from where we live and a 63-bed hospital in Denver, roughly 65 miles away. I wasn’t sure of any in the southeastern part of Wyoming, 40-50 miles north of us.

The Medicare website on Long-Term Care hospitals says they focus on those whose inpatient stay is likely to be more than 25 days. The contrast is stark as this is an age when many surgeries are done in a technically “outpatient” fashion (the current definition of an inpatient says you’re in the hospital at least two midnights). Medicare says LTCHs specialize in treating patients, even those with more than one serious medical condition, who may improve with time and eventually return home.

Yet the NYT piece talks of patients who are critically ill, may be unresponsive, even comatose and, except for those who are younger and have been in an accident, may stay for months, years, or the rest of their lives. In 2010 another NYT article discussed significant issues with some LTCHs.

At that point my wife and I both said, “Don’t put me in a LTCH!” We are 73 years old, relatively healthy at the present time, and enjoy life. We have living wills and medical durable power of attorney documents naming each other as first decision makers if we can’t choose for ourselves.

I’ve mentioned before how my parents approached this quandary. Mom had a cardiac arrest at age 74, was resuscitated by Dad who was still a practicing physician, and lived 16 more years. But when she was in her declining last four years and they had moved from totally independent living to a seniors’ residence, they encountered a situation that influenced their future decisions. Mom had a minor acute illness and moved short-term into an associated facility.

She was there for a few days to get some antibiotics and nursing care, but in the next room was a woman, the wife of one of their friends, who had been in extended care for seven years. For the last four of those she no longer recognized her husband, yet he requested treatment of her bouts of pneumonia on three separate occasions. Dad and Mom each said, “Don’t do that to me!” They had signed the appropriate end-of-life documents before Mom showed signs of initial dementia.

A 2011 article in Kaiser Health News stresses that making end-of-life decisions can be tough, especially if they aren’t made in advance. But a professor of ethics was noted as saying more than 90% of all families who have a loved one in intensive care want to hear prognostic information that will help them make those difficult decisions.

Hospital care has changed..a lot, since I last saw inpatients. It used to be that the physician who organized your treatment was the same one you saw in her or his outpatient office. Now the primary care physicians I know, unless they are part of a residency program, don’t see their long-term patients at all when they are hospitalized. Instead patients see an ER doc, a hospitalist (physician whose focus is inpatient care) and, if they go to an ICU, an intensivist.

Intensivists  are physicians who have completed specialty training, often in Internal Medicine or Anesthesia and then take an additional two-to-three year fellowship in critical care medicine (some are triple board certified, in lung disease (Pulmonology), for example. They are often thought of as primary critical care physicians and in major academically-oriented clinics and their associated hospitals (e.g., the Cleveland Clinic), they may provide most or all of the physician care in the ICU.

Do you need an intensivist?

Do you need an intensivist?

The article from the NYT said we spend over $25 billion a year in long-term acute care in the United States., The path to landing in a LTCH often begins in an ICU. The major task for intensivists is keeping patients alive during critical illnesses. That often means deciding on short- or long-term-ventilator support, the possibility of a tracheostomy (a surgically created hole through the front of the neck into the trachea (AKA windpipe) to allow this, feeding tubes of several varieties or long-term intravenous access.

I don’t ever want to be on a ventilator long-term. I might allow one short-term if I had a clearly treatable, potentially curable patch of pneumonia, for instance, but I would  want to set a time allowance for that.

When my mother quit eating, her physician wanted to create a long-term method of feeding her, a percutaneous endoscopic gastrostomy (PEG). If someone cannot eat and needs to be fed long term, one method of doing so is to place a PEG tube through the wall of the abdomen directly into the stomach.

This could be done for someone who has had a stroke and is at risk of aspirating food if fed normally. In my Mom’s case, by then she had developed significant dementia and Dad said, “We’ve made our decisions; she is not having a PEG tube.”

She could have gone into a LTCH and lived a while longer, but Dad knew that her refusal to eat meant she had come to a logical stop point.

There are an estimated 380,000 patients in LTCHs at present. Some (roughly 10 to 15%) are there for appropriate reasons and have a reasonable chance of recovery; many are not. A study by a Duke critical care specialist found half who enter these facilities die within a year; a majority of the remainder are in “custodial care.”

I don’t choose to join their ranks.

So there are some decisions that you and your family may want to make. I’d suggest you read the NYT articles and think about what your choices might be. It’s never easy, but a careful discussion in advance with your long-term goals in mind makes sense.

 

 

 

Dengue fever and its major Mosquito vector

June 21st, 2014

I don’t like being bitten by mosquitoes any more than the rest of you do, but worldwide the real reason to avoid them, kill them or alter them is the enormous disease burden they cause. One recent estimate , surprising to me, said “mosquitoes have been responsible for half the deaths in human history.” I was aware, having lived as an Air Forced physician in the Philippines and traveled in South America and Africa, that malaria was one enormously dangerous, mosquito-carried disease, but there’s a long list of other illnesses that contribute to the threat from these insects.

This one doesn't carry dengue

This one doesn’t carry dengue

From 1690 to 1905 major epidemics of yellow fever struck parts of southern and eastern America: Boston, New York, Philadelphia, New Orleans killing over 40,000 people. A 2006 PBS website gives short summaries of nine of the outbreaks and alludes to even larger mortality figures.

And then there’s dengue, a disease primarily transmitted by the bite of infected female Aedes Aegypti mosquitoes. They don’t make the telltale sound that alerts you to other mosquitoes, they also strike during daytime and may follow their human target, biting repeatedly.

Dengue attacks 400 million people every year world-wide., mostly in the tropics and sub-tropics. Three-fourths of those infected never develop symptoms and of the remaining 100 million, a large majority have a mild to moderate nonspecific acute illness with a fever. But 5% can have severe, even life-threatening disease with terrible joint and muscle pain (It’s called break-bone fever), hemorrhages and shock. The World Health Organization estimates 22,000 die from dengue yearly, but other estimates range from 12,000 to 50,000.

The first known case in the United States occurred in Philadelphia in 1780 and was documented by Benjamin Rush, the distinguished physician who was a signer of the Declaration of Independence.

The Center for Disease Control and Prevention (AKA the CDC) has an entire chapter on dengue in its “Infectious Diseases Related to Travel” publication and a shorter version with links for travelers. Their maps of disease distribution focus on warmer areas in Africa, Central and South America, Asia and Oceania.

There has been no vaccine available to prevent the disease and no specific anti-viral treatment for those with severe cases of dengue. Because of known bleeding complications, those who get the dengue are advised to avoid taking aspirin or any of the nonsteroidal anti-inflammatory drugs , AKA NSAIDs, such as ibuprofen.

The continental United States was essentially dengue-free for over fifty years, but marked increases in dengue infection rates have occurred in our hemisphere, mostly in South America and Mexico.

Now Aedes Aegypti is back in Florida, Texas, and Hawaii. The article in The New Yorker mentioned a small 2009 outbreak of dengue in Key West with fewer than 30 cases, but that was the first real brush with the disease there in over seventy years. In 2010 there were twice as many cases. An entomologist (insect specialist) with the Florida Keys Mosquito Control District reminded the reporters that the manner in which the populace lived was crucial; from 1980 to 1999 there were only sixty-four cases on the the Texas side of the Rio Grande and 100 times as many just across the river.

What was the difference? Likely screens on windows, cars with AC running and windows closed and how often people were exposed outdoors. Key West, in a 2013 followup, had seen no further cases, but the World Health Organization called dengue the “fastest-spreading vector-borne viral disease,” saying cases had gone up thirty-fold over half a century.

Why has this happened and what can be done about it?

How can we do this?

How can we do this?

Is this another consequence of global warming? After all dengue has appeared in France and Croatia for the first time. But I just watched an online video by Dr. Paul Reiter, a world-famous medical entomologist, who spent much of his professional career at the CDC’s Dengue Control branch. It was obvious that he does not believe in man-made global warming (I do) or that any form of global temperature change is responsible for the spread of malaria or dengue.

How about used tires? He thinks they are great incubators for mosquitoes and billions of those tires have been moved around the globe. So Aedes aegypti has adapted to the city, in part because of our habit of having water-containing used tires around the places where we live.

I don’t have any old tires in my yard and I change the dog’s water bowl and the bird water outside frequently.

A few new ideas are out there: a British company called Oxitec has genetically modified (GM) mosquitoes, making the males able to mate, but also giving them a gene which kills their offspring soon after they hatch. An initial field trial in Brazil was successful in markedly reducing the population of disease-carrying adult females (remember, males don’t bite humans for a blood meal; females do).

Further field trials of these GM-mosquities, titled OX513A, have met with considerable opposition and an engineer involved has published a paper examining the ethical issues involved. The lifespan of mosquitoes is short and they don’t appear to be a major food source for other creatures; the most significant issue likely is fully informing the people in the test area are who may consider OX513A to be just another threat.

A French pharmaceutical company recently announced an experimental vaccine for dengue was moderately successful in a late-stage, placebo-controlled clinical trial involving 10,000 children in Southeast Asia, reducing dengue incidence by 56%. A similar clinical trial is underway in South America.

It’s a bad disease, coming back at us, but perhaps there’s some good news on the horizon.

 

 

 

Using (or at least minimizing) our food waste

May 21st, 2014

I recently read an article in The New York Times with the interesting title, “Recycling the Leftovers.” It was written by Stephanis Strom, one of their regular correspondents, and covered a variety of programs in America for recycling food scraps. Lynnette and I have been separating our own waste streams that for at least ten years and have a garbage bin, a trash sack, a recycle sack and a composting pail in our kitchen and laundry room. Our waste-collecting company keeps adding new items that can be recycled, but at present we only put out two containers for them: trash goes to the curb to be picked up weekly and recyclables go out every other week.

Composting is one approach to food waste.

Composting is one approach to food waste.

Now the city of Austin, Texas has plans to markedly extend its  food waste pilot project; Strom’s article says 14,000 Austin residences currently have a third garbage bin, one for food scraps, collected weekly Twenty-five years ago the city started with a “Dillo Dirt” program; the city made over a quarter million dollars last year selling the end product, compost made from yard clippings and treated sewage sludge. The newer approach, adding organic waste, currently has enrolled less than 10% of the city’s ~185,000 households; the plan is for all of them to be offered the service. I’m unaware of a city-wide program here in Fort Collins for food scrap recycling; ours end up  in a vermiculture bin that’s outdoors, but in a fenced-in corner. The worms doing most of the work in turning food waste into compost thus far have survived our winters.

The concept is being highlighted nationally by the U.S. Food Waste Challenge (FWC), a joint project of the U.S. Department of Agriculture (USDA) and the Environmental Protection Agency (EPA). The goal of the FWC is to bring about a “fundamental shift” in the way we manage and even think about food and food waste. The USDA/EPA wants to have 400 organizations enrolled this years and 1,000 by 2020 and they are well on their way already with an impressive list of governmental and private partners including companies, colleges and universities, K-12 schools and at least one major hospital having joined.

We as individuals can’t join the FWC, but there is a webpage of suggestions for consumers. Basically it says shop thoughtfully, cook with care, serve portions that you’ll eat then and there, save whatever can be kept (while eating what would otherwise spoil) and, if possible, grow part of your meal. It also mentions we should shop our own refrigerators first; plan meals before we go grocery shopping so as to buy only those items we actually need; freeze, preserve or can seasonal produce; ask restaurants to bag up leftovers and be realistic at all-you-can eat buffets.

I was at a writers’ meeting recently and drove to the event with my long-time writing mentor. She said her family almost always eats everything she buys, but even with a husband and three teenagers on board I knew she was being modest. She obviously shops carefully and plans ahead.

Our lunch yesterday featured a Quiche my wife (professionally a Jung) made that was “Jung Fun.” It wasn’t your typical recipe, but used up everything in the vegetable drawer that was needing to be eaten ASAP. We still occasionally have spoiled vegetables and fruits, especially when our CSA gives us more than its usual abundance, but those go into the compost bin.

We did go to the CSA a few days ago to purchase four beefsteak tomato plants. We’ve got a special above-ground gadget for planting tomatoes and have consistently done well with those we bought at a nursery, but, having grown up eating beefsteak tomatoes, I’m really looking forward to have an abundance of them. Our local grocery store generally has good produce, much of it grown locally or regionally, yet it’s been my experience that homegrown tomatoes are several orders of magnitude better than anything I can buy at a store.

Beefsteak tomatoes are yummy!

Beefsteak tomatoes are yummy!

The EPA’s Food Recovery Challenge webpage has a horrifying set of statistics from 2011 (they’re still collecting/collating the 2013 stats apparently, but what happened to 2012?). Almost all (96%) of the 36 million tons of food waste generated in 2011 ended up in landfills or incinerators. The food sent to landfills breaks down and releases methane, a nasty greenhouse gas, twenty times as effective in increasing global temperature than CO2 is. More than a third of all methane released into the atmosphere comes from landfills (domesticated livestock accounts for 20% and natural gas and oil systems another 20%) .

While all that food is being wasted and much of it is contributing to global climate changes, 14.9% of U.S. households were food insecure in 2011, not knowing where they’d get their next meal. Fortunately we have a strong local Food Bank serving Larimer County and their “Plant It Forward” campaign’s 2014 goal is to obtain 15,000 pounds of produce donated by local gardeners.

So where are you in the nationwide quest to cut food waste?

 

Food Safety Issues: America in 2014

March 19th, 2014

Having written recently about China’s food problems, I knew there were some remaining in the Untied States, but their scope amazed me. Each year forty-eight million of us suffer from food poisoning. Over 125,000 of that group are ill enough to be hospitalized and 3,000 die.

Having seen those numbers on a government website, I decided to review the modern timeline of food-related illness in America and how our laws help prevent it.

One step in meat processing

One step in meat processing

My initial thought was of Upton Sinclair’s 1906 novel, The Junglea powerful expose’ of the American meat-packing industry. After its publication, public outcry led President Theodore Roosevelt to appoint a special commission to verify Sinclair’s tale of the horrors of meat production in Chicago and elsewhere, and eventually led to the meat Inspection Act of 1906 and the Pure Food and Drug Act.

For many years a so-called “Poke and Sniff” system prevailed. The 1907 law said federal employees could inspect and approve (or disapprove) any animal carcasses which were to be sold across state lines. The inspectors could physically monitor the slaughter line, touching and smelling the animals and cuts of meat. They could remove any meat that was rotten, damaged or had abrasions or growths. Some felt that provided only minimal protection for the public, but that’s what we had for over eighty years.

I grew up in Wisconsin in the 40s and 50s. My father, in addition to his medical practice, was the local Public Health Officer and I remember going to inspect local area dairy herds with his sanitarian when I was a teenager. I don’t recall major food safety issues surfacing in those decades., although there may have been some isolated cases that I didn’t pay attention to.

I was in medical school from 1962 to 1966. During that time, two women died in Michigan from botulism, a rare but extremely serious paralytic disease caused by a toxin produced by a bacteria. In their case the toxin was in canned tuna fish. There were other botulism outbreaks in 1971, 1977, 1978 and 1983 with 59 people being affected in the largest such episode. All were related to food being improperly canned or prepared.

In 1985 a huge outbreak of another form of food poisoning happened. This one involved at least 16,284 people (and perhaps up to 200,000) in six different states and was caused by bacterial contamination of milk.

Some new laws only applied to a few food items.

Some new laws only applied to a few food items.

The Department of Agriculture’s food safety and inspection timeline appears to skip over a considerable period of time, although a number of laws were passed to strengthen federal regulation of the food chain. The 1957 Poultry Products Inspection Act and the 1970 Egg Products Inspection Act added to the government’s ability to prevent food-related illness in specific areas, but wouldn’t have prevented the major food-related episodes I just mentioned.

Then in late 1992 and early 1993 an E. coli outbreak sickened 623 and killed 4 children in four western states (Washington, Idaho, Nevada and California). It was eventually traced to contamination of under-cooked Jack in the Box hamburgers with that common bowel bacterium. Those affected developed bloody diarrhea and, in a few cases, severe kidney disease from an entity termed hemolytic-uremic syndrome (HUS). This is a disease which is the most common cause of acute kidney failure in children and usually occurs when an infection in the digestive system produces toxic substances that destroy red blood cells, causing severe kidney injury. The CDC traced the meat back to five slaughter plants in the United States and one in Canada.

In 1998 the USDA introduced a brand-new method for inspecting meat. The “Hazard Analysis and Critical Control Point (HACCP) system had been pioneered by NASA. That agency had protected our astronauts by adopting a system of critical control points, anywhere a germ, invisible to the naked eye, could find its way into a food meant for a space mission.

Pinging off the NASA approach, the USDA also mandated inspectors could order meat plants to do microbial testing. The meat industry became responsible for establishing and submitting their own HACCP plans. Then USDA would review the plan, approve it if it seemed appropriate and inspectors could monitor the plans’ compliance with their own safety plans. The problem is the age-old one of the fox guarding the hen-house; inspectors no longer had the power to physically examine the meat on the line. The acronym HACCP was often derided as “Have a cup of coffee and pray.”

On January 10th, 2014 two articles were published that changed my mind: the first, in UPI.com’s website simply said, “U.S. food Safety a big issue in 2014.” It mentioned that already in 2014 the U.S. Department of Agriculture had shut down a meat-processing facility in the state of Minnesota.

The other online article was written by Dr. Margaret A. Hamburg, the Commissioner of Food and Drugs, i.e., the head of the FDA. It discusses the Food Safety Modernization Act (FSMA), signed by President Obama in early January, 2011. It was a reaction to the figures I mentioned at the start of this article.

This law gave the FDA “a legislative mandate to require comprehensive, science-based preventive controls across the food supply.”

But let’s look at its provisions, some of which make eminent sense and others, in my opinion, ask for the impossible.

On the one hand the FSMA required food facilities to have a written preventive control plan. I agree with that idea, but note it’s a complex process with multiple steps involved. Such a plan includes evaluation of possible hazards, figuring out what one has to do to marked alleviate or totally eliminate them, noting how you will monitor these control measures (and keep appropriate records) and specifying what you will do when issues arise. Oh, and by the way, you had a year and a half to do all that.

Other parts of the FSMA involved standards for safely producing and harvesting vegetables and fruits plus another set involving the prevention of “intentional contamination” of food. The latter may be quite difficult. As the law is written, 600 such foreign food facilities must be inspected in its first year with the number doubling for each of five additional years. let’s see, that’s 600, 1,200, 2,400, 4,800, 9,600 and 19,200. Where in the world would the FDA get enough trained inspectors? And that’s assuming that the foreign countries would allow such detailed examinations of their food-producing and exporting businesses.

One of every six Americans becomes ill from food-bourne disease each year. Only a small fraction of  them (approximately 1/4th of 1%) need to be hospitalized and even of those who do only 2.3% die. But another way of looking at those mortality statistics is to say it’s equivalent to almost 10% of the number who die from motor vehicle accidents each year in this country.

 

 

They’re transplanting what?

February 24th, 2014

I read an article in The New York Times that gave me pause for a moment. It was on fecal transplants. Initially that didn’t seem to make sense. Then I remembered there had been something on this topic last year in the New England Journal of Medicine, a Dutch study done at an academic center with the title “Duodenal Infusion of Donor Feces for Recurrent Clostridium Difficile.”  So I did a Google search and found the Mayo Clinic’s website for medical professionals where the subject was titled “Quick, inexpensive and a 90 percent cure rate.”

Why in the world would you need this kind of a transplant? Well let’s start with the possibility of eliminating  a majority of 14,000 deaths a year in this country alone. The CDC website on the bacterial overgrowth that can cause the issue is a good resource, but let’s start with a few basics. Your intestines normally have lots of different kinds of bacteria, up to 1,000 varieties according to some experts. The term “gut flora” is often used to mean our normal sea of bowel bacteria.  But when you take antibiotics, especially for a prolonged time, you run a risk of getting rid of the balance between bacterial species and having some (that are normally harmless) cause major problems.

I may need another roll after this one!

I may need another roll after this one!

One of these kinds of potentially nasty “bugs” is called Clostridium difficile technically, or C. diff as a shortcut name. The WebMD site has an easy-to-understand short tutorial on C. diff When it becomes the predominant gut flora it releases toxins that attack the bowel lining and causes severe diarrhea with up to 15 watery stools a day, fever, weight loss, abdominal pain and blood or pus in the stools. The disease often hits older patients (those over 65, so I’m in that higher risk category) and, in the past, was usually treated with one of three antibiotics given orally. Up to a quarter of those so treated need a second round of antibiotics.

The Dutch study randomly assigned patients to standard treatment with a drug called vancomycin, or the same drug plus four liters of a bowel-cleansing solution, or the drug plus that bowel washout plus infusion of a solution of donor feces through a tube inserted through the patient’s nose and into their stomach (typically called an NG tube, shorthand for nasogastric). Less than a third of those in the first two groups had their diarrhea resolve while 81% of those given the fecal infusion (13 of 16) improved after one treatment and only one of the three remaining patients didn’t improve after a second infusion.

One Mayo Clinic branch had tried a fecal transplant in 2011 in a patient with severe C. diff colitis (inflammation of the large bowel). In that case the medical staff infused the patient with their brother’s stool given via a colonoscope, instead of an NG tube, therefore going up the intestinal tract, not down and getting right to the colon. The patient had been bedridden for weeks prior to the procedure, but was able to go home within one day after it.

Since then the same Mayo branch has done 24 fecal transplants. Every one of the patients had their infection go away within a short period of time; only two had a recurrence of the disease. (both had other illnesses). The senior nurse who played the major role in starting the Mayo program interviewed every patient and said their quality of life improved tremendously. Mayo now uses the procedure only for those who have severe relapsing C. diff infections, but is researching its use in other medical diseases.

Then in 2012, Mark Smith who was a doctoral candidate launched OpenBiome with three colleagues. It’s a nonprofit 501 (c)(3) organization they organized after a family member/friend had gone through seven rounds of vancomycin for a C. diff infection that lasted a year and a half. They call the procedure Fecal Microbiota Transplantation (FMT) and, according to the New York Times article, they’ve supplied more than 135 frozen, ready-to-use Fecal Microbiota Preparations to over a dozen hospitals in the last five months. Much of the work is done in M.I.T.’s laboratories. All the medical facility needs is a doctor with an endoscope.

So have we solved the C. diff overgrowth problem or nearly so? I went back to a July 12, 2010 article in The New York Times titled “How Microbes Defend and Define Us.” It described the work of a University of Minnesota gastroenterologist, Dr. Alexander Khoruts, who not only performed a fecal transplant on a woman with an intractable C. diff gut infection, but also looked closely at what bacteria were in her intestines before and after the procedure.

In this case the donor was the patient’s husband and the analysis of the gut flora revealed his bacteria had taken over, supplanting the abnormal bacteria that were there before the transplant.

Khoruts continued to use the new procedure, fifteen by 2010 with 13 cures, but according to the NYT article, is now concerned that OpenBiome’s model is just an early step. The Food and Drug Administration, in early 2013, classified fecal transplants as biological drugs. As such, any clinician who wished to use them would need to obtain an Investigational New Drug (IND) application, much as a pharmaceutical company would need in developing a new antibiotic.

Since then the FDA has relaxed their ruling, slightly, saying doctors performing FMTs for C. diff wouldn’t be prosecuted. Smith and colleagues want FMT to be classified as a tissue, not a drug, allowing more research to be done on the procedure in other diseases and conditions and, at the same time, letting clinicians use FMT, at least for C. diff, without an IND permit or fear of FDA reversing its stance on such therapy.

There are a host of other diseases where FMT has been suggested as possibly effective in treatment. Some seem farfetched to me at first glance, but investigators appear interested in pursuing research on many of those conditions. I bet they would need an IND in such cases, even if FMT is reclassified as a tissue.

We all have bacteria in our colons, but in other places too.

We all have bacteria in our colons, but in other places too.

Khoruts and others think FMT for C. diff is just the tip of the iceberg. The NIH has been carrying out a huge Human Microbiome Project since 2007 with the first five years  of the investigational study being devoted to cataloging the microbiome of our noses, skin, mouths, GI tracts and urogenital systems. That term refers to the aggregate of all the microorganisms, including bacteria and fungi that live on and in our bodies. From 2013 on they have shifted gears, aiming at an integrated dataset of biological properites in microbiome-associated diseases.

Having read a number of papers and looked at a variety of source materials on the concept I’m no longer astounded by the idea. It still sounds strange, but obviously reputable academic centers have pioneered this research with great results.

One question that seems unresolved was highlighted on a patient-website. Is my bowel flora the same as someone’s who lives in another part of the world and eats a totally different diet?

But it seems like FMT, in one form or another, is here to stay.

Electronic Health Records & Medical Scribes

February 5th, 2014
Turn over the data entry to someone else, doctor.

Turn over the data entry to someone else, doctor.

Recently, in the online version of The New York Times, I saw an article by Katie Hafner titled “A Busy Doctor’s Right hand, Ever Ready to Type.” The article described a new movement among medical personnel, one to hire scribes to make entries into an Electronic Health Record (EHR).

The concept made great sense to me, but it’s clearly not a new one. Our ophthalmologists have, over the last fourteen years, routinely had an assistant who entered data into some form of a medical record, allowing the physician to concentrate on examining us.

Only five years back the use of an EHR was clearly the exception for other medical personnel with perhaps a tenth of physician office practices and hospitals utilizing them. now that percentage is well over two-thirds.

So what are the problems with universal acceptance of EHRs?

One that I touched on in my previous post on EHRs is interoperability between different health-record systems. My translation of that term is that Dr. A using, for instance, Epic at a UCH site like our local hospital, should be able to access and read my medical record from the Department of Defense or the Veteran’s Affairs’ systems. At the moment I doubt that’s even remotely possible and there will obviously be issues with patient confidentiality. Those should be eventually solvable, although the mechanism for doing so is well beyond my computer skill level.

But, for an individual practitioner, on a day by day patient-care basis , there’s an entire other set of issues.

I had mentioned in a recent post our pleasure at watching a Family Practice intern who kept eye contact with her patient (in this case my wife) while she examined her and informed her about test results.

The intern wasn’t entering data and there’s the rub with an EHR. She presumably had the choice of doing her examination and keeping as much eye contact as possible with her patient while remembering all the accumulated data points versus typing while she asked questions and, if she were a typical doc typist, looking at the keyboard and the screen much of the time.

The opposite end of the spectrum was a nurse who, in order to give Lynnette an ibuprofen tablet, spent twelve minutes (I timed the interaction) between my request for her pain med and it being put in her mouth, mostly on the computer, occasionally glancing up to ask a question (e.g., “On a scale of one to ten, what is your pain level? What is your full name and date of birth?{the fourth time she’d asked that during her shift}).

As the EHR has grown more complex, with more mandated information being necessitated by organizational, certifying and governmental entities, the potential for increased human-machine time has grown hugely, while the doctor-patient segment of a physician’s day is squeezed.

The potential for burnout of physicians, especially in emergency medicine, family practice and primary care internal medicine has increased. The link is to a free article that appeared in the Archives of Internal Medicine in 2012 comparing both burnout and satisfaction (with physicians’ balancing work and outside life) to others in the United States. Bottom line was of the 7,000+ docs who filed in a survey, over 45% had some symptoms of burnout and were much less satisfied with their ability to find a counterpoise between their work time and the rest of their life than those with comparable professional degrees.

Burnout meant less enthusiasm for work, development of cynicism and less of a sense of accomplishment than those of us who practiced medicine years ago had. There are lots of components as to why this has become more common among “front-line” physicians, but as I’ve talked to some recently the EHR has been a very significant contributor.

This was a somewhat unexpected development for me, although based on what I had seen with my radiologists attempting to dictate into an earlier version of an EHR in 1988-1991, not one that I  should have been surprised by.

Adding one more to the medical team should be easy.

Adding one more to the medical team should be easy.

There is a growing industry providing medical scribes to physicians and others and, since 2010, certification available through a non-profit, the American College of Medical Scribe Specialists. I was somewhat surprised that patients not only haven’t objected to a scribe being present, but often have warmly welcomed them. They may be introduced as “my data entry specialist.” Obviously, in teaching hospitals, patients see a team of physicians already. Only the most intimate parts of a physical examination would need to be conducted in a one-on-one basis. Then the scribe could be on the other side of a curtain and the doctor would verbally describe her or his findings.

If I had the choice of my physician looking at me almost all of the time and, in essence, dictating her findings (my own doctor is female) or having to type much of the time, my choice would be simple.

Then there’s the possibility of a remote scribe. I had envisioned a future EHR which had set areas to be filled in and a practitioner being able to wear a headset and dictate into the EHR directly. I hadn’t realized that some practices already have scribes who may be thousands of miles away from the patient-physician encounter, sometimes in India.

I went back to the New York Times article I mentioned initially and saw a quote from a family medicine physician who said, “Having the scribe has been life-changing.” An article in the journal Health Affairs said two-thirds of a primary care doctors time at work was spent on clerical duties that could be done by others. Another doctor  said, “Making physicians into secretaries is not a winning proposition.” She had surveyed over 50 primary care practice in the past five years, finding those who used scribes were more satisfied with their work and their choice of careers.

Doctors have been dictating patient records for fifty years, but those transcriptions often made their way to the chart many hours later. Having a scribe could cut that lag time immensely.

With our growing need for primary care physicians and the tendency for medical students to avoid those specialities, aiming toward more financially rewarding and less laborious fields in medicine, the advent of medical scribes may be not only a significant improvement for the lives of those already in front-line medical areas, but an inducement for new prospective physicians to join their ranks.

I’m heartily in favor of the idea.

 

 

 

 

 

Electronic Health Records: Conquering a major “con”

January 18th, 2014
The question is how to connect the two.

The question is how to connect the two.

My first electronic medical record encounter was in 1975 at a not-for-profit hospital in California. I could enter orders for my dialysis patients and retrieve lab test results. I thought it was” better than sliced bread.” I don’t remember any negatives about the system other than not being able to connect to it from the private medical office I shared with another nephrologist. So there were lots of “pros” and no major “cons” as far as I was concerned.

Of course, it wasn’t a complete Electronic Health Record (EHR) and I couldn’t dictate the results of a physical exam or anything else into the system.

In mid-1988 I became the commander of a small Air Force hospital in Texas that was a test site for the Composite Health Care System (CHCS), a  Department of Defense effort to have a system-wide EHR. During the preceding six months, when I had been the deputy commander, I was aware there was a rudimentary system in our x-ray department, one that let our radiologists dictate a report. But they had to speak slowly, in an absolute monotone, for it to work.

I attended my first CHCS meeting, with the Assistant Secretary of Defense for Health Affairs (ASD/HA) and all three military Surgeons General seated at the front of a large room. CHCS had morphed into an endless series of blah-colored screens that my docs, nurses and other medical personnel could use to retrieve and enter patient data. At that point I thought it was an elephant designed by committee, a prototype that had a long, long way to go before it was a viable EHR.

I was the junior commander in the room, having been a bird colonel for only three years. Many of the others were long-time colonels or one-stars and even, in a few cases, two-star generals/admirals. After a few introductory remarks, the ASD/HA said, “Colonel Springberg, you’re the new kid on the block; what do you think of CHCS?

All eyes turned to me and I blurted out, “Frankly, sir, I think it sucks.”

Shocked silence for a moment, then he asked, “What do you mean?”

“My docs hate it, sir. It needs to have a touch-screen or a mouse-able interface or be on a Mac with some colorful screens. As it is, there’s row after row of green lines of questions that can easily put you to sleep.”

I survived that meeting (perhaps just barely) and my own Surgeon General showed up in my office back in Texas a few weeks later. That wasn’t unusual, as he fairly frequently came to the base for events at the Medical Service Training Wing and stopped to talk to me on the way. This time I was concerned he’d want to chastise me for my remarks.

“Peter, do you remember that CHCS meeting?” he asked somewhat rhetorically. “Do you remember what you said?”

My heart skipped a beat or two.

“Well I agree with you. I just can’t say those kinds of things. Keep it up!”

Twenty-plus years later, DOD was still using a version of CHCS for healthcare administrative purposes and had something called AHLTA (the Armed Forces Health Longitudinal Technology Application; DOD does love acronyms) as its EHR.

Then in May, 2013, the Secretary of Defense announced a plan to replace AHLTA with a commercial EHR with a short-term goal of coordinating with Veterans Affairs to “develop data federation, presentation and enhanced interoperability.”

After I looked up the term “data federation,” it made sense. We’re talking about software allowing an organization to use data from a variety of sources in a number of places with the data itself remaining “in the cloud.”

If you’re speaking about medical records for people who move around the globe and often later stay in an allied system (the VA) after they retire, it would be great to be able to access all or part of an EHR without the need to move physical patient charts.

Then how do they find my old medical records?

Then how do they find my old medical records?

I’ve got part of my old military health record sitting on a file cabinet in this room, but what I really would like is for all my records to be accessible to any doc I see. whether it’s my own ex-Air Force Family Practice physician here in Fort Collins, someone at a VA clinic I might happen to stop at on a trip, or a civilian doctor in Canada or Europe I see in an emergency room

My left shoulder has been painful for six weeks. I saw my physician, got a referral for physical therapy and drove nearly twenty miles to see the PT who works for the local hospital chain (now a part of the University of Colorado) and was moved sometime back to an outlying location. She’s really good, so I became one of her groupies, patients who, when they need physical therapy, decided they’d follow the PT they liked best.

If I were still on active duty (it’s been nearly sixteen years since I retired), she might have been sent to Italy or Guam. But twenty miles was doable.

After her usual thorough exam she started entering data into a computer. Epic, the EHR used by University of Colorado Health (UCH) meets the 2010 Patient Protection and Affordable Care Act standards, was adopted at the main UCH hospital in Denver in 2012, reached the affiliated northern Colorado hospitals and clinics in July 2013, and will extend to other UCH locations by mid-2014. So if I’m seeing a practitioner at any UCH location, they can pull up my EHR onscreen. 

There’s now a non-profit Healthcare Information Management Systems Society (HIMSS) an organization that was formed with goals to improve healthcare through information technology. As I thought about the issue on the way to the gym yesterday, I realized one problem is defining who can see my medical data.

Medical data privacy is crucial to many and my first thoughts along this line were rapidly discarded. I don’t want Joe Ripoff in Otherplace, Elsewhere, to easily access my records and couldn’t initially think of a way  that all medical personnel anywhere could have easy entry to my EHR without some hacker also being able to duplicate the necessary passcode. And if I carried a card in my wallet, it could be pick-pocketed. Even if I had my own personal code, I might forget it or be unconscious.

Then I had an idea that could safeguard my medical record while allowing any practitioner I see while traveling to gain entry to all my stored records. It turned out not to be a new idea at all; others have suggested it for the last fifteen or so years.

My dog has an implanted microchip so if he’s lost someone can scan him and find who he belongs to. I would be willing to have such a chip in, for instance the flesh of my arm, modified to contain my entire EHR.

If that technology would allow a medical team anywhere to scan my arm and then retrieve my medical data, it might be worth considering.

This sounded like science fiction, but apparently it’s possible and it also caused a furor. I Googled the idea and found that Snopes.com had debunked the rumor that the Affordable Care Act mandated such microchips be implanted in everyone. Supposedly, according to the canard, the chip, about the size of a grain of rice, would also link to your bank account (It’s not true.) However, an-EHR-microchip, while conceivable, has been resisted by some religious groups and by many who are concerned that they would lead to Big Brother government being able to track all of our movements. Some have even said the data would be accessible to anyone with a scanner.

I think those objections, except for religious ones, are a stretch. And the data could be encrypted.

So my level of paranoia on the issue being quite low, I’m ready for a microchip.

It should absolutely be your choice, of course, whether you get one or not.

 

 

 

 

 

 

 

 

 

 

 

 

 

Food Safety Issues–Part one: China

January 9th, 2014

Two recent articles in The New York Times caught my attention and highlighted a marked disparity between China, the most populous country in the world, and the United States, third in the global population list, but with a markedly differing approach to many problems.

To begin with, I knew who was in first, second and third place for the greatest number of residents (citizens and others), but was curious to see who followed so I Googled “countries with the largest population” and found the numbers on an unexpected website (I suppose I shouldn’t have been surprised). As of July, 2013, the CIA’s listing on their webpage, The World Factbook, says China has a hair under 1.35 billion people, India has 1.22 billion, the entire twenty-eight-country European Union (that number of members is also as of July, 2013)  has just over half a billion and America has 316 million. They are followed by Indonesia at 251 million, Brazil slightly over 200 million and Pakistan at 193 million. The other countries with over 100 million inhabitants are Nigeria (175 million), Bangladesh (163 million), Russia at 143 million, Japan with 127 million, Mexico with 116 million and the Philippines with nearly 106 million.

When it comes to land area, Russia clearly leads the pack with over 6.6 million square miles; Canada is second with 3.8 million square miles and somewhat surprisingly to me the United States is third with 3.7 million, slightly over  China’s size (Alaska with 586 thousand square miles of land is the reason). But China’s population density (365 per square mile) is more than four times that of America’s (84 per square mile).

A significant question is what information is available to people in various countries and what influence do they people on decisions that may affect their health and that of their children. I’m going to stick to China and the United States, but I think I could probably extrapolate to a number of others in the over 100 million population group.

China needs a "Save Our Soil" stamp

China needs a “Save Our Soil” stamp

The NYT article about China, written by Edward Wong, was titled “Pollution Rising, Chinese Fear for Soil and Food” It’s datelined from a village in Hunan Province, the breadbasket of China. Crops raised in the eastern and southern parts of the country include rice, yams, carrots, turnips, cabbage and lotus, while millet, corn and soybeans predominate in northern and northeast areas. Other major crops include sorghum, barley, tea, cotton and peanuts.

The Hunan village mentioned in the story grows rice, sweet potatoes, turnips, carrots and cabbage. The problem is the fields on which these crops are produced are far too close to industrial plants; many factories, smelters and mines surround them and the wastewater from those plants is toxic. In May, 2013, officials in Guangdong Province, in the far south, said they had discovered excessive levels of cadmium in 155 batches of rice collected from markets, restaurants and storehouses. Of those well over half were from Hunan Province.

On December 30, 2013, the Chinese Vice Minister of Land and resources, Wang Shiyuan, said an area about the size of Belgium, (or Maryland, about 12,000 square miles) but comprising only 2% of China’s 135 million hectares (roughly 520,000 square miles) of arable land, was too polluted for growing crops safely. And early in 2013 Wang’s ministry had commented that a five-year, $1 billion soil-pollution survey’s resulted were being held as a “state secret.” This came out in Bloomberg News online along with a comment from Minister Wang, “Farming on the land with medium-to-heavy pollution should be discontinued.”

One-sixth of China’s rice is produced in Hunan Province, but so is much of its cadmium, chromium, lead and non-metal arsenic.

A Chinese official admitted the pollution was due to intense industrial development, but also mentioned three other factors I thought were much less likely to be involved (chemical fertilizers, mechanized farming and household garbage).

Cadmium’s effects have been studied in detail in those exposed to inhalation of the metal: they include lung, kidney, bone and reproductive changes. Ingested cadmium is exceedingly toxic to those same systems of the body.

Although the total arable land in China has increased in the last survey, the per capita figure has shrunk secondary to both population growth and a quickening pace of urbanization. Nearly eleven thousand square miles of previous farmland has been converted into portion of cities since a 1996 survey. China’s per capita arable land, 135.4 million hectares at the end of 2012 , translates into 0.101 hectares per person, far under the world’s average of 0.225. The redline figure for China at its present population is 120 million hectares reserved for agriculture; below that, even at their present population, they would be unable to produce enough food crops for all.

But toxic chemicals aren’t the only Chinese food issue. A January 2, 2014 BBC article, “Donkey Meat Recalled in China,” It’s apparently a common snack food there, but the Wall-Mart corporation said that government testing revealed that two of its stores in an eastern area of the country (Shandong province) had sold product contaminated with fox meat.

Wall-Mart plans to reimburse customers who purchased the donkey meat and upgrade its own DNA testing.

Chinese consumer confidence has plummeted since the melamine scandal of 2007-2008. Initially pet food contaminated with an industrial compound and exported to the United caused kidney failure in dogs and cats. Then infant formula, frozen yogurt and one brand of a canned coffee drink in China itself caused six infant deaths and sickened at least 300,000 people. A February 2013 Huffington Post article gave a followup on a theory of why so few died. About 1% of humans have a gut bacteria that metabolizes melamine into a more toxic chemical. So, if that concept is correct, China was very lucky.

Think of the numbers sickened and killed if that microbial species had been present in most of their population.

There is some very good news coming from China as well, however.  The world’s largest genomics corporation, started as the Beijing Genomics Institute in 1999 and now called B.G.I., is carrying on major projects to unravel the genetic structure of thousands of economically and scientifically important animals and plants with one goal being applying the knowledge gained to better treat or even prevent diseases.. A January 6, 2014 article in The New Yorker titled “The Gene Factory” featured B.G.I. and our former Chinese graduate student (now with a Pharmacology PhD from the University of Colorado) spoke highly of the work of one of its leading figures.

Maybe China can move this way.

Maybe China can move this way.

B.G.I. is collaborating with the Bill and Melinda Gates Foundation and major American universities to increase global food production by ten percent. It’s is also sequencing the genes of rice, cucumbers and chickpeas.

So there’s both bad and some good food safety news coming from the world’s most populous country. But the majority of its people are kept in the dark as to the extent of the problems.

There’s hope in sight: in early 2013 the Chinese State Council set a 2015 goal for measuring soil pollution comprehensively and establishing initial programs for treating those injured by unsafe agricultural products.

Hopefully they will let their citizens know the results of the survey.

 

There’s Silver in Them There Pills

December 25th, 2013

Like most medically-trained people (and hopefully many of the rest of us), I’ve been highly concerned about the rise of drug-resistant microorganisms, bacteria that can’t be treated with our standard antibiotics. A recent article in The Wall Street Journal with the intriguing title “Antibiotics of the Future” offered considerable hope, but let look at some background on the subject first.

The WSJ article said that two million patients each year in the United States develop infections that doctors can’t combat with our normal antibiotics; earlier in the year, the CDC in a report titled “Antibiotic Resistance Threats in the United States 2013” estimated that at least 23,000 of them die. They divide the microorganisms, all bacteria except for Candida (a fungus), into three groups: those whose threat levels are considered urgent, serious or concerning. The three in the urgent category include Clostridium difficile, which causes severe, life-threatening diarrhea, often in patients who have been hospitalized and are already on antibiotics, and leads to a quarter-million infections, 14,000 deaths and a billion dollars in medical expenses yearly. Then there are the carbapenem-resistant Enterobacteriaceae,  abbreviated as CRE (the carbapenems are powerful antibiotics considered the “drugs of last resort,” used when all other old and modern antimicrobials fail or are thought to be likely to fail; Eneterobacteriaceae are bacteria that are part of the normal gut flora.)

CRE infections most often happen in patients getting treatment for other serious conditions. They may be on a respirator, have a long-term catheter in their bladder or have been on other antibiotics. One estimate says there are 9,000 CRE infections a year and they cause at least 600 deaths. Patients in intensive care units not infrequently have IV catheters placed in large veins in the neck, chest or groin to allow hospital personnel to give medications and draw blood sample for a prolonged period of time. If these get infected they can cause a bloodstream infection (sepsis is the medical term). About half of all hospital patients who get CRE that goes on to cause a bloodstream infection die.

The third infectious urgent threat level is the bacteria, Neisseria gonorrhoeae, that causes the STD gonorrhea. The CDC estimates more than 800,000 cases occur yearly in the United States and 30% of these are resistant to some antibiotic, but almost all can be treated, at this time, with a two-drug cocktail. Gonorrhea causes severe reproductive system complications and the CDC says it “disproportionally affects sexual, racial and ethnic minorities.”

Then there is MRSA, methicillin-resistant Staphylococcus aureus. This bug is classified as serious, not urgent, yet there are roughly 80,000 severe MRSA cases a year and over 11,000 of these patients die. Most major MRSA cases are seen in healthcare setting among patients with weakened immune systems (e.g., those on hemodialysis or receiving cancer therapy) but less serious MRSA  can case problems in otherwise healthy people, including athletes who share towels or razors, children in day-care and members of the military in cramped quarters. Some of these infections, usually of the skin, can become severe and life-threatening.

The CDC piece, except for Candida, excludes non-bacterial diseases, but I received a reader comment a while back from a person whose website (Mphonline.org) has a post on Deadly Viruses.  Like parasitic diseases, e.g., malaria, viruses through the ages have killed simply enormous numbers of people. Now we’re facing a future when bacterial illnesses could overtake their status as the prime infectious threats to mankind.

An article in the December 23, 2013 online version of the New York Times described an increased death rate among dolphins, with many dying of viral disease. A number of them also showed evidence of antibiotic resistant bacteria, presumably from environmental contamination. Dolphins have been termed the modern equivalent of the canary in the coal mine, a biological early warning system analogous to the times when miners used to carry caged canaries while at work; if there was any methane or carbon monoxide in the mine, the canary would die before the levels of the gas reached those hazardous to humans.

The New England Journal of Medicine in January, 2013published an article titled “The Future of Antibiotics and Resistance.” The lead author, Dr. Brad Spelberg, works where I did my research fellowship. He and two colleagues mention that antibiotic-resistant bacteria are considered, in a major yearly publication by the World Economic Forum  (WEF), to be a leading risk to human health.

The World Economic Forum’s (WEF) 2013 publication on Global Risks analyzed fifty possibilities (e.g., economic disparity, religious fanaticism, rising greenhouse gas emissions, terrorism, water supply crises), examining their likelihood over the next decade, the impact if they actually happen and how interconnected they were to each other.  It used those to generate analyses of three major risk cases: one was on the threats to economic/environmental systems, a second on so-called ‘digital wildfires” from misinformation, and  The Dangers of Hubris on Human Health, devoted to antibiotic-resistant bacteria.

In a study done in Europe, 50% of French patients experiencing a flu-like syndrome (FLS) expected their physician to prescribe an antibiotic; FLS may be caused by influenza virus or other viruses and antibiotics are not of any use against these viral diseases. The WEF piece mentioned an article reporting 98% of Chinese children seen in a Beijing pediatric hospital for common colds were given antibiotics.

Huge quantities of antibiotics are being used for animals as well.  Animals being raised for their meat are often given antibiotics as growth promoters. A 1950 article in Science News announced results from Lederle Laboratories that lacing the hog feed with trace amounts of an antibiotic could increase the yield of meat by a half. Then in 1977 the FDA sent out a notice that it would withdraw approval of non-medical use of penicillin and tetracyclines, but no hearings on the subject followed that non-binding pronouncement.

A Federal District judge finally ordered those FDA hearings in 2012, but an article online less than two weeks ago said only suggestions to the animal-growing industry have resulted. In 2009, more than 3,000,000 kilograms of antibiotics were given to US patients; in 2010, 13,000,000 kilograms were used for animals.

Back to the Wall Street Journal article: it mentions four new approaches to treatment of these deadly bugs. The two I found most intriguing were research to befuddle the bacteria by working against the signaling chemicals they use to become infectious and using silver to increase the ease with which antibiotics enter the microbes.

There’s a way to go before these concepts are translated into bedside medicine, but there is more than a glimmer of hope on the horizon.