Richard G. Petty, MD

Parkinson's Disease and Cholesterol

Within the last week we have talked about the association between Helicobacter pylori and Parkinson’s disease and the way in which Parkinson’s disease may often get better if people are treated with a cocktail of antibiotics. We have also discussed the association between Parkinson’s disease, allergies and inflammation.

Now new research from the University of North Carolina at Chapel Hill has  found that people with low levels of LDL cholesterol are more likely to have Parkinson’s disease than people with high LDL levels. This is the form of cholesterol sometimes referred to as "bad cholesterol." This study followed the strange observation that people with Parkinson’s disease have a lower rates of heart attack and stroke than people who do not have the disease. It is also known that known that cigarette smoking, which increases a person’s risk for cardiovascular disease, is also associated with a decreased risk of Parkinson’s disease. 

Few scientific stories are clear cut: it usually takes a while to get things right. Just to prove it, a study from the Netherlands found that high total cholesterol levels were associated with lower rates of Parkinson’s disease, but only in women.

So what to make of all this: infections, allergies and now cholesterol?

To try and understand this, I think that we need to introduce another actor to the stage. Since the early 1950s the medical community has been concerned about a striking concentration of amyotrophic lateral sclerosis (ALS) and Parkinsonism-dementia among the Chamorro people on the island of Guam. A number of lines of evidence have suggested that this group of illnesses has been caused by some neurotoxic agent in the environment, though nobody has been able to work out exactly what it is. One of the most attractive recent theories is that it might have something to do with toxins from Cycas plants. So the idea is that similar cholesterol-containing neurotoxins can come either from Helicobacter or from eating Cycas plants, or animals that have fed on the plants.

There is a complex inter-relationship between LDL- and HDL-cholesterol, and HDL-cholesterol appears to be anti-inflammatory: high levels of HDL-cholesterol are associated with low levels of inflammation. And it has recently been shown that simvastatin may cut the risk of developing Parkinson’s and Alzheimer’s diseases. Not just by lowering cholesterol but from its inflammatory activity.

It may also be that low levels of cholesterol may impair the activity of another factor: one that interests me is coenzyme Q10.

From a practical perspective, this new evidence reinforces a point that I made in Healing, Meaning and Purpose and on this blog: "boosting" one component of the blood or lowering another is not sensible. Whether dealing with cholecterol or immunity, we need to moduate and harmonize all the systems of our bodies and our minds.

Follow our systems for modulating the inflammatory mediators in your body and that alone should – theoretically – reduce your risk of many illnesses.

I shall keep you posted as this story continues to develop.

Reducing Your Cancer Risk

“The doctor of the future will give no medicine, but will interest his patients in the care of the human frame, in diet, and in the cause and prevention of disease.”
— Thomas Alva Edison (American Inventor, 1847-1931)

I am sure that you will agree that prevention is better than cure. And this is a good time of the year to review where you are in your life and what you want or need to do for yourself and your loved ones.

According to a study reported in the Lancet in November 2005 more than one third of cancer deaths are attributable to nine modifiable risk factors.

To evaluate exposure to risk factors and relative risk by age, sex, and region, the investigators analyzed data from the Comparative Risk Assessment project and from new sources, and they applied population-attributable fractions for individual and multiple risk factors to site-specific cancer mortality provided by the World Health Organization.

Of the seven million deaths from cancer worldwide in 2001, approximately 2.43 million (35%) were attributable to nine potentially modifiable risk factors. Of these deaths, 0.76 million were in high-income and 1.67 million in low- and middle-income nations; 1.6 million were in men and 0.83 million deaths were in women.

Smoking, alcohol use, and low consumption of fruits and vegetables were the leading risk factors for death from cancer worldwide and in low- and middle-income countries. In low- and middle-income regions, Europe and Central Asia had the highest proportion (39%) of deaths from cancer attributable to the nine risk factors studied.

For women in low- and middle-income countries, sexual transmission of human papilloma virus (HPV) was also the leading risk factor for cervical cancer. Smoking, alcohol use, and overweight and obesity were the most important causes of cancer in high-income countries.

Between 1990 and 2001 mortality from cancer decreased by 17% in those aged 30 to 69 years and rose by 0.4% in those older than 70 years, according to the authors, but this decline was lower than the decline in mortality rates from cardiovascular disease for men and women. The decline in mortality in men was largely due to reduction in mortality from lung, prostate, and colorectal cancers, while in women, lung cancer increased in the 1990s, and death rates for breast and colorectal cancer decreased. An article published almost ten years ago in the journal Cancer Epidemiology, Biomarkers, and Prevention, it was estimated the worldwide attributable risk for cancer to infectious agents as 16%.

The nine factors were:

  1. High body mass index
  2. Low fruit and vegetable intake
  3. Physical inactivity
  4. Smoking
  5. Alcohol abuse
  6. Unsafe sex
  7. Urban air pollution
  8. Indoor use of solid fuels
  9. Injections from healthcare settings contaminated with hepatitis B or C virus

This all makes good sense, but it is good to see high quality research in reputable journals confirming what we suspected. The research also gives us further compelling reasons for taking a good look at our lifestyles and hopefully the motivation to do something to improve them. And in the case of air pollution and injection of contaminated products, to be active in getting things cleaned up.


“Keep your own house and its surroundings pure and clean. This hygiene will keep you healthy and benefit your worldly life.”
— Sathya Sai Baba (Indian Spiritual Teacher, c.1926-)


“Length of life does not depend so much on a good physical constitution as it does on the best use of the six non natural things, which if we rule aright, we shall live long and healthy lives: to divide the day properly between sleep and waking; to adjust our air to the needs of the body; to take more or less food and drink according to our age, our temperament and whether we live an active or inactive life; to take exercise or rest according to the quantity of food and whether we are lean or fat; to know ourselves and be able to rule our emotions and subject them to our reason.  Whoever handles these wisely will live long and seldom need a doctor.”

–Giorgio Bagliivi (Italian Physician, Pathologist, Researcher and Author of De Fibra Motrice, 1669-1707)

“The best doctor prevents illness, a mediocre one treats illnesses that are about to occur, and an unskilled one treats current illnesses.”
–Chinese Proverb

Parkinson's Disease, the Intestine and Infections

Early in my career, one of my mentors was the eminent scientist and clinician Robert Mahler. He recently passed away at the age of 81, but in the last two years of his life he was an author on two papers (1, 2) about an ailment with which he struggled for many years: Parkinson’s disease.

Despite the best treatment, he was severely incapacitated by the illness, at one stage needing a wheelchair to get from his car to his office. But his fine mind remained undimmed by the illness, and he was intrigued by reports of an association between stomach ulcers and Parkinson’s disease and of dramatic improvements in the symptoms of some people with Parkinson’s disease who were being treated with antibiotics for gastric ulcers. (Last year Barry Marshall and Robin Warren were awarded the Nobel Prize in Physiology or Medicine for their pioneering work on Helicobacter – a bacterium associated with peptic ulcers.
I mentioned in an earlier post that I have a strong sense that there are more prizes to come on the
interaction between infectious agents, inflammation, genes, the psyche
and the environment.)

Robert was one of the test subjects in a research study and his Parkinsonian symptoms got much better when he was treated with antibiotics. There are now several important pieces of research on the fascinating topic. In some people eradicating Helicobacter may convert rapidly progressive Parkinsonism to a quieter disease, although only a minority of sufferers have evidence of current infection.

There seems to be an interaction between aging, genes and this infectious agent. Clearly not everyone is helped by antibiotic treatment, but this is a whole new line of very promising research.

Stress and the Skin

You have probably noticed how stress can have an impact on some people’s skin. Increasing stress can initiate or worsen skin disorders such as psoriasis and atopic dermatitis. There has also been a lot of discussion about whether stress can also exacerbate acne and cause cold sores to erupt.

A new study published in the December issue of the American Journal of Physiology-Regulatory, Integrative and Comparative Physiology sheds important light on this association.

It is well known that one of the physical effects of stress is to increase levels of a range of steroid hormones called glucocorticoids. The best known glucocorticoid is cortisol or hydrocortisone. So the question was whether the missing link between stress and skin problems might be one or other of the glucocorticoids.

Researchers from the Veterans Affairs Medical Center, San Francisco and the University of California at San Francisco and Yonsei University Wonju College of Medicine, Wonju, Korea decided to study this possible connection.

You may have heard that the skin is the largest organ in the body and provides the critical barrier between the environment and the internal organs. Its most important function is providing a permeability barrier that prevents us from drying out. When we are healthy we are approximately 65-70 percent water. We are able to survive and function in dry environments because the skin forms a permeability barrier that prevents the loss of water.

The physical location of the permeability barrier is in the outermost layer of the epidermis that is known as the stratum corneum. The stratum corneum is composed of dead cells surrounded by lipid membranes. The stratum corneum layer continuously sloughs off, and therefore has to be constantly regenerated. The epidermal cells in the lower epidermis are continuously proliferating to provide new cells, which then differentiate, move toward the surface and ultimately die, to form a new the stratum corneum. This process is going on in your skin right now, though it can be disrupted by damage such as sunburn. If the process becomes overactive, it can lead to the development of thick, hardened skin.

It was already known that psychological stress disturbs this elegantly balanced system by decreasing the proliferation of epidermal cells and inhibiting their differentiation. As a result the function of the permeability barrier is impaired.

To test the hypothesis that glucocorticoids would have adverse effects on skin function, they stressed some hairless mice by putting them in small cages in constant light and forcing them to listen to the radio for 48 hours.

Before being stressed one group of mice was treated with mifepristone, which you may know by its two other names, RU-486, or the “morning after” pill, which blocks the action of glucocorticoids. A second group was given a drug called antalarmin, which blocks glucocorticoid production. A third group was stressed but received neither drug and a fourth group remained unstressed in ordinary cages and without the continuous light and sound to which the other groups were exposed.

The mice that received mifepristone or antalarmin showed significantly better skin function compared to the stressed mice that did not receive either treatment.

The experiment demonstrated the important role that glucocorticoids play in inducing the skin abnormalities brought on by psychological stress. Although we hope that the study will lead to a way to treat people who suffer from these skin conditions, there is still a long way to go. It’s always difficult to extrapolate from mice to people. Second, there may be serious side effects of modulating glucocorticoid activity. Glucocorticoids are essential hormones that play many important roles. Blocking their action could have negative outcomes. This is one of the reasons why we are skeptical about advertisements that claim that some herbal concoction can “cure” cortisol-related obesity. If something could really modify the activity of cortisol or other glucocorticoids in the body, it would likely have many most undesirable effects.

The research team is now looking at the effect of psychological stress on the skin’s production of antimicrobial peptides, which play a role in defense against infection. It has long been thought that psychological stress might also reduce the ability of the skin to protect from infections.

I never like to leave a report involving animal experiments without also saying a heartfelt thank you to the animals that participated in the experiments.

This research is interesting and may have a number of spin offs. But I have another rather obvious question: since we already know that there is a link between stress and some skin problems, why not focus on stress management techniques, rather than trying to find new medicines to help counteract the biochemical effects of stress?

Hidden Harbingers of Weight: Salt Intake and Obesity

In Healing, Meaning and Purpose, I discuss some of the evidence for four previously little recognized causes of obesity:

  1. Stress
  2. Salt intake
  3. Pesticides
  4. Viruses

Each of these has been widely discussed in the professional literature, but little has percolated out into the general population except in advertisements for agents like Cortislim. I remain skeptical about these products. Tinkering with just one of the 260 hormones and neurotransmitters implicated in the control of weight is unlikely to be crowned with success. And their ingredients may also have the potential for causing problems. Recent advertisements have also mentioned that one of these products may elevate mood. Sad to say, in the last year we have seen two people who developed manic symptoms after taking one of the supplements. We are urging colleagues to see if there are any other cases, or whether these two were just coincidental.

I recently mentioned some of the evidence for viruses as a cause of weight gain.

Now a new publication from the Universities of Helsinki and Kuopio is out in this month’s journal Progress in Cardiovascular Diseases, that provides powerful support for the salt hypothesis.

The researchers report that an average 30-35 % reduction in salt intake during 30 years in Finland was associated with an extraordinary 75 % to 80 % decrease in both stroke and coronary heart disease mortality in the population under 65 years. During the same period the life expectancy of both male and female Finns increased by 6 to 7 years.

As expected, reducing salt intake has a beneficial effect on blood pressure.

But in my view the most interesting finding of the study is the close link between salt intake and obesity.

As bartenders, pub landlords and tavern owners have known since the beginning of time, increasing a person’s intake of sodium produces a progressive increase in thirst. (You didn’t think that those peanuts on the bar were put there out of the goodness of the establishment’s heart did you??!)

The progressive increase in the average intake of salt explains the observed increase in the intake of sugar-containing beverages which, in turn, has caused a marked net increase in the intake of calories during the same period in the United States.

Here is an extraordinary statistic: Between 1977 and 2001, energy intake from sweetened beverages increased on the average by 135 % in the United States. During the same period, the energy intake from milk was reduced by 38 %. The net effect on energy intake was a 278 kcal increase per person a day. The American Heart Association has estimated that, to burn the average increase of 278 kcal a day and avoid the development or worsening of obesity, each American should now walk or vacuum 1 hour 10 minutes more every day than in 1977. As we all know, that has not happened.

In the decade from 1976-1980 to 1988-1994 the overall prevalence of obesity increased 61 % among men and 52 % among women. During 1999 to 2002, the prevalence of obesity was 120 % higher among men and 99 % higher among women as compared with the 1976 to 1980 figures. The increased intake of salt, through induction of thirst with increased intake of high-energy beverages has clearly made a significant contribution to the increase of obesity in the United States.

It is also of note that until 1983 the use of salt did not change or even showed a continuous decreasing trend in the United States. The prevalence of obesity was relatively low and remained essentially unchanged from early 1960s to early 1980s.

This new study suggests that a comprehensive reduction in salt intake, which would reduce the intake of high-energy beverages, would be a potentially powerful means in the so far failed attempts to combat obesity in industrialized societies.

There is now conclusive population-wide evidence that indicates that we could achieve powerful beneficial health effects simply by reducing our overall salt intake. These benefits include a decrease in obesity.

As an aside, the population-wide long-term experience from Finland indicated that a remarkable decrease in the salt intake has not caused any adverse effects.

A number of years ago we were engaged in some experiments in which we replaced regular table salt – sodium chloride – with potassium chloride. For the first three weeks food seemed rather tasteless. But then we all suddenly discovered a new universe of flavors that had previously been hidden under a thick coating of salt. So a dietary change does have a temporary effect on your taste buds.

Although the paper doesn’t say so, there is also some data that salt may itself increase cortisol release.

The bottom line?

We now have clear, empirical data to support three out of the four points that I made in Healing, Meaning and Purpose, and there is some less robust data for the fourth.

I urge you to try gradually to reduce your personal intake or salt, and to encourage your family and friends to do the same. I mentioned that food may initially seem a little less flavorful, but then things change rapidly and for the better.

And your body will love you for the change!

Medicine and the Transformation of Illness

Something important has been happening in the medical field over the last century. And like most important concepts, once I mention it, everyone says, “Oh, that’s obvious.” Yet I have seen little discussion of it except in an occasional book or speculative paper.

The concept is this: modern medicine has been transforming the nature of illness in far-reaching ways. There are many illnesses that once were fatal, and which have now been transformed into chronic problems. Yet most conventional health care providers are still wedded to the short-term resolution of symptoms.

Let me give you three examples:

  1. The first is diabetes mellitus. There are two main types, and at least ten subtypes. Type 1 diabetes is what used to be known as juvenile onset diabetes or insulin-dependent diabetes. It usually comes on in childhood or adolescence, is associated with severe damage to the beta cells in the pancreas that produce insulin. People with this problem usually become very sick very quickly and need insulin to keep them alive. Until 1922, when the first patient was treated with insulin derived from cows, the illness was usually fatal. Insulin transformed it into a chronic illness. People were kept alive, but now we saw the emergence of diabetic eye disease (cataracts and retinopathy), disease of the blood vessels supplying the limbs, heart and kidneys, kidney failure, infections and many other chronic problems. In 1935 Sir Harold Himsworth, the father of a friend of mine, identified a second type of diabetes. He published a classic paper on his discovery of insulin resistance in 1936. This is what is now known as Type 2 diabetes, and used to be called maturity onset diabetes. This is a more chronic illness, but carries many of the same complications. The point about these two types of diabetes is not just that they have disturbances of glucose and lipid metabolism. That on its own matters little. It is the long-term consequences of the elevated glucose and lipids that causes all the problems.
  2. The second is hypertension. Again, this often used to be a fatal illness. Until the invention of the sphygmomanometer most people did not know that they had high blood pressure, and most often would die of strokes. Hypertension is now also a chronic illness. The problem is not the blood pressure itself, but the long-term consequences of an elevated blood pressure. That is why most physicians are now trying to prevent the damage to the heart, eyes and kidneys, instead of just focusing on the blood pressure numbers themselves.
  3. The third is Lyme disease. This is a bacterial illness that is acquired by being bitten by a tick. It is said to be the fastest growing infectious disease in the United States, primarily because we are spending more time venturing into the wilderness, and the deer population – a major carrier of the tick – is increasing in most Eastern states. Lyme disease can make people very ill. We identify acute and chronic types. The acute can usually be treated if identified quickly and if the correct treatment is given. But sometimes identification can be very difficult, and inadequate or even inappropriate treatment may lead to the chronic form. We have even seen people who have been treated exactly as the experts say, but have still developed the chronic form of Lyme disease. The biggest problem with Lyme disease is that it is a great masquerader: it can look like so many other illnesses, from multiple sclerosis and rheumatoid arthritis to chronic fatigue syndrome and syphilis.

We could pick out other examples. I have mentioned some of the problems of thinking that attention deficit disorder is just a problem with getting good grades in school. When in reality the problem is that inadequately treated ADD is associated with a range of long-term problems that occur outside of school hours.

For many years now some practitioners have been warning about the long-term consequences of symptomatic treatment alone. One of the most eloquent critics of this way of treating people is the Greek homeopath and teacher George Vithoulkas. I like and respect George, but he takes a militant view, saying that conventional treatment simply suppresses illnesses, rather than treating them. His solution is to use homeopathy for everything. He is a genius and also a natural healer, so he can probably get away with that. Most of us cannot.

So the fundamental tenets of Integrated Medicine include medical treatment to deal with the acute problem, but a combination of approaches to prevent the problem from becoming chronic. Or if it has become chronic, then how to change its course over time.

As I’ve said before: Combinations are Key. Not randomly giving an antibiotic as well as a homepoathic remedy, but precisely tailoring the combination to the individual.

Climate Change and Civilization

I have several times now sounded warnings about the impact of climate change on health and the re-emergence of some infectious diseases.

Today I’d like to tell you about a remarkable theory that could have profound implications for how we see ourselves and our place in the world.

Dr. Nick Brooks from the Tyndall Centre for Climate Change Research at the University of East Anglia has recently proposed a remarkable hypothesis. It has long been assumed that civilization started where is did because conditions had become very hospitable, but people banded together for mutual protection and for hunting and agriculture.

By contrast, Nick Brooks proposes that severe climate change was the primary driver in the development of civilization. He proposes that without climate changes thousands of years ago, we might have remained farmers, herders and hunter-gatherers.

The early civilizations of Egypt, Mesopotamia, South Asia, China and northern South America were founded between 6000 and 4000 years ago when global climate changes, driven by natural fluctuations in the Earth’s orbit, caused a weakening of monsoon systems resulting in increasingly arid conditions. These first large urban, state-level societies and monumental architecture such as the pyramids, emerged because diminishing resources forced previously transient people into close proximity, in areas where water, pasture and productive land was still available.

It is certainly remarkable that all of these places where the first urban civilizations developed, arose in once humid and productive environments that are now largely covered by desert. One theory has been that many of these deserts, as well as the Sahara desert, were the result of over-grazing by goats.

In a presentation to the British Association Festival of Science earlier this month, Dr. Brooks said, “Civilization did not arise as the result of a benign environment which allowed humanity to indulge a preference for living in complex, urban, ‘civilized’ societies. On the contrary, what we tend to think of today as ‘civilization’ was in large part an accidental by-product of unplanned adaptation to catastrophic climate change. Civilization was a last resort – a means of organizing society and food production and distribution, in the face of deteriorating environmental conditions.”

He added something that we now see all over the world: for many, if not most people, the development of civilization meant a harder life, less freedom, and more inequality. The transition to urban living meant that most people had to work harder in order to survive, and suffered increased exposure to communicable diseases. Health and nutrition are likely to have deteriorated rather than improved for many.

The new research challenges the widely held belief that the development of civilization was simply the result of a transition from harsh, unpredictable climatic conditions during the last ice age, to more benign and stable conditions at the beginning of the Holocene epoch some 10,000 years ago.

This work also presents some profound philosophical implications, because it challenges deeply held beliefs about human progress, the nature of civilization and the origins of political and religious systems that have persisted to this day. It suggests that civilization may not be our natural state, but the unintended consequence of adaptation to climatic deterioration – a condition of humanity “in extremis.”

Dr Brooks said: “Having been forced into civilized communities as a last resort, people found themselves faced with increased social inequality, greater violence in the form of organized conflict, and at the mercy of self-appointed elites who used religious authority and political ideology to bolster their position. These models of government are still with us today, and we may understand them better by understanding how civilization arose by accident as a result of the last great global climatic upheaval.”

This is an extraordinary throwback to the ideas of romantic philosophers like Jean-Jacques Rousseau who contended that Man is good by nature, but it corrupted by society and civilization.

Though we need to continue to be very alert to the dangers inherent in climate change, could it be that it is actually an invitation to evolve?

A Valuable History Lesson

“Those who cannot remember the past are condemned to repeat it.”
–George Santayana (Spanish-born American Philosopher, Humanist and Poet, 1863-1952

I’m a bit of a history buff, and like most armchair historians who know something about medicine, it’s always interesting to try and work out why some civilizations underwent rapid collapse. Was the decline of Rome really due to malaria, lead pipes or societal malaise? Why did the huge Khmer Empire in Southeast Asia vanish in just a few years? Was it ecological failure, disease or pollution? The list goes on.

The map shows some of the major empires in Eurasia around A.D.1200. See how few still exist.

One of these historical puzzles seems to be close to solution, and provides important lessons for us today. In the second year of the Peloponnesian war, the city state of Athens was devastated by an epidemic known as the Plague of Athens. Historians and scientists have been debating the cause of the Plague for years. When I was a young schoolboy, the debate was already a century old.

According to historical records, the plague began in Ethiopia and passed through Egypt and Libya to Greece in 430-426 B.C.E. It forever changed the balance of power between Athens and Sparta, effectively ending the Golden Age of Athenian dominance in the ancient world. It is thought that up to one third of the Athenians, including their leader, Pericles, dies in the epidemic. Most of our knowledge about the Plague came from the fifth century B.C.E. Greek historian Thucydides, who himself was taken ill with the plague but recovered. Though Thucydides gave a detailed description, researchers have not managed to agree on the identity of the plague. Several diseases have been suggested, including bubonic plague, smallpox, anthrax and measles.

Now a study in the International Journal of Infectious Diseases helps answer this question that has puzzled historians for decades: What destroyed ancient Athens, the cradle of democracy? Analysis carried out by Manolis Papagrigorakis and colleagues from the University of Athens, using DNA collected from teeth obtained from an ancient Greek burial pit points to typhoid fever as the disease responsible for this devastating epidemic. Typhoid fever (or enteric fever) is an illness caused by the bacterium Salmonella Typhi. It is common throughout the world, more so in tropical and semitropical climates. It is transmitted by ingestion of food or water contaminated with feces from an infected person.

There are some classic physical symptoms of typhoid: In the first phase there is coughing, a fever, sweating and a rash of “rose spots,” particularly on the abdomen. Typhoid has a unique feature: normally when you get a fever your pulse rate increases. In typhoid the pulse slows. In the second phase of the illness people get severe headaches muscle pain and diarrhea. And it is the diarrhea that usually dehydrates and kills people. You may have heard about typhoid in the last few days after publication that a well-known terrorist was supposed to have died from it.

It is humbling to realize that entire civilizations have been put to the sword, not by force of arms, but by microbes. Climate change or a breakdown in sanitation of food inspection can all lead to a reappearance of typhoid: within the last hundred year there have been outbreaks all over the Western world, and it is endemic in many less developed countries.

I mention in Healing, Meaning and Purpose that one of the reasons for the persistence of the gene for cystic fibrosis is thought to be that carriers of the gene are resistant to typhoid.

We must never forget the power of micro-organisms and how rapidly they can re-appear if we let down our guard or if we neglect the impact of climate change on their growth and viability.

It is no coincidence that H.G. Wells vanquished the Martians not with guns, but with microbes.

Vaccinating Against the Snivels?

I’m using a deliberately provocative title for a story that is quite serious, both medically and ethically.

We have today heard about two new developments. Researchers from the University of Rochester Medical Center announced that they are starting trials of a new vaccine aimed at eliminating childhood ear and sinus infections as well as many cases of bronchitis in adults. Second, and on the same day, the University of Rochester announced that it had won a $3.5 million grant from the National Institute of Deafness and Communication Disorders, one of the divisions of the National Institutes of Health, to develop the new vaccine. The team at Rochester helped to develop the vaccine marketed by Wyeth as Prevnar. It is used to protect infants and toddlers against some strains of bacteria that can cause pneumonia, meningitis and ear infections.

The vaccine will target Nontypeable Haemophilus influenzae or NTHi, which is the main remaining cause of ear and sinus infections and bronchitis, now that vaccines exist for various forms of streptococcal bacteria and Haemophilus influenzae B, the previous leading causes. NTHi is now the leading cause of ear and sinus infections, and of bronchitis in adults.

But why this news is so important for all of us, is that unlike virtually all other vaccines on the market, this one will not be aimed at saving lives, but at preventing what are usually nuisance illnesses. But please note my use of the word “usually.”

Dr. Michael Pichichero, a professor of microbiology, immunology, pediatrics, and medicine at the University of Rochester Medical Center who is leading the trial, was quoted as saying, “While ear infections are never fatal, they can cause serious damage in some children.” He went on to say that “83 percent of U.S. children experience one or more ear infections by age 3 and in some cases hearing loss becomes permanent.”

This is the point, and also why the National Institute of Deafness and Communication Disorders has chipped in. Most of these infections get better on their own, so an initial reaction might be to say, “why bother with this at all?” The trouble is that not only can they lead to long-term problems with hearing, I have seen more than one person develop a cerebral abscess as a result of a severe infection: the illness is not always innocuous.

Another problem is that infections of the sinuses and ears bring children to clinics and emergency rooms, and are the leading reason for antibiotic prescriptions. Even though many of the infections are viral, and viruses do not, of course, respond to antibiotics. Every expert agrees that antibiotics are overused in the United States, which wastes money and also helps the evolution of bacteria that ultimately resist all antibiotics.

What’s the downside of these announcements?

  1. We have already run into a great many problems with previous vaccinations. Some long-term and some subtle; so we desperately need long-term safety data.I know many natural healers who blame many of our health woes on vaccinations.
  2. I hope that the researchers have built into the contract a clause to allow them to report efficacy and safety data without having to clear everything with the study’s sponsor.
  3. We are facing major resource problems, not just in the provision of health care, but also in the funding for research. So should we be diverting resources into these kinds of problems while every 30 seconds another child dies of malaria?
  4. There has been a great deal of discussion about what is often called the “hygiene hypothesis:” the idea that the increasing rates of some allergic illnesses and asthma are a result of children having avoided minor infections early in life, that would have stimulated their immune systems. The hypothesis is not proven, but there is a great deal of circumstantial evidence.
  5. If successful, would the vaccine create a host of secondary problems ten or twenty years from now?

Peripheral Neuropathy

Treating peripheral neuropathy can be one of the toughest problems facing a clinician. Peripheral neuropathy simply means disease affecting the peripheral nerves.

There are a great many cause of peripheral neuropathy. This is just a partial list to give you an idea of the things that a clinician has to think about before starting treatment:

  1. Metabolic illnesses: Diabetes mellitus; porphyria; chronic renal failure; amyloidosis and disturbances in circulating proteins
  2. Vitamin deficiencies: Vitamins, B1, B3, B6 and B12
  3. Drugs and chemicals: Alcohol; Heavy metals like arsenic, lead and mercury; organic pesticides; several drugs used in cancer chemotherapy; isoniazid; nitrofurantoin
  4. Infections: Lyme disease; Herpes zoster (shingles); Diphtheria; Brucellosis; Leprosy; Tetanus; Botulism
  5. Malignant illnesses
  6. Inflammatory and autoimmune illnesses: Rheumatoid arthritis; Systemic lupus erythematosus; Polyarteritis nodosa; Sarcoidosis; Guillain-Barre syndrome; Celiac disease
  7. Physical injury: Trauma, stretching and compression of nerves, which can include things like carpal tunnel syndrome.
  8. Congenital illnesses

Many causes of peripheral neuropathy, particularly diabetes, may also damage the autonomic nervous system that controls the heart, blood pressure, swallowing, intestinal and bladder function.

Neuropathic symptoms typically start in the feet, because the nerves running down there are longer and more vulnerable than the ones going to the hands.
The most common symptoms are:

  1. Numbness
  2. Tingling
  3. Abnormal sensations called dysesthesias
  4. A characteristic form of pain, called neuropathic pain or neuralgia: people usually describe it as “pins and needles,” a steady burning sensation or “electric shocks.” These pains can be difficult to describe: typically pains, like stubbing your toe or stepping on something sharp, are transmitted through pain fibers. Neuropathy also involves other neurological pathways, so that the brain receives impressions that it cannot process.

There has been a revolution in out understanding of neuropathic pain in recent years. It is now considered to be a disease rather than a symptom. Normal pain is designed to protect you: you put your foot on a hot plate and you pull it away immediately. Neuropathic pain is different: it is non-protective and it persists and therefore behaves like a disease.

Multiple different classes of medications have been shown to be effective in some people with neuropathic pain, though most are not approved for use by the Food and Drug Administration:

  1. Lidocaine patches and creams
  2. Capsaicin creams
  3. Opioid analgesics
  4. Tricyclic antidepressants
  5. Serotonin-norepinephrine reuptake inhibitors (SNRIs)
  6. Anticonvulsants: Carbamazepine; gabapentin; pregabalin

Earlier this week, data presented at the European Federation of IASP (International Association for the Study of Pain) Chapters (EFIC) indicated that an innovative combination of painkillers might hold the key to unlocking the severe and relatively untreatable pain of peripheral neuropathy.

Dr Magdi Hanna, Director of Pain Clinical Research Hub at King’s College Hospital in London, has been studying the combination of the strong opioid oxycodone (OxyContin) with gabapentin (neurontin) in over 300 patients with severe diabetic neuropathy. This combination demonstrated a significant 33% improvement on top of the best pain relief achievable using the maximum tolerated dose of gabapentin as monotherapy. The study was part funded by one of the medicine manufacturers.

This study is good news, but even in this study there were a great many people who were not helped. In another blog item, I’m going to talk about some of the unorthodox approaches that have helped some people.

logo logo logo logo logo logo