Richard G. Petty, MD

Fast Foods, Exercise and Your Liver

We have known for many years that insulin resistance can cause non-alcoholic fatty liver disease (NAFLD). Now a new study published in the journal Gut reports that too much fast food and too little exercise can harm the liver in a matter of weeks.

In an experiment looking a lot like Morgan Spurlock’s Super Size Me, Swedish researchers selected 18 thin, healthy volunteers – 12 men and 6 women – to attempt a 5 to 15% body weight increase by eating at least two fast-food-based meals per day for four weeks. The participants in this intervention group also restricted their level of physical activity to no more than 5000 steps a day. A comparison group, matched for age and sex, ate a normal diet and maintained normal exercise levels.

The plan was to see if doubling calorific intake and increasing total body weight had any impact on participant’s liver health.

Changes in major liver enzymes, such as alanine aminotransferase (ALT), and in hepatic triglyceride content (HTGC) were used to indicate liver damage. Abnormally high ALT levels are frequently seen in people who consume a lot of alcohol or who have been infected with the hepatitis C virus. HTGC measures fatty acid levels in the liver; too much fat in the liver leads to a condition called fatty liver disease.

At the end of the four weeks, the researchers found that:

  • Fast-food consumers had put on an average of 6.5 kg (14.3 lbs.)
  • Five participants increased their weight by 15%
  • One person gained 12 kg (26.4 lbs.) in two weeks
  • Sharp increases in ALT occurred after just one week on the fast food diet
  • The average ALT level increased four-fold from 22 U/l to of 97 U/l over the 4 weeks
  • ALT rose to liver damage levels in 11 participants
  • No changes were seen in levels in the comparison group
  • The increases in ALT levels were linked to weight gain and increased sugar and carbohydrate intake. One subject developed fatty liver disease, and there was a large rise in liver cell fat content in the other participants


Although nobody should be surprised that gorging on junk food and becoming a couch potato is bad for the body, the speed and extent of the liver damage is alarming.

“Don’t dig your grave with your own knife and fork.”
–English Proverb

“I saw few die of hunger; of eating, a hundred thousand.”
–Benjamin Franklin (American Author, Inventor and Diplomat, 1706-1790)

Nutrigenetics: A Peak Into the Future

At any given time only about ten percent of you genes are thought to be active. They are switched on and off in response to all kinds of internal and environmental changes. This is particularly true in the metabolic pathways, where gene activation is an essential part of the normal response to dietary changes. We also know that many of us have genetic reasons for varying in our nutritional requirements.

Anybody who has looked into diet and nutrition knows that there is no one approach that works for everyone, and the Holy Grail of weight management is to be able to identify which diet will work for whom.

This goal has just come a little closer with the publication of a report from researchers in Greece, London and Colorado that has been published in the Nutrition Journal.

The paper, “Improved weight management using genetic information to personalize a calorie controlled diet” is available for free download.” The study population consisted of 50 patients who had failed to lose weight. They were offered a nutrigenetic test screening 24 variants in 19 genes involved in metabolism. 43 patients attending the same clinic were selected for comparison using algorithms to match age, sex, frequency of clinical visits and BMI at initial clinic visit. The second group of 43 patients did not receive a nutrigenetic test. BMI reduction at 100 and over 300 days and blood fasting glucose were measured.

The results are very promising. After 300 days of follow-up individuals in the nutrigenetic group were more likely to have maintained some weight loss (73%) than those in the comparison group (32%). Average BMI reduction in the nutrigenetic group was 1.93 kg/m2 (5.6% loss) vs. an average BMI gain of 0.51 kg/m2 (2.2% gain). Among patients with a starting blood fasting glucose of >100 mg/dL, 57% (17/30) of the nutrigenetic group but only 25% (4/16) of the non-tested group had levels reduced to <100 mg/dL after >90 days of weight management therapy.

The paper concludes by saying that the addition of nutrigenetically tailored diets resulted in better compliance, longer-term BMI reduction and improvements in blood glucose levels.

This is a small “proof of concept” study, and the effects are not enormous, but there is easily enough here already to vigorously pursue this genetic approach.

Folic Acid: Too Much of a Good Thing?

If you believe everything that your read in the media or hear on those infomercials, you would think that you should spend all day munching pounds of fruits and vegetables while taking megadoses supplements chasers.

While that may sound good in theory, in practice things are not so simple and this approach may actually do you harm.

A good example has just come to light in a report from the Institute of Food Research in the United Kingdom that has just been published in the British Journal of Nutrition.

We have talked before about the potential value of fortifying food with folic acid. Apart from reducing the risk of neural tube defects in babies, it may also reduce the risk of depression.

The new report suggests that fortifying flour with folic acid may lead to a range of health problems.

Folic acid is a synthetic form of folate, a B vitamin found in a wide variety of foods including liver and leafy green vegetables. Folates are metabolized in the intestine, whereas folic acid is metabolized in the liver. The liver is an easily saturated system, and at doses of half the amount being proposed for fortification that could lead to significant unmetabolized folic acid entering the blood stream.

This excess folic acid could cause a number of problems:
It may interfere with some treatments for leukemia and arthritis
Women being treated for ectopic pregnancies
Men with a family history of colon cancer
People with blocked arteries being treated with a stent
In women undergoing in-vitro fertilization, it may increase the likelihood of conceiving multiple embryos
Unmetabolized folic acid accelerates cognitive decline in the elderly with low levels of vitamin B12 (If they have normal levels of B12, folic acid may slow brain aging)
While dietary folates have a protective effect against some cancers, folic acid supplementation may increase the incidence of colon cancer
Folic acid may increase the incidence of breast cancer in postmenopausal women, though other studies have shown the opposite

The trouble is that it could take 10-20 years for any potential harmful effects of unmetabolized folic acid to become apparent.

The latest study follows a letter to the Food Standards Agency from Sir Liam Donaldson, the Chief Medical Officer of England, requesting further expert consideration of two recent studies linking folic acid to bowel cancer before the government gives the final go-ahead for mandatory fortification of food with folic acid. However the Food Standards Agency has stuck to its position that fortification is safe. Mandatory fortification has already been introduced in the US, where it has been required since 1998, Canada and Chile, where it has cut neural tube defect rates by up to half.

Professor Nicholas Wald, director of the Wolfson Institute of Preventive Medicine, said:

“Fortification would prevent many cases of spina bifida and would also benefit the health of the country as a whole. Further delay in this public health measure will result in hundreds more babies being disabled by this serious disorder, or pregnancies being needlessly terminated due to a neural tube defect.”


When it comes to analyzing risks and benefits, one of the most important things is to realize that more is not necessarily better, and that folate and folic acid are not the same thing at all.

Second is the point that we discussed before: some people have the right genes to be able to metabolize folic acid with impunity, while others may get a range of problems from taking it.

Red Meat and Breast Cancer

There is an important study that was published in the April issue of the British Journal of Cancer, but which I haven’t seen reported in the United States.

The researchers did a survival analysis to assess the effect of meat consumption and meat type on the risk of breast cancer in the UK Women’s Cohort Study. Between 1995 and 1998 a cohort of 35,372 women was recruited, aged between 35 and 69 years with a wide range of dietary intakes, assessed by a 217-item food frequency questionnaire. The researchers also took into account smoking, weight, fruit and vegetable intake, class, education and use of hormone replacement therapy.

The results showed that eating even small amounts of red meat daily can increase the risk of breast cancer by 56 per cent in older women.

As little as 2oz (57g) of beef, lamb or pork a day showed an effect. Post-menopausal woman who ate larger amounts, 3.6oz (103g), of processed meats such as sausage, bacon or ham had an increased risk of 64 per cent.

Even younger, pre-menopausal women had a slightly raised risk if they ate red meat daily.

Earlier analysis from the study found that pre-menopausal women who had the greatest intake of fibre cut their breast cancer risk by half.

The Leeds work supports other studies. In November, a study from the United States found that women who ate the largest amounts of red meat had a rising risk of breast cancer. But different studies have presented conflicting views.

One reason why red meat may contribute to a raised risk of breast cancer is that it is a rich source of saturated fat. The women who ate the most meat were also more likely to be fatter.

The publication of the study in the United Kingdom generated a great many negative comments from the meat industry.

One spokesperson, on hearing that as little as two ounces may be enough to increse the cancer risk said,

"Two ounces is absolutely tiny. I have never heard such rubbish. It’s a tiny amount. This is ridiculous, it’s silly."


I doubt very much that it is "silly," though it is a surprise that the study has not attracted more attention.

The best way to look at this data is not to say that everyone should become vegetarian, but to recommend that women should:

  • Eat a balanced diet
  • Limit alcohol consumption
  • Exercise regularly
  • Keep and maintain a healthy weight
  • Try to maintain a regular sleep schedule

Non-pharmacological and Lifestyle Approaches to Attention-Deficit/Hyperactivity Disorder: 1. Diet


You can find some articles on Attention-Deficit/Hyperactivity Disorder (ADHD) here, and also some of the evidence that ADHD is a “real” illness and not just a label for socially unacceptable behavior. That being said, it is essential to take extra care when making the diagnosis. Mud sticks, and diagnostic mud sticks like glue. It can be hard to “unmake” a diagnosis.

As with any problem, the most effective way of helping it is to address the physical, psychological, social, subtle and spiritual aspects of the situation.

Medicines can definitely have a place in the management of ADHD, and the reason for treating ADHD is not so that people get better grades in school or do better at their jobs. It is to prevent the long term problems that may follow from inadequately treated ADHD.

There is a large and growing body of research on non-pharmacological approaches to treating ADHD. A literature search has turned up over two hundred papers, over half of which report some empirical research. Some of the research is summarized in a short paper aimed at health care professionals.

Research has shown that more than 50% of American families who receive care for ADHD in specialty clinics also use complementary or alternative medical (CAM) therapies, if you include things like modifying their diet or other aspects of their lifestyle. Despite that, only about 12% of families report their use of CAM to their clinician. Despite that low rate of families reporting the use of unorthodox therapies, a national survey of pediatricians showed that 92% of them had been asked by parents about complementary therapies for ADHD. The trouble is that many pediatricians have not been taught very much about the pros and cons of these approaches.

The most commonly used CAM therapies for ADHD are dietary changes (76%) and dietary supplements (> 59%). I have talked about food additives and one type of diet in the past. Now let’s look in a little more detail.

The 3 main dietary therapies for ADHD are:

  • The Feingold diet,
  • Sugar restriction, and
  • Avoiding suspected allergens.

Sometimes these diets are used in combination.

The Feingold Diet
The Feingold diet is the most well known dietary intervention for ADHD. It aims to eliminate 3 groups of synthetic food additives and 1 class of synthetic sweeteners:
Synthetic colors (petroleum-based certified FD&C and D&C colors);
Synthetic flavors;
BHA, BHT and TBHQ ; and
The artificial sweeteners Aspartame, Neotame, and Alitame.

Some artificial colorings such as titanium dioxide are allowed.

During the initial weeks of the Feingold program, foods containing salicylates (such as apples, almonds, and grapes) are removed and are later reintroduced one at a time so that the child can be tested for tolerance. Most of the problematic salicylate-rich foods are common temperate-zone fruits, as well as a few vegetables, spices, and one tree nut.

During phase 1 of the Feingold diet, foods like pears, cashews, and bananas are used instead of salicylate-containing fruits. These foods are slowly reintroduced into the diet as tolerated by the child.

The effectiveness of this diet is controversial. In an open trial from Australia, 40 out of 55 children with ADHD had significant improvements in behavior after a 6-week trial of the Feingold. 26 of the children – 47.3% – remained improved following liberalization of the diet over a period of 3-6 months.

In another study, 19 out of 26 of children responded favorably to an elimination diet. What is particularly interesting is that when the children were gradually put back on to a regular diet, all 19 of them reacted to many foods, dyes, and/or preservatives.

In yet another study, this one a double-blind, placebo-controlled food challenge in 16 children, there was a significant improvement on placebo days compared with days on which children were given possible problem foods. Children with allergies had better responses than children who had no allergies.

Despite this research many pediatricians, particularly in the United States, do not believe the evidence regarding the effectiveness of elimination diets or additive-free diets warrants this challenging therapy for most children.

There is an interesting difference in Europe. In 2004 a large randomized, blinded, cross-over trial of over 1800 three-year-old children was published. The results showed consistent, significant improvements in the children’s hyperactive behavior when they were on a diet free of benzoate-preservatives and artificial flavors. They had worsening behavior during the weeks when these items were reintroduced. On the basis of this and other studies, in 2004 schools in Wales banned foods containing additives from school lunches. It has been claimed that since the ban, there has been an improvement in the afternoon behavior of students.

The biggest problem with the Feingold and other elimination diets is that they are hard to follow and to maintain. But for some children and families, the inconvenience and stricter attention to food have worthwhile results.

It is also essential to ensure that children on any kind of diet maintain adequate nutrition: there have been many examples of that simple rule not being followed.

Sugar Restriction
The notion that sugar can make children “hyper” entered the mainstream over twenty years ago, and is now on the list of things that “everyone knows.” But happily it is not true. At least 12 double-blind studies have failed to show that sugar causes hyperactive behavior. Some researchers suggest that sugar or ingestion of high-carbohydrate “comfort foods” is actually calming, and that children who seek these foods may be attempting to “self-medicate.”

There are plenty of very good reasons for children to avoid candy, but hyperactivity is not one of them.

Food Allergies
There is clear evidence that children, and perhaps adults with ADHD are more likely to have allergies. That lead to the obvious question whether children with ADHD allergic or sensitive to certain foods. (It is useful to differentiate “allergies” that are the result of abnormal reactivity of the immune system to proteins in food, from “sensitivities” that are the direct result of substances in food: the two have different treatments.)

It is certainly true that food allergies and food sensitivities can generate a wide range of biological and behavioral effects. Gluten sensitivity (celiac disease) is known to be linked to an increased risk of ADHD and other symptoms.

In an open study of 78 children with ADHD referred to a nutrition clinic, 59 improved on a few foods trial that eliminated foods to which children are commonly sensitive. For the 19 children in this study who were able to participate in a double-blind cross-over trial of the suspected food, there was a significant effect for the provoking foods to worsen ratings of behavior and to impair psychological test performance.

For more than 30 years one of the tests used to track allergies has been the radioallergosorbent test (RAST). It is not much used these days since technology has moved on. In an allergy testing study of 43 food extracts 52% of 90 children with ADHD had an allergy to one or more of the foods tested. Over the next few years several researchers carried out open-label studies in which children with ADHD and food allergies were treated with a medicine called sodium cromoglycate, that prevents the release of inflammatory chemicals such as histamine from mast cells. Some of the reports suggested that it could help in some children.

Other popular dietary interventions include eating a low glycemic index diet to avoid large swings in blood sugar. Another strategy has been to “go organic” to reduce the burden of pesticides, hormones, antibiotics, and synthetic chemicals in the child’s system. These diets need more scientific study but they are probably safe if expensive.

There are plenty of practitioners and commercial entities who claim to be able to identify food sensitivities with all kinds of methods from blood and muscle testing to electrical and energetic techniques. Some may be helpful, but few have been proven to be effective.

What Should Parents do About Diet, Nutrition, Allergies and Sensitivities?
It is very difficult to predict whether an individual child will be helped by changes in diet. However, as long as the child’s needs for essential nutrients are met these diets should be safe.

It is an extremely good idea for parents to keep a diet diary for one to two weeks to see if anything obvious jumps out. Then trying an additive-free diet, low in sugar and avoiding foods that are suspected of exacerbating symptoms. You will normally find the answer – yes or no – within a few weeks.

What is the Evidence for Food Sensitivities and ADHD in Adults?
Not a lot!

There are plenty of people who have reported that dietary restrictions have helped them, but there is very little evidence. One of the problems about looking for food sensitivities is that there is a high placebo response rate. But if you have adult ADHD, it may be worth investigating. Just make sure that any diet that you use is nutritionally sound. And if you don’t find anything reconsider another approach.

Half the Population Has Genes to Make Them Fat

I have talked a bit about my skepticism concerning the genetic contribution to obesity, insulin resistance and diabetes.

I was fascinated to see a huge fifteen year study that has just been published in the journal Science. I felt a touch of pride: back when the earth was new, I helped train more than one of the authors.

The study involved over 42,000 people and found an association with body mass index at every age [from seven to 70] in populations throughout the UK and Europe.

Unlike previous work, it shows a very common genetic link with mild obesity rather than a rare genetic link with extreme obesity.

There were 42 scientists in the group and they found that if people carry one copy of a variant in a gene called FTO, as does half of the general population, it will lead to a gain in weight of 2.6lb or put just over half an inch on their waists and raise their risk of being obese by one third. People with two copies of this variant in the FTO gene, which is the case in one in six of the population, then they will gain almost 7lb more than those who lack the variation and are at a 70 per cent higher risk of obesity.

The researchers then tested a further 37,000 people from Bristol, Dundee and Exeter as well as a number of other regions in Britain, Italy and Finland. In every case the same variant in the FTO gene – which is mostly present in the brain and pancreas, among other key tissues – was associated with type 2 diabetes and obesity.

They also showed that in children, this particular FTO variant was associated with increased body weight.

We hope that in the future, once we have found additional obesity genes, it may be possible to offer advice based on a person’s genetic make-up. We all know that folk are eating more and doing less exercise, but some people gain more weight than others. Similarly two people on the same diet and exercise plan lose different amounts of weight. There are undoubtedly some unrecognized factors in weight gain, but genes remain in the mix.

Never be disheartened if your first attempt at diet and exercise is not crowned with success: they are only two of a dozen factors that play into weight control.

Do not fall into the fatalistic trap of thinking that biology is destiny.

We are talking about a factor that may modulate the way in which we control out weight.

As promised, I shall soon be publishing a book detailing specific methods for dealing with whichever ones are important in your life.

Your future lies in your hands: not in a string of chemicals.

Race and Diabetes

It’s another one of those, “Everyone knows that…” facts. For forty years we have all been taught that some ethnic groups are at higher risk of developing insulin resistance and type 2 diabetes mellitus. So now “everyone knows that” African Americans, Native Americans and people from the Indian sub-continent are all genetically predisposed to these medical maladies.

Now it looks as if “everyone” might have been wrong.

James Neel first proposed the theory of the “thrifty genotype” in 1962. He suggested that cycles of feast and famine early in human history created a gene that helps the body use scarce nutrients – a gene that leads to obesity and diabetes in sedentary modern populations with ready and continuous access to food.

Several months ago I pointed out some of the problems with the thrifty genotype theory, and why many of us have become more convinced about the concept of the “thrifty phenotype.” I have many friends, colleagues and former trainees who have dedicated themselves to hunting for diabetes genes. As early as the mid-1980s I was worried that they were going to vanish down a rabbit hole.

It seemed illogical that a gene or genes could “explain” an illness that was, until recently, very rare. It would have to be a gene that was somehow switched on and off by diet or some other environmental factors. It is certainly possible but seemed implausible, given that there are dozens of genes designed to control food intake and metabolism. But my friends the gene jockeys had the louder voice, and it was good for them to see what they could find. Now, twenty years later, more than 250 genes have been studied as possible causes of type 2 diabetes, but together these genes explain less than 1 percent of diabetes prevalence worldwide.

There is an interesting piece of research published in the journal Perspectives in Biology and Medicine by a team of researchers from the United States and Australia, that supports what I was saying. The study was co-authored by UC Irvine anthropologist Michael Montoya, an anthropologist at the University of California at Irvine, together with an epidemiologist and population geneticist. Together they analyzed existing genetic studies published across a variety of disciplines. The team found no evidence to support the thrifty genotype theory.

They also found that in most existing studies of the suspected genes that contribute to diabetes in ethnic minorities, researchers had failed to control for the potential impact of social and environmental factors. If those factors are taken into account, other factors – such as poverty, housing segregation or poor diet – were stronger indicators of diabetes than genes.

As Montoya said,

“Our study challenges the presumption that Native American, Mexican American, African American, Australian Aborigine, or other indigenous groups are genetically prone to diabetes because the evidence demonstrates that higher rates of diabetes across population groups can be explained by non-genetic factors alone. Our study shows that by focusing on genes, researchers miss the more significant and alterable environmental causes of diabetes.”

One of Montoya’s co-authors, Stephanie Malia Fullerton, a population geneticist and bioethicist at the University of Washington added,

“When it comes to diabetes, we’re finding that genes are no more important for ethnic minorities than for anyone else.”

This new critique of genetic and ethnic studies will need to be replicated, and it is a little bit of a surprise that such important work was published in Perspectives rather than one of the journals dedicated to epidemiology.

I have no inside knowledge about why the study was published where it was. But it often happens that it can be very difficult to get new research published if it contradicts the mainstream. There have been examples of experts squashing data that contradicts their own, but it is uncommon. Most of the time the difficulty in getting revolutionary new data published is not because of some conspiracy, but because any kind of evidence, particularly if it is radically different, attracts the most concentrated scrutiny by independent reviewers.

If this new data analysis is confirmed, it is going to mean a radical re-think about the ways in which we screen, manage and advise people from different ethnic groups.

It also confirms something that I’ve said a hundred times: Biology is Not Destiny.

Fats, Inflammation and Depression

We have talked before about the associations between inflammation and psychiatric illnesses.

There is yet more evidence in the shape of a study just published in the journal Psychosomatic Medicine. by Janice K. Kiecolt-Glaser and her colleagues from Ohio State University College of Medicine in Columbus.

The study involved 43 older adults with a mean age of 66.67 years, and the results suggests that the imbalance of omega-6 and omega-3 fatty acids in the typical American diet could be associated with the sharp increase in heart disease and depression seen over the past century. The more omega-6 fatty acids people had in their blood compared with omega-3 fatty acid levels, the higher their levels of the inflammatory mediators tumor necrosis factor-alpha and interleukin-6, and the greater the chance that they would suffer from depression. These are the same inflammatory mediators associated with insulin resistance, type 2 diabetes and coronary artery disease, all of which are more common in depression. And depression is more common in diabetes, arthritis and coronary artery disease than expected.

Our hunter-gatherer ancestors consumed two or three times as much omega-6 as omega-3, but today the average Western diet contains 15- to 17-times more omega-6 than omega-3. There were 6 individuals in the study who had been diagnosed with major depression, and they had nearly 18 times as much omega-6 as omega-3 in their blood, compared with about 13 times as much for subjects who didn’t meet the criteria for major depression.

Depressed patients also had higher levels of tumor necrosis factor alpha, interleukin-6, and other inflammatory compounds. And as levels of depressive symptoms rose, so did the omega 6 and omega 3 ratio. So it seems as if the effects of diet and depression enhance each other. People who had few depressive symptoms and/or were on a well-balanced diet had low levels of inflammation in their blood. But when they became more depressed and their diets became worse – which is very common when people are depressed – then the inflammatory mediators in the blood surged.

Omega-3 fatty acids are found in foods such as fish, flax seed oil and walnuts, while omega-6 fatty acids are found in refined vegetable oils used to make everything from margarine to baked goods and snack foods. The amount of omega-6 fatty acids in the Western diet increased sharply once refined vegetable oils became part of the average diet in the early 20th century.

Depression alone is known to increase inflammation, the researchers note in their report, while a number of studies have found omega-3 supplements prevent depression.

So this more evidence for the value of eating fatty fish like salmon, mackerel or sardines two or three times a week, but be sure to avoid fish that may contain a lot of mercury. If you add more fruits and vegetables to your diet, you will also reduce your levels of omega-6 fatty acids.

I have just finished analyzing all the new literature on using fish oils for the prevention and treatment of psychological and psychiatric problems, and I am going to post my findings in the next couple of days.

Anti-inflammatories and Colon Cancer

I just had a very good question after I published my list of Twelve Tips to Reduce Your Risk of Colorectal Cancer.

Dear Dr. Petty,

“That’s a great list, but I am wondering why you haven’t included aspirin or other non-steroidal anti-inflammatory drugs (NSAIDs)? I thought that they had been shown to reduce the risk of colon cancer.”

This is an excellent question, and I deliberately omitted mention of anti-inflammatories because the research suggests that they may cause more harm than good.

There is a report in today’s edition of the Annals of Internal Medicine from the United States Preventive Services Task Force, a highly regarded and independent panel of experts in primary care and prevention, that confirms that screening for colorectal cancer is still important and everyone over 50 should have it. But they urge caution on taking preventive drugs, saying that on balance the health risks of aspirin outweigh the benefits when it comes to preventing colon cancer. This advice holds even for those people with a family history of the disease, as long as they have only an average risk of colon cancer. (20 per cent of people who get colorectal cancer also have a close relative with the disease, with proportionally more cases among African Americans than other races.)

They found good evidence that high doses of aspirin (i.e. 300 mg a day or more) and possibly ibuprofen protect against colorectal cancer but this comes with increased risk of intestinal bleeding, stroke and kidney failure.

In low doses – under 100 mg a day – the Task Force says that good evidence supports the notion that aspirin protects against heart disease. However, at this dosage it will have no preventive effect on colorectal cancer.

The US Preventive Services Task Force regularly reviews the available research evidence and issues advice based on what they regard the strength of the evidence to be. They use a grades to help guide practice. For example a grade A recommendation is equal to "strongly recommends", while a B is just "recommends", and C is "no recommendation for or against".

In this case the Task Force has issued a grade D "recommends against" to the routine use of aspirin and NSAIDs to prevent colorectal cancer.

So for now I recommend following the Twelve Tips that I published yesterday.

Twelve Tips to Reduce Your Risk of Colorectal Cancer

Colon cancer, or, more accurately colorectal cancer, includes cancerous growths in the colon, rectum and appendix. It is the third most common form of cancer and the second leading cause of death among cancers in the Western world. Colorectal cancer surpasses breast and prostate cancers as a leading cause of cancer deaths in both men and women.

And the key point is that with early screening and a few simple dietary modifications, you can dramatically reduce your risk of getting it.

These are the 12 Tips to Slash Your Risk of Colorectal Cancer

  1. Receive regular colorectal cancer screenings beginning at age 50 if you are at normal risk
  2. If you are at higher risk due to a personal or family history of colorectal cancer, other cancers or inflammatory bowel disease have a discussion with your health care provider about screenings before age 50
  3. Eat between 25 and 30 grams of fiber each day from fruits, vegetables, whole grain breads and cereals, nuts, and beans
  4. Eat a low-fat diet: colorectal cancer has been associated with diets high in saturated fat, particularly fat from red meat
  5. Eat foods with folate, such as leafy green vegetables
  6. Try to drink at least 80 fluid ounces of pure water a day unless you have a medical reason for not doing so
  7. Drink alcohol in moderation: 2 units of alcohol or less each day
  8. If you smoke, here is another good reason for quitting. Alcohol and tobacco in combination are linked to colorectal cancer and other gastrointestinal cancers
  9. Exercise for at least 20 minutes three to four days a week. Moderate exercise such as walking, gardening or climbing stairs may help reduce your risk
  10. If you get any persistent symptoms such as blood in the stool, a change in bowel habits, weight loss, narrower-than-usual stools, abdominal pains or other gastrointestinal complaints, it is essential to report them to your health care provider
  11. Maintain a healthy weight. Obesity may increase the risk of colorectal cancer
  12. Maintain a good intake of calcium and vitamin D: this combination has been shown to reduce the risk of colorectal cancer

For more information, I recommend visiting the Web site of the American Cancer Society.

I keep their details in the “Resources” section on the left hand side of this blog.

logo logo logo logo logo logo