Is Your Sunscreen Killing You? Probably Not!

I recently wrote an “Ask the Expert” column for a client of mine, a wonderful company dedicated to providing quality, evidence-based health information to the public. Although this topic is not related to nutrition, it is timely, so I wanted to share it with you here.

My passion is to help people make sense of the confusing mix of hype and fear that makes up the majority of mainstream media reporting about a variety of health topics.

What’s Worse for Health: Sun or Sunscreen?
Expert Advice from Suzanne Dixon, MPH, MS, RD, Epidemiologist and Registered Dietitian

Make a habit of using a broad-spectrum sunscreen or physical block

Recent news reports have highlighted a study out of Sweden, which found women who avoid sun exposure have higher instances of death due to any cause (all-cause mortality) compared with women who regularly get more sun. Somehow these findings have been getting buzz as “proof” that sunblock is deadly, despite that the research did not examine sunscreen use at all.

What did the new study actually find?

The Swedish researchers collected information on sun exposure and health habits from approximately 30,000 women who were 25 to 64 years old at the start of the study. The women were not asked about sunscreen use. Over 20 years of follow up, the women who had the least sun exposure were twice as likely to die of any cause compared with women with the most sun exposure.

On the surface, these results seem to suggest that avoiding sun exposure is bad for health, but the study authors failed to take into account important considerations. When compared with the group with “active sun exposure habits,” those women who indicated they were not exposed to sun were significantly more likely to:

  • be overweight or obese,
  • be sedentary, meaning they engaged in very little physical activity compared with the sun-exposed group, and
  • have a hereditary risk of melanoma.

None of these factors—all of which are important predictors of the risk of death due to any cause—were properly corrected or controlled for by the researchers. Further, they did not consider other diseases that may contribute to risk of death. While they assessed the use of medications for diabetes and cardiovascular disease, this approach does not provide information about other chronic diseases, such as autoimmune conditions or mobility issues that may increase risk of death, and which tend to make it less likely a woman will spend time in the sun.

Activities used to determine “active sun exposure”—sunbathing, winter holidays to the mountains, and vacations to warm, sunny locations—also are likely related to overall health as well; women who are frail or ill are less able to engage in these active, sun-seeking habits.

Separating truth from fiction

Until a better-designed study comes along, people should continue to heed the advice of dermatologists and other experts, as the evidence still points to the importance of avoiding excess sun exposure for good health.

  • Get the D. If you do not sunburn easily, for your health, aim to get 15 minutes of sun exposure three times per week. However, if you do sunburn, you shouldn’t engage in this practice (I’m a redhead, I sunburn easily, and I do not ever expose myself to the sun intentionally). You can adjust the amount to account for living at northern latitudes, but even people with darker skin can develop skin cancer, so don’t overdo it.
  • Screen, for sure. Make a habit of using a broad-spectrum sunscreen. It’s true the jury is still out on whether chemical-based sunscreens may have other long-term health effects, but for now the risks of not using appear to outweigh the risks of applying sunscreen. If you fear chemical sunscreens, try physical blocks. Some health experts have raised concerns about metal nano-particles in some physical formulas, but you can find zinc- or titanium-based sunscreens that do not contain nano particles (though they may form a white sheen when applied).
  • Move more. As noted, the women in the sun-exposed group were significantly more active than the sun avoiders. Regular physical activity is strongly linked to better overall health, so add movement into your day, every day. If you’re a couch potato, just 20 to 30 minutes of brisk walking will do the trick.
  • Stay slim and trim. The sun avoiders were more likely to be overweight or obese. Maintain a healthy body weight throughout adulthood, and if you’re already overweight, losing just a few pounds can significantly improve health and reduce risk of death. And even if you don’t lose a single pound, getting more physical activity will improve your health. Heavy active people are much healthier than heavy inactive people!

What’s the danger?

Finally, keep in mind that having adequate vitamin D levels is linked with better health. Sweden is very far north, so there is the possibility that in this population, which receives no sun exposure for much of the year, blocking sun exposure is more harmful than not blocking it; this may lower blood vitamin D to insufficient or deficient levels.

(Journal of Internal Medicine, 2014, 276; 77–86)

Advertisement

Should You Supplement Vitamin D? A New Look at Old Data

Two new vitamin D studies offer tantalizing clues about vitamin D and health, though the newly published research challenges long-held assumptions about the sunshine nutrient and bone health. As with most nutrition research, details matter, and considering those details can help you make informed choices about the role of vitamin D in your self-care plan.

Both papers, published in the British Medical Journal, used a technique called meta-analysis, which combines existing studies:

  • One paper found people with low vitamin D blood levels were significantly more likely to die from heart disease and cancer compared with people who had higher levels. Supplementing with vitamin D3, though not D2, appeared to reduce risk of death from any cause by 11% in older adults as well.
  • The other study supported a connection between higher vitamin D levels and reduced risk of chronic disease. However, the authors concluded that vitamin D supplements alone do not improve bone health and recommended against routine vitamin D supplementation for healthy adults.

Do they stand up to scrutiny?

While meta-analysis may alert researchers to important connections about nutrition and health, the method has downsides. An editorial about the studies noted there is a huge range in type and quality of study included in these meta-analyses, which may lead to erroneous conclusions.

It is possible that low vitamin D levels are caused by disease, rather than disease being caused by low levels, or are an indicator of other factors tied to poor health, such as smoking and obesity.

Considering context

A look at the big picture can help you better understand the connection between vitamin D and optimal health:

  • Consider deficiency. Health experts note that as much as two-thirds of the population of the United States and Europe are vitamin D deficient. Considering this and their findings, the study authors reported up to 13% of deaths in the United States and 9% in Europe can be attributed to low vitamin D levels.
  • Know your nutrients. Dr. Robert P. Heaney, MD, Professor of Medicine at Creighton University, points out that if a person is low in other critical nutrients, such as magnesium, calcium, vitamin K, or even dietary protein, adding vitamin D is unlikely to improve bone health.
  • Supplement wisely. Start with a diet of whole, healthy, unprocessed foods, such as vegetables, fruit, legumes, nuts and seeds, and lean protein. Work with your doctor or dietitian to see where you may be coming up short on critical nutrients, and supplement accordingly.
  • Watch the maximums. For most nutrients, health agencies have set safe upper limits of intake. To avoid potential downsides of supplementing, do not exceed these safe upper limits unless directed to do so by your doctor.
  • Consider food and supplements. Many people aim to meet all of their nutritional needs with dietary supplements. A smarter strategy? Consider dietary and supplemental sources of key nutrients. Make sure these add up to meet your nutrition goals.

(BMJ 2014;348:g1903; BMJ 2014;348:g2035; BMJ 2014;348:g2280)

UPDATE: Is Saturated Fat the New Health Miracle? Not So Fast.

It’s hard to believe it was just yesterday that I blogged about a study, which I believe was widely misinterpreted in the mainstream media. Within days of the original research publication, the study authors issued corrections to that original study. Science moves fast!

In my original blog on this topic, I pointed out that while most media reports focused on the “exoneration” of saturated fat as a cause of heart disease, key findings were ignored. The study authors’ corrections further weaken their conclusion, “Current evidence does not clearly support cardiovascular guidelines that encourage high consumption of polyunsaturated fatty acids and low consumption of total saturated fats.”

We are back to where we began: For best health, in terms of heart disease and just about any other chronic disease one can name, we need to base our diets around unprocessed, whole foods, including nuts, legumes (beans and peas), vegetables and fruit, whole grains, vegetable and olive oils, and small portions of animal products, if desired.

If you’d like to read more about what other researchers in this field are saying about the corrections, be sure to check out Science Magazine’s coverage, and the article by Knight Science Journalism at MIT. According to Paul Raeburn, author of the MIT article, the best explanation for the controversy over saturated fats raised by the recent research just may be error.

Is Saturated Fat the New Health Miracle? Not So Fast.

I recently wrote a Week in Wellness column for a client of mine, a wonderful company dedicated to providing quality, evidence-based health and nutrition information to the public through venues such as large retailer kiosks and websites. The topic is so timely, I wanted to share it with you here – with permission, all you copyright hawks out there 🙂

My passion is to help people make sense of the confusing mix of hype and fear that makes up the majority of mainstream media reporting about food, nutrition, and health… I hope you enjoy reading it as much as I enjoyed writing it.

“Healthy” Fats – What Does That Even Mean Anymore?

A new analysis of existing research published in the Annals of Internal Medicine has been hot news, as it calls into question the long-accepted link between heart disease and eating saturated fat. In decades past, we were told to eat less meat, butter, cheese, and other high-fat dairy to reduce heart disease risk, until this advice gave way to the concept of focusing on fats with health-promoting properties. Does taking this and other new research into account change how we define “healthy fats?”

Despite the hype, the simple answer is, “Not really.” Taking all the study’s findings into account hardly adds up to an open invitation to step up saturated fats:

  • People eating the most trans fats had more heart disease than those eating the least, confirming something we already know: trans fats, found primarily in processed foods, are bad for the heart.
  • There was no difference in heart disease rates between people eating the most saturated fat, compared with people eating the least. This was true for omega-6 fats as well; they neither protected nor harmed the heart.
  • People eating the most omega-3 fats—think fish, nuts, and seeds—had significantly less heart disease than those eating the least.

In other words, the findings do not indicate that saturated fats are a good part of heart health, or a good way to maintain a healthy weight—they were neutral. Despite being largely ignored, by far the most important takeaway is avoiding trans fats and emphasizing the omega-3 fats.

The bottom line on healthy fats

The importance of omega-3s was further highlighted by another meta-analysis on fats and health—also published last week, though to much less fanfare. This analysis, published in JAMA Internal Medicine, looked at 70 randomized controlled trials, and the findings strongly demonstrated that omega-3 fats from fortified foods and supplements significantly reduce high blood pressure risk, a major contributor to heart disease.

So, what are the “best” fats to support a healthy heart and weight?

  • Omega-3s, found in fatty fish, such as salmon and sardines, and in nuts and seeds, particularly walnuts and flax seeds.
  • Monounsaturated and unsaturated, found in olive and sesame oils, and in nuts and seeds.
  • Saturated, which don’t need to be sought out in the diet, but are okay when consumed as part of small quantities of animal products, such as high-fat dairy and meat.

Also, when looking to make fat a healthy part of your diet—especially if weight-management is your goal—don’t overlook the importance of how you prepare it. Study findings also published in the British Medical Journal demonstrate that fried, fatty foods may be particularly harmful for those at highest genetic risk for obesity. Considering genes and fried food together in 37,000 people, researchers found each of these factors may intensify the harm of the other. (In other words, the heaviest in the group had both the genes, and the high-fat diet contributing to their weight woes.)

Back to basics

So, the fats you choose matter, and—whether or not you have the gene that makes you more at risk for obesity—avoiding fried food makes sense. However, according to Dr. Dariush Mozaffarian of the Harvard School of Public Health, the take home message on fats and health is to move away from a single-nutrient focus. Instead, build your diet around a variety of unprocessed, whole foods, including nuts, legumes (beans and peas), vegetables and fruit, whole grains, vegetable and olive oils, and small portions of animal products, if desired.

Does an “Acidic Body” Cause or Worsen Cancer?

I was asked recently if it’s important to prevent the body from being “too acidic,” for someone who has cancer. This topic has come up again and again over the nearly two decades during which I’ve worked in the cancer nutrition field. Given how common this belief is, it’s certainly worth discussing.

Possible origin of the belief that an “acidic body” causes cancer

There is a significant contingent of the complementary, alternative, and integrative medicine communities that believes keeping the body “alkaline,” or less acidic, is one of the key factors for optimal disease prevention. There also is a belief that an “acidic environment favors cancer.” This belief likely arose from the fact that cancer cells, with their unusual, extremely rapid metabolic activities, tend to make the micro-environment in and around, a tumor more acidic. However, this acidic tumor micro-environment does not appear to measurably shift total body pH to be more acidic.

Further, there is no evidence that making the body itself more alkaline will have any affect on the acidity of the tumor. So, the observation that tumors are acidic may have led some people to conclude (mistakenly) that acidic environments cause tumors. In reality, it is tumors that cause acidic environments. Thus far, research simply does not support that a large focus on shifting total body pH to be more alkaline will improve cancer outcomes. Perhaps future studies will show some benefit of “being more alkaline”, but for now, that research does not exist.

Of interest, cancer researchers are investigating ways to alter the pH of cancer treatment medications, so they can better penetrate into tumors. This is an exciting line of study, but it doesn’t point to any benefit of trying to “de-acidify” the body overall as part of cancer treatment.

One exception

As an aside, it’s important to note that in later stages of cancer, when a person has significant metastatic disease – tumors that have spread throughout the body – the body can become acidic. However, this is not related to the tumor acidity itself. In advanced cancer, a condition called cachexia can occur.

Cachexia causes the body to inappropriately use lean tissue, such as muscle, for energy, and to fail to use fat and carbohydrates – the more appropriate sources of fuel (calories). This causes wasting and weakness, and can increase body acidity. However, this cannot be reversed with diet alone, and making the body more alkaline will not stop cancer cachexia from occurring in advanced cancer cases.

Even if you believe it, how do you measure it?

The other problem we face is how to measure acidity. Urine acidity, which is a test that many alternative medicine practitioners use to convince people to take “alkalinizing supplements,” is very “short term” and reactive. A change in what you put in your mouth can immediately, and drastically alter urine pH within hours. But we don’t know if this means that what you put in your mouth has done much to your blood and body’s pH levels over the long term. One short-term study (one week) demonstrated urine and blood pH could be increased with an alkaline mineral supplement, but even these study authors concluded that they had no idea if this had any implications for long-term health.

In the end, most agree that diet can nudge blood and body pH, but for how long, and to what effect, isn’t clear in the majority of cases.

It is important to note though, that while body acidity can be altered by general diet patterns (more on that below), these changes are very small. A pH of 7 is neutral. Above pH 7 is alkaline, and below 7 is acidic. Blood pH is naturally, slightly alkaline. Healthy blood pH levels range from approximately 7.35 to 7.45.

What if you could shift your body pH?

The body fights VERY hard to keep blood pH in that range, and it’s controlled mainly by the kidneys and by the respiratory system. If the body becomes too acidic (acidosis) or too alkaline (alkalosis) it’s due to either respiratory or metabolic causes. These very serious medical conditions are referred to as metabolic or respiratory acidosis, and metabolic or respiratory alkalosis.Regardless of the hows and whys, if a person’s blood pH varies much from the normal 7.35 to 7.45 range, that person is in big trouble, medically speaking. And what their urine pH is doing isn’t a big part of the equation.

In the end, you cannot push blood or total body pH very hard in either direction with diet alone. The body simply recalibrates, to bring it back to baseline. And what that baseline is may vary from person to person, within that narrow range of approximately 7.35-7.45.

Then what is the role of diet in body acidity and alkalinity?

We do know that over the long term, nutrients such as phosphorus, calcium, and magnesium, all of which can affect acidity, may play a role in specific health conditions. For example, it has long been believed that a diet with too much phosphorus and not enough calcium and magnesium, may contribute to osteoporosis. The theory is that the body pulls calcium from bones for use as a buffer against the acidifying effects of phosphorus, and to help metabolize the excess phosphorus. This is one reason why a phosphorus-heavy diet (processed foods, animal foods, cola-type sodas), had been thought to contribute to bone loss. However, even this theory has come under scrutiny in recent years. Some researchers have shown a higher phosphate diet and more acidic urine may actually decrease calcium and bone loss.

The above example focuses on specific minerals, so let’s return to the general diet question. It is clear that the food-related factor that most impacts acidity and alkalinity – within that very narrow range – is total diet patterns. In general, the more plant-based the diet, the more alkaline the blood and urine tend to be. Overall, animal foods tend to increase acidity, while plant foods tend to decrease it.

There are exceptions – cranberries and plums, for example, tend to increase acidity – but most vegetables and fruit, even if acidic in nature (think citrus), actually create more alkalinity in the body. Research shows that vegans are the least ‘acidic,’ followed closely by vegetarians, and then by omnivores. The bottom line is that eating more plants will make the urine less acidic, and likely, the body less overall as well.

Does it matter why eating more plants decreases acidity?

Personally, I find it convenient that the very same nutrition approach that alkalinizes the urine (and probably the body overall), also is the thing that appears to reduce cancer risk. Available studies in humans support that a plant-based diet – a diet in which the bulk of calories come from minimally processed plant foods including vegetables, fruit, nuts, seeds, legumes, and whole grains – happens to reduce cancer risk and possibly reduce risk of recurrence, and to alkalinize the body.

In the end, if the motivation of acid/base balance helps a person make healthier choices that can simultaneously reduce acidity and reduce disease risk, does it matter? I do believe that why people eats healthier isn’t as important as that they simply do it. Reducing disease risk – cancer, heart disease, obesity, stroke, dementia – is the goal. To make a long story short, if people are interested in ‘alkalinizing,’ it’s helpful to focus on diet patterns.

What about alkalinizing dietary supplements?

I know of a naturopath who recommends his clients who have acidic urine take high, daily doses of magnesium sulfate mixed in selzer water, to “alkalinize” the body. This does alkalinize the urine immediately, though many people who try this end up with loose stools and even diarrhea. Unfortunately, even though magnesium is a mineral that tends to decrease body acidity, it’s not a good idea for most people to take high doses of it over the long-term. Remember milk of magnesia, the laxative? Magnesium is the ingredient that causes the laxative effect.

If you want to increase magnesium intake without the unpleasant side effects, plants are a great solution. In particular, try greens. One serving of spinach, chard, collards, kale, purslane, yellow doc, or other green leafy vegetables has about 2 to 3 times the RDA – now called the Dietary Reference Intake (DRI) – for magnesium.

Back to food

Inevitably, when I explain what we do and don’t know about the relationship between acidity in the body and cancer, I am asked for a list of foods. Which ones are alkalinizing and which foods increase acidity?

Nearly all vegetables, fruit, mushrooms, spices and herbs, and the sweeteners honey and molasses make the urine less acidic. Exceptions include cranberries, plums and prunes, and corn. which tend to make the urine, and possibly the body, more acidic. This doesn’t mean you need to avoid them, because what matters, is the overall dietary pattern. If most of the vegetables and fruit you eat fall in the “alkaline category,” the overall effect will be to alkalinize your urine.

As an example of how a mix of “alkaline” and “acid” foods can still result in less acidic urine, consider vegetarians and vegans. Despite the fact that most grains fall into the “acid food” category, and that most vegetarians and vegans eat plenty of grains, these two groups still have much more alkaline urine than omnivores. Clearly, the fact that the rest of a vegetarian diet is comprised of “alkaline foods,” outweighs the acidifying effect of grains.

Meat, poultry, cheese, fish, eggs, fats and oils, sweets, and most grains (as noted above) are “acidifying,” for the urine at any rate. Milk and dairy are considered neutral to slightly acid. Nuts, seeds, and legumes (beans and peas) are a mix, with some falling into the “acid” category and others being considered “alkaline,” in terms of urine.

It is likely that eating a mix of nuts, seeds, and legumes from each “category of acid/alkaline foods”, on balance, will not significantly alter urine pH. Again, despite the fact that some nuts and legumes are “acid forming,” and that vegetarians and vegans eat large amounts of these foods, these groups of people produce less acidic urine than omnivores. As far as beverages, tea, coffee, vegetable juice and most fruit juices tend to make urine less acidic.

The bottom line

Eating fruit and vegetables is linked with lower risk of all kinds of chronic diseases, including some types of cancer. Most fruit and vegetables are “alkalinizing,” but does this play much of a role in why these foods tend to be protective against chronic disease? We don’t know. It may play some role, but so do phytochemicals, vitamins, minerals, fiber, and the thousands of disease-fighting components found in plants.

The Benefits of Intellectual Flexibility

At first glance, the phrase “intellectual flexibility” brings to mind someone who has no principles, but I’d like to propose that intellectual flexibility is a good thing. If you’re passionate about food, health, and nutrition science, as I am, being intellectually inflexible can be a big liability.

The dawn of nutrition science

Nutrition is a young science. Even simple concepts, such as vitamins and minerals, only gained wider understanding in the 20th century. More complex dietary puzzles still aren’t solved. We have some good ideas about how nutrition and health are connected, to be sure, but the fine details elude even the experts.

For example, we know that a plant-based diet pattern consistently is linked with lower risk of the chronic diseases that plague modern man. This includes heart disease, diabetes, stroke, hypertension, osteoarthritis, and some types of cancer. Plant-based refers to any dietary pattern that is based around eating plenty of fresh and minimally processed vegetables and fruit, legumes (beans and peas), nuts, seeds, and whole grains (brown bread doesn’t count), regardless of whether it contains modest amounts of animal foods.

The Mediterranean diet is an example of this type of eating pattern. The Okinawa diet is another example. Both of these dietary patterns are linked with excellent health, especially when compared with the processed food and meat-heavy fare typical of the American diet.

Accepting the evidence and new ideas

Although many nutrition experts agree that eating more plants is a good way to improve health, agreement begins to break down when we delve into specific dietary components. This is where the benefits of intellectual flexibility are most evident. I experienced this in my own life very recently.

I had written a newsletter about a comprehensive analysis of research on multivitamins and mortality. Previous observational research suggested multivitamins increased risk of death, and vitamin skeptics raised the alarm. This new study however, which included only controlled clinical trials—the gold standard of evidence—suggests the alarm is unwarranted. The study conclusion: taking multivitamins has no affect on risk of death due to any cause.

I wrote up the “take away” message, noting that it’s fine to take a multivitamin, it’s unlikely to harm health, but it’s not likely to help much either. End of story. Or so I thought. I received comments back from the medical editor, indicating that I had failed to mention that many people took vitamins “for energy.” I chuckled to myself and responded that vitamins don’t give people energy (unless they have a vitamin deficiency). If someone claims vitamins give them energy, it’s probably placebo effect.

My colleague pointed out a double-blind, placebo-controlled clinical trial, in which participants who received the multivitamin reported significantly greater reductions in anxiety and perceived stress compared with people who received the placebo. The vitamin recipients also rated themselves as less tired and better able to concentrate following treatment.

Maintaining my intellectual flexibility

What?! This flew in the face of everything I had ever believed about multivitamins as a harmless, yet ineffective way to alter how people actually feel. I quickly searched the medical literature, and confirmed that yes, vitamins can improve energy and feelings of well being.

One study found that participants receiving a mix of B vitamins, minerals, and vitamin C had significant improvements in mental and physical stamina, concentration, and alertness compared with the placebo group. Other placebo-controlled, blinded trials concurred with these findings.

My strongly held conviction that vitamins cannot objectively improve physical or mental well being was challenged, and I was forced to reexamine my beliefs. I concluded that I had been wrong. And I changed my opinion. That is intellectual flexibility: being open to new evidence, and the willingness to admit that previously held beliefs may be incorrect.

Are you (or your nutrition guru) intellectually inflexible?

This is just one example of intellectual flexibility, but there are many others I’ve experienced in my own life and work.  Unfortunately, many self-proclaimed nutrition experts are unwilling to walk the path of intellectual flexibility.

There are vitamin D experts, who refuse to acknowledge the possibility that high, though still normal, blood levels of vitamin D may increase the risk of some types of cancer. Paleo diet pushers who, despite strong evidence to the contrary, insist that legumes harm, rather than improve health. And Atkins’ diet proponents who deny that higher intakes of meat are linked with increased risk of heart disease and cancer.

Why take your advice from a nutrition guru who is dogmatic and inflexible, and who clings to opinions about nutrition and health that have been challenged with good evidence? Doing so is a recipe for making poor nutrition choices. When it comes to nutrition science, exercising your intellectual flexibility is an important part of the learning process.

Gluten Free or Gluten Fear?

What and where is gluten?

Gluten and related proteins are found in wheat, rye, barley, and possibly oats. The jury is still out on whether oat proteins, or proteins from other grains that contaminate oats are the main problem for those on a gluten free diet.

Thousands of foods contain these grains, which means that completely avoiding gluten is no small task. Pretty much, any processed food not specifically labeled gluten-free should be considered suspect.

Why the fear of gluten?

Celiac disease is a condition in which the immune system reacts to gluten and related proteins. For people with celiac disease (CD), there is no question that a 100% gluten free diet is absolutely necessary to maintain good health.

With CD, failure to avoid gluten can lead to very serious health problems, including thyroid disease, joint damage, neurological effects, alopecia (hair loss), skin rashes, intestinal cancer, and of course, damage to the intestinal tract. This can cause malabsorption of nutrients, leading to conditions such as osteoporosis and anemia.

In short, if you have celiac disease, avoiding gluten should be your number one diet priority.

Given that the number of people affected by celiac disease is on the rise, and the scary list of symptoms that can accompany it, it’s not surprising that fear of gluten is on the rise too. But there’s a catch.

Most of the people who have celiac disease (CD) don’t know it. And contrary to popular opinion, not everyone with CD has gastrointestinal symptoms. Only 35% of those with CD report experiencing diarrhea, for example.

Who is gluten free?

Numbers are hard to come by, but celiac disease experts believe that of the roughly 2 million Americans with CD, about 80% or 1.6 million don’t know they have it. Conversely, approximately 1.6 million people in the US without CD are believed to be following a gluten free diet.

Sadly, this means that many who need a gluten free diet aren’t on one, and many who are on a gluten free diet don’t need it. This is why I often refer to CD as the “simultaneously most under- and over-diagnosed” condition in the US.

For those who truly have CD, the time from onset of symptoms to diagnosis can be agonizingly long, up to 10 years for many adults. Yet, many people diagnose themselves with CD without any evidence that they have the condition.

Many of life’s common woes – fatigue, stress, anxiety, lack of energy, insomnia, weight gain – are attributed to gluten, despite a lack of evidence to support these claims.

Gluten intolerance without celiac disease

If you suspect you are sensitive to gluten, yet a blood test for celiac disease indicates you do not have the condition, you may have something called non-celiac gluten sensitivity. Undoubtedly, some people without celiac disease still don’t digest gluten very well. But as with celiac disease, it appears likely that many more people believe they have non-celiac gluten sensitivity than actually do.

As Dr. Stefano Guandalini, Director of the Celiac Disease Center at the University of Chicago recently told the NY Times, a gluten free diet, “is not a healthier diet for those who don’t need it,” and that many people are essentially “following a fad.” Dr. Guandalini hastened to add, “And that’s my biased opinion.”

Should you go gluten free?

You should consider going gluten free if you have reason to believe you have celiac disease. Why might you have CD?

If someone in your family has it, this increases the likelihood that you may develop the condition too. If you or anyone in your family has an autoimmune disease, especially type 1 diabetes, rheumatoid arthritis, autoimmune thyroid or liver disease, Addison’s disease, or Sjögren’s syndrome, this too can increase the odds of CD.

And to confirm your suspicions, ask your doctor for a blood test. There’s one caveat: Do not go gluten free before you get the blood test. Doing so can render the test completely inaccurate.

The gray area

What about non-celiac gluten intolerance? It is possible you’ll feel better without gluten in your diet, but for many people, feeling better results from cutting out junk food, not gluten. Going gluten free means far fewer baked goods, pretzels, chips, cookies, pies, and other nutrition bombs.

Simply paying better attention to what you put in your mouth, and eliminating most processed food will make anyone feel better, gain energy, and lose weight. You can accomplish this without nixing gluten… Not a bad idea, given that gluten free does not mean healthy. Many people end up eating a less-nutritious diet when cutting out gluten.

To avoid this pitfall, talk to a dietitian or doctor who specializes in gluten free diets to help you sort it out. With his or her help, you can do something called an elimination diet. Gluten can be one of those things you eliminate, and you can track your symptoms accordingly.

With professional help, you can apply the elimination approach to many potential dietary trouble spots, in a systematic fashion. You may just discover some other dietary culprit that is making you feel poorly. And given how difficult it is to completely avoid gluten, you’ll probably be glad to narrow down your issues to something else.

For everyone else?

Don’t give in to the fad.

As Dr. Alan Leichtner, MD, senior associate in medicine at Boston Children’s Hospital Division of Gastroenterology and Nutrition says, “There are no studies showing that the gluten-free diet has an impact on anything other than celiac disease. The medical data simply aren’t there.”

Does Dairy Cause Breast Cancer?

Dairy is a common target of nutritional fear mongering. It’s also one of the most highly politicized nutrition topics in the United States. Pro-dairy groups will tell you that you must eat dairy for good health. Anti-dairy groups will tell you that you must avoid dairy for good health. Who’s right?

The truth? Somewhere in the middle.

This is because the connection between dairy and health is not black and white. Dairy may be helpful for reducing risk of some health threats, yet it may be implicated in increasing the risk of other diseases. If someone issues a blanket statement about dairy and health, they simply are not telling the truth. The science on dairy and health does not support that dairy is always good, or always bad.

Can dairy and breast cancer be studied accurately?

Unfortunately, the gold standard of research – a double-blinded, placebo-controlled trial – isn’t available for dairy and breast cancer. It’s nearly impossible to randomly assign people to consume dairy or not consume dairy, have them stick to this dietary regimen faithfully, and follow them for the decades required to see how many people in each group get breast cancer.

Never mind that you can’t “blind” people to the intervention. If someone is assigned to drink milk, they know it. There isn’t a good placebo – or a non-dairy milk – that would fool anyone into thinking they are drinking milk when they aren’t. Ditto for cheese and yogurt. So it’s unlikely that we’ll get this type of evidence anytime soon.

Fortunately, there are dozens and dozens of observational studies on dairy and breast cancer. And while no single observational study can prove cause and effect, when we have a lot of these studies to consider, we can look for a pattern.

A great example is smoking and lung cancer. Even though we don’t have a controlled clinical trial on smoking and lung cancer, the observational studies all point the same way.

And about breast cancer?

How do dairy and breast cancer stack up? It turns out that dairy is, if anything, slightly protective against breast cancer. One large meta-analysis – a type of study that combines data from previous studies on the topic – found that women with the most dairy in their diets had about a 15% reduced risk of breast cancer.

However, other large observational studies and research reviews have found that dairy is neither protective against, nor increases the risk of, breast cancer. In essence, it is neutral.

To sum it all up, if dairy foods truly had a strong connection with breast cancer, the results of all of this observational research would consistently point in that direction. This isn’t what we see, which means there probably isn’t a strong connection between dairy and breast cancer risk, one way or the other.

The Take Home Message?

If breast cancer is your concern, dairy is pretty much a non-issue. If you enjoy dairy, have dairy. If you don’t like dairy, don’t have dairy. You may have other reasons for wanting to avoid dairy, but don’t let someone sway your decision by convincing you that dairy causes breast cancer.