I just read the recent (Oct 12, 2011) JAMA article on "Vitamin E and the Risk of Prostate Cancer." It was a long-term, prospective, randomized study of 33,533 men followed in 427 study sites in the US, Canada and Puerto Rico. The investigators were from major academic centers, Duke, the Cleveland Clinic, Brigham and Woman's Hospital (e.g., Harvard) and the National Cancer Institute among them.
This was an impressive study of the effects of Vitamin E and/or selenium versus placebo that began in 2001 with the subjects being "relatively healthy men." Seven years after it began, in September 2008, the independent data and safety monitoring committee decided that the supplements should be stopped as there had been no positive results (reduction in prostate cancer detection) and futility analysis (a statistical tool) said the results were quite likely to be negative (more cases of prostate cancer). I hadn't heard of that term and found a medical website that discussed a number of reasons for ending a study prior to the intended date. I'll paste in the URL if you want to read a one-pager on what is called "interim analysis."
http://www.childrensmercy.org/stats/plan/interim.aspx
In this study, though the researchers stopped giving supplements and published an article (JAMA.2009;301(1):39–51) on the results to date, which showed a higher (but not statistically significant) number of cases of prostate cancer in the groups receiving Vitamin E, selenium or both, they also continued following the patient group.
The later data, though July 5, 2011, was quite impressive. There was a 17% higher incidence of prostate cancer in the group taking Vitamin E. In most scientific studies a p-value of 0.05 is felt to be significant. That translates to a probability of 5% or less that whatever happened did so by chance. If the data calculates to a p- value of 0.01, there's a 1% chance this was a random occurrence. Here, after ~eleven years the p-value for Vitamin E increasing the chance a man was diagnosed with prostate cancer was 0,008. (I'll paste in a website that explains more of this stuff if you're remotely interested). http://www.childrensmercy.org/stats/definitions/pvalue.htm
Why all the math and statistics?
Well, for starters, a few years back a large study showed the exact opposite, but in a highly selected group: men in Finland who were smokers. Another study, done with physicians as the subjects, showed no effect on the incidence of prostate cancer. A post by a physician harshly criticized the SELECT trial as part of a lengthy defense of supplements, but made sweeping pronouncements without supplying data or references to specific articles.
I read the articles, the blog post and the new study in detail. I know that medical research projects often come to conclusions that, a few years later, are "proven" incorrect. But I think this study was carefully done, had a clear-cut purpose in mind and included a large enough group of subjects that I'm going to believe its conclusions.
Plus I'm certainly not a Finnish smoker.
Tags: Medical research, Prostate Cancer, supplements, Vitamin E