Could long-term exposure to vitamin D, a natural enhancer of the immune system, ultimately have a negative impact on that system? Specifically, could vitamin D trigger exacerbated autoimmune responses and lead to increased likelihood of adult-onset autoimmune diseases, particularly lupus and rheumatoid arthritis (RA)?
It doesn’t seem likely, say authors of a study from Harvard School of Public Health and the Brigham and Women’s Hospital in Massachusetts. They base the conclusion on comparisons of data originally provided in the late 1980s and examined 20 years later.
The researchers started by studying responses from participants in the Nurses Health Study (NHS) regarding their high school eating habits. Among the 120,000 total participants, 73,629 completed dietary questionnaires in the first NHS study, launched in 1986, and another 45,544 supplied such data in NHS II, which began two years later. The authors then conducted medical reviews seeking reports of RA and/or lupus among the same nurses over the subsequent 20 years.
After adjusting the data to compensate for variables such as age, sun exposure, and caloric consumption, they found that only a small fraction of participants who had reported taking vitamin D as teenagers showed higher rates of RA or SLE any time later. Specifically, 652 of the nurses in the NHS study developed RA and 122 showed lupus over 351 months. In the NHS II study, 148 participants developed RA and 54 developed lupus over 209 months.
For now, the authors conclude, there seems little indication that vitamin D consumption in the teenage years poses an increased risk for either RA or lupus later in life. They add that vitamin D intake during other age periods should be examined for the possibility of those risks.