So you may have noticed that there is a study out there this week that purports to make a connection between Cardiovascular Disease (CVD) and weight and activity levels.
The whole thing is pissing me off. I first saw something about it on the web yesterday. The reporting is awful...it seems to say that really exercise has no bearing on cardiovascular fitness and we all just need to be thin. That, of course, is not what the article concludes. The researchers say that they show that exercise does not completely ameliorate the risk of CVD for persons who are fat, but that exercise, in their view of their research, does have some positive effect.
So the reporting on this is misleading. Then there is the issue that this study was not designed to study this exact question, which is perhaps the cause of a major flaw in the study, in my opinion. You see, the researchers were actually collecting data to study the effects of low-dose aspirin and vitamin E in the prevention of CVD. So they collected this data on weight, height and activity levels at the beginning of their study...11 years before...and never again. So the only correlation that we can make is that these random women in this study who happened to have a particular activity level and body type 11 years ago seem to have higher rates of CVD. They can't really say why because the changes that can happen in study subjects over 11 years to change their activity levels and body types is incomprehensible.
The authors, to credit them with some sense of ethics, do acknowledge that this is a problem. "We did not account for changes in BMI and physical activity over time, which in a similar study of diabetes in the Women's Health Study did not affect our risk estimates."
What the heck do they mean by that, you may ask? Well, I suspect (having read that article as well) that they are saying that in the same study design out of which the current study was drawn, they took they opportunity to fall back on the fact that their measure for activity levels were highly correlated in another study at a 2 year interval. In other words, 2 years after someone was asked about activity levels the same measure was shown to be similar in results. But the article that validates this only used a sample size of 231 women. And there were no validity tests run in any of the Women's Health Survey research.
They might also mean that they found a similar result when they studied the question on diabetes, but again this study was a 7 year follow-up with no further report of weight, height or activity level in the interval.
It's absolutely frustrating to me. We (the public) tend to believe that the research we are presented is accurate and informational. But the reality is often far from the case that actually reaches us. This contributes to the feeling of being on a rollercoaster when it comes to lifestyle recommendations (usually in the form of what's healthy for us to eat).
I remember a professor at CAL that I really didn't like. She wanted me, and her other students, to pay more attention to research and best practices. She wanted for us to validate our interventions. She wanted us to look deeper at what was being said and done in our (or our client's) best interest and really find out if it was. It seemed like too much trouble at the time, but now I don't know. We can't of course know everything, but the more I read of research that becomes news the less I feel like I can believe about what's reported.