<body><script type="text/javascript"> function setAttributeOnload(object, attribute, val) { if(window.addEventListener) { window.addEventListener('load', function(){ object[attribute] = val; }, false); } else { window.attachEvent('onload', function(){ object[attribute] = val; }); } } </script> <div id="navbar-iframe-container"></div> <script type="text/javascript" src="https://apis.google.com/js/platform.js"></script> <script type="text/javascript"> gapi.load("gapi.iframes:gapi.iframes.style.bubble", function() { if (gapi.iframes && gapi.iframes.getContext) { gapi.iframes.getContext().openChild({ url: 'https://www.blogger.com/navbar.g?targetBlogID\x3d9924031\x26blogName\x3dApathy+Curve\x26publishMode\x3dPUBLISH_MODE_BLOGSPOT\x26navbarType\x3dBLUE\x26layoutType\x3dCLASSIC\x26searchRoot\x3dhttps://apathycurve.blogspot.com/search\x26blogLocale\x3den\x26v\x3d2\x26homepageUrl\x3dhttp://apathycurve.blogspot.com/\x26vt\x3d-8459845989649682690', where: document.getElementById("navbar-iframe-container"), id: "navbar-iframe" }); } }); </script>

Monday, August 29, 2005

Statistical Obfuscation

After reading this article, which tries to create a tie between fast-food advertisements and child obesity, (a questionable statistical connection at the very best), I was left feeling that something didn't quite ring true. So I ran all the statistical numbers through my handy-dandy calculator:

an average of 10.65 food advertisements per hour in their sample... researchers taped 40 hours of TV programming... The sample yielded 1,424 advertisements, 426 (or 29.9 percent) of them for food products.


Well, those numbers all cross-referenced perfectly to within three decimal places. So I read on down a little further, and I realized why the alarm bells had gone off in my head the first time through the article. I present to you two lovely examples of how to manipulate statistical analysis by means of specious implication:

The researchers then coded each ad as being aimed at a child or an adult audience; foods by type; verbal or visual health-related messages; and characteristics of all human characters.


In other words, the data correlation was filtered through the personal prejudices of the research staff--who already had an implied agenda. In statistics, that's called second-order data corruption.

Also, because research demonstrates a connection between TV viewing and obesity for children and adults alike, parents could curb eating in their household by limiting their children's -– and their own -– television viewing.


This is called an assumed causality link, and it is in no way supported by the primary research data. The casual cross-association, "because [other] research demonstrates," is blatant and deliberate distortion.

And that, folks, is how you pass bad science off as statistical "facts." You produce real numbers, and then draw fatuous conclusions based upon your prejudices. The numbers are just a smokescreen.

Absolutely shameful.

4 Comments:

Blogger mman said...

Good point. Data can always be fudged. If you want to prove the grass is greener on the other side, that is what your data will eventually prove. Hopefully you can obtain a government grant to line your pockets to conduct your scientific research.

16:27  
Blogger Becky said...

I've got where I can hardly bring myself to pay attention to such things anymore. It's really hard to avoid becoming a total relativist. Even if there aren't obvious distortions like you point out (I don't know what second-order data corruption is, but I know that setting up an experiment requires making some assumptions which will affect the results), there are possible distortions that aren't being mentioned. It's hard to know how to really interpret such things.

16:57  
Blogger Churt(Elfkind) said...

My understanding of experiments is that the assumptions are what you are attempting to discover the truth about. I much prefer to say, “discover the truth” because in a true experiment that is what you are after. You don’t care if it’s result A or B. You just want to know what it really is. If you use assumptions for test data then your just speculating and your data, while entertaining perhaps, does not prove anything. Astronomy is full of examples (Ex: Dark matter). I avoid saying speculation is useless because it can be healthy to ponder what might be and then start real experiments. It’s just that people are taking it as the final stage instead of the beginning.

In the case of this “experiment” the assumed possible results are that the marketing either does or does not affect obesity in children. Children don’t usually buy their own food. That’s what parents or guardians do. So at best it’s an indirect link. You might extrapolate a new experiment from that and test if parents/guardians are the cause of their children’s obesity. For those who still don’t get it, the previous sentence is speculation/opinion. No testing has been done. If someone wants to give me a multi million dollar grant I’ll research it. Otherwise, figure it out for yourself. If you think the last few sentences had nothing to do with the original subject then you really aren’t paying attention.

Later,
N

07:12  
Anonymous Anonymous said...

Exaggerated data or not, if you have kids cancel you cable. Yes it is possible I’ve seen it done. If you don’t have kids, cancel your cable.

As far as exaggerated research data, the only true studies being done are on Global Warming Is Global Warming Fueling Katrina?. All others obviously have a spin.

09:20  

Post a Comment

<< Home