Thursday, June 30, 2011

How Not to Take a Survey, Part II


            Previously, in “How Not to Take a Survey,” I discussed McKinsey’s opaque approach to its survey on employer reactions to the PPACA.  Now, it seems, McKinsey has bowed to pressure from the media and the Democratic Party, and has released some of its survey data.  Those who follow this link can also download PDF’s of the survey and the resulting data (I recommend reading the questionnaire at the very least).
            Again, I’m not sure how I feel about the political dimensions of this imbroglio.  The business press has every right to ask McKinsey questions; that’s part of the role that we assign to the media in a democratic society.  But should McKinsey be made to answer to the demands of a political party?  That thought makes me more than a little nervous.

Monday, June 20, 2011

Why Sample?

    One question that I've received a few times since I wrote "How Not to Take a Survey," is the following: given how thorny issues of sampling are, why bother in the first place?  Wouldn't it just be easier to put the question under study to every available member of the entire population?  There'd then be no need to bother with the mathematics.    The answer, I think, is contained in the following joke, very often told to beginning graduate students:

Sunday, June 19, 2011

What's in a Name?

    I've received a few questions as to why I'm blogging under the pseudonym of "Student."  The answer is actually very simple.  In other words, "Student" is a pseudonym that has a very proud history in the field of statistics.  While I don't think I've ever accomplished anything comparable to the achievements of William Sealy Gosset, I'm proud to follow in his footsteps as an anonymous statistician.

Thursday, June 16, 2011

Introducing This Blog


            Everybody knows the famous quotes.  'There are three kinds of lies: lies, damned lies, and statistics.” “67% of all cited statistics are made up on the spot.”  And the list goes on.  I’d like to think that my fellow statisticians and I have done a little more for the world than produce a great mass of lies, and I think that part of the reason we so often hear statements such as the one made by the illustrious Samuel Clemens is a failure on our part to explain to the larger public what it is that we actually do.  Either way though, the quotes point to an important fact: fuzzy numbers are often used to bolster weak arguments, particularly when those arguments are being made by politicians  (though they certainly have plenty of company in this regard).

How Not to Take a Survey


           The world has actually been on shockingly good behavior regarding the use of statistics ever since the idea for this blog entered my head.   Fortunately, just as I was beginning to despair of ever finding a good topic for a first post, McKinsey and Company just handed me this little gem of a report.

Some Background:

            To summarize, McKinsey’s proprietary research arm is claiming that the result of the Patient Protection and Affordable Care Act (often referred to as “Obamacare”), 30% of private sector employers in the United States will stop offering employer sponsored insurance (or ESI for short) to their employees after 2014, when the law’s main provisions go into effect.  I’ll begin by noting that the Congressional Budget Office estimated a figure of about 7% for the same question, and studies by the Rand Corporation, the Urban Institute and Mercer all suggest that the number of employers who currently offer traditional ESI for their employees but who intend to end this benefit after 2014 is minimal.  Mercer also qualifies its findings by noting that in Massachusetts  - where laws signed by former governor Mitt Romney in 2006 have produced a regulatory climate very similar in many ways to the PPACA - very few employers of any size have actually dropped traditional ESI since 2006.  So McKinsey is clearly an outlier here.  Now, obviously, being an outlier isn’t what disqualifies the study.  It’s fully possible that McKinsey is correct, and has seen something that the CBO et. al. either haven’t noticed, or are refusing to see.  And it’s important to note that the Urban Institute and Rand both based their conclusions on simulations, rather than polling data; this isn’t a criticism (I happen to think that their simulation methods are rather good), but it needs to be pointed out that their methods are different.  Anyway, while the studies done by the opposing camp have their imperfections, McKinsey has done virtually everything it can to undermine the credibility of its own report, to the point that the latter should not be taken seriously unless or until McKinsey releases more information to the public.  I suggest reading the short article in its entirety, since it’s an excellent example of how not to publish survey data if you wish to be taken seriously, and of why the general public should be skeptical of survey data to begin with.