I’ll admit it; I am rapidly becoming a skeptic when it comes to interview-based data. And the reason is that people (interviewees) just don’t know their business – although, of course, they think they do.
For example, in an intriguing research project with my (rather exceptional) PhD student Amandine Ody, we asked lots of people in the Champagne industry whether different Champagne houses paid different prices for a kilogram of their raw material: grapes. The answer was unanimously and unambiguously “no”; everybody pays more or less the same price. But when we looked at the actual data (which are opaque at first sight and pretty hard to get), the price differences appeared huge: some paid 6 euros for a kilogram, others 8, and yet other 10 or even 12. Thinking it might be the (poor) quality of the data, we obtained a large sample of similar data from a different source: supplier contracts. Which showed exactly the same thing. But the people within the business really did not know; they thought everybody was paying about the same price. They were wrong.
Then Amandine asked them which houses supplied Champagne for supermarket brands (a practice many in the industry thoroughly detest, but it is very difficult to observe who is hiding behind those supermarket labels). They mentioned a bunch of houses, both in terms of the type of houses and specific named ones, who they “were sure were behind it”. And they quite invariably were completely wrong. Using a clever but painstaking method, Amandine deduced who was really supplying the Champagne to the supermarkets, and she found out it was not the usual suspects. In fact, the houses that did it were exactly the ones no-one suspected, and the houses everyone thought were doing it were as innocent as a newborn baby. They were – again – dead wrong.
And this is not the only context and project where I have had such experiences, i.e. it is not just a French thing. With a colleague at University College London – Mihaela Stan – we analyzed the British IVF industry. One prominent practice in this industry is the role of a so-called integrator; one medical professional who is always “the face” towards the patient, i.e. a patient is always dealing with one and the same doctor or nurse, and not a different one very time the treatment is in a different stage. All interviewees told us that this really had no substance; it was just a way of comforting the patient. However, when we analyzed the practice’s actual influence – together with my good friend and colleague Phanish Puranam – we quickly discovered that the use of such an integrator had a very real impact on the efficacy of the IVF process; women simply had a substantially higher probability of getting pregnant when such an integrator, who coordinates across the various stages of the IVF cycle, was used. But the interviewees had no clue about the actual effects of the practice.*
My examples are just conjectures, but there is also some serious research on the topic. Olav Sorenson and David Waguespack published a study on film distributors in which they showed that these distributors’ beliefs about what would make a film a success were plain wrong (they just made them come true by assigning them more resources based on this belief). John Mezias and Bill Starbuck published several articles in which they showed how people do not even know basic facts about their own companies, such as the sales of their own business unit, error rates, or quality indicators. People more often than not were several hundreds of percentages of the mark, when asked to report a number.
Of course interviews can sometimes be interesting; you can ask people about their perceptions, why they think they are doing something, and how they think things work. Just don’t make the mistake of believing them.
Much the same is true for the use of questionnaires. They are often used to ask for basic facts and assessments: e.g. “how big is your company”, “how good are you at practice X”, and so on. Sheer nonsense is the most likely result. People do not know their business, both in terms of the simple facts and in terms of the complex processes that lead to success or failure. Therefore, do yourself (and us) a favor: don’t ask; get the facts.
* Although this was not necessarily a “direct effect”; the impact of the practice is more subtle than that.
Monday, 12 December 2011
Most People Don't Know Their Business (so asking them is useless)
COMMENTS 12.12.11
Wednesday, 7 December 2011
The Lying Dutchman: Fraud in the Ivory Tower
The fraud of Diederik Stapel – professor of social psychology at Tilburg University in the Netherlands – was enormous. His list of publications was truly impressive, both in terms of the content of the articles as well as its sheer number and the prestige of the journals in which it was published: dozens of articles in all the top psychology journals in academia with a number of them in famous general science outlets such as Science. His seemingly careful research was very thorough in terms of its research design, and was thought to reveal many intriguing insights about fundamental human nature. The problem was, he had made it all up…
For years – so we know now – Diederik Stapel made up all his data. He would carefully review the literature, design all the studies (with his various co-authors), set up the experiments, print out all the questionnaires, and then, instead of actually doing the experiments and distributing the questionnaires, made it all up. Just like that.
He finally got caught because, eventually, he did not even bother anymore to really make up newly faked data. He used the same (fake) numbers for different experiments, gave those to his various PhD students to analyze, who then in disbelief slaving away in their adjacent cubicles discovered that their very different experiments led to exactly the same statistical values (a near impossibility). When they compared their databases, there was substantial overlap. There was no denying it any longer; Diederik Stapel, was making it up; he was immediately fired by the university, admitted to his lengthy fraud, and handed back his PhD degree.
In an open letter, sent to Dutch newspapers to try to explain his actions, he cited the huge pressures to come up with interesting findings that he had been under, in the publish or perish culture that exist in the academic world, which he had been unable to resist, and which led him to his extreme actions.
There are various things I find truly remarkable and puzzling about the case of Diederik Stapel.
• The first one is the sheer scale and (eventually) outright clumsiness of his fraud. It also makes me realize that there must be dozens, maybe hundreds of others just like him. They just do it a little bit less, less extreme, and are probably a bit more sophisticated about it, but they’re subject to the exact same pressures and temptations as Diederik Stapel. Surely others give in to them as well. He got caught because he was flying so high, he did it so much, and so clumsily. But I am guessing that for every fraud that gets caught, due to hubris, there are at least ten other ones that don’t.
• The second one is that he did it at all. Of course because it is fraud, unethical, and unacceptable, but also because it sort of seems he did not really need it. You have to realize that “getting the data” is just a very small proportion of all the skills and capabilities one needs to get published. You have to really know and understand the literature; you have to be able to carefully design an experiment, ruling out any potential statistical biases, alternative explanations, and other pitfalls; you have to be able to write it up so that it catches people’s interest and imagination; and you have to be able to see the article through the various reviewers and steps in the publication process that every prestigious academic journal operates. Those are substantial and difficult skills; all of which Diederik Stapel possessed. All he did is make up the data; something which is just a small proportion of the total set of skills required, and something that he could have easily outsourced to one of his many PhD students. Sure, you then would not have had the guarantee that the experiment would come out the way you wanted them, but who knows, they could.
• That’s what I find puzzling as well; that at no point he seems to have become curious whether his experiments might actually work without him making it all up. They were interesting experiments; wouldn’t you at some point be tempted to see whether they might work…?
• Truly amazing I also find the fact that he never stopped. It seems he has much in common with Bernard Madoff and his Ponzi Scheme, or the notorious traders in investments banks such as 827 million Nick Leeson, who brought down Barings Bank with his massive fraudulent trades, Societe Generale’s 4.9 billion Jerome Kerviel, and UBS’s 2.3 billion Kweku Adoboli. The difference: Stapel could have stopped. For people like Madoff or the rogue traders, there was no way back; once they had started the fraud there was no stopping it. But Stapel could have stopped at any point. Surely at some point he must have at least considered this? I guess he was addicted; addicted to the status and aura of continued success.
• Finally, what I find truly amazing is that he was teaching the Ethics course at Tilburg University. You just don’t make that one up; that’s Dutch irony at its best.
COMMENTS 7.12.11