"Indeed, the process has been so widely commented upon that one writer postulated a common life cycle for all of the attempts to develop regulatory policies:from the New York PostThe life cycle is launched by an outcry so widespread and demanding that it generates enough political force to bring about establishment of a regulatory agency to insure the equitable, just, and rational distribution of the advantages among all holders of interest in the commons. This phase is followed by the symbolic reassurance of the offended as the agency goes into operation, developing a period of political quiescence among the great majority of those who hold a general but unorganized interest in the commons. Once this political quiescence has developed, the highly organized and specifically interested groups who wish to make incursions into the commons bring sufficient pressure to bear through other political processes to convert the agency to the protection and furthering of their interests. In the last phase even staffing of the regulating agency is accomplished by drawing the agency administrators from the ranks of the regulated...
As much as 90% of medical knowledge has been gauged to be substantially or completely wrong. We spend about $95 billion annually on medical research in the US, but average life span here has barely increased since 1978 — and most of the improvement was due to the drop in smoking rates. The picture of expert trustworthiness is no better or even worse in most other fields. One examination of published economics findings concluded that the wrongness rate is essentially 100%. In that light, is it surprising that we weren’t as well-protected as we thought from investment and banking system disasters?
Why all the wrong? Usually because of a hunger for easy answers that you can’t get from chaotic, complicated systems. But that doesn’t stop Oprah — who must feed a daily show — or even scientists, whose careers are tied to making a splash in prestigious research journals.
These journals want the same sorts of exciting, useful findings that we all appreciate. And what do you know? Scientists manage to get these exciting findings, even when they’re wrong or exaggerated. It’s not as hard as you might think to get a desired but wrong result in a scientific study, thanks to how tricky it is to gather good data and properly analyze it, leaving plenty of room for ambiguity and error, honest or otherwise. If you badly want to prove an experimental drug works, you can choose your patients very carefully, and find excuses for tossing out the data that looks bad. If you want to prove that dietary fat is good for you, or that fat is bad for you, you can just keep poring over different patient data until you find a connection that by luck seems to support your theory — which is why studies constantly seem to come to different findings on the same questions.
You might expect that other, more rigorous scientists would catch these sorts of shenanigans, but they often don’t, and in fact the vast majority of published research isn’t even verified. And even when bad research is outed, hardly anyone notices — we’ve all long since moved on to the next exciting finding.
Not that there isn’t some minority of expert advice that’s good, and even critically important. Most people just don’t know how to pick it out from the constant stream of flawed and conflicting findings — the housing market is recovering, the housing market is getting worse, video games deaden children’s brains, video games boost rapid thinking.
That’s why much of the public has simply stopped listening to experts, and sometimes with potentially catastrophic results, as when parents don’t get their children recommended vaccines and treatments, or believe they can eat whatever they want, or invest their savings in whatever stocks seem exciting.
The rest of us often trust experts blindly, because we’re programmed to do so practically from birth. Call it the “Wizard of Oz” effect: first with our parents, then our teachers, and then on to the authoritative voices in our textbooks and on TV news, we’re brought up to believe there are always people whose knowledge and judgment should be taken over our own. Experiments suggest that our brains’ decision-making capabilities get put on hold when we’re presented with what we think is expert advice, regardless of how bad the advice is.
Fortunately, just being aware of the extent to which even gold-plated expert advice tends to go wrong is a big first step towards being able to filter out the worst of it. So you’re already better off than you were a minute ago.
Trust me — I’m an expert on this subject.