Friday, July 24, 2009

Group-think, Scientific Fraud and Big Mistakes

John Derbyshire reviews a book detailing elaborate scientific fraud by a scientist at Bell Labs. I've done quite a bit of reading over the years about scientific fraud and the way not only managers but also entire university faculties can close ranks to protect a person accused of fraud. Meanwhile, the person who uncovers fraud become the target of virulent attacks. This also happens in organizations with no connection to scientific research when someone reports malfeasance. We have personal experience.

In the case described in the book, an internal investigation was done and there is no mention of serious retaliation against those reporting misgivings or fraud. Particularly in organizations dependent upon certain research for income (including government grant money), those who find evidence of fraud need to be very careful in order to avoid having their careers destroyed and their lives ruined. A few rules of thumb from advocates for whistelblowers:
1. An accusation against a superior almost always fails.
2. Leave the organization and get another job before reporting fraud.
3. Warn family and friends that you intend to report fraud, and enlist their support beforehand.
4. Go directly to the media or outside authorities. Do not risk an internal report.
5. You must have hard proof before reporting fraud.
6. Your own actions must be completely above reproach and meticulously documented.
It is also important to understand the organizational culture to get an idea concerning how safe it is to report fraud, malfeasance or even mistakes. How have others fared after uncovering malfeasance? Do fears that outsiders will eventually discover fraud or mistakes make discovery inside the organization welcome?

There are psychological reasons why people will do unethical things at work which they would never do in their personal lives. Included among these are pressures to conform, the tendency to trust authority figures before trusting people with less authority, and reluctance to believe that scientific fraud or even big, dumb mistakes could actually take place in "our" organization. There's also a kind of group wishful thinking that can sometimes take hold.

Fraud can be minimized within an organization by an atmosphere of openness. When I worked at a pharmaceutical company, when a new clinical trial was finished, a group of about 40 people from several departments would sit around a huge table going over the final report line by line, asking questions of the trial coordinators and the statisticians. Everyone in the company with a job connected in any way with science (including those with only general scientific background to the highest medical and scientific degrees) also participated in reviews of raw data. These types of procedures send a signal that it would be difficult to slip fraud through the system. They also lessen the risks that "group-think" among those in a single department will lead everyone to unconsciously overlook a glaring problem. Sometimes someone with less specialized knowledge looks at the data from a different perspective and can ask a "dumb" question which reveals a potential problem.

I was at the table once when this happened, after a study was completed, when someone asked about a deficiency in the design of the trial. The expensive study was much less useful than it could have been. Depressing, but it was better that we found it before the FDA did. If it had been fraud, it would have been even more critical to uncover in "in-house".

Not all organizations engage in such rigorous reviews of scientific data. There are reasons to distrust scientific data beyond Post-Normal Science. However, the acceptance of "Post-Normal Science" as a "legitimate" way to approach scientific data reinforces the tendency to "overlook" fraud or mistakes in study designs or execution. It also confers legitimacy on those who punish scientists who don't go along with the consensus. In the case of global warming, suppressing data which didn't fit the desired political result was widely accepted as a good thing. Still is in some places, like the EPA. Dissenting scientists were demonized unmercifully a few years ago. This punishing atmosphere has recently started to turn around. It has been more than ten years since the hottest year in recent history, 1998. It's getting harder to justify sensationalizing worst-case scenarios in order to push political changes. Finally, some dissenting scientists are being taken seriously.

Related: The powerful psychological drives favoring consensus and group-think can lead to disastrous mistakes with very serious consequences. There is a theory that the New York Times hired John Tierney to write the practical TierneyLab column in preparation for the changing "consensus" on global warming. Already, "global warming" has become "climate change" in information produced by advocates of controlling carbon emissions through government action. But the column covers other issues, too.

Concerning the way one researcher censored himself in order not to stray too far from consensus concerning the possibility of the housing bubble which recently burst, Nicholas Wade writes in the TierneyLab column:
If the brightest minds on Wall Street got suckered by group-think into believing house prices would never fall, what other policies founded on consensus wisdom could be waiting to come unraveled? Global warming, you say? You mean it might be harder to model climate change 20 years ahead than house prices 5 years ahead? Surely not – how could so many climatologists be wrong?

What’s wrong with consensuses is not the establishment of a majority view, which is necessary and legitimate, but the silencing of skeptics. “We still have whole domains we can’t talk about,” Dr. Bouchard said, referring to the psychology of differences between races and sexes.
Update: Still fighting to keep temperature data and climate study methodology secret from those who might find some problems.

No comments: