Computer models may have really messed up on predicting

Possibly those global warming advocate are so certain about those computer models. There is also so much trust put in government regulators. Well, possibly they could read this.

Flawed computer models may have exaggerated the effects of an Icelandic volcano eruption that has grounded tens of thousands of flights, stranded hundreds of thousands of passengers and cost businesses hundreds of millions of euros.

The computer models that guided decisions to impose a no-fly zone across most of Europe in recent days are based on incomplete science and limited data, according to European officials. As a result, they may have over-stated the risks to the public, needlessly grounding flights and damaging businesses.

“It is a black box in certain areas,” Matthias Ruete, the EU’s director-general for mobility and transport, said on Monday, noting that many of the assumptions in the computer models were not backed by scientific evidence.

European authorities were not sure about scientific questions, such as what concentration of ash was hazardous for jet engines, or at what rate ash fell from the sky, Mr Ruete said. “It’s one of the elements where, as far as I know, we’re not quite clear about it,” he admitted.

He also noted that early results of the 40-odd test flights conducted over the weekend by European airlines, such as KLM and Air France, suggested that the risk was less than the computer models had indicated.

The acknowledgement that the computer models were flawed is likely to provide ammunition for critics who believe that authorities have shown excessive caution. The closure of much of the airspace over Europe over the past five days is estimated to have cost airlines a total of $200m a day in lost revenue. . . .

Labels: ,


Blogger Al B. said...

There is a qualitative difference between the computer models that the global warming alarmists use and the ones that were used in this case. In this case, the models were testable, and turned out to be wrong. In the global warming case, the models cannot be verified, because they predict conditions up to 100 years from now. In this sense, they bear less relationship to science than they do to cold-reading.

Climatologists attempt to verify their GCMs by running them against historical data, which only goes back 100 years or so, before crossing over into the realm of proxies, the accuracy of which can be highly dubious, e.g., the infamous ‘hockey stick’ curve. They then tweak their models to more accurately predict the past. But, as Nassim Nicholas Taleb points out, you can curve-fit virtually any dataset, regardless of how random the data is. Whether or not the resulting model has any predictive value or not can only be established by predicting future results.

4/20/2010 11:50 AM  

Post a Comment

Links to this post:

Create a Link

<< Home