3 thoughts on “Climate Models”

  1. Roy Clark is not a climate scientist nor meteorologist by trade or skill set. But since retirement maybe he has become better educated in these areas.

    Neither am I. However I’ve been following the subject near on 3 decades and was surprised by the fact that Clark fails to mention that even the surface temperature record is flawed due to inclusion of weather stations that are subject to the Urban Heat Island (UHI) effect. This totally contaminates the observational surface temperature data record.

    NOAA and other groups continue to this day to refuse to exclude these contaminated stations. But rather attempts to provide filtering on the datasets that supposedly compensates for these induced errors.

    I consider this completely bogus.

    Contaminated stations should not be part of the data record when attempting to measure climate. Nor should the GCMs be tuned to match this flawed dataset. It’s no wonder to me that they continue to run hot.

    Tuning them using flawed observational data cannot make them better.

    I will not get into debating the physics behind the GCMs. I am not qualified to do that.

  2. Many many years ago I was employed doing upgrades and warranty work for Dell. I was sent to a “famous” college here in the Midwest to upgrade a computer. I came armed with a memory christmas tree and a pile of the fastest, largest, memory chips money could buy. The professor explained all he had to do was add the “magic” words global warming to the proposal and the coffers overflowed.

  3. There is another aspect of computer climate modeling that has been completely overlooked. It begins with the chaotic nature of coupled, non-linear differential equations of the kind that form climate models. Edward Lorenz famously discovered that computer solutions of such equations would yield completely different results with such small changes in initial conditions that they would be immeasurable in the real world. It makes future climates impossible to predict, even in principle, a fact which was trumpeted by the IPCC in it first report, and then conveniently forgotten by everyone.

    The thing everyone overlooks is that computers sometimes have bit errors, either in moving bits around or as bit toggles in various memory locations (sometimes due to single-event upset events produced by cosmic rays). Bit error detection started with parity bit checking. But that only detects odd numbers of bit errors, and doesn’t correct them. Bit error and corrector algorithms have become very highly developed, and can theoretically handle all bit errors. But only if they are fully implemented. In a supercomputer running a 100 year climate projection, and running at petaflop speeds, the overhead of the best known bit error detection/correction codes would make the run time exceed the time being simulated. Not a very useful thing.

    The undetected bit error rate in such supercomputers has been studied, and found to be much higher than expected. High enough, in fact, to ensure that computer runs as gigantic as 100 year climate projections would not be repeatable.

    The most honest thing climate modelers could do is to run at least seven 100 year climate simulations on exactly the same computer, using exactly the same code and exactly the same initial condition load, and compare the output of those runs. I would bet my life that there would be at least one run that differed completely from all the rest, and I’d bet $10,000 that no two of the runs would give the same results at all. Meaning that climate models are really just hugely expensive random number generators.

Comments are closed.