The pattern of falsifying appearances began early. Although he works at the National Center for Atmospheric Research (NCAR), Stephen Schneider was heavily employed in the work of the IPCC as this biography notes.
Much of Schneider’s time is taken up by what he calls his “pro bono day job” for the Intergovernmental Panel on Climate Change (IPCC). He was a Coordinating Lead Author in Working Group II of the IPCC from 1997 to 2001 and a lead author in Working Group I from 1994 to 1996. Currently, he is a Coordinating Lead Author for the controversial chapter on “Assessing Key Vulnerabilities and the Risks from Climate Change,” in short, defining “dangerous” climate change.” - Pubmedcentral.nih.gov
He continued this work by helping prepare the Summary for Policymakers (SPM) of the Fourth Assessment Report (AR4) released in April 2007.
Schneider, among others, created the appearance that the Summary was representative of the Science Report. However, he provides an early insight into the thinking when speaking about global
The Summary for Policymakers is designed to convince everyone that global warming is due to human production of CO2. In SPM AR4 issued in April 2007 they say, “Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic GHG concentrations.” The term “very likely” is from a table reportedly produced by Schneider and means greater than 90%. Professor Roy Spencer says about probabilities in this context. “Any statements of probability are meaningless and misleading. I think the IPCC made a big mistake. They’re pandering to the public not understanding probabilities. When they say 90 percent, they make it sound like they’ve come up with some kind of objective, independent, quantitative way of estimating probabilities related to this stuff. It isn’t. All it is is a statement of faith.”
So they create an appearance of certainty about a human cause of warming. But what is the reality? The only place where CO2 is causing temperature increase is in the IPCC computer models. In every record of any duration for any time period in the history of the Earth, temperature increase precedes CO2 increase. So an incorrect assumption that a CO2 increase will cause temperature increase is built into the computer models. That is damaging enough, but the computer models themselves are completely inadequate to represent global climate or make any predictions about future climate. But don’t believe me. The IPCC Technical Report (“The Physical Science Basis”) produced by Working Group I and released in November 2007, says so.
Problems begin with the definition of climate change used because it requires they only consider human causes. From the United Nations Environment Program (article 1) of the United Nations Framework Convention on Climate Change (UNFCCC), “a change of climate which is attributed directly or indirectly to human activity that alters the composition of the global atmosphere and which is in addition to natural climate variability observed over considerable time periods.” But you cannot determine the human portion unless you understand natural climate change. As Professor Roy Spencer said in his testimony before the US Senate EPW Committee, “And given that virtually no research into possible natural explanations for global warming has been performed, it is time for scientific objectivity and integrity to be restored to the field of global warming research.”
Media and public are allowed to believe the IPCC make climate predictions, but they don’t. The First Assessment Report (Climate Change 1992) said, “Scenarios are not predictions of the future and should not be used as such.” While the Special Report on Emissions Scenarios says; “Scenarios are images of the future or alternative futures. They are neither predictions nor forecasts. Climate Change 2001 continues the warnings; “The possibility that any single in emissions path will occur as described in this scenario is highly uncertain.” In the same Report they say, “No judgment is offered in this report as to the preference for any of the scenarios and they are not assigned probabilities of recurrence, neither must they be interpreted as policy recommendations.” This is a reference to the range of scenarios they produce using different future possible economic conditions. Of course, they didn’t build in the recent financial collapse.
Climate Change 2001 substitutes the word projection for prediction. Projection is defined as follows, “A projection is a potential future evolution of a quantity or set of quantities, often computed with the help of a model. Projections are distinguished from predictions in order to emphasise that projections involve assumptions concerning e.g. future socio-economic and technological developments that may or may not be realised and are therefore subject to substantial uncertainty”.
This and similar statements are based on the unproven hypothesis that human produced CO2 is causing warming and or climate change. The evidence is based solely on the output of 18 computer climate models selected by the IPCC. There are a multitude of problems including the fact that every time they run them they produce different results. They use an average of all the runs. The IPCC then take the average results of the 18 models and average them for the results in their Reports.
Tim Palmer, a leading climate modeler at the European Centre for Medium - Range Weather Forecasts said, “I don’t want to undermine the IPCC, but the forecasts, especially for regional climate change, are immensely uncertain.” This comment is partly explained by the scale of the General Circulation Models (GCM). The models are mathematical constructs that divide the world into rectangles. Size of the rectangles is critical to the abilities of the models as the IPCC AR4 acknowledges. “Computational constraints restrict the resolution that is possible in the discretized equations, and some representation of the large-scale impacts of unresolved processes is required (the parametrization problem). “ (AR4 Chapter 8. p.596.)
The IPCC uses surface weather data, which means there is inadequate data in space and time for most of the world to create an accurate model. Limitations of the surface data are surpassed by an almost complete lack of information above the surface. An illustration of the surface problem is identified by the IPCC comment of the problems of modeling Arctic climates.
“Despite advances since the TAR, substantial uncertainty remains in the magnitude of cryospheric feedbacks within AOGCMs. This contributes to a spread of modelled climate response, particularly at high latitudes. At the global scale, the surface albedo feedback is positive in all the models, and varies between models much less than cloud feedbacks. Understanding and evaluating sea ice feedbacks is complicated by the strong coupling to polar cloud processes and ocean heat and freshwater transport. Scarcity of observations in polar regions also hampers evaluation.” (AR4.,Chapter 8, p593.) Most of the information for the Arctic came from the Arctic Climate Impact Assessment (ACIA) and a diagram from that report illustrates the problem.