The Great ‘CO2 is Rising’ Keeling Curve Fraud
Written by John O'Sullivan
Unpicking the massive climate fraud is akin to peeling back layer after layer of a rotting onion. One benchmark essential to establishing the fraud is the claimed persistent rise in atmospheric levels of carbon dioxide.
We are told current levels of atmospheric CO2 are ‘unprecedented’ and ‘dangerous’ at just over 400ppm. The scientist famed for pioneering a standardized global metric for such levels of CO2 is Dr Charles Keeling.
Herein, we expose some of the tomfoolery around the claims made about CO2 levels and Dr Keeling.
Keeling cherry picked ONLY UPPER ATMOSPHERE CO2 levels. He threw out data produced when winds were coming off the ocean from lower levels. By contrast, NASA has given data documenting that CO2 in the upper atmosphere cools the atmosphere.
Wikipedia, that organ for mass brainwashing, tells us that:
“The Keeling Curve is a graph of the accumulation of carbon dioxide in the Earth’s atmosphere based on continuous measurements taken at the Mauna Loa Observatory on the island of Hawaii from 1958 to the present day. The curve is named for the scientist Charles David Keeling, who started the monitoring program and supervised it until his death in 2005.”
See graph below:
On May 9, 2013, the daily mean concentration of CO2 in the atmosphere measured at Mauna Loa surpassed 400 parts per million. Alarmists claimed this was unprecedented and dangerous. Policymakers told us if we paid higher taxes that would solve the ‘crisis’ and prevent a 2 degrees temperature rise.
By what mechanism higher taxation relates to precise, systematic control of earth’s climate is never made clear. But we are dealing with half-wits, aren’t we?
The program Keeling started now monitors CO2 at 13 sites, from the South Pole to Alaska. The National Oceanic and Atmospheric Administration (NOAA) runs a larger network that overlaps with the Scripps system, relied on by governments and activists to make bold pronouncements about impending climate doom.
The junk science claim, which has poisoned academic reasoning for 30 years, is the mistaken notion that CO2, created by man, will drive up global warming.
This far-fetched claim is all premised on 19th century half-baked speculation. In recent decades it has been disproven by a flatlining of global temperatures since 1998, despite (official) CO2 levels in the atmosphere rising by 18 percent.
But so many interest groups and do-gooders have become wedded to this narrative, it has poisoned sensible public debate. So much so, skeptics are not only derided but mercilessly attacked, de-platformed and unjustly prosecuted in the courts.
The greenhouse gas theory, the cornerstone of the scientific claims that CO2 is our climate’s control knob, specifically states that a rise in carbon dioxide will cause a concomitant rise in temperatures. But Mother Nature has confounded that hypothesis by refusing to comply.
As such, the ‘theory’ should now be abandoned and dispassionate, independent scientific research has already exposed many shenanigans woven into the 30-year doomsaying narrative. Not least, but often overlooked, is the nonsensical metric of taking a global ‘snapshot’ of atmospheric CO2 levels from right beside an active volcano at Mauna Lao in Hawaii.
Can you think of anything more unscientific than basing your entire claims for rising levels of CO2 on measurements taken right beside an active volcano spewing out millions of tons of carbon dioxide?
You don’t need to be a well-qualified scientist to see this is a scam built to create a massive system of taxation and control. The willfully stupid are being spoon fed by promoters of this fraud. The perpetrators are a poisonous combination of tax-hungry governments and shill academic researchers who rely upon governmental grant funding to keep their universities in business.
The greenhouse gas theory, debunked in detail in the important new book. ‘The Sky Dragon Slayers Victory Lap‘ tells us more CO2 will cause higher temperatures. That is the essence of the theory. But we find that claim is debunked, not just by flat-lining modern temperatures, but also by the paleo-climatic historical record.
As Canadian climatologist, Dr Tim Ball wrote, there is a rise in proof that the major assumption of the hypothesis was wrong. Dr Ball tells us:
“A more detailed look showed that the temperature increased before CO2 in complete contradiction to the hypothesis. Somehow it was shuffled aside, probably because of the diversionary claim that the lag was between 80 and 800 years. It doesn’t matter because it still contradicts the basic assumption. More compelling is the fact that temperature changes before CO2 change in every record for any period or duration. Most curious, the contradiction is ignored by proponents and too many skeptics. Figure 2 shows a shorter record of the relationship.
Figure 2; Lag time for short record, 1958 to 2009.
It is logical to assume that if CO2 change follows temperature change in every record then CO2 cannot be a greenhouse gas. Despite this, the assumption that a CO2 increase causes a temperature increase is in all global climate computer models including those of the IPCC.
The IPCC faced another serious challenge created by the need to prove their hypothesis, rather than disprove it as normal science requires. It paralleled the need to eliminate the Medieval Warm Period because it showed the world was warmer than today before the Industrial Revolution. It was necessary to show or claim that the pre-industrial level of CO2 was lower than today. This campaign was underway before the ice cores information was released.
Most people think ice cores are the only source of pre-industrial CO2 levels. What most people don’t know is that thousands of direct measures of atmospheric CO2 began in the Nineteenth Century. Joseph Black had studied the properties of CO2 in the 1750s and Joseph Priestly published on oxygen in 1775. Attempts to measure the various volumes of atmospheric gases, with global measures of CO2 followed these events beginning in 1812. Scientists took precise measurements with calibrated instruments as Ernst Beck thoroughly documented.
In a paper submitted to the Hearing before the US Senate Committee on Commerce, Science, and Transportation Professor Zbigniew Jaworowski states
“The basis of most of the IPCC conclusions on anthropogenic causes and on projections of climatic change is the assumption of low level of CO2 in the pre-industrial atmosphere. This assumption, based on glaciological studies, is false.”
Of equal importance Jaworowski states,
The notion of low pre-industrial CO2 atmospheric level, based on such poor knowledge, became a widely accepted Holy Grail of climate warming models. The modelers ignored the evidence from direct measurements of CO2 in atmospheric air indicating that in 19th century its average concentration was 335 ppmv (Figure 2). In Figure 2 encircled values show a biased selection of data used to demonstrate that in 19th century atmosphere the CO2 level was 292 ppmv. A study of stomatal frequency in fossil leaves from Holocene lake deposits in Denmark, showing that 9400 years ago CO2 atmospheric level was 333 ppmv, and 9600 years ago 348 ppmv, falsify the concept of stabilized and low CO2 air concentration until the advent of industrial revolution .
Jaworowski’s claim the modelers ignored the 19th century readings isn’t correct. They knew about it because T.R.Wigley introduced information about the 19th century readings to the climate science community. (Wigley, T.M.L., 1983 “The pre-industrial carbon dioxide level.” Climatic Change 5, 315-320). It did what many others have done in taking a wide range of readings, eliminating only high readings and claiming the pre-industrial level was approximately 270 ppm. I suggest this is what influenced the modelers because Wigley was working with them through the Climatic Research Unit (CRU) at East Anglia. He was the key person directing the machinations as revealed by the leaked emails from the Climatic Research Unit (CRU).
Wigley was not the first to misuse the 19th century data, but he did reintroduce it to the climate community. Guy Stewart Callendar, a British Steam engineer, pushed the thesis that increasing CO2 was causing warming. He did what Wigley did by selecting only those readings that supported the hypothesis.
There are 90,000 samples from the 19th century and the graph shows those carefully selected by G. S. Callendar to achieve his estimate. It is clear he chose only low readings.
You can see changes that occur in the slope and trend by the selected data compared to the entire record.
Ernst-Georg Beck confirmed Jaworowski’s research. An article in Energy and Environment examined the readings in great detail and validated their findings. In a devastating conclusion Beck states
Modern greenhouse hypothesis is based on the work of G.S. Callendar and C.D. Keeling, following S. Arrhenius, as latterly popularized by the IPCC. Review of available literature raise the question if these authors have systematically discarded a large number of valid technical papers and older atmospheric CO2 determinations because they did not fit their hypothesis? Obviously they use only a few carefully selected values from the older literature, invariably choosing results that are consistent with the hypothesis of an induced rise of CO2 in air caused by the burning of fossil fuel.
So the pre-industrial level is some 50 ppm higher than the level put into the computer models that produce all future climate predictions. The models also incorrectly assume uniform atmospheric global distribution and virtually no variability of CO2from year to year.
Beck found, “Since 1812, the CO2 concentration in northern hemispheric air has fluctuated exhibiting three high level maxima around 1825, 1857 and 1942 the latter showing more than 400 ppm.” Here is a plot from Beck comparing 19th century readings with ice core and Mauna Loa data.
Variability is extremely important because the ice core record shows an excceptionally smooth curve achieved by applying a long term 70 year smoothing average. Smoothing applies to the Mauna Loa and all current atmospheric readings, which can vary up to 600 ppm in the course of a day as the Mauna Loa portion of the curve in the diagram shows. Smoothing done on the scale of the ice core record eliminates a great deal of information. It is why statistican William Brigg’s says you never, ever, smooth a time-series. Elimination of high readings prior to the smoothing makes the loss even greater. Beck explains how Charles Keeling established the Mauna Loa readings by using the lowest readings of the afternoon. He also ignored natural sources a practice that continues. Beck presumes Keeling decided to avoid these low level natural sources by establishing the station at 4000 m up the volcano. As Beck notes “Mauna Loa does not represent the typical atmospheric CO2on different global locations but is typical only for this volcano at a maritime location in about 4000 m altitude at that latitude.(Beck, 2008, “50 Years of Continuous Measurement of CO2on Mauna Loa” Energy and Environment, Vol 19, No.7.)
Keeling’s son continues to operate the Mauna Loa facility and as Beck notes, “owns the global monopoly of calibration of all CO2 measurements. Since the young Keeling is a co-author of the IPCC reports, they accept the version that Mauna Loa is representative of global readings and that they reflect an increase since pre-industrial levels.
The Historic CO2 Record.
Al Gore and others created the delusion that CO2 levels are at an all time high. Here is a plot of CO2 levels for 600 million years using geologic evidence. It shows the current level is the lowest in the entire record and only equaled by a period between 315 and 270 million years ago. For the last 300 million years, the average is between 1000 and 1200 ppm. For most of the record, the levels were above 1000 ppm and as high as 7000 ppm.
There are other problems with the ice core record. It takes years for air to be trapped in the ice, so what is actually trapped and measured? Meltwater moving through the ice especially when the ice is close to the surface can contaminate the bubble. Bacteria form in the ice, releasing gases even in 500,000-year-old ice at considerable depth. (“Detection, Recovery, Isolation and Characterization of Bacteria in Glacial Ice and Lake Vostok Accretion Ice.” Brent C. Christner, 2002 Dissertation. Ohio State University). Pressure of overlying ice, causes a change below 50m and from brittle ice becomes plastic and begins to flow. The layers formed with each year of snowfall gradually disappear with increasing compression. It requires a considerable depth of ice over a long period to obtain a single reading at depth. Jaworowski also identifies the problems with contamination and losses during drilling and core recovery process.
Another measurement of CO2 provides further evidence of the effects of smoothing and the artificially low readings of the ice cores. Stomata are the small openings on leaves, which vary directly with the amount of atmospheric CO2. A comparison of a stomata record with the ice core record for a 2000-year period illustrates the issue. Stomata are like the direct atmospheric measures discussed by Beck, but unlike the trapped bubbles in the ice core, which take decades to form.
Stomata data show the higher readings and variability when compared to the excessively smoothed ice core record and align quantitatively with the 19th century measurements as Jaworowski and Beck assert. The average level for the ice core 2000-year record shown is approximately 265 ppm while it is at least 300 ppm for the stomata record.
The pre-industrial CO2 level are marginally lower than current levels and likely within the error factor. Neither they, nor the present readings, are high relatively to the geologic record. The entire output of computer climate models begins with the assumption that pre-industrial levels were measurably lower. Elimination of this assumption further undermines the claim that the warming in the industrial era period was due to human addition of CO2 to the atmosphere. Combine this with their assumption that CO2 causes temperature increase, when all records show the opposite, it is not surprising IPCC predictions of temperature increase are consistently wrong.
This is crucial because most don’t realize the IPCC link scientific model outputs with economic model outputs to create the Special Report on Emissions Scenarios (SRES). Many severely criticized them including participant David Henderson who said, “My main theme is what I see as the uncritical and over-presumptive way in which these various sources have dealt with the scientific aspects of the subject.”They’re circular arguments that predetermine results, which explains why they’re always wrong.
Read more on the CO2 fraud by Dr Ball in his article ‘CO2 Data Manipulation.‘
PRINCIPIA SCIENTIFIC INTERNATIONAL, legally registered in the UK as a company incorporated for charitable purposes. Head Office: 27 Old Gloucester Street, London WC1N 3AX.