Proven Negative Water Feedback Means CO2 Climate Impact Irrelevant

Written by Carl Brehmer

There is an ongoing scientific debate over whether or not carbon dioxide causes atmospheric warming or atmospheric cooling. The reason that this debate continues is because no scientific study has yet been devised to directly measure the affect of carbon dioxide on the temperature of the atmosphere.

The current real world experiment being tested is this: “What effect will doubling the carbon dioxide concentration in the atmosphere from pre-industrial times have on the mean global temperature?” Nobody knows because it hasn’t happened yet and won’t for another 70 years or so when carbon dioxide is expected to reach ~560 ppm or 0.00056th or 56/100,000th of the atmosphere.

At that time the global mean temperature will be compared to the pre-industrial global mean temperature and we will have one data point. Of course no competent scientist would ever draw a definitive conclusion from just one data point. Beyond that, not only will those who are currently debating this scientific question be long dead by then, the scientists of that day will still not know because the global mean temperature fluctuates naturally. A second data point would be helpful but a redoubling of atmospheric carbon dioxide to 1120 ppm will take centuries to occur if it occurs at all.

The same is not true for water vapor–the “most potent greenhouse gas”.  The amount of humidity that is present in various climates around the world and even within the same climate from one day to the next is variable enough to measure its effect on the temperature of that climate, which I have done in a number of studies using real world, open air data.  I have observed that in climates where there is ample ground moisture present the absolute humidity in g/kg goes up and down with the temperature, but when there is not ample ground moisture, like in a desert or during a drought, the mean temperature is higher in the drier climate contrary to the “greenhouse effect” hypothesis.

Affect of Humidity on City Temperatures

Here is a sample.  I downloaded a month’s worth of temperature and humidity reading from these four cities, which are typically fairly dry, from the National Weather Service and separated the more humid days from less humid days and averaged the temperature readings.  In each case the mean temperature of the drier days was higher than it was during the wetter days; in this study by 2-4 °C.

Continue Reading 7 Comments

“Power In” is NOT Equal to “Power Out”

Written by Joseph E Postma

I keep on seeing the phrase from alarmists, warmists, and luke-warmists, of this initiating assumption that, in order to conserve energy, you set the power input equal to the power output.  In other words:

Power In = Power Out

Haven’t these people heard of entropy?  The fact that for essentially NOTHING in the universe, power in = power out, is learned in high-school or even well before that.  entropy

So who are the people that claim that power in = power out, in direct and the most basic violation of thermodynamics?  Can you actually really be a physicist while claiming that power in = power out, in 100% efficiency?  Nothing is 100% efficient, because of our friend entropy – no matter how efficiently you try to get work out of a system, you can never get as much power out as you put in – there are always losses.

So there’s that, and of course, why else is power in NOT EQUAL TO power out?  Power in is not equal to power out because the energy which constitutes those powers does not come from the same surface area.  For Earth, ‘power out’ does not equal ‘power in’ because the power gets put in on only half the planet, while the ‘power out’ comes from the whole planet.

There’s twice as much surface area from which power can come out than to which power comes in, and so, if the power out equalled the power in, there would be twice as much energy coming out as comes in.  Equating flux will in general always lead to a basic violation of conservation of energy.  Equating flux, in general, is not the correct way to conserve energy.

I mean this is all very basic stuff, which I’ve written on extensively already.  The Earth is not flat, Sunshine is not cold, conserving flux is not the same thing as conserving energy, etc.

And that latter seems to be the source of all the climate confusion, among all participants of the debate.  Only me and other people at PSI (Principia Scientific International, i.e. “the Slayers”) seem to be stating the factual, traditional-science case that power is not the same thing as energy, that flux can’t be averaged, that real-time differential heat-flow equations are the only true solution for heat flow and temperature, etc.

Continue Reading 54 Comments

UN Climate Scientists Plead for Immunity from Criminal Prosecution: AR5 in Crisis

Written by

The UN’s Intergovernmental Panel on Climate Change (IPCC) published it’s Fifth Assessment Report (AR5) this week and already it is in crisis with accusations of malfeasance and pleas from climatologists for immunity from prosecution.

A critical backlash against AR5’s “junk science” is now in full swing and policymakers in Britain and Australia are already in full retreat from the travesty. The ongoing collapse in the UN’s climate cabal’s credibility puts a fresh light on why climatogists got in early with their formal request for immunity from prosecution at the Rio de Janeiro UN climate summit of 2012.

Today, prominent statistician Steve McIntye, one of the analysts often credited with exposing past IPCC ‘errors’ points to why this fiasco may rise to the level of criminality. McIntyre shows how UN officials systematically hid adverse data contained in the final draft review but omitted from the subsequent report now issued to the public. The world hasn’t seen this kind of orchestrated institutionalized deceit since the world banking crisis of 2008.

The astonishing plea by the world’s climatologists for immunity from prosecution was first reported last year when it surfaced embarrassingly during the Rio summit. At the time  John Bolton, a former U.S. Ambassador to the UN, was quick to question the motives, “The creeping expansion of claims for privileges and immunities protection for UN activities is symptomatic of a larger problem.”

This week, in ‘IPCC: Fixing the Facts’ McIntyre identifies the evidence that proves how UN authors cynically removed from their final report facts that contradicted the propaganda set out in the Summary for Policymakers (SPM) issued only last Friday. Such stark evidence reveals that climatologists failed to predict the flatlining of temperatures in recent decades.

McIntyre observes:

“Figure 1.4 of the Second Order Draft clearly showed the discrepancy between models and observations, though IPCC’s covering text reported otherwise. I discussed this in a post leading up to the IPCC Report, citing Ross McKitrick’s article in National Post and Reiner Grundmann’s post at Klimazweiberl. Needless to say, this diagram did not survive. Instead, IPCC replaced the damning (but accurate) diagram with a new diagram in which the inconsistency has been disappeared.”

Continue Reading 4 Comments

National Geographic – Fooled to disgrace itself

Written by Nils-Axel Mörner

The September issue of National Geographic was devoted to the idea that we are facing a disastrous flooding in the near future. They had the bad taste to illustrate this with a picture of the Statue of Liberty with the sea reaching up to her waist some 70 m above the present sea level. This is a complete misconception of physical possibilities in nature itself.statue of liberty Nat Geo

The firm scientific facts fully to dismiss all such flooding ideas have been presented by Professor Don J. Easterbrook last week on WUWT, and I don’t need to add further facts.

There is another side of this tragedy, and that is the question of how and on what grounds a top-magazine can be fooled to disgrace itself so very much. The IPCC and its supporting boy-scouts seem totally to have lost contact with reality in their claims of sea level rise and disastrous flooding events of low-lying islands and coastal areas.

Claims of a sea level rise by 2100 in the order of 1-2 m or more are simply impossible because it would upset all knowledge and all observational facts we have achieved over the entire time of scientific investigations.

In the article in National Geographic references were given to three scientists who were said to be responsible for the “facts” presented. Those persons are:

Philippe Huybrechts (Vrije Universiteit Brussel, Belgium)

Richard S. Williams Jr (Woods Hole Research Centre, US)

James C. Zachos (University of California, US)

They should all know better than to allow the falsification of facts and the discarding of all accumulated knowledge in geology and physics.

Continue Reading 3 Comments

What is Energy?

Written by Joseph E :Postma

Not “Watt is energy”.  In physics, and what should be everywhere else in anything calling itself science, what is the unit of energy?  The unit of energy has a name, and it is called a Joule, after English physicist James Prescott Joule.  A Joule, or Joules, are the unit of energy in science.  There are other equivalent metrics for energy such as “ergs” or “electron volts” but they are all equivalent to a certain number of Joules.

Watts, on the other hand, are a unit of flux.  In particular, the temporal flux of Joules, meaning the number of Joules being “used” or “passing by” in one second.  The fundamental definition and unit of a Watt is a Joule per second, so, W = J/s where the letters abbreviate the relevant quantities.  So, one Watt is one Joule of energy used in one particular second.  We call this flux.

When we get to radiation or light, and the measure of its strength, these are measured in Watts per square meter, which means Joules per second per square meter, and this is called flux density.  It is a number of Joules, being used each second, over the area of a square meter. W/m2 = J/s/m2.  These are the units for the Stefan-Boltzmann Equation which is the single equation that exists for converting radiation, or light, into temperature.  Why I mean by that is that the equation tells you the temperature of the light given its intensity, or conversely, the intensity of the light given its temperature.  The equation tells you that light has a direct equivalence to temperature, just like mass has a direct equivalence to energy.  The latter equation is Einstein’s E = mc2, which shows that mass has an equivalence to energy. Likewise, radiation has an equivalence to temperature via the Stefan-Boltzmann equation, which is F = σT4, where F is the Flux density of the radiation, σ is a constant, and T is the temperature.

So then what’s wrong with the IPCC energy budget?  Let’s have a look at it again (see diagram):

IPCC energy budget with backradiation

What they’re doing to get this thing to “work”, is adding together the flux densities of light.  Given the Stefan-Boltzmann equation which shows us that light flux density has an equivalence to temperature, then what this diagram is doing is adding temperatures together, to make it work.  When it adds 168 J/s/m2 from sunlight with 324 J/s/m2 from the atmosphere, it is saying that sunlight is -400C and that the atmosphere is 1.80C (because that is the equivalent temperature of those light flux densities), and that if you add together something that is -400C to something that’s +1.80C, you get +150C. Not just that – the diagram tries to say that air is hotter than sunlight!

Continue Reading 8 Comments

What is the Standard Model Greenhouse Gas Theory?

Written by

In this short article we identify what is meant by the ‘Standard Model’ greenhouse gas theory. We see climatologists have never formulated an agreed theory of the supposed greenhouse gas effect. Ironically,  it took critics of the consensus to finally put together, from all the disparate models, a single unified version that could be practically examined and refuted.Standard Model GHE by PSI

So why is this now such an issue?

Frankly, more and more scientists are now beginning to accept that the so-called greenhouse gas theory of global warming appears to be in trouble. Despite atmospheric levels of carbon dioxide (CO2) up more than 40 percent in recent decades global temperatures have stubbornly remained flat for more than 15 years.

In a recent article on Principia Scientific International (PSI) independent analysts highlighted long-overlooked flaws. Among the flaws is the absence of ANY internationally agreed standard model of the GHE.

The closest to an ‘official’ version would be that provided by the UN’s Intergovernmental Panel on Climate Change in their Fourth Report. [1]

The IPCC provide an even more simplistic version  that is bereft of any actual numbers. What the IPCC offer up would be inadquate for a science lesson for schoolchildren let alone as a reference source for serious authorities. As such, it may be regarded as the least informative of them all (see below).

Continue Reading 30 Comments

On Langan’s Theory of Theories

Written by Dr. Pierre R Latour

Scientific method extended to all human thought


Pierre R Latour, Doctor of Philosophy in Chemical Engineering, Sept 21, 2013


Principia Scientific International, promotes discussion and debate using the scientific method for learning and teaching about how nature works. The method is part of the intellectual framework of human thought collected under the all-encompassing topic of philosophy. A professional philosopher, Christopher Langan, published his Theory of Theories and Cognitive-Theoretic Model of the Universe which extends the scope of the scientific method to guide all human thought in a search for truth. It confirms the validity of the scientific method.Christopher Langan


The scientific method inaugurated by Francis Bacon around 1600 and codified by Galileo and Newton inspired the age of reason because it provided the way to elevate belief in how the natural world works, to knowledge, describing and predicting nature’s behavior in its own language, mathematics. Belief defined by authority was not sufficient to declare truth.

The method calls for intellectual formulation of a postulate from belief, then testing its predictions by comparison with experimental measurements. If the observations match predictions, the postulate is elevated to a theory, a form of knowledge to be accepted until something better comes along.

Engineers add additional requirements of utility, efficiency and value to apply scientific knowledge to build things people like and need. When engineered systems work as planned, the theory employed gains greater stature as valid and true.

The basic sciences are physics, chemistry and biology; with extensions like astronomy, geology, medicine, psychology, agriculture, engineering, military, political.


The Greeks recognized there is much more to reality than nature. Art, music, honesty, integrity, ethics, morality, epistemology, law, religion, good, evil, passion, emotion, success, life, death, truth, mathematics, beauty, love, knowledge, education, economics, history, fiction, war, peace.

Thomas Jefferson famously encapsulated this idea of the laws of Nature and of Nature’s God in the Preamble of his American Declaration of Independence, 1776. “When in the course of human events, it becomes necessary …. to assume separate and equal Station to which the Laws of Nature and of Nature’s God entitle them …” And then “We hold these Truth’s to be self-evident, that all Men are created equal, that they are endowed by their Creator with certain inalienable Rights, that among these are Life, Liberty, and the pursuit of Happiness. That to secure these Rights, Governments are instituted among Men, deriving their just Powers from the Consent of the Governed,”

Many men have died for that idea. So there is more to it than Nature and the study of Nature; science. The rest is the realm of Nature’s God. According to Jefferson.

Continue Reading 1 Comment

The Anti-Science IPCC Global Warming Report 5

Written by Dr Charles Anderson

Fundamentally, the IPCC has never had any solid evidence of measurable man-made global warming caused by man’s emissions of carbon dioxide.  The newest report just issued does not change this.  Yet, the Summary Report issued to the press and politicians claims that catastrophic man-made global warming is now known to be more certain than ever. 

This claim is made on the basis of General Circulation Model (GCMs) computer models interpreted with an embarrassing flight of fancy.In the light of that claim, let us examine the predictions of such GCMs used in the prior reports, when we were informed that the science was already settled and well-known.  The draft report that was sent out to actual scientists for review had the following graph in it. 

The various shaded areas show the range of certainty of the average global temperature according to the body of computer models.Temp anomaly IPCC Fifth Report

The FAR was the first report of 1990, the SAR was the second report of 1995, the TAR was the third report of 2001, and the AR4 is the fourth report of 2007.  Over this period the U.S. government alone spent about $150 billion funding climate change related phenomena.  Each IPCC report claimed a higher level of confidence in catastrophic man-made global warming. 

So we should expect to be able to look at the range of expected temperatures from each report for 2015 and see that the range of each successive report falls within the range of the previous report, but is narrower.  This is both because the claim is that the science is better known and because the prediction time is becoming shorter.Because the colored ranges overlap, it is easiest to quickly see how the certainty of the predictions of the settled science actually changed from report to report by looking at the brackets on the right side of the graph which are color-coded.  These represent the range of the prediction for 2015 for each report.

So what actually has happened is that the settled science did claim a smaller temperature range in the second report than in the first report, but its prediction range did not lie entirely within the range claimed in the first report.  No, it admitted that the temperature increase might be smaller.  In the third report the 2015 temperature range was much wider than in the second report.  The 2015 temperature might be much higher than that predicted in the second report, or a bit lower. 

This represented a large increase in the scientific uncertainty being claimed.  The fourth report claimed that knowledge had improved and the range shrank compared to that of the third report, but apparently the knowledge was not as good as that of the second report whose range was narrower.  While the range of the fourth report prediction does lie entirely within that of the third report prediction and that of the first report, it excludes the lower part of the range of the second report on the settled science.Unfortunately for the IPCC, the fourth report prediction of the temperature did not allow for such low temperatures as have been measured in the meantime. 

The black and red dots of recent years are well below the predicted range of the fourth report and even below the range by a bit from any of the reports.  The only conclusion a rational person can make is that the settled science incorporated into the many computer models was wrong. 

Continue Reading


Written by Nicola Scafetta, Earth Science Reviews


Nicola Scafetta (2013) Discussion on climate oscillations: CMIP5 general circulation models versus a semi-empirical harmonic model based on astronomical cycles, Earth-Science Reviews 126 (2013) 321–357
Abstract: Power spectra of global surface temperature (GST) records (available since 1850) reveal major periodicities at about 9.1, 10–11, 19–22 and 59–62 years. Equivalent oscillations are found in numerous multisecular paleoclimatic records. The Coupled Model Intercomparison Project 5 (CMIP5) general circulation models (GCMs), to be used in the IPCC Fifth Assessment Report (AR5, 2013), are analyzed and found not able to reconstruct this variability. In particular, from 2000 to 2013.5 a GST plateau is observed while the GCMs predicted a warming rate of about 2 °C/century. In contrast, the hypothesis that the climate is regulated by specific natural oscillations more accurately fits the GST records atmultiple time scales. For example, a quasi 60-year natural oscillation simultaneously explains the 1850–1880, 1910–1940 and 1970–2000 warming periods, the 1880–1910 and 1940–1970 cooling periods and the post 2000 GST plateau.
This hypothesis implies that about 50% of the ~0.5 °C global surface warming observed from 1970 to 2000 was due to natural oscillations of the climate system, not to anthropogenic forcing as modeled by the CMIP3 and CMIP5 GCMs. Consequently, the climate sensitivity to CO2 doubling should be reduced by half, for example from the 2.0–4.5 °C range (as claimed by the IPCC, 2007) to 1.0–2.3°C with a likely median of ~1.5 °C instead of ~3.0 °C. Also modern paleoclimatic temperature reconstructions showing a larger preindustrial variability than the hockey-stick shaped temperature reconstructions developed in early 2000 imply aweaker anthropogenic effect and a stronger solar contribution to climatic changes. The observed natural oscillations could be driven by astronomical forcings. The ~9.1 year oscillation appears to be a combination of long soli–lunar tidal oscillations, while quasi 10–11, 20 and 60 year oscillations are typically found among major solar and heliospheric oscillations driven mostly by Jupiter and Saturn movements.
Solar models based on heliospheric oscillations also predict quasi secular (e.g. ~115 years) and millennial (e.g. ~983 years) solar oscillations, which hindcast observed climatic oscillations during the Holocene. Herein I propose a semi-empirical climate model made of six specific astronomical oscillations as constructors of the natural climate variability spanning from the decadal to the millennial scales plus a 50% attenuated radiative warming component deduced from the GCM mean simulation as a measure of the anthropogenic and volcano contributions to climatic changes. The semi-empirical model reconstructs the 1850–2013 GST patterns significantly better than any CMIP5 GCM simulation. Under the same CMIP5 anthropogenic emission scenarios, the model projects a possible 2000–2100 average warming ranging from about 0.3 °C to 1.8 °C. This range is significantly below the original CMIP5 GCM ensemble mean projections spanning from about 1 °C to 4 °C.
Future research should investigate space-climate coupling mechanisms in order to develop more advanced analytical and semi-empirical climatemodels. The HadCRUT3 and HadCRUT4, UAHMSU, RSS MSU, GISS and NCDC GST reconstructions and 162 CMIP5 GCM GST simulations from 48 alternative models are analyzed.


Continue Reading

Equating Flux

Written by Joseph E Postma

In the last post was an explanation of the difference between energy and energy flux.  Energy is generally a simple static scalar quantity, while flux refers to an instantaneous expenditure of energy.

Physics, i.e. the real world and real-world reactions, occur in real-time.  Reality doesn’t wait around for an average of something to build up and then decide to act – reality acts as time flows by, each infinitesimal moment to the next.  Reality reacts to instantaneous flux, not the average flux because there is no “average” that reality waits around for to react to.

The standard procedure for “conserving energy” and then creating an energy budget and subsequent greenhouse effect is by numerically equating the terrestrial flux output with the solar flux input. This numerical procedure is done with the  justification that “on average, the input and output must equal if the system is in equilibrium”.  But this is done numerically on paper, not physically in reality, because the physics of reality reacts instantaneously to forces, and doesn’t wait around for averages.

Equating Energy

So what’s the basic thing that we’re actually trying to conserve in regards to solar input and terrestrial output?  The real physical quantity we want to conserve is energy, not flux.  Energy is a fundamental unit of physics, while flux always depends upon the particular, real-time, local situation.  So if we assume that, on average, the input and output energies are equal, which they should be, then we can consider such energies for any particular second.  Considering any particular second is convenient since this allows us to directly convert the energy into flux later on.

In any given second, the Earth absorbs 1.22 x 1017 Joules of light energy from the Sun.  This is calculated with the Stefan-Boltzmann equation for the Sun, factored for the distance to the Earth and the Earth’s cross-sectional area, and its albedo.

In any given second, this energy, 1.22 x 1017 Joules, falls on one-side of the planet – the day-side hemisphere. So, now that we know the total energy falling on the Earth in one second, and we now also know where the energy falls in one second, we can convert the energy value into the units of the Stefan-Boltzmann equation, which are Joules per second per square meter. Therefore if we take the total energy and divide it by the surface area of a hemisphere of the Earth, you get an (linear) average of 480 Joules per second per square meter, or 480 W/m2. Using the Stefan-Boltzmann equation which equates flux to temperature, this is a temperature of +30 degrees C, which is very nice and warm and will melt ice into water on the day-side, etc. It is a reasonable number.

However, we must again recall that reality reacts to forces instantaneously, and not to averages of those forces after-the-fact.  The light energy falling on the day-side hemisphere in one second is not evenly (linearly) distributed because the Earth is round, not flat. That means that there is a locality-dependence on the true, real-time value of the flux density.  That is, when the Sun is overhead it is strongest, and when it is near sunrise or sunset it is weakest, and in between it smoothly ranges.  When the Sun is directly overhead, and even barely so, the flux density of the energy falling isn’t strong enough to just melt ice into water, but it is also strong enough to evaporate water into vapour.  This is what basically creates everything we recognize as the climate, is water vapour rising into the atmosphere from the strength of the Sun, and this occurs in real-time.  The greenhouse effect models do not show this, and they actually even contradict it, because they incorrectly average the power of the Sun to where it doesn’t physically exist, and thereby make the solar power far too cold (on paper) to be able to create that water cycle and climate.

This diagram is a representation of real-time reality and the physics that drives the climate on the Earth:Postma Earth Energy Budget

Back to Equating Flux

With an energy input of 1.22 x 1017 Joules over a hemisphere in one second from the Sun, and an energy output from the Earth of 1.22 x 1017 Joules from the entire globe, i.e. both hemisphere’s, it is not physically correct to equate these energy values in terms of flux. These values are true and totally correct in terms of energy.  They can not be made to be equal in terms of flux.

For example, if we say that the Earth is in numerical flux equilibrium with the Sun, and mistake this for conserving energy, then that would mean that the Earth must emit the same flux of energy as it receives the Sun.  Therefore the Earth must emit 480 W/m2 on average since that is what it receives from the Sun on an instantaneous basis.

Well, the Earth does not emit this flux of energy.  That is way too high of value.  If you converted that value into total energy emitted per second over the entire globe, it would be more energy than actually comes in.  The known and measured value for the flux output from the Earth is 240 W/m2.

Continue Reading 4 Comments

IPCC in Disarray: Time for a Review of Greenhouse Gas ‘Science’

Written by

As the UN’s Intergovernmental Panel on Climate Change (IPCC) flops with the release of its Fifth Report global policymakers are being left in no doubt why. Skepticism about man-made global warming and doubts about the validity of the ‘science’ of the greenhouse gas ‘theory’ are at all time highs.IPCC sinking

The reason? Despite carbon dioxide (CO2) levels rising by 40 percent, global temperatures have flatlined since 1998. None of the IPCC’s climate models forsesaw this. In fact, the greenhouse gas ‘theory,’ the scientific cornerstone of 30 years of climate alarm, unequivocally states that increased carbon dioxide in our atmosphere must cause more warming. But reality is disproving the theory.

The latest IPCC report is now reduced to conceding “natural variability” does play a part. This admission contradicts another cornerstone of their main thesis, that natural causes are of little or no consequence. But as the ‘Slayers‘ of the theory have long shown, it was always flawed because it made many dubious assumptions including the following:

  • The earth is flat.
  • The earth does not rotate.
  • The sun shines all day and all night with equal intensity.
  • Energy interchange in the climate is entirely by radiation. 
  • Conduction, convection and latent heat transfer do not happen.
  • Energy flow parameters are constants with no variability.
  • Energy flow is “balanced” with input equal to output.
  • Air movements, wind, rain, hurricanes are ignored.
  • Chaos has been abolished.
  • Change in this system is entirely caused by increasing human-induced trace gases in the atmosphere.
  • The earth is dead: there are no living organisms, no trees, animals, birds or people.

At this point honest scientists would admit the ‘theory’ seems discredited. Rational minds would admit that a fresh look is needed at the counterclaims of dissenting scientists. Such scientists have found a rallying point at Principia Scientific International (PSI).

Continue Reading 14 Comments

Do we really have a “33 °C Greenhouse effect”

Written by Jan Zeman, Czech Technical University, Prague

The Wikipedia entry for the Greenhouse Gas Effect states:

If an ideal thermally conductive black-body was the same distance from the Sun as the Earth is, it would have a temperature of about 5.3 °C. However, since the Earth reflects about 30% of the incoming sunlight, this idealized planet’s effective temperature (the temperature of a black-body that would emit the same amount of radiation) would be about −18 °C. The surface temperature of this hypothetical planet is 33 °C below Earth’s actual surface temperature of approximately 14 °C. The mechanism that produces this difference between the actual surface temperature and the effective temperature is due to the atmosphere and is known as the greenhouse effect.

The statement is almost completely untrue. For instance not even the math adds up: the difference between the two temperatures +14 °C and -18 °C is not 33 °C but 32 °C. But it is not important, what is important here is the fact that there’s not a difference of 33 °C, nor of 32 °C between the hypothetical and real Earth surface temperature. In short, there is clearly a confusion about what is meant scientifically when describing the “surface” of Earth.

I don’t want to rewrite astronomic customs, but for such purposes as a black-body radiation flux equation to and from the planet using the Stefan-Boltzman law, we would think the surface of the Earth should be considered to be “the atmosphere”- not the surfaces of the sea and land. The reason being that it is only the uppermost layer of the planet’s mass that is capable of radiation – in the sense as defined by the Stefan-Boltzman equation – unlike the boundary of the vacuum of space beyond.

This confusion is a result of our human perspective. In the case of big gas planets like Jupiter we observe from the outside and hardly anybody would suggest the immediate exterior of its uncertain small diameter core was the “surface.”

Indeed,  there’s an even stranger boundary custom to consider whereby we could discern the “surface” and atmosphere arbitrarily just at the point where Jupiter’s immense atmospheric pressures crosses 10 bar. Nonetheless, when talking about the Stefan-Boltzman law (i.e. about black-body radiation) as applied to Earth and it’s radiation budget, we should consider the gaseous atmosphere as being the Earth’s surface, not the actual surfaces of sea and land below.

Continue Reading 115 Comments

Debate: ‘Greenhouse Gas theory is False’

Written by Pierre Latour & Jack Barrett

UPDATED SEPT 30, 2013: Pierre Latour, Vice Chairman of PSI, recently published his takedown of the so-called greenhouse gas ‘theory.’ An alleged key component of the so-called greenhouse effect (GHE) is the trace atmospheric gas carbon dioxide (0.04%) which has been blamed for causing global warming. But carbon dioxide (CO2) has been shown by PSI researchers to only act as a coolant in earth’s atmosphere. Either PSI members are fools or they will prove to be the instigators of perhaps the most important paradigm shift in science this century.

Latour’s essay triggered a lively response among defenders of the GHE faith unpersuaded of such claims. One such critic is Dr. Jack Barrett. Below we run Barrett’s critique and Latour’s reply. We hope Dr. Barrett and others will continue this lively and open debate, plus readers are also invited to post comments for wider consideration.

Continue Reading 212 Comments

Carbon Capture: An Expensive Solution to a Climate non-problem

Written by Dr. Martin Hertzberg

Dr Martin Hertzberg, a co-founder of Principia Scientific International (PSI), pens a damning letter of complaint to the New York Times about the multi-billion dollar folly of carbon dioxide capture and storage.

In ‘Challenges Await Plan to Reduce Emissions‘ (September 20, 2013) authors Matthew L Wald and Michael D Shear addressed the Environmental Protection Agency’s (EPA) legislative proposal on carbon emission limits on new power plants and the multi-billion dollar costs.carbon capture machine

But nowhere in their 1054-word piece did the authors indicate that any such levy (which will be passed directly onto hard-pressed consumers) is based on discredited ‘greenhouse gas’ science. As such it may be entirely pointless.

Dr. Hertzberg (diehard Democratic and noted climate analyst) protests as follows:

       The above article summarizes the Industry objections to the EPA’s proposal to limit CO2 emissions from power plants. They are that the technology is not sufficiently developed (not ready for prime time) or that it would be too costly. However the most cogent reasons for rejecting draconian measure of CO2 control are that it will have only a trivial effect of atmospheric CO2, and no effect whatever on the climate.

While the presence of 0.04 % of CO2 in our atmosphere is essential for life in the biosphere, the notion that such a minor constituent of the atmosphere can control the enormous forces and motions in the atmosphere, is absurd. There is not one iota of reliable evidence that it does.

Furthermore, human emission of CO2 is but a trivial fraction of all natural sources and sinks of CO2. The most recent research by Norwegian scientists shows that the recent modest increase in atmospheric CO2 is coming from the Southern Equatorial Ocean, and that it has little to do with human emission.  Human emission, mainly from mid-latitudes, dissolves rapidly into the Earth’s oceans and re-circulates within them.

The oceans contains 50 times more dissolved CO2 than is contained in the atmosphere. The current small measured increase in CO2 is coming from the oceans: the same place CO2 changes came from during the 400,000 years shown in the Vostok ice-core data. That data show four glacial coolings each followed by an interglacial warming with atmospheric CO2 concentrations near their highest during the warmings and near their lowest during the coolings.

Continue Reading 3 Comments