Facebook Twitter YouTube SoundCloud RSS
 

Global Climate: Cooling Fear Trumps Warming Fear

The global warming debate has become so politicised, that the public have been forces to ‘choose their science’ along political lines. However, by examining history, it’s possible that cooler heads might prevail…

Andrew McKillop

21st Century Wire

GLACIERS ARE FRIGHTENING

How many TV films and press photos do we see in the “unbiased information providing media” of the Arctic – in winter? Not in summer, but in winter. Many glaciers are retreating, a large number are advancing. Attempts at painting a picture of the Antarctic ice sheet “melting rapidly” have been a failure, at least to date but we can count on “unbiased media” to do what it can – to raise the global warming alarm.

By the 1830s the science now called glaciology was emerging, and alarming. Swiss and French scientists studying Alpine glacier advance and retreat used data on historical, meteorological, economic and agricultural changes in the Alpine region. They constructed models of rapid glacier advance and retreat that were unsettling to established scientific opinion, and in particular to the also-emerging science of what is now called geology. At the time this infant science, founded in the 1790s was dominated by the views of the “father of modern geology”, British scientist James Hutton.

Hutton was opposed to glaciology from the start – for pure ideological reasons. He was also, for example and for the same reasons, opposed to what later became known as overthrust folds, where geologically younger strata lays below older strata. Any such thing as tectonic plates and drift would have been totally rejected by Hutton, also for the same reasons.

He believed in “God’s work”, that is very slow and sure, therefore predictable, but what can or should surprise us is that Hutton in the 1790s believed that CO2, emitted by human beings, was able to harm local or even regional climate. In extreme cases this CO2, called “phlogiston”by Hutton’s compatriot  scientist Joseph Priestley, could in their view make it impossible for humans to survive.

Swiss and French glaciologists, such as de Saussure, Contejean, Perraudin, Thurmann, Cuvier, Wetzel and others, by the 1830s, concluded that glacier advance and retreat can be rapid on a historical timeframe and basis. In as little as 30 years, with global or regional climatic, geomorphological, tectonic, volcanic and even human geographical factors accelerating the process either way, glaciers can rapidly move a long way, advancing or retreating. Today, surprisingly, the idiot media asks us to imagine that glaciers should either stay fixed, or only advance because “otherwise there is catastrophic climate change”! By the 1830s, some 180 years ago, scientists were able to say that in either mode, the process gains momentum and can continue for several decades or longer. Systems theory wasn’t yet invented, but today we know all about positive feedback.

THE WORLD COOLS

Glacial advance and retreat may be slow to start, but once started it accelerates. When in advancing mode glaciers have a major downward effect on local and regional temperatures – and in the early 19th century this was the major fear.

One simple reason for this fear was that a period of over 350 years from about 1450 to beyond 1800 are called “Europe’s Little Ice Age”. No exact completely accepted theory explains why Europe, and certainly North America, and northern Asia, all experienced this Little Ice Age – but it happened. In addition, the historically documented accounts of the damage caused, including repeated massive flooding in Europe’s North Sea region, constantly retreating farmlands, and recurring periods of famine are impossible to ignore.

For many scientists, events such as the Maunder Minimum of sunspot cycles for about 60 years in the 18th century, and the Dalton sunspot cycle minima of about 1806-16 (in Sunspot Cycles 5 and 6), possibly intensified or aggravated but did not cause this cold phase – which in human historical terms but not in geological terms was long. Warming phases during the Little Ice Age were short and aborted rapidly, for example through about 1770-1785. Although “heretical” this can easily be compared with the warming phase of about 1980-2000.

In all of the northern hemisphere, the year 1816 was called “the year with no summer”, but this is usually attributed only to the 1815 Tambora tectonic-volcanic explosion, 10 times more powerful than the 1883 Mount Krakatau explosion. Tambora blasted about 12 cubic miles of dust, rock and gas into the atmosphere, submerging the island of Sumbawa with incandescent ash and lava, and instantly killed about 10,000 persons. Sunsets in Europe and North America were deep red-tinged for months. Its impact on global climate was very real.

But returning to what we can call “the birth date of global warming fear” – or at least climate change fear linked with CO2 emissions from human activity – we can look to Joseph Priestley. At the tail end of the Little Ice Age, English scientist Joseph Priestley, the discoverer of oxygen, firmly believed that “phlogiston” accumulations, of CO2 in the atmosphere due to burning coal and wood were major causes not of general warming, but bad weather, unpredictable and unseasonal weather, unusual rainfall and serious crop losses. Priestley, who was based in an English Midlands then accelerating into the coal-powered Industrial Revolution, became intensely attached to his theory. He supported the American revolution because he hoped English migration to the “new” continent could reduce the social damage, and possible famine and death – that he thought would result from ‘runaway climate change’ caused by industry and its fossil fuel burning.

Priestley was declared a traitor to Britain by the press, had his Birmingham house burned down by a raging mob, and thus was lucky to escape alive to the US where he died in 1805.

COOLING FEAR CONTINUES

 The fear of cooling in Europe, especially in continental Europe, persisted a lot longer than Priestley’s early CO2 “phlogiston” theory, which even in England and the UK was heavily criticised by many contemporary scientists – despite Hutton’s support to it. Glacier advance was shown to be no respecter of human beings. The early glaciologists and botanists, and pedologists showed that a period of glacier advance leaves disturbed drainage systems and impoverished soils behind, often taking centuries or more to recover and make good. Human attempts in Europe’s Alpine region to control or deviate glacier advance were shown to be puny.

“The Little Ice Age” above all, showed that any clear causal linkage with human activity was almost or totally impossible to demonstrate against the scale of the Earth’s changing climate. All the changes towards large numbers of long cold winters, dull wet springs and cool, short, often wet but above all “capricious” summers marked by hailstorms, and huge sand storms in coastal areas – took place without it being possible to say that mankind was responsible. Some scientists today, of course “heretically”, believe that the “short warming” – which mostly occurred in the northern hemisphere through about 1980-2000 – may only have been the result of recovery from the long 1450-1820 cold period. They go on with their heretical cut on contemporary global climate history to also attribute today’s Arctic ice melt but almost absent Antarctic ice melt to the world geographical profile of climatic recovery from the Little Ice Age.

Cooling fear, at least among scientists and now including American scientists, continued through the 19th century. By the 1860s the subject of energy had become a major theme in science. With it, the early thermodynamists including Carnot and Thompson relatively quickly advanced the concept of there being an Absolute Zero, now set at about -271.3 degrees celsius. This in fact is very close to the average temperature of the Universe. What is called “fossil radiation” from the Big Bang operates at about 2 or 3 degrees Kelvin or around -269 degrees celsius.

Interest in thermodynamics and another fast-growing science – radiation – soon generated the scientific controversy of how to measure the Moon’s temperature. Questions linked to this quest concerned subjects like the difference between temperature measurement and energy flux, the relationship of  build up or decline in heat quantity or thermal mass and temperature at any one instant, and the subject of thermal inertia. It was increasingly better-known that apart from heat mass, other parameters have an influence on the temperatures of the Earth’s atmosphere, biosphere, lithosphere, oceanosphere, and in most cases temperature flux and thermal mass can be treated as unrelated for certain periods of elapsed time. Why the Moon should have such extreme variations of temperature, ranging through as much as 280 degC in 24 hours at certain latitudes, with extremes of cold making it impossible for any life to survive, was a major subject of scientific thinking from the 1870s, accelerated by developing knowledge on light radiation, radiative flux and absorbtion spectra.

In particular the vast difference between the Moon’s temperatures, and Earth temperatures led to the concept of “greenhouse behaviour” of the gases making up the Earth’s atmosphere.

Here again we have a major irony. By the 1890s the American scientist Pierpoint Langley had used observations of the Moon’s radiation of reflected sunlight towards the Earth, at different angles, in his methods for measuring the temperature of the Moon. Where the moon’s surface receives solar radiation, the local temperature can reach 250 degrees degF (plus 120 degC), but the “dark side of the moon” has temperatures dipping to minus 160 degC.

Langley’s related studies of CO2 absorbtion spectra of the Earth’s atmosphere showed that CO2 and especially water vapour help reduce temperature extremes, preventing the deadly cold of the Moon’s “dark side” surface while also limiting temperatures rises under sunlight. In turn, Swedish scientist Svante Arrhenius a decade after Langley used the latter’s observations of increased infrared absorption when reflected sunlight from the Moon passes through the Earth’s atmosphere at a low angle, encountering more carbon dioxide, to estimate an atmospheric cooling effect from any future decrease of CO2. Arrhenius was concerned about the Earth’s climate shifting to another cooling phase partly caused by loss of CO2. He also realized that a cooler atmosphere would necessarily hold less water vapour, and this would further depress temperatures by creating an additional cooling effect – in other words positive feedback when cooling started.

Arrhenius can be considered the real “father of global warming” as a scientific theory. His basic concern, many times stated by him in writing, was that global cooling was a constant threat to mankind. He believed that “anthropogenic” CO2 emissions, by building up the CO2 content of the atmosphere could or might slow down the cooling trend – but not stop it – and this would buy time for us all.

THE WARMIST IPCC

 As we know, the IPCC only exists since 1988 and is above all controversial. It has been obliged by merited criticism from eminent scientists to formally state: “Long-term prediction of future climate states is not possible”. Despite that, the same IPCC felt able to publish the totally inaccurate “Hockey-stick” curve, known as “Mann’s curve” no less than 6 times in various reports, before it was forced to withdraw this laughably untruthful chart after “Climategate” became a world media event in 2009.

From its start the IPCC has had a bias in favour of “the CO2 warming hypothesis”. This is necessarily a “warmist” bias. There is no equivalent and countervailing “coolist” bias, at least in the media and at least in Western science – although the great majority of Russian climate scientists and geoscientists have a “coolist” bias.

The relationship between atmospheric CO2 and temperature, as shown by ice core sampling, proves that a build up of CO2 typically occurs 800 to 1000 years after an increase in temperature is observed. Hence all previous cases in all geological time of a build up of CO2 were due to climate warming, not its cause. Climate warming always preceded CO2 increase. We are invited, or told by media pressure to believe the exact opposite, today.

We are insistently, even hysterically told that world temperatures will increase as CO2 levels rise.

 In an unattributed return to the primitive thinking of Priestley, the arch-warmist James Hansen now claims that rising CO2 levels in the atmosphere have not caused global warming, because of “biosphere fertilization” by CO2. Priestley, in the 1790s claimed that “phlogiston”, basically CO2, which he had obtained by burning biomass in closed containers until combustion stopped, was recycled by plants and sunlight. As a “healthy gas” his phlogiston helped plants to grow providing us with food and fiber, but as a “dark gas” accumulating in the lower atmosphere, the same phlogiston would cause famine and mayhem through wreaking havoc on local or regional weather.

As well known and proven today, plants grow faster when “phlogiston”, that is CO2 levels increase to as much as 2.5% of the atmosphere or 25,000 parts per million.

For a short while longer it is likely that claims of Global Warming Apocalypse just around the corner, due to human emissions of CO2, will continue to get massive media support, attributing the “short warming” of 1980-2000 to CO2 from fossil fuel burning. This in no way at all prevents, or excludes, global cooling from triggering and to self-reinforce through positive feedback.  It has happened before – for long periods of human historical time. It can happen again, and like glaciers global cooling has no respect for human beings who get in the way!

 

 

Get Your Copy of New Dawn Magazine #203 - Mar-Apr Issue
Get Your Copy of New Dawn Magazine #203 - Mar-Apr Issue
Surfshark - Winter VPN Deal