Core Essays

31 January 2017

Labor Union Membership, Right to Work, and Education

Labor unions have been having a tough time competing in the private sector for a long time and now are even shrinking as a percentage of government workers.  In 2015, union workers were 11.1% of the work force.  This fell to 10.7% in 2016 with a loss of 240,000 union members.  Only 6.4% of private sector workers are now union members.  The mainstay of the unions is in the government sector with 29.6% of state government employees being union members and 40.3% of local government employees being members.  The local government union membership is much inflated by the many teachers who pretend to manage 25 or 30 people in the classroom, but are really blue-collar workers unable to negotiate their own work compensation as individuals.  The high percentage of government workers in unions has more recently been falling so that the percentages above are 15-year lows.

Within the last year, West Virginia and Kentucky have become Right to Work states, increasing the number of Right to Work states to 27.  There is a good chance that Missouri will soon become a Right to Work state.  In the 2016 election, a Republican who championed Right to Work hard won the governorship in Missouri despite the unions supplying the Democrat who opposed Right to Work with more than $10 million of campaign funds and other support.  Missouri voters returned every one of the state legislators to office who supported Right to Work.  Missouri, like other states with forced union dues collection, has been losing jobs to states with Right to Work laws.  The bordering states of Iowa, Nebraska, Kansas, Oklahoma, Arkansas, Tennessee, and Kentucky all have Right to Work laws.  Only the basket case state of Illinois still maintains forced unionization on its borders.

You would not know it based on the recent campaign rhetoric, but manufacturing jobs increased by 236,000 in 2016.  Despite that overall growth in manufacturing jobs, union membership decreased by 74,000.  The growth in manufacturing jobs has been in Right to Work states.  In 2016, union membership in the 25 states that were Right to Work states for the full year decreased by 290,000, falling from 7.1% to 6.5%.  Membership in forced unionization states increased by 50,000, making it clear how dependent labor unions are on forced unionization.  Union membership increased in only one-quarter of the states with Right to Work laws, while increasing in 60% of those with forced unionization.  The percentage of Michigan workers in labor unions has fallen by 2.2% since Michigan became a Right to Work state in 2013.  Government worker unions lost their privileged powers in Wisconsin in 2011 and since then union membership has fallen by 136,000 workers or by 40%!

The loss of union power over the school systems in Wisconsin since 2011 and the freeing of school systems to pay teachers on their individual merit, is improving education in those school systems that have moved to the individual merit evaluation and compensation of teachers.  A Stanford University researcher, Barbara Biasi, has found that the school systems that have chosen to stick with union-favored seniority compensation programs rather than individual merit programs are falling behind the individual merit school systems.  Governor Scott Walker's Act 10 collective bargaining reform has allowed the thinking school systems to improve.  Who would think that evaluating and rewarding individual teaching ability would improve education?  Clearly the Democrats who claimed this would undermine the government-run school system would not allow this possibility.  How surprising it is that there is a correlation between being a capable teacher and classroom manager and being capable of negotiating your own working conditions and compensation!

17 January 2017

Repealing ObamaCare Costs How Many People Coverage?

Those who advocate the retention of ObamaCare are claiming highly exaggerated losses of coverage due to its repeal.  They assume that no new coverage will come into play upon its repeal.  HHS Secretary Burwell cited a claim that 30 million Americans will lose their coverage.  Many others are claiming that 20 million will lose their coverage.  These exaggerations are exactly as I predicted here.

First, as I noted here, only 11.1 million people were covered for the year in 2016, though 12.7 million signed up for coverage on ObamaCare health insurance websites.  They are very trusting people to sign up and to provide their social security numbers on those very incompetent exchanges.  But I digress.  The lower 11.1 million people number is found by adding up the monthly premium payments and dividing by 12.  But you can be sure that the advocates of ObamaCare use the 12.7 million number and have no regard for the fact that the 6 million who lost coverage due to ObamaCare are now proportional to the population growth as 6.1 million who would have been happy to stay on their old plans, especially now that they will have to soon make more changes due to the many instabilities that ObamaCare has introduced.  Subtract 6.1 million whose loss of insurance the ObamaCare advocates originally did not care about from 11.1 million and one has only 5.0 million on ObamaCare of new enrollees.

The ObamaCare insurance loss alarmists count all of the new enrollees on Medicaid compared to 2013 as being covered by ObamaCare as well.  As I noted here, of the 17 million new enrollees into Medicaid in 2014, only 3.3 million were newly eligible for coverage as a result of state expansions of Medicaid benefits as a result of ObamaCare.  It is clear that the claim that 30 million will lose their insurance comes from adding 17 million added people on Medicaid to 12.7 million who paid some portion of their premiums on ObamaCare purchased insurance in 2016.  It is also clear that this is a very dishonest number.

A more honest number is that of adding the additional 5.0 million insured under ObamaCare exchanges from the fully paid premium equivalent number to the 3.3 million added due to the expansion of Medicaid coverage by states under the ObamaCare law.  That gives us 8.3 million people.  But even that is dishonest.

Why?  One of the reasons this is dishonest because in 2014 alone employers dropped the health insurance coverage of 2 million people under their employer plans.  Small businesses alone dropped coverage for 2.2 million people as a result of ObamaCare and some unknown degree of reduction due to the super-extended poor growth Obama economy, some of which is also due to ObamaCare.  What is more, prior to ObamaCare, the number of people covered under employer plans had expanded year after year, so that growth in employer coverage was lost.  So, conservatively, we subtract another 2 million from the number of people who will lose coverage due to the repeal of ObamaCare.  We are now at 6.3 million people.

But in reality, these 6.3 million people will have the option to go without health insurance after the repeal, which may be wise if they are healthy or so rich that they can reasonably be self-insured.  In 2017, the wealthy will not be able to afford to lose 2.5% of their income to ObamaCare penalties, so they will have to buy ObamaCare health insurance no matter how little they need it, unless it is repealed.  Then ending ObamaCare will bring many employers back to offering insurance plans for their employees.  Many more cost effective private plans will come into existence, once the ObamaCare straitjackets are removed.  ObamaCare requires too many doctor visits and doctors offer much less treatment per visit since it was passed.  This is the usual practice with government-controlled health care.  It becomes very inconvenient, so people use it less and get less for the cost of the coverage.  Without ObamaCare, there will be a return to competition in the medical insurance market as well.  Most areas of the country now have only one or two insurers offering ObamaCare health insurance plans.  This is not good for costs.  High costs keep many from enrolling in health care insurance.

So, if the Republicans do nothing to replace ObamaCare, the net number of people losing health insurance will be fewer than lost it when ObamaCare was put in place.  If 6 million people happy with their insurance then did not matter, how can the advocates of ObamaCare claim that the less than 6 million people dependent upon ObamaCare will matter now.  In fact, the repeal of ObamaCare does not mean that those people added to Medicaid due to its expansion in some states will not continue to be covered by Medicaid in those states.  That 3.3 million people were almost entirely the responsibility of the states after 3 years under ObamaCare in any case.  2016 was the third year. Federal subsidy payments were scheduled to plummet in 2017 anyway.  This is why many states did not allow themselves to be hooked into expanding their Medicaid rolls.  So, the repeal of ObamaCare will actually result in fewer than 3 million people losing their health care insurance relative to the before ObamaCare number.

Only Democrat Socialists can get away with pretending that fewer than 3 million people are 30 million people or sometimes only 20 million people.  These are the same people who claimed that 47 million Americans were uninsured before ObamaCare and then in 2014 stopped counting the many millions of illegal immigrants as among those uninsured to make ObamaCare look as though it was much more effective in providing health insurance than it actually was.  Tricky Dicks, these socialists.  Too bad we do not have a free press willing to keep them honest.


10 January 2017

The Simple Physics Explaining the Earth's Average Surface Temperature

According to NASA the Earth's Energy Budget is given by the following diagram here and here. This energy budget is shown in Figure 1.


Fig. 1.  This NASA Earth Energy Budget is based on data over 10 years of observations as of 2009.  As of the time of this writing, it is still the Earth Energy Budget on NASA websites.

NASA says that the Earth emits 239.9 W/m2 of longwave infra-red radiation into space.  This implies an effective Earth system radiative temperature of 

P = 239.9 W/m2 = σ T4 = (5.6697 x 10-8 W/m2K4) T4

so T = 255.0K, by application of the Stefan-Boltzmann equation.

Now we can go a step further, since NASA says that 40.1 W/m2 of longwave infra-red radiation emitted from the surface alone passes through the atmospheric window without absorption by the atmosphere directly into space.  This allows us to calculate the effective radiative temperature of the atmosphere alone.  Consequently,

P = (239.9 - 40.1) W/m2 = σ T4 = (5.6697 x 10-8 W/m2K4) T4,

so T = 243.7 K, the effective radiative temperature of the atmosphere alone as seen from space.

According to the U.S. Standard Atmosphere Table of 1976, this is the temperature at mid-latitudes at an altitude of 6846 meters by interpolation of table data. This is very close to 7000 meters, where gas molecule parameters are given in the U.S. Standard Atmosphere Table of 1976.

What the NASA Earth Energy Budget critically fails to note is that the Earth's gravity causes a linear temperature gradient in the troposphere, the Earth's lower atmosphere, which was well-known to the American scientists who devised the U.S. Standard Atmosphere tables from the 1950s to the final table of 1976.  The U.S. Standard Atmosphere Table of 1976 is found in many editions of the CRC Handbook of Chemistry and Physics, such as the 71st and 96th Editions. One can still obtain it from the government for a fee.  This gravitational field induced linear temperature gradient was also understood by Prof. Richard Feynmann who discusses it in his Feynmann Lectures late in Vol. 1.  Let me provide a simple explanation of this linear temperature gradient due to gravity and the heat capacity of air molecules.


There is a linear gradient in kinetic energy, EK, with altitude, since the total energy E of a molecule is given by E = mgh + EK , with mgh the potential energy and h the altitude.  The temperature of a perfect gas molecule is proportional to its kinetic energy, so an increased kinetic energy at sea level compared to its kinetic energy at 7000 meters altitude means the gas molecule is warmer at sea level. 

The temperature at 7000 meters is established by the fact that that is the effective radiative equilibrium altitude for the atmosphere with space.  We can use this fact to calculate the approximate temperature of the surface due only to the temperature gradient in the troposphere below that due to the effect of gravity alone.  This is an approximation because the atmospheric radiative equilibrium with space is really a range of altitudes primarily weighted over the range of 5000 to 11000 meters in the troposphere at the mid-latitudes approximated by the U.S. Standard Atmosphere Table of 1976. There are also small contributions due to mostly carbon dioxide in the very cold tropopause and the warmer stratosphere.  I want to keep things simple here so that we can develop a good physical feel for the physics without becoming lost in reams of integrations and subsequent computer code to arrive at similar, though more exact, results. Of course, there are other factors than the gravitational temperature gradient that also affect the surface temperature, but we will isolate this one factor and consider the effect of the other factors later.

EK = (3/2) kT, where EK is the kinetic energy for a perfect monatomic gas molecule, where k is the Boltzmann constant.  However, the lower atmosphere is made up almost entirely of diatomic molecules, with N2 and O2 more than 99% of the atmosphere.  EK = (5/2) kT for a diatomic perfect or ideal gas molecule and (6/2) kT for a polyatomic molecule with more than two atoms.  This is because a diatomic molecule has rotational kinetic energy around each axis  perpendicular to the bond between the two atoms in the molecule.  There are equal amounts of energy in each of the 5 degrees of freedom of the diatomic molecule.  Molecules such as CO2 and CH4 with more than two atoms have 6 degrees of kinetic energy freedom. This allows us to tie the total kinetic energy at an altitude to the translational velocities of molecules given in the U.S. Standard Atmosphere table of 1976 for dry air.  The total kinetic energy of the diatomic molecules making up more than 99% of the lower atmosphere is then 5/3 times the translational kinetic energy.

Conservation of energy for a diatomic gas molecule requires that, where 7000 meters altitude is chosen as a reference altitude, since it is the altitude in effective radiative equilibrium with space:

EK0 = (5/3) (½ m v02 ) = E7000 = (5/3)(½ m v70002 ) + mgh,

Where EK0 is the kinetic and the total energy of the gas molecule at sea level, v0 is its translational velocity there, E7000 is the total energy at 7000 meters altitude, v7000 is the translational velocity of the gas molecule at 7000 meters altitude, m is the mass of the molecule, g is the gravitational constant at 7000 meters altitude, and h is the altitude, here 7000 m.  From the U.S. Standard Atmosphere table of 1976, the mean gas molecule in the atmosphere has a mass of 28.964 amu or 4.8080 x 10-26 kg, which is greater than the mass of the most common N2 molecules and less than the mass of the second most common O2 molecules.  The gravitational constant at 7000 meters altitude is slightly less than that at sea level and is found in the table to be 9.7851 m/s2.  The temperature is 242.7K. The translational velocity of the mean molecule at 7000 meters altitude from the table is 421.20 m/s.  The value of EK0 is calculated to be approximately 1.040 x 10-20 Joules per mean molecular weight air molecule at sea level based on the translational velocity at 7000 meters and assuming that we are only calculating the static gravitational effect. 

We can now set the gravitational effect EK0 kinetic energy into the EK = (5/2) kT equation and calculate what T should be if there were no other cooling effects, such as the evaporation of water.  Note that air convection is not a net changer of the energy here, except for the effect of volume expansion cooling as the warm air rises and the pressure drops. This temperature gradient exists in the static air, yet there is no flow of heat.
 
The surface temperature due to the action of gravity alone on the atmosphere is found from:

EK0 = 1.040 x 10-20 J  = (5/2) kT = (5/2) ( 1.381 x 10-23 J/K )T

T = 301.2 K

This temperature is actually warmer than the 289.5 K temperature implied by NASA in their Fig. 1. Energy Budget, since they always assume the surface emits as a black body.  With a 398.2 W/m2 emission, the implied black body temperature is 289.5 K.  More commonly, the average surface temperature is taken to be about 288 K.

If the Earth's surface were 301.2 K due to the gravity temperature gradient alone, the equivalent black body power density to allow us to compare its effect to other effects in the NASA energy budget would be:

P = σ T4 = (5.6697 x 10-8 W/m2K4)(301.2 K)= 466.6 W/m2 

Note that this is not an actual flow of power to the surface.  The atmospheric temperature gradient due to gravity exists under conditions of energy conservation, with the energy of gas molecules always constant.  There is no heat flow in this effect.

Note that a few critics of the catastrophic man-made global warming hypothesis have calculated a thermodynamic lapse rate due to a rising mass of air with a given heat capacity at constant pressure. The lapse rate due to this is 9.81 K/km, compared to the temperature gradient of 8.36 K/km that I just calculated as due to the Conservation of Energy in the gravitational field.  Either way, the temperature gradient due to gravity is large and requires strong cooling mechanisms to reduce the surface temperature to a mere 288K.  In other words, the problem is not explaining why the surface temperature is 288K instead of 255K, but is instead one of explaining why it is as low as 288K.  The fact that the surface also absorbs solar radiation makes this explanation of the cooling mechanisms even more important.

Let us now consider other inputs and outputs of energy from the NASA Earth Energy Budget of Figure 1.  We will use this effective gravitational power, the power absorbed by the surface from the sun directly, the power loss due to thermals, the power loss due to water evaporation, and the power loss due to surface longwave infra-red emission through the atmospheric window directly into space.

We take the power values for each of these from the NASA energy budget.  We will lump the effect of all other warming or cooling mechanisms, including the Earth's surface radiative emissions absorbed by the atmosphere and any back-radiation into an unknown power flux R.

P = (466.6 + 163.3 -18.4 - 86.4 - 40.1 + R) W/m2 = σ T4 = (5.6697 x 10-8 W/m2K4) (288 K)4,

Solving for R, we find that 

R = - 94.9 W/m2 

According to NASA in Figure 1, R is equal to the - (358.2 - 340.3) W/m2 = - 17.9 W/m2 , the difference between the longwave surface emission absorbed by the atmosphere and the back radiation from the atmosphere. This is a modest cooling mechanism for the surface.  What we see from our calculations though is that there is really much more cooling needed than 17.9 W/m2 .  

Now there certainly is surface radiation which is absorbed by the atmosphere.  I have maintained that there is very little back radiation and what there is is due to those situations in which the air temperature near the surface is higher than that of the surface.  This is not the common daily condition, but it does happen at times.  I have also maintained that the electric dipoles that oscillate to emit radiation characteristic of a body at 288K cannot provide kinetic energy to the evaporation of water and to transfer to air molecules in collisions.  Energy must be conserved.  Some of the molecules in the surface do emit longwave radiation as though they were part of a surface at 288K, but some do not, because they gave up their energy to the evaporation of water or in collisions with air molecules.  The sum of the radiation energy emitted from those that emit radiation is less than that of a surface at 288K which was not also being cooled by water evaporation and air gas molecule collisions.  For radiation purposes, only a portion of the surface area emits longwave infra-red and this portion is distributed nanoscopically over the entire surface area.  Other nanoscopic areas are dumping their energy instead into evaporating water.  Still others are exchanging it with cooler air molecules which are impinging upon a water surface.  At sea level, the average air molecule has 6.92 x 109/s collisions with other air molecules.  Due to the more than 100 times as many water molecules at the surface of a square meter of water compared to the two-thirds power of the number density of air molecules in a cubic meter, the collision rate for air molecules with water molecules at the surface is more than 100 times greater than that for air molecules with other air molecules.  This gives surface water molecules many opportunities to transfer energy to cooler air molecules.  The story is similar for the other materials of the Earth surface.

The NASA claims that the surface emits a total of 398.2 W/m2 and that back radiation amounts to 340.3 W/m2 are both absurdly exaggerated.  Surface infra-red emission is not only limited by conservation of energy and the competition with energy loss via water evaporation and gas molecule collisions, but also by the mean free path length of emitted infra-red radiation before absorption of that radiation by water vapor or carbon dioxide molecules.  That limited mean free path length also places a limit on the temperature differential between the surface temperature and the temperature of the air layer where absorption takes place.  The allowed energy transfer is governed by the equation:

P = σ (TS4 – TA4 ), 

where TS and TA are the temperatures of the surface and the air layer, respectively.  Since the mean free path length for water vapor absorption of those wavelengths it can absorb is short in the lower troposphere and that for carbon dioxide absorption is not much longer, this is a severe limit on the power transported by radiation from the surface.

The NASA claim that the surface emits 398.2 W/m2 includes only 40.1 W/m2 emitted directly to space.  The remaining 358.1 W/m2 that NASA claims is absorbed by the atmosphere implies that the atmosphere is at a temperature of 0 K!  This could not be more wrong.  On the other hand, a back radiation of 340.3 W/m2 implies that the effective temperature of the radiating atmosphere to the surface is 278.3 K.  In the U.S. Standard Atmosphere of 1976, this is an altitude of about 1500 m.  So the atmosphere absorbs radiation as though it were cold space and it emits radiation back to the surface as though it has a temperature of 278.3 K.  The implied mean free paths for the longwave infra-red radiation which can be absorbed and emitted by infra-red active or greenhouse gases are completely incompatible!

What is more, the atmosphere is generally cooler than the surface, so it radiates only to still cooler parts of the atmosphere or to space.  Cooler parts of the atmosphere are usually at higher altitudes, so there is a strong preference for radiated energy from the atmosphere to be transported to higher altitudes.  Such transport to lower altitudes as does occur requires a temperature inversion in air layers.

In any case, it is very clear that the net effect of surface radiation absorbed by the atmosphere, back-radiation, any errors in the solar radiation absorbed by the surface, or power lost as thermals, or power used to evaporate water, or the neglect of other heating or cooling mechanisms, amounts to a net cooling effect of 94.9 W/m2 .  This does not leave much room for a warming of the surface attributable to back-radiation from greenhouse gases!

In fact, we need more cooling effects to explain the surface temperature being as low as it is.  A big part of that is surface longwave radiation absorbed by the atmosphere, but I believe another part of that is other cooling mechanisms due to infra-red active gases (greenhouse gases) and underestimation of the cooling by water evaporation and thermals.

Infra-red active gases (water vapor, carbon dioxide) have a higher heat capacity than do nitrogen and oxygen molecules. Consequently, they can remove more energy from the surface upon collisions with it and add to the heat loss of the surface due to rising thermals.  In addition, they make minor contributions to the transfer of heat upward through the atmosphere by radiating infra-red from warmer air layers to the usually cooler air layers just above them.  Since that transport of energy is at the speed of light, this effect transfers energy faster than a rising thermal and acts as a cooling mechanism.

The advocates of catastrophic man-made global warming pose the problem of our average surface temperature as one of explaining why it is as high as it is.  The real problem is explaining why it is as low as it is.  This is the sad state of NASA and generally of the entire set of alarmist man-made global warming advocates understanding of the climate.  For this poor service, NASA now spends more than half of its budget on climate change, with the emphasis heavily on catastrophic man-made global warming.

Update: 5 March 2017

08 January 2017

Reality-based climate forecasting by Paul Driessen

Reality-based climate forecasting

Continuing to focus on carbon dioxide as the driving force will just bring more bogus predictions

Paul Driessen

These days, even shipwreck museums showcase evidence of climate change.

After diving recently among Key West’s fabled ship-destroying barrier reefs, I immersed myself in exhibits from the Nuestra Senora de Atocha, the fabled Spanish galleon that foundered during a ferocious hurricane in 1622. The Mel Fisher Maritime Museum now houses many of the gold, silver, emeralds and artifacts that Mel and Deo Fisher’s archeological team recovered after finding the wreck in 1985.

Also featured prominently in the museum is the wreck of a British slave ship, the Henrietta Marie. It sank in a hurricane off Key West in 1700, after leaving 190 Africans in Jamaica, to be sold as slaves.

As Fisher divers excavated the Henrietta wreck, at 40 feet below the sea surface they found – not just leg shackles and other grim artifacts from that horrific era – but charred tree branches, pine cones and other remnants from a forest fire 8,400 years ago! The still resinous smelling fragments demonstrate that this area (like all other coastal regions worldwide) was well above sea level, before the last ice age ended and melting glaciers slowly raised oceans to their current level: 400 feet higher than during the frigid Pleistocene, when an enormous portion of Earth’s seawater was locked up in glaciers.

Climate change has clearly been “real” throughout earth and human history. The question is, exactly how and how much do today’s human activities affect local, regional or global climate and weather?

Unfortunately, politicized climate change researchers continue to advance claims that complex, powerful, interconnected natural forces have been replaced by manmade fossil fuel emissions, especially carbon dioxide; that any future changes will be catastrophic; and that humanity can control climate and weather by controlling its appetite for oil, gas, coal and modern living standards.

If you like your climate, you can keep it, they suggest. If you don’t, we can make you a better one.

Not surprisingly, climate chaos scientists who’ve relied on the multi-billion-dollar government gravy train are distraught over the prospect that President Donald Trump will slash their budgets or terminate their CO2-centric research. Desperate to survive, they are replacing the term “climate change” with “global change” or “weather” in grant proposals, and going on offense with op-ed articles and media interviews.

“This is what the coming attack on science could look like,” Penn State modeler and hockey stick creator Michael Mann lamented in a Washington Post column. “I fear what may happen under Trump. The fate of the planet hangs in the balance.” (Actually, it’s his million-dollar grants that hang in the balance.)
A “skeptic” scientist has warmed to the idea that a major Greenland ice shelf may be shrinking because of climate change, a front-page piece in the Post claimed. Perhaps so. But is it manmade warming? Does it portend planetary cataclysm, even as Greenland’s interior and Antarctica show record ice growth? Or are warm ocean currents weakening an ice shelf that is fragile because it rests on ocean water, not land?

The fundamental problem remains. If it was substandard science and modeling under Obama era terminology, it will be substandard under survivalist jargon. The notion that manmade carbon dioxide now drives climate and weather – and we can predict climate and weather by looking only at plant-fertilizing CO2 and other “greenhouse gases” – is just as absurd now as before.

Their predictions will be as invalid and unscientific as divining future Super Bowl winners by modeling who plays left guard for each team – or World Cup victors by looking at center backs.

As climate realists take the reins at EPA and other federal and state agencies, the Trump Administration should ensure that tax dollars are not squandered on more alarmist science that is employed to justify locking up more fossil fuels, expanding renewable energy and “carbon capture” schemes, reducing US living standards, and telling poor countries what living standards they will be “permitted” to have.

Reliable forecasts, as far in advance as possible, would clearly benefit humanity. For that to happen, however, research must examine all natural and manmade factors, and not merely toe the pretend-consensus line that carbon dioxide now governs climate change.

That means government grants must not go preferentially to researchers who seek to further CO2-centrism, but rather to those who are committed to a broader scope of solid, dispassionate research that examines both natural and manmade factors. Grant recipients must also agree to engage in robust discussion and debate, to post, explain and defend their data, methodologies, analyses and conclusions.

They must devote far more attention to improving our understanding of all the forces that drive climate fluctuations, the roles they play, and the complex interactions among them. Important factors include cyclical variations in the sun’s energy and cosmic ray output, winds high in Earth’s atmosphere, and decadal and century-scale circulation changes in the deep oceans, which are very difficult to measure and are not yet well enough understood to predict or be realistically included in climate models.

Another is the anomalous warm water areas that develop from time to time in the Pacific Ocean and then are driven by winds and currents northward into the Arctic, affecting US, Canadian, European and Asian temperatures and precipitation. The process of cloud formation is also important, because clouds help retain planetary warmth, reflect the sun’s heat, and provide cooling precipitation.
Many scientists have tried to inject these factors into climate discussions. However, the highly politicized nature of US, IPCC and global climate change funding, research, regulatory and treaty-making activities has caused CO2-focused factions to discount, dismiss or ignore the roles these natural forces play.

The political situation has also meant that most research and models have focused on carbon dioxide and other assumed human contributions to climate change. Politics, insufficient data and inadequate knowledge also cause models to reflect unrealistic physics theories, use overly simplified and inadequate numerical techniques, and fail to account adequately for deep-ocean circulation cycles and the enormity and complexity of natural forces and their constant, intricate interplay in driving climate fluctuations.

Speedier, more powerful computers simply make any “garbage in-garbage out” calculations, analyses and predictions occur much more quickly – facilitating faster faulty forecasts … and policy recommendations.

The desire to secure research funding from Obama grantor agencies also perpetuated a tendency to use El Niño warming spikes, and cherry-pick the end of cooling cycles as the starting point for trend lines that allegedly “prove” fossil fuels are causing “unprecedented” temperature spikes and planetary calamity. 

Finally, the tens of billions of dollars given annually in recent years to “keep it in the ground” anti-fossil fuel campaigners, national and international regulators, and renewable energy companies have given these vested interests enormous incentives to support IPCC/EPA pseudo-science – and vilify and silence climate realists who do not accept “catastrophic manmade climate change” precepts.

The Trump Administration and 115th Congress have a unique opportunity to change these dynamics, and ensure that future research generates useful information, improved understanding of Earth’s complex climate system, and forecasts that are increasingly accurate. In addition to the above, they should:
  • Reexamine and reduce (or even eliminate) the role that climate model “projections” (predictions) play in influencing federal policies, laws and regulations – until modeling capabilities are vastly and demonstrably improved, in line with the preceding observations.
  • Revise the Clean Air Act to remove EPA’s authority to regulate carbon dioxide – or compel EPA to reexamine its “endangerment” finding, to reflect the previous bullet, information and commentary.
  • Significantly reduce funding for climate research, the IPCC and EPA, and science in general. Funding should be more broadly based, not monopolistic, especially when the monopoly is inevitably politicized.

This is not an “attack on science.” It is a reaffirmation of what real science is supposed to be and do.

Paul Driessen is senior policy analyst for the Committee For A Constructive Tomorrow (www.CFACT.org) and author of Eco-Imperialism: Green power - Black death and other books on environmental issues.  His commentary was published on this blog at his request.