Core Essays

31 January 2010

Are more moderate winters so bad?

J. Richard Wakefield commented on an earlier post of mine and directed my attention to a study he did on the temperature records of Belleville, Ontario going back to 1921.  The yearly average mean daily temperature from the late 1980s on shows an increase.  He points out that this does not need to mean that there is an increase in both the minimum and maximum temperatures of most days and it does not need to mean that all of the seasons have become warmer.  So he proceeds with inspecting such issues.

His major findings are:
  • Winters are becoming shorter as measured from the time of first to last freeze.
  • The number of days each year with low temperatures below -20C is decreasing.
  • The number of days with temperatures above 30C is also decreasing, while the minimum daily summer temperatures are increasing.
In other words, the weather is moderating, with the winters accounting for the warming that has occurred.

Wakefield examines the graphs of days above 30C and days below -20C and sees indications of three phases of temperature behavior in the Belleville temperature record.  Phase 1 is from 1933 to 1955, Phase 2 is from 1955 to 1982, and then Phase 3 is from then to 2006.  He then plots the temperature against the sum of the days in a year above 30C plus the days below -20C plus the days in winter.  The result is the graph below:

The phase boundary years change slightly, but three distinct phases do stand out.  Odds are, atmospheric CO2 global warming enthusiasts might try to explain these behavioral phases in terms of the sudden tipping points they are so fond of.  That would be great, if they can explain what the tipping point was.  I am much more inclined to look to natural forces for an explanation.

Whatever the explanation, the yellow Phase 1 was a time of weather extremes in Belleville.  It was not a pleasant period.  The green Phase 2 was an improvement in that the number of hot days above 30C had decreased and there was at least a slight drop in the number of days below -20C.  The pink Phase 3 has clearly higher temperatures and is the catastrophe of our time!  That catastrophe consists of shorter winters and a further decrease in the number of days below -20C!  I am sure the people of Belleville, Ontario are boiling mad about the many evil people in the world producing CO2 and causing the weather to differ in these respects relative to what it was in the delightful Phase 1 period!

No.  Whatever caused this change, I expect it is really very welcome in Belleville.  They have a longer growing season and little need for air conditioning in the summer months.  They also have fewer of those deadly -20C and below days.  No rational person is not going to enjoy this.  Indeed, most plants and animals are surely better off as well.  Most of the detailed seasonal studies I have seen of warming around the Earth seem to show that the greatest warming in the Northern Hemisphere has been in the winter.  Hardly anyone sees this as bad in their daily lives!

30 January 2010

Chairman Pachauri of the UN IPCC AR4 Report Lied

We live in a world that seems to accept lies very readily, at least if they are made by someone whose politics is approved by the progressive (socialist) elitists.  Examples of constant liars are Presidents Clinton and Obama.  So, Chairman Pachauri's lies about when he learned about the error in the UN IPCC AR4 report of 2007 being very badly wrong about a prediction that the glaciers of the Himalayan Mountains would be gone by 2035 may not do him the damage it ought to.  Nonetheless, the story of his lie is recited here.

28 January 2010

Greenpeace Experts Direct Parts of UN IPCC AR4 Report

Donna Laframboise of NOconcensus.org has been industriously busy looking further into the references and experts of the Nobel Prize winning UN IPCC AR4 report of  2007.  See has found that the report which advertises itself as the work of 2500+ scientific expert reviewers, 800+ contributing authors, and 450+ lead authors and based entirely on authoritative peer-reviewed journal sources, uses a number of Greenpeace experts and reports.  These claims are false.  Greenpeace is the "authoritative scientific" source for the following claims with eight cited references (given by Donna):
  • Climate change will likely cause coral reef degradation.
  • The lowest cited power production (630 GW) from installed concentrated solar power plants in the year 2040.
  • Estimates of the current global installed peak capacity of solar photovoltaic power of 2400 MW in a 2004 report and 5000 MW in a 2006 report.
  • Main locations and cost of wind-energy investments.
  • An estimate that wind power will generate 29% of all electricity by 2030!  Most other estimates are between 3 and 5%.  The IPCC report decides on 7% for its analysis.
Donna Laframboise also found that among the expert reviewers of Working Group III of the UN IPCC AR4 report were three Greenpeace employees, two Friends of the Earth representatives, two Climate Action Networkers, one WWF International employee, one Environmental Defense employee, and someone from the David Suzuki Foundation.

The Greenpeace experts were:
  • Scientific Expert Reviewer Gabriela von Georne, Ph.D. in geology, a climate and energy campaigner for Greenpeace Germany.
  • Scientific Expert Reviewer Steve Sawyer, a boat-borne campaigner and now an energy lobbyist
  • Scientific Expert Reviewer Sven Teske, BSc in engineering and masters in wind energy technology, protest vessel sailor and now renewable energy expert
Donna also notes that Dr. Rajendra Pachauri, the IPCC Chairman, provided a forward for a Greenpeace publication with Sven Teske as co-author entitled New Zealand Energy Revolution: How to Prevent Climate Chaos.  Donna notes that this IPCC / Greenpeace relationship is a bit too cozy.

26 January 2010

ExtinctionGate and the UN IPCC AR4 Report

Among the many unjustified claims of the UN IPCC AR4 report of 2007 is a claim that man's emissions of CO2 are causing and will cause unprecedented rates of extinctions of plant and animal life on Earth.  Climate Change Reconsidered, a report by the Nongovernmental International Panel on Climate Change (NIPCC), does a good summary of the problems with the UN IPCC argument.

25 January 2010

A Host of Unscientific References in the UN IPCC AR4 Report

Donna Laframboise on her blog NOconcensus.org has turned up a host of unscientific references in the UN IPCC AR4 report of 2007.  Many of them are World Wildlife Fund reports and others of a similar level of objectivity.  Apparently, this is the result of looking into just a small fraction of the many references.

Thanks for your work on checking these references out, Donna Laframboise!  Donna also notes that the NASA website had claimed the Himalayan glaciers might be gone by 2030, five years earlier than their disappearance according to the discredited claim in the UN IPCC AR4 report!

AmazonGate Follows on the Heels of GlacierGate

The UN IPCC AR4 report of 2007 makes this claim:
Up to 40% of the Amazonian forests could react drastically to even a slight reduction in precipitation; this means that the tropical vegetation, hydrology and climate system in South America could change very rapidly to another steady state, not necessarily producing gradual changes between the current and the future situation (Rowell and Moore, 2000).  It is more probable that forests will be replaced by ecosystems that have more resistance to multiple stresses caused by temperature increase, droughts and fires, such as tropical savannas.
In a report by James Delingpole, Dr. Richard North discovered that the reference to Rowell and Moore is to a report by the World Wildlife Fund (WWF) and the International Union for Conservation of Nature.  Rowell is a journalist and environmental activist and Dr. Moore is a policy analyst with an interest in fires in Australia and Southeast Asia.  Neither is a specialist in the Amazon rain forests.  And even the cited report has no mention of the 40% figure!  No one knows where that 40% may have come from.

Once again, the UN IPCC AR4 report has been found to quote unscientific reports, as it did with respect to the glaciers of the Himalaya Mountains, to hype the claims of impending disaster due to man's use of fossil fuels.  The politics of the UN is more important than the science.

The Collapsing Carbon Market

The Kyoto Protocol required developed countries which did not meet their carbon reduction quotas to purchase credits from clean energy projects in the developing world.  This created a carbon trading market, which meant that many people, including Al Gore, were able to make a great deal of money from the alarm caused by the theory of catastrophic anthropogenic global warming.  Those making such money became a powerful lobby with a strong incentive to further feed the man-made global warming frenzy.

The Guardian of the UK has now reported that the carbon market is no longer growing and may soon collapse.  The failure of the Copenhagen Conference to set new emissions standards to take effect when the Kyoto Protocol reduction schedule ends in 2012, is causing many banks to stop lending money for carbon-emissions offset projects.  These banks are not expanding their workforce on the carbon desks as anticipated and some people are already being released.  The financiers backing one large clean energy project in the developing world backed out this month.  The recession also contributed to there being lower emissions and hence less overshoot on quotas needing to be offset.  The price of carbon offsets has fallen, especially in the U.S. and Australia where attempts to pass carbon cap and trade legislation have failed.

The weakness of the carbon trading market, combined with the realization by many that the science behind the theory of catastrophic man-made global warming was fraudulent, is and will further result in a great weakening of the forces pushing for catastrophic energy use restrictions and the attendant destruction of the economies of the nations of the developed world. 

24 January 2010

Does Global Warming Cause More Natural Disasters?

Despite the claims of the Nobel Prize winning UN IPCC AR4 report of 2007, there is no evidence that the warming of the late 20th Century caused an increase in the frequency of natural disasters!  The UN IPCC AR4 claim was based upon an unpublished report by Robert Muir-Wood, which was published later in 2008 with the new caveat that: "We find insufficient evidence to claim a statistical relationship between global temperature increase and catastrophic losses."

Note that a number of papers on the evolving understanding of the more important role of natural forces were ignored in the AR4 report of 2007 since they were published after 2005, the claimed cut-off time, for papers considered for the report.  Despite that, a major claim was based on a preliminary version of a paper which was not published until 2008 because it appeared at the time to support the idea that global warming would increase natural disasters.  Two scientific reviewers of the UN IPCC report had urged caution in making this claim, but their objections were ignored.

Global losses due to natural disasters such as hurricanes and floods have been going up at a rate of about 8% a year.  But, when population growth and more building in risky areas is taken into account, there is no evidence that global warming is contributing to the increase.  Roger Pielke, professor of environmental studies at Colorado University, is an expert in disaster impacts of climate and hosted a workshop on disaster losses in 2006, where the Muir-Wood paper was first presented.  The researchers who attended the workshop agreed in a statement that there was no evidence that global warming caused an increase in the severity or frequency of natural disasters.  Pielke says that it is still true that no link between global warming and natural disasters has been found and Mike Hulme, professor of climate change at the Tyndall Centre of the UK, agrees.

But, these inconvenient scientific facts have not prevented politicians from making the outrageous claims that there is a link between global warming and natural disasters.  The Sunday Times reviews some of these claims:
The claim by the Intergovernmental Panel on Climate Change (IPCC), that global warming is already affecting the severity and frequency of global disasters, has since become embedded in political and public debate. It was central to discussions at last month's Copenhagen climate summit, including a demand by developing countries for compensation of $100 billion (£62 billion) from the rich nations blamed for creating the most emissions. 
Ed Miliband, the energy and climate change minister [of the UK], has suggested British and overseas floods — such as those in Bangladesh in 2007 — could be linked to global warming. Barack Obama, the US president, said last autumn: "More powerful storms and floods threaten every continent."
Last month Gordon Brown, the prime minister [of the UK], told the Commons that the financial agreement at Copenhagen "must address the great injustice that . . . those hit first and hardest by climate change are those that have done least harm". 
The dogma of the religion of catastrophic anthropogenic global warming calls for increased natural disasters, so they must be linked to global warming, whether there is any scientific evidence or not. This religion is rapidly being exposed as a man-made fraud.

Edward Hudgins: Paternalists Want Power and Bank on Envy

In an article called Banking on Envy, Edward Hudgins, of the Atlas Society, says that the defeat of ObamaCare has caused Obama to invoke envy to attack the banks.  He says envy is the chief weapon, and frequent motivation, of the paternalist.  Obama claimed that the banks were responsible for the current recession and that the government had had to bail them out because the banks had made risky loans in pursuit of profit and bonuses.  He failed to note that the government forced the banks to take the money and they rapidly paid it back with interest.  And, of course, he failed to note that the banks had been pushed hard by the government over a couple of decades to make more and more risky loans to home buyers so that home ownership would increase, especially among the poor and lower middle class.

Hudgins says,
With chutzpah pouring out of his every pore, Obama announced support for restrictions on what he defines as risky bank activities “that are central to the legislation that has passed the House under the leadership of Chairman Barney Frank, and that we’re working to pass in the Senate under the leadership of Chairman Chris Dodd.”  Here he named the two members of Congress who most of all laid the groundwork for the current economic crisis. Both worked for years to force banks to make risky home loans to individuals who couldn’t afford them and both took campaign funds from the government-chartered Fannie Mae, which was making money packaging and marketing those bad loans. They sold the crack, and now they want to be deputized as drug-busting police.
Obama argued that the banks get cheap money from the Federal Reserve and use this "privilege" to trade for profit.  His implication is that profit is evil.  Now it is true that the banks get cheap money and that they try to use it to make a profit, but the government set this system up itself.  Hudgins says that the Obama paternalist's tactics are revealed by his effort to gain more effective control of the already highly regulated banking industry.  He goes on a great rampage next against paternalists and their lust for power.
His proposed restrictions on banks will further constrain capital investment in the American economy and reduce so-called “risky” lending. These constraints will certainly slow economic growth. They will especially make it tough for entrepreneurs to create whole new industries that are by their nature risky ventures, for which they bear the costs of the risk, and by which they hope to make huge, well-deserved profits. But it really isn’t economic growth and prosperity that paternalists seek.

Obama is shameless to suggest that his government, which is overseeing the most irresponsible and wasteful spending spree with taxpayer money in American history, should be imposing discipline on private bank responsibility. But it’s not responsible, efficient, and honest government that paternalists seek.

Paternalists want power. They are obsessed by the need to control every aspect of our lives. That’s why Leftists reacted as they did to the Supreme Court’s ruling—released on the same day as Obama’s anti-bank press conference—that corporations have free speech rights to make their thoughts about candidates known during elections. Obama is vowing to find ways to stamp out these First Amendment freedoms. The freedom of others to criticize is a grave danger to control-freak paternalists.

One of the principal tactics that paternalists use to induce individuals to surrender their freedom is to stoke their envy, resentment, and hate against some scapegoat and to promise that they, the paternalists, are the only ones who can punish the villains. “Look, that one is richer than you! Let’s get ‘em!”

Envy, of course, is an essentially nihilistic sentiment that revels in tearing down. It is a form of social relativism that teaches that one’s worth or status is always in comparison to others. And the easiest way to raise one’s own pseudo-sense of worth and status is to tear down others, to judge one’s self as better off only if others are worse off. Envy is a path to individual and social destruction.
 It is so good to have company that shares my sense of outrage at the elitist paternalists using low envy to satisfy their lust for power while not caring what harm is done to free markets and the rights of the individual.

22 January 2010

Take Heart if Barbara Mikulski is Not Your Senator

One of the burdens I bear is that as a Marylander, I am represented by two total losers in the Senate, Barbara Mikulski and Ben Cardin.  Here is a piece of history from 1990 according to Time Magazine:
Democrats cheered when Maryland's Barbara Mikulski declared that "the middle class have no more to give. The poor have nothing to give. So, let's go and get it from those who've got it." To Republican applause, G.O.P. Senator Bill Armstrong of Colorado proclaimed that "raising taxes in the face of a recession is a hare-brained idea."
The war cry of the Democrats is still that of Barbara Mikulski:  "So, let's go and get it from those who've got it."   David Boaz of the Cato Institute calls this the Mikulski Principle.  It is alive and well and controlling the Democrat agenda still, given this list of taxes called for by Obama or his advisers.

ObamaCare on the Precipice

I just read this from a monthly e-mail by Merriam-Webster:
After President Obama declared Congress was "on the precipice" of passing legislation to reform the nation's health care, the term precipice teetered at 90th place on last month's Most Looked Up List. The 17th century coinage precipice comes from the Latin word for headlong (prae means before and caput means head). The original (now obsolete) meaning of precipice named a "sudden or headlong fall."
I do not know about the original meaning of precipice being obsolete, given that Obama just foretold the sudden or headlong fall of ObamaCare!  It appears he has revived the original meaning of the word precipice!  I must be obsolete myself, since when I read what he said, my immediate image was of him throwing himself, Pelosi, and Reid headlong off a cliff.

Supreme Court Defends Free Speech

The Bipartisan Campaign Reform Act (BCRA) of 2002, often called the McCain-Feingold Act, placed severe limits on the free speech of corporations and labor unions for 60 days prior to a general election for ads mentioning the name of candidates for federal office.  It also restricted free speech for 30 days before a caucus or a primary election.  President George W. Bush signed the legislation into law, despite having reservations about the free speech restrictions.  He apparently thought the Supreme Court would strike many of its provisions down, so he abrogated his responsibility to veto the bill.  The Supreme Court, in McConnell vs. FEC, upheld most of the provisions of the BCRA of 2002, though it was also clearly its duty to find the restrictions on freedom of speech unconstitutional.

Finally, yesterday, the Supreme Court, in a 5-4 decision, considering Citizens United vs. Federal Election Commission, ruled that the BCRA of 2002 could not restrict the free speech of persons acting through either corporations or labor unions or other groups.  Quoting from a press release from Chip Mellor of the Institute for Justice, which helped defend free speech in this case:
Justice Kennedy, writing for the Court, emphasized that the government’s ban on corporate speech was censorship, pure and simple:  “When Government seeks to use its full power, including the criminal law, to command where a person may get his or her information or what distrusted source he or she may not hear, it uses censorship to control thought.  This is unlawful.”  He added, “The First Amendment confirms the freedom to think for ourselves.”
The Court overturned Austin vs. Michigan Chamber of Commerce and parts of McConnell vs. FEC in the process.  Chip Mellor says:
In today’s opinion, the Court ruled that everyone, including corporations, has the right to speak out about issues and candidates.  The government may not restrict the marketplace of ideas:  “The civic discourse belongs to the people, and the Government may not prescribe the means used to conduct it.”  In other words, the First Amendment rejects government paternalism, instead “entrusting the people to judge what is true and what is false.”
 The Institute for Justice also won a case in Arizona on 20 January on free speech and election issues when, according to the Institute for Justice website:

a federal district court judge declared unconstitutional the challenged “Matching Funds” provision of Arizona’s so-called “Clean Elections Act,” striking a blow for the rights of individuals and groups to speak freely during political campaigns.  The Institute for Justice is challenging Arizona’s scheme of publicly financing elections, which drowns out the voices of individuals and groups who wish to support privately financed candidates who run against taxpayer-funded candidates in a misguided effort to “level the playing filed.” If a group makes an independent expenditure in favor a privately funded candidate, the unelected bureaucrats at the Clean Elections Commission dole out dollar-for-dollar “matching funds” to the publicly funded candidate. That means that for every dollar an individual or group spends to support the candidate of their choice, over the publicly funded candidate’s initial government subsidy, the government pays an equal amount of money to the political competition. IJ also seeks to preserve the right of individuals to run for public office without having to accept taxpayer funds. Arizona’s public financing scheme punishes candidates who reject the political welfare of public funding by burying them in red tape and giving extra money to their publicly funded opponents. The case is being appealed to the Ninth Circuit, where IJ will again demonstrate that "clean Elections" doesn't level the playing field, it levels the players on the field.
A couple of other important freedom of speech court cases are being supported by The Institute for Justice.  SpeechNow.org vs. FEC will be argued before the D.C. Circuit Court of  Appeals on 27 January challenging a federal law that forces people to give up the right to associate in order to use the right to free speech!  Both rights are clearly stated in the First Amendment, so where were the minds of Congress when they passed this law!  The law limits each person joining together to advertise their beliefs to $5000 in contributions to the cause, but acting individually, they would have no such limit.  The Institute for Justice (IJ) is joining the Center for Competitive Politics to represent SpeechNow.org.

In Colorado, a group of neighbors joined together to oppose the annexation of their neighborhood by a nearby town and were sued under a Colorado law which forces any group spending as little as $200 to address a ballot issue to register with the state and disclose all donors making contributions of $20 or more.  IJ is representing these neighbors in the 10th Circuit Court of Appeals.

I have long been impressed by the work of the Institute of Justice and have been a contributor for many years.  I urge others to carefully consider supporting their critical work.  They played a big role in eminent domain reform and the Kelo case a few years ago.  They attack onerous professional licensing laws and many other petty attacks upon our freedoms.  They seem to be a very cost-effective organization as well.

20 January 2010

A Tea Party Once Again in Massachusetts

Scott Brown, a Republican down a month before the 19 January 2010 special election in Massachusetts to a better known State Attorney General, Martha Coakley, has beaten her with a 52% to 47% margin of victory.  Most of the Democrats are either blaming her as a bad candidate or the National Democratic Committee for not supporting her strongly enough.  But Coakley was the perfect caricature of the out-of-touch, uncaring educated elitist socialist.  Given that many of the People have been awakened by the Tea Party movement in their alarm at losing their rights and individual choices to governments grown massive, paternalistic, corrupt, aloof, and uncaring, she confirmed their growing understanding of the Democrat party.

Scott Brown, on the other hand, had a pretty good sense of the changed mood of the People, which had changed far more in Massachusetts than one can easily comprehend, given the almost slavish past history of the people of that state to the Democrat party.  They had massively voted for such dim-wits as Teddy Kennedy and John Kerry many times.  But finally, even the people of Massachusetts have awakened to the fundamental and critical issue of our time.  Will Americans remain individualists with the independence to choose their own values and to manage their own lives or will they simply become numbers in a group identity as Mao wanted to make his people.  Yes, the same Mao admired by some of Chairman Obama's advisers.

Massachusetts already has a very expensive "reformed" health insurance plan.  For a small increase in insurance coverage, all the covered are paying much more in premiums already.  The plan is not proving popular and ObamaCare is even less popular.  At this point, ObamaCare will only raise the costs of health care more in Massachusetts and will not increase the coverage further.  In other words, the bad deal for the rest of the country is a really, really bad deal for the people of Massachusetts.  So, this is a bit of a special case situation for Massachusetts.

Scott Brown campaigned as though he was virtually an independent in a state where most voters are not registered under a party.  There are 3 Democrats for every Republican, but added together these two parties only account for a bit less than half of the voters.  Scott Brown campaigned as a supported of the War on Terror, not as a first responder to man-made disasters.  He most importantly said he would vote against ObamaCare as the 41st vote.  He also said he was committed to tax reductions and a more limited role for government.  He also proved a hard campaigner who was eager to get out and talk to real people.  Perhaps because he did so, he found a way to appeal to real people, many of whom have become Tea Party participants.

Tea Party participants have a renewed understanding that government has become too big and too expensive.  It is interfering with their personal lives too much.  They have a renewed appreciation for the Founders and for the Constitution.  They resent the pointless debts they, their children, and their grandchildren are now saddled with.  They are angry that their representatives will not listen to them, that they do not read the bills they vote for, and they do not care if those bills are constitutional.  They have come to understand that government does not create jobs, they do.  The source of jobs is the private sector and it is Capitalism, not Socialism, that allows everyone the opportunity to improve their lives.  Socialism just picks out special interests to reward with grants of government powers and favors.  Corruption and vicious fights between huge numbers of special interest groups is a necessary result of choosing fascism and Marxism.  The people are becoming aware that that is what communitarianism and progressivism really are.  That is not the change they want.

The Tea Party people are voting for the party out of power.  They are turning to the Republican candidates, at least in Virginia, New Jersey, and now Massachusetts, not because they are happy with the Republican party, which has also pursued big government and too often proven corrupt, but simply as a counterbalance to the very out-of-control Democrats.  The Democrats have proven that they learned nothing from their loss of power for many years after the early Clinton years.  They are as arrogant and power-grubbing as ever.  They are worse than the Republicans in power lust and corruption.  Still, the People want a sense that the Republicans will not once again betray them.  They do not yet have much confidence that they will not.  The future of the Republic depends upon whether the Republicans can at least earn the confidence of the People of a Tea Party mood to regain the greater measure of control of government.  It looks likely they will be tested with it after 2010 or 2012.  They had better prove worthy, or the Tea Party movement may well become a third and a dominant party.

UN IPCC AR4 Errors on Himalayan Glaciers Revealed

An article in the Seattle Times by Seth Borenstein notes that numerous claims made about the melting of glaciers in the Himalayan Mountains in the UN IPCC AR4 report of 2007 were wrong.  As usual, all of the errors made it appear that global warming was much more dire than it actually is.  Here are the errors in a half page of the report:
  • "Glaciers in the Himalayas are receding faster than in any other part of the world." Cogley and Michael Zemp of the World Glacier Monitoring System said Himalayan glaciers are melting at about the same rate as other glaciers.
  • It says that if the Earth continues to warm, the "likelihood of them disappearing by 2035 and perhaps sooner is very high." Nowhere in peer-reviewed science literature is 2035 mentioned. However, there is a study from Russia that says glaciers could come close to disappearing by 2350. Probably the numbers in the date were transposed, Cogley said. 
  • The paragraph says: "Its total area will likely shrink from the present 500,000 to 100,000 square kilometers by the year 2035." Cogley said there are only 33,000 square kilometers of glaciers in the Himalayas.
  • The entire paragraph is attributed to the World Wildlife Fund, when only one sentence came from the WWF, Cogley said. And further, the IPCC likes to brag that it is based on peer-reviewed science, not advocacy group reports. Cogley said the WWF cited the popular science press as its source.
  • A table says that between 1845 and 1965, the Pindari Glacier shrank by 2,840 meters. Then comes a math mistake: It says that's a rate of 135.2 meters a year, when it really is only 23.5 meters a year.
The Indian Environment Minister Jairam Ramesh says that some glaciers are expanding and others are receding more slowly than indicated in the UN IPCC report. But secondly, note that an Indian scientist did speculate, not in a published paper, that the Himalayan glaciers might vanish by 2035.  He says his speculation should not have been used in the UN IPCC report.  Note that losing 400,000 square kilometers of glacier by 2035 is a much greater loss of sources of fresh water than is the loss of some fraction of the 33,000 square kilometers of actual Himalayan glacier and a much greater contributor to images of sea level rise.  The error on calculating the rate of retreat of the Pindari Glacier is especially foolish, since it was particularly easy for any reader to check the calculation.  The article rate of 23.5 meters/year is also actually wrong.  The rate is really 23.7 meters/year.

This set of errors, all favoring alarmism, is simply another indicator of a massive scientific fraud orchestrated by the UN IPCC, the Hadley Center and CRU of Great Britain, and NOAA and the NASA GISS of the U.S.  All of them aided and abetted by Al Gore, Tony Blair, Obama, Pelosi, many other socialist politicians, and much of the socialist press, which is most of the press.  The People have been massively betrayed here again by many scientists, almost all universities, many politicians, and most of the press.  The desire for money from governments and/or more control and power over other people's lives is just too great.  This is a great lesson in the perils of government grown beyond its rationally limited powers as defined in our Constitution, now ignored by our government and the fascist/Marxist/environmentalist intellectual pretenders and posers.

17 January 2010

NOAA is Fountainhead of Exaggerated Temperature Data

The National Oceanic and Atmospheric Administration (NOAA) operates the National Climate Data Center (NCDC) in Asheville, North Carolina.  This center produces the Global Historical Climate Network (GHCN), which is a land surface temperature record from around the world.  This data set is not just raw temperature data, but is temperature station temperature measurements averaged in a grid across the Earth.  This data is supplied to the NASA Goddard Institute for Space Studies (GISS) at Columbia University, which performs further data manipulations upon the data.  The data was also supplied to the Climate Research Unit of the University of East Anglia, where their temperature data sets were examined more closely following the dump of many documents that occurred in late November.  The fourth most significant set of temperatures comes from Japan, but their database is also based upon the NOAA GHCN temperature database.  Despite the use of the same basic set of manipulated data by NOAA, NASA, and the CRU, they used to and often still do pretend that each fully manipulated data set came to very similar conclusions independently.

Recent analysis by computer expert E. Mike Smith and the certified consulting meteorologist Joseph D'Aleo has shown manipulations of the GHCN which have had the effect of greatly exaggerating the rise of temperatures in the late 20th Century, as well as in the first decade of the present century.  The analysis was so blatantly designed for that purpose, that it is unavoidable to say that an agency of our government was knowingly lying to the American People and abusing the good name of the United States throughout the world.  How was this perfidy performed?  I have discussed a number of aspects of this huge effort to distort the temperature data to support a theory of sudden highly unusual temperature rise in the late 20th Century and a sustained high in the first decade of this century in a number of posts in November and December.  I will add to that story here.

First, the GHCN was based on a few tens of reporting stations until about 1870, then the number grew rapidly until about 1920 when it was more than 1300 stations.  There was another growth in the number in the late 1940s with more than 1700 stations reporting temperatures.  About 1970 the numbers started to drop, but remained above 1500 until about 1990.  In the late 1990s, the number of temperature readings entered into the GHCN database dropped precipitously to a few hundred, though the number of stations still reporting data did not drop anywhere near so rapidly.  E. Mike Smith and Joseph D'Aleo show that the high altitude and high latitude station data was preferentially dropped from the data set, as was rural data.  The weight of the reported data stations moved south in the northern hemisphere as well.  More coastal and more airport (hardly ever close to cities, right?) station data was included.  Colder stations were systematically not entered into the data set.  To make up for the missing data in many grids around the world, temperature values were made up for those grids based on stations as far as 1200 km away.  Thus mountain data might be based on the data of two cities with large urban heat island effects.  See the story as told by Joseph D'Aleo here and by E. Mike Smith here.

The GHCN data shows warming in only a fraction of the reported grid areas and they are mostly in the higher latitudes and in mountainous regions.  I earlier discussed the effect on Russia.  D'Aleo notes that:
In Canada the number of stations dropped from 600 to 35 in 2009. The percentage of stations in the lower elevations (below 300 feet) tripled and those at higher elevations above 3000 feet were reduced in half. Canada’s semi-permanent depicted warmth comes from interpolating from more southerly locations to fill northerly vacant grid boxes, even as a pure average of the available stations shows a COOLING. Just 1 thermometer remains for everything north of latitude 65N – that station is Eureka. Eureka according to Wikipedia has been described as “The Garden Spot of the Arctic” due to the flora and fauna abundant around the Eureka area, more so than anywhere else in the High Arctic. Winters are frigid but summers are slightly warmer than at other places in the Canadian Arctic.
China had 100 stations in 1950, over 400 in 1960 then only 35 by 1990. Temperatures reflected these station distribution changes. CRU’s own Phil Jones showed in 2008 peer review paper that contamination by urbanization in China was 1.8F per century. Neither NOAA nor CRU adjusts for this contamination. NASA to their credit, makes an attempt to adjust for urbanization, but outside the United States, the lack of updated population data has NASA adjusting cities with data from other cities with about as many stations warming as cooling.
High elevation stations have disappeared from the data base. Stations in the Andes and Bolivia have vanished. Temperatures for these areas are now determined by interpolation from stations hundreds of miles away on the coast or in the Amazon.
Of course, other adjustments have been applied to the measured temperature data, with the data shown to the public being only this data.  These adjustments are often times made frequently and are undocumented.  I previously discussed a comparison of so-called unadjusted data for Central Park and data which was said to be adjusted, but it turns out that even unadjusted data is not raw temperature data.  From D'Aleo,



The older temperatures were adjusted downward, as though fewer people were on Manhattan Island and less energy was used in earlier times than now!  Similar adjustments were made to some so-called rural stations also.  Here is the result for Davis, California, the nearest "rural" station to San Francisco, though it is a town of more than 60,000 people and is a suburb of Sacramento.  The recent drop of reported stations has resulted in only 4 stations for the entire state of California.  Three of those stations are along the coast in the Los Angeles area and the Davis station is the only one left in the mid-latitudes of the state.  So, what has NOAA done to the only reported station in the recent record outside of the coastal LA area?



The pre-1960 data was adjusted downward to change a net cooling after about 1908 into a net warming.  This would possibly make sense if the population of Davis and nearby Sacramento had been much larger in 1880 and then they had shrunk in size drastically until 1960, when the population became stable.  But, that certainly did not happen, now did it?  Now, if you are going to represent the large state of California with only 4 station temperature histories in recent times and you want to tell everyone that there has been no cooling since 1998, this is a great way to do it.  Drop all the station readings in the mountains of CA and drop those from the northern third of the state entirely only for those pesky recent times when the catastrophic AGW theory says temperatures must remain high.  Keep the many cooler station temperature inputs, of course, for earlier times so that the temperature record will have a recent rise.  Having stations which are really in urban areas is the way to go and the manipulations are time consuming enough that you do not want to have to do them on too many station's temperature data, so just keep a few urban results.

There is a huge pattern of falsifying the temperature record performed by our own government for the purpose of misleading the American People and the people of the rest of the world.  This is a blatant power grab by the government to tax, regulate, and limit the use of energy.  This data provides the justification for the EPA to condemn CO2 as a pollutant and to heavily regulate all energy users.  It is an excuse to offer subsidies and mandates to favor certain alternative energy industries and certain so-called green companies.  These industries and companies can be counted upon to offer frequent bribes, called campaign contributions, to the politicians who gain this power.  This data is also the basis for an increase in effective world governance of Americans by a UN dominated by governments which do not hold the rights of the individual as sovereign or even worthy of respect.  This fraud has been perpetrated for the purpose of advancing socialism and world government.

The scalawags and scoundrels of NOAA and NASA must be kicked out of the government for fraud and denied their pensions and benefits.  The EPA administrators who claimed CO2 was a pollutant based on their data showed sufficiently bad judgment that they should be demoted to GS9 level positions.  Those in the House of Representatives who voted for carbon cap and trade should all be turned out of office in the 2010 election for their awful judgment.  Any Congressman who was willing to do so much damage to the American economy and to destroy so many jobs on the basis of such flim-flam artistry has no business making our critical national legislation.  Those Senators who have supported the catastrophic anthropogenic global warming hypothesis as being correct, should also be retired.  Such fools should never be permitted such power.

13 January 2010

A Decade of Missing Jobs, A Decade of Government Growth

We often hear that the current recession is the cause of current unemployment and that in December of 2009, the unemployment rate was 10.0%, seasonally adjusted.  Given that back in January 2000, at the start of the first decade of the 21st Century, the unemployment rate was a very low 4.0%, this is a very sorry situation.  But, because it is so hard for many to find jobs, we know the situation is even worse than indicated by the 10.0% unemployment rate.  We know that people have given up on finding jobs and those who are no longer able to collect unemployment insurance, or who never could, are no longer counted as looking for a job unless they make the effort to visit state unemployment agencies, which cannot help them.  The exercise I posed myself here was how can I look at the job and unemployment statistics of the Bureau of Labor Statistics (BLS) of the U.S. Department of Labor to gain insight on job creation and unemployment in this last decade.

The BLS provides the total number of people in the noninstitutionalized civilian work age population, the total number of people employed, the number they consider to be unemployed and actively looking for jobs, and an unemployment rate.  The sum of the number of people with jobs and those they consider unemployed is the workforce.  The unemployment percentage rate is the number considered unemployed divided by the number in the workforce times 100.

Back in January 2000, things were pretty good in the job market.  It was apparently pretty easy to create jobs and the jobs were seen as pretty desirable.  64.77% of the noninstitutional population held jobs and the workforce was 67.49% of the work age population.  The unemployment rate was a very low 4.0%, so even marginally unproductive people could profitably be put to work.  This is a very low historical unemployment rate, so many of the unemployed were likely just changing jobs, changing careers, or they were very marginally productive.  I am going to use this period of very healthy job creation as a benchmark against which to measure the rest of the decade that follows.  The job and employment statistics for various months in the decade are given in the table below.

Jobs were at a maximum in number in December 2007 and the number of jobs has decreased since then.  Note that the numbers given in the table are in thousands, so the number of unemployed in January 2000 was 5,689,000.  Also, the unemployment rate numbers in this table are not seasonally adjusted, so they are readily calculated from the numbers given in the table.

Let's look at the % of the population of potential workers employed.  In January 2000, it starts at a high of 64.77% and as the decade progresses it keeps dropping.  By December 2005, the % working is down to 62.78%.  In December 2007, it is almost identical, but then it drops again as the recession progresses.  One can also look at the % of the workforce in relation to the total potential working population.  In January 2000, it starts at a high 67.49% with many jobs available that many people want.  But this number also drops as the decade unwinds.  In December 2007, even before the recession is usually said to have taken hold, the workforce % is down to 65.92%.  Strangely enough, this number decreases relatively little in the first year of the recession, since it is only down to 65.67% in December 2008.  But, by December 2009, the decade closes out with the % of the workforce at a low of 64.38%.  Jobs are not available and/or are not very desirable compared to January 2000, according to this figure.  In fact, we know many people want work now and if jobs, even rather undesirable ones, were available, they would mostly be scarfed up.  The problem is with job creation.

So, how many jobs are missing in our economy?  Let's take the 67.49% workforce of January 2000 as a criteria for how many jobs people would want if jobs were available and they were reasonably enticing jobs, as they were apparently in January 2000.  Multiplying the total noninstitutional civilian work age population by 0.6749 gives us the number of jobs wanted and needed to be as good a situation as January 2000 and to provide the unemployed of that time with jobs.  By this criterion, in December 2005, the job situation had actually deteriorated to the extent that 10,710,000 jobs were already missing.  The unemployment rate had only risen to 4.9% according to the usual method of calculation, but the economy was not creating jobs very well anymore.  The % of missing jobs is 100 (Jobs Wanted/Missing Jobs), which was 6.98% in December 2005, which is much higher than official unemployment rate of 4.9%.  From then until December 2007, this % Missing Jobs changed very little.  But in December 2008, the number of missing jobs had risen to 15,287,000 jobs and the % of missing jobs was 9.64%.  At the end of the decade, 22,108,000 jobs were missing and the % of missing jobs was 13.83%!

So, in the first decade of the 21st Century, our economy went from being able to provide jobs that people wanted and managed to employ all but 4.04% of the workers who wanted jobs, some of whom were simply in transition between jobs.  But, as the decade ended, the number who could not find jobs and/or who found the jobs to be unenticing had risen to 13.83%.  This is a factor of 3.42 times worse, even as the official unemployment rate says the situation is only 2.42 times worse.  And we now see that we are short by about 22.1 million jobs.

Throughout the decade, local, state, and federal governments grew.  They drew resources of money and manpower away from the private sector, which is the great engine of job creation.  Governments at all levels increased regulations on businesses and taxed them.  In the face of increasingly global trade, the U.S. and Japan kept high corporate taxes while most other nations decreased their taxes.  This left the U.S. and Japan with the highest corporate taxes in the world.  This was not good for business in either country.  The effects of these taxes and regulations were likely the cause of the 11,023,000 jobs already missing at the start of the recession, which now has us missing twice the number of jobs missing only 2 years earlier.  The recession did not start the problem of job creation, but it did accelerate it.

A shortfall of 22.1 million jobs is a huge shortfall.  Adding threats of a government takeover of the medical services industry, with all the attendant inefficiencies of government and the grasping of special interests, will make this situation worse.  Treating CO2 absurdly as a pollutant and regulating the heck out of energy use will make this situation worse.  Further increases in taxes, now planned at all levels of government in most of the country, will make matters worse.  Further forced unionization will make job creation much harder.  The fact that taxes are being targeted on businesses and their owners specifically, will make job creation harder.  The second round of bailouts and so-called government job creation bills will, like the first round, make jobs harder to create.  A 22.1 million job shortage is a very serious problem and should be a game-changer.

At least 9.79% of potential workers would be working if only our job creation situation were not now much worse than it was at the start of the decade.  This is a lot of people who should be blaming overgrown governments for the bulk of their unfortunate job situation.  There are many more workers yet who have a job, but it is a part-time job or it is one they would choose to leave, if they dared and could expect a better job.  Back in January 2000, many workers could leave a job for a better job.  No longer.

It is about time that some of our politicians started actually trying to learn something about how jobs are created and that they get over the fact that it is private industry, not government, that does it.  Too many of them are hallucinating as Gore did when he thought he created the Internet.  Of course, then Gore hallucinated that he was going to save the planet from catastrophic man-made global warming and enlisted many of our brilliant politicians as his acolytes in the effort.  He then poured money into the pockets of a great many scientists to enlist them as proselytizers for the unsound hypothesis of catastrophic anthropogenic global warming.  The proposed government takeover of the medical system and of energy use have proven fascinating toys for our child-like politicians.  It is a rare day when one can find a politician whose head is not buried in the sand.

12 January 2010

Levy: The Moral and Constitutional Case for a Right to Gay Marriage

Robert A. Levy is the chairman of the Cato Institute, a libertarian think tank in Washington, D.C.  He recently played a major role in strengthening the Second Amendment to the Constitution in the case of Washington, D.C. vs. Heller.  He recently wrote an article appearing in the New York Daily News on 7 January 2010 on the right to gay marriage called The Moral and Constitutional Case for a Right to Gay Marriage.  It is an excellent article, consistent with my oft-stated belief that we would be better served by getting government out of marriages, which I believe are a spiritual union of partners.  The government should only be offering an important legal contract for domestic partnerships, somewhat akin to a small business partnership.  The spiritual aspect of such a domestic partnership, the marriage aspect, is not something government can or should attempt to address.

Levy points out that New Hampshire and Washington, D.C. have just joined Connecticut, Iowa, Massachusetts, and Vermont in legalizing gay marriage after disappointing defeats in California, Maine, and New York.  Levy says:
The primary purpose of government is to safeguard individual rights and prevent some persons from harming others. Heterosexuals should not be treated preferentially when the state carries out that role. And no one is harmed by the union of two consenting gay people.
For most of Western history, marriage was a matter of private contract between the betrothed parties and perhaps their families. Following that tradition, marriage today should be a private arrangement, requiring minimal or no state intervention. Some religious or secular institutions would recognize gay marriages; others would not; still others would call them domestic partnerships or assign another label. Join whichever group you wish. The rights and responsibilities of partners would be governed by personally tailored contracts — consensual bargains like those that control most other interactions in a free society.
Levy notes that more than 1,000 federal laws dealing mostly with taxes and transfer payments have provisions for married people.  The states have many more laws with provisions for married people.  But Levy says:
Whenever government imposes obligations or dispenses benefits, it may not "deny to any person within its jurisdiction the equal protection of the laws." That provision is explicit in the 14th Amendment to the U.S. Constitution, applicable to the states, and implicit in the Fifth Amendment, applicable to the federal government.
Levy also observes that:
No compelling reason has been proffered for sanctioning heterosexual but not homosexual marriages. Nor is a ban on gay marriage a close fit for attaining the goals cited by proponents of such bans. If the goal, for example, is to strengthen the institution of marriage, a more effective step might be to bar no-fault divorce and premarital cohabitation. If the goal is to ensure procreation, then infertile and aged couples should be precluded from marriage.
More and more people are unable to find any substantial merit to the claim that gay domestic partnerships weaken or threaten heterosexual marriages.  As Levy notes, nearly 60% of Fortune 500 companies offer employee benefits to domestic partners.  The Senate Homeland Security and Governmental Affairs Committee has also voted to apply employee benefits to the gay partners of federal employees.  The clock is running out on this cruel act of discrimination.

10 January 2010

A Real Cuffy Meigs or a Real Imitation?

Real life is imitating Ayn Rand's great novel Atlas Shrugged in many ways.  More and more people have observed this and the sales of her novel have shot up tremendously in the last year or more.  But did you know that among the imitations of real life is a character using the name Cuffy Meigs on a blog named Perfunction?  He has a special flare for misspellings and calling the People teabaggers.

In Atlas Shrugged, Cuffy Meigs is a particularly crude and nasty guy, who dresses in a military uniform and carries a pistol.  He is a bully who is made Director of Unification for the Railroads, so that unhealthy railroads can suck the life's blood out of the last of the healthier railroads.  He eventually seizes the weapon of Project X while it is in trial and incompetently manages to destroy a large area of the United States, including the last railroad bridge over the Mississippi River.  He kills himself, his men, and Dr. Stadler in the process.

I have no idea whether the Cuffy Meigs of Perfunction is using his real name or whether he aspires to be like the Cuffy Meigs of Atlas Shrugged.

Postscript:  See the comment of Cuffy Meigs below.  It turns out he is actually rather a libertarian/conservative commentator with an inclination toward sarcasm, irony, and the occasionally slightly twisted way of expressing himself.  He sure managed to mislead me in the posting I had initially read.  Beware that since I am not given to sarcasm and irony, it can take me awhile to understand that that is what is going on in some cases.  Perhaps here I was especially set up by the use of an awful name like Cuffy Meigs to hide one's identity behind.  It may be irony, but it is an irony I cannot conceive of using myself.

A Republican Senator from Massachusetts? Wow!!!

In the Special Election to fill the Senate seat in Massachusetts vacated by Senator Kennedy's death on 19 January, Republican Scott Brown leads Democrat Martha Coakley in a poll taken on Saturday by 48% to 47%.

Here are some key reasons:
  • 66% of Republicans who plan to vote are very excited about voting in the Special Election, but only 48% of the Democrats who plan to vote are very excited.
  • Independents view Brown favorably as opposed to unfavorably with a 70% to 16% split.  In the election with Coakley, 63% favor Brown over 31% who favor Coakley.  But clearly large numbers of those who plan to vote for Coakley nonetheless view Brown favorably, dampening their enthusiasm for Coakley.
  • 59% of Massachusetts independents oppose Obamacare, while only 27% support it.
  • Obama won Massachusetts by a margin of 26%, but voters who plan to vote in the Special Election only gave him a 16% margin.
  • Many Massachusetts voters plan to vote for the party out of power.
  • The 20% of voters who dislike both major parties plan to give Brown a lead of 74% to 21%.
  • Among all likely voters, Brown is viewed favorably by 57% and negatively by 25%.  The 32% difference compares to Republican Bob McDonnell's favorability difference of 20%, yet he won in Virginia with a 17% margin.
  • Surprisingly, given that registered Democrats outnumber Republicans by a 3 to 1 ratio, about half of the state's voters are independents.
  • Democrat Gov. Deval Patrick is not popular and faces a tough re-election campaign in 2010, despite winning easily in 2006.
If Scott Brown should become the first Republican Massachusetts Senator since Edward Brooke, who served 2 terms from 1967 through 1978, he has vowed to vote against Obamacare, which will break the Democrat 60-vote filibuster-proof block.  This has Democrats very worried and talking about finding a way to delay his taking his Senate seat, if he should win the election.  The election fight may not be over, however, since Coakley, the state attorney general, has more money and more organizational support, as in labor unions!

Scott Brown is not exactly either a great libertarian or a great conservative, but he is from Taxachusetts, so what can you expect.  At least he will break that Democrat 60-vote filibuster-proof block in the Senate and he may be able to stop Obamacare.  These are such important causes, that you should seriously consider donating his campaign money, as I have.

Changes of State Representatives and Electoral Votes in 2012

We are close enough to the 2010 Census time that pretty good predictions should be possible for the states which will gain seats in the House of Representatives and states which will lose seats there.  Since the number of Electoral College votes each state has is the sum of the number of Representatives and Senators they have, this also has possible important implications for the presidential race in 2012.

On 23 December 2009, the Census Bureau made its last U.S. population estimate by state as of 1 July 2009 prior to the census.  If these figures were translated into seats in the House of Representatives, the blog site 270toWin says the apportionment results would be:

Texas +3
Arizona +1
Florida +1, now RED
Georgia +1
Nevada +1, now RED
South Carolina +1
Utah +1
Washington +1
Illinois -1
Iowa -1
Louisiana -1
Massachusetts -1
Michigan -1
New Jersey -1
New York -1
Pennsylvania -1
Ohio -2

The states that voted for Obama in 2008 are in blue and those that voted for McCain are in red.  Meanwhile, the voters in Florida and Nevada have become very unhappy with Obama, so Florida and Nevada are now red.  So, of the now Red States, there is a net gain of projected 2012 Representatives and Electoral Votes of +8.  The loss among the Blue States is -8, for a net swing of 16 Electoral College votes and 16 Representatives to the Republicans.  Of course, this is just a zeroth order estimate with respect to the effect upon the House of Representative's party affiliations.  Redistricting and the quality of candidates will have a major impact on the number of Republican Representatives a state winds up with.  The impact on the re-election chances of Obama is more straightforward.

There is something else to observe here.  People go where they can find jobs.  The Red States are clearly creating many more jobs than are the Blue States.  There is a long-term downside to being the more highly socialist party.  The Party of Mass Destruction, the Socialist Democrat Party, is very effective in destroying the business climate and in destroying jobs.  When jobs are killed, as in Ohio, Michigan, Pennsylvania, New York, New Jersey, New York, Massachusetts, and Illinois, people have to move away to states with jobs.

This is just a snapshot of the jobs issue, but if one weights the state unemployment rate of November 2009 by the absolute number of Representatives projected as gains or losses, the average now Red State unemployment rate for those changing number of Representatives is 10.24%.  That for the now Blue States changing representation is 10.92%.  By November of 2009, the huge sums of money sent to Wall Street for the bailout had surely kept employment from falling as much in NY and NJ as it otherwise would have.  A huge amount of bailout money was transferred to the auto industry in MI as well.  Since the stimulus bill was mostly a long-time wish list of Democrat special interest funding, more of it went to Blue States than to Red States.  Nonetheless, the Red States still had a lower weighted November 2009 unemployment rate!

Harry Reid's Dire Re-Election Prospects

Senator Harry Reid of Nevada and Senate Democrat Majority Leader has to push through the Democrat socialist agenda in the Senate.  This makes it impossible for him to pretend to Nevadans that he is a centrist or anything but a committed socialist.  The other Nevada Senator is the Republican John Ensign.  Obama did narrowly beat McCain in the 2008 election, but the voters are remorseful now.  In fact, they are very remorseful.  Obama had a favorable rating in Nevada of 55% in May, 44% in December, and for 5 - 7 January it was down to 34%.  His unfavorable rating is now 46%, up from 30% in May.

According to the Nevada Secretary of State's Voter Registration and Active/Inactive voter lists, the breakdown on Nevada's active voters is:

Democrat   494,316
Republican   410,198
Non-Partisan  174,663
Independent American  47,112

There are additional minor parties such as the Green Party (848) and the Libertarians (6,666).  The Independent American Party was until 1998 the Utah Independent American Party.  It is a Constitution-loving and conservative party, which endorsed the presidential candidate of the Constitution Party in the 2000 and 2004 presidential elections.  Some members of this conservative party may find it worthwhile to vote for a Republican in order to get rid of Reid.

The Nevada Review Journal had a poll of Nevadans taken by the Mason-Dixon pollers by telephone from 5-7 January and Reid is in deep trouble.  625 registered Nevada voters were interviewed and the mragin of error is 4%.  49% viewed Ried unfavorably, while only 38% thought favorably of this 4-term Senator.  Each of the three top-running Republicans would beat him according to the poll results in the 2010 election.  Sue Lowden, the Republican Party chairwoman of Nevada, would beat him 50% to 40%.  Danny Tarkanian, a businessman and former UNLV basketball star, would beat him 49% to 41%.  Sharron Angle, a former Reno assemblywoman, would beat him 45% to 40%.  Note that in each case, he gets about 40% of the vote, despite the fact that Sharron Angle is very little known in the state.  In other words, he gets the die-hard Democrat vote, but not much else.  In each of these three match-ups, Reid only gets about one-quarter of the independent vote.  Lowden gets 59%, Tarkanian gets 56%, and Angle, the unknown, gets 53%.  In other words, the independent voters want nothing to do with Harry Reid!  They do not even have to know the person running against him.

It may not be over for Harry Reid, however.  He is expected to have a $25 million war-chest.  But he will have to be very successful in buying all the votes that money can buy.  Such a war-chest is a lot more effective when the People have not already decided that they do not like the politician backed by the money.  So, let us hope that Reid has Tom Daschle disease.  Daschle was the Democrat Leader of the Senate whom the People of South Dakota threw out in 2004 after he proved more socialist in his leadership role than they could stand.

05 January 2010

UAH Satellite Temperature Readings in 9-Year Decline



As we have observed, the land surface temperature record is unreliable and biased upward by the elimination of rural stations, the bad siting of most stations, and the application of adjustments to the raw temperature readings which create an artificial rapid temperature increase bias.  The UAH satellite temperature record showed a much weaker late 20th Century temperature increase than did the unreliable land surface temperature record.  For the period from 2001 to 2009, the atmospheric CO2 concentration continued to increase as shown by the black line in the plot above.  The UN IPCC AR4 climate models all predict a continuation of the late 20th Century temperature increase into this period based on this increasing concentration of CO2 in the atmosphere.  The satellite temperature record is the red-blue line and the linear trend line is the green line.  The trend line is showing a 0.84C decrease of temperature per century.  Of course, the longer term trend line of temperature since the end of the Little Ice Age has been positive.

Dr. Syun Akasofu presented the temperature trend and the decadal oscillations of the global temperature superimposed upon the longer linear trend line using the global temperature data since 1880 and up to 2000.  The linear trend line for that period is projected back into the very late Little Ice Age and forward.  The graph indicates where the temperature had been measured to be through 2008 and the IPCC AR4 report of 2007 prediction out to the year 2100 is shown.  This graph was presented at the 2009 International Conference on Climate Change, New York, March 2009.  The pink area is the UN IPCC range of predictions, which are proving to be too high.  This is not surprising given that they were a projection themselves largely based on a time from the bottom of the last cooling temperature oscillation and the maximum of the warming oscillation of the late 20th Century.  Such climate predictions are incredibly foolish.



The upper graph of temperature is basically that between the end of the yellow box area to the present, showing a slight temperature decline.  This decline is consistent with the indicated temperature oscillations superimposed on the long term linear trend line since the end of the Little Ice Age in the Akasofu graph.  Of course, there is no certainty that the linear long term warming rate from 1880 to 2000 will be maintained into the future, since the natural forces causing this warming trend are not well-understood and they have been known to frequently change direction in the past.  But, it is a more reasonable guess than that of the UN IPCC AR4 report of 2007, whose science we know to be wrong on steroids.

Saudi Arabia Sentences Lebanese TV Host to Death for Sorcery

A Lebanese television host, Ali Hussein Sibat, was recognized by Saudi religious police making a pilgrimage to Mecca.  He was charged with sorcery because on his show he made predictions about the future of people who called in.  He has now been sentenced to death.

Saudi Arabia is a center for Islamic extremism and runs a system of religious schools both in Saudi Arabia and in many other countries around the world based on the Wahhabi reform of Islam.  This, like many reforms, is in fact a reversion to primitiveness and violence.  Many of the terrorist attackers on 11 September 2001 came from this perverted religious belief.  The idea of killing people for sorcery is one straight out of a human era of the utmost superstition and ignorance.

04 January 2010

Britain's Met Office Predicted a Mild Winter Ever -- For the Third Time In a Row

Christopher Booker reports in the Telegraph that "the serial inaccuracy" of the Met Office climate forecasts in Great Britain is no longer just a joke, but a national scandal.  He goes on to say:
The reason the Met Office so persistently gets its seasonal forecasts wrong is that it has been hi-jacked from the role for which we pay it nearly £200 million a year, to become one of the world's major propaganda engines for the belief in man-made global warming. Over the past three years, it has become a laughing stock for forecasts which are invariably wrong in the same direction.
The year 2007, it predicted, would be "the warmest ever" – just before global tempratures plunged by more than the entire net warming of the 20th century, Three years running it predicted warmer than average winters – as large parts of the northern hemisphere endured record cold and snowfalls. Last year's "barbecue summer" was the third time running that predictions of a summer drier and warmer than average prefaced weeks of rain and cold. Last week the Met Office was again predicting that 2010 will be the "warmest year" on record, while Europe and the US look to be facing further weeks of intense cold.
What is not generally realised is that the UK Met Office has been, since 1990, at the very centre of the campaign to convince the world that it faces catastrophe through global warming. (Its website now proclaims it to be "the Met Office for Weather and Climate Change".) Its then-director, Dr John Houghton, was the single most influential figure in setting up the UN's Intergovernmental Panel on Climate Change (IPCC) as the chief driver of climate alarmism. Its Hadley Centre for Climate Change, along with the East Anglia Climatic Research Unit (CRU), was put in charge of the most prestigious of the four official global temperature records. In line with IPCC theory, its computers were programmed to predict that, as CO2 levels rose, temperatures would inevitably follow. From 1990 to 2007, the Department of the Environment gave the Met Office no less than £146 million for its "climate predictions programme".

But in the past three years, with the Met Office chaired by Robert Napier, a former global warming activist and previously head of WWF UK, its pretensions have been exposed as never before. The "Climategate" leak of documents from the CRU, along with further revelations from Russian scientists, have shown the CRU/Met Office alliance systematically manipulating temperature data, past and present, to show the world growing warmer than the evidence justified. And those same computers used to predict temperatures 100 years ahead for the IPCC have also been used to produce those weather forecasts that prove so consistently wrong.

It will be equally instructive to take note of the weather of the last several years in the U.S. and worldwide. The climate models so highly touted by the UN IPCC AR4 report of 2007 and the catastrophic man-made global warming alarmists have not been predicting the climate of the period since 2001 well at all. The outcome of the predictions is always warmer than the actuality, even when the actual raw temperature measurements have been manipulated to make them look warmer than they really are. If the climate models are not good for any particular year and not good for any particular decade, then why should we have much confidence in them for 50 to 100 year predictions? It seems clear we would be foolish to expect much of them in light of the fact that there are clearly climate factors which the models do not have a grasp on. The climate prediction models of the Met, Hadley Centre, and the CRU of the University of East Anglia are clearly clueless. I would suggest this is because they are biased to predict an excessive effect due to CO2 emissions and because they do not yet understand the natural forces on the climate.

02 January 2010

A Mayo Clinic in Arizonia Stops Treating Most Medicare Patients

The Mayo Clinic facility in Glendale, Arizona has stopped treating many Medicare patients as part of a two-year trial program.  The Mayo Clinic is known as a high quality, relatively low cost provider of health care and was praised by Obama in June as a model for efficient health care.  Yet, the Mayo Clinic, based in Rochester, Minnesota, with a staff of 3,700 physicians and scientists, treated 526,000 patients in 2008 and lost $840 million on Medicare patients.  The average loss per patient was $1600 and many of their patients were not on Medicare, so the loss per Medicare patient was higher.

According to the March report of the Medicare Payment Advisory Commission (MPAC), doctors made 20% less treating Medicare patients than they did caring for privately insured patients in 2007.  Of course, this is like getting an estimate of future costs for a government medical program from the CBO.  As is well-established historically, the CBO estimates are always many times less than the future costs actually turn out to be.  The underpayment estimates of the MPAC are just as surely underestimates.

Socialists like to claim that Medicare is a huge success.  It would in fact be an obvious total failure, except that many of its costs have been transferred to patients on private insurance and the fact that demographics have allowed it to operate successfully as a giant Ponzi scheme so far.  As the Baby Boomers reach the age of 65, the program will rapidly incur costs which will exceed the current income from the Medicare payroll tax.  This will happen by 2017 or even earlier by 2013 in some estimates.  Some combination of higher taxes, lower medical services payments, and rationed medical care will have to make up the shortfall in the program's ability to pay for medical services.  The day of reckoning is very near and the present socialist Democrat health insurance reform bills are not addressing any of the real problems, except perhaps on the pathway of rationing future health care.  In many ways, the present ObamaCare bills actually add to expenses directly with taxes on medical services and will add further with an incredible overload of bureaucracy and innovation restrictions.

The Glendale Mayo Clinic facility has more than 3,000 patients eligible for Medicare.  They will have to pay cash to continue to see their doctors there.  This facility lost $120 million in 2008 on their treatment.  Only about 50% of the cost of treating elderly primary care patients at this facility was paid by Medicare.  There are a few areas in which Medicare payments will still be accepted there.  Specialist cardiology and neurology services and laboratory services payments will still be accepted.

Medicare covered about 45 million Americans at the end of 2008.  About 92% of family doctors participate in Medicare, but only 73% of them or about 2/3 of all family doctors are accepting new patients under the program.  The increased numbers of patients over 65 in the next few years will result in a further reduction in the percentage of family doctors accepting new Medicare patients.  If payments are cut back, as they likely will have to be due to the coming funding crisis for Medicare, then the percentage of doctors accepting Medicare patients will drop still further.  There is already a severe shortage of family doctors at the present time, so the situation will deteriorate catastrophically.  Congress, just about a week ago, postponed a 21.5% cut in Medicare payments to doctors for two months.

Yes, this is the farce the socialists are telling us is a great and successful program.  In fact, it is as bad as the post office, Amtrak, the ethanol from corn subsidy program, and subsidies for wind and solar power generation.  Well, except that the failure of this program will have fairly immediate deadly consequences for very large numbers of elderly Americans.  At the same time, it will most likely add still more backbreaking taxes upon younger generations, who will also be struggling under the huge increases in health care insurance premiums which ObamaCare will require them to buy for themselves.  Some very nasty intergenerational warfare is going to result.