Flashback: NASA, NOAA create global warming trend with cooked data ( and continue today )

NASA, NOAA create global warming trend with cooked data

January 28, 2010 8:15 PM MST

Are NASA and NOAA cooking temperature data to promote the theory of man-made global warming?

goingconcern.com

The cooks – er, “scientists” – at NASA’s Goddard Institute of Space Studies (GISS) have released their latest sky-is-falling temperature findings, and they show 2009 as the second-warmest year for the planet since modern record-keeping began in 1880, and 2009 temperatures in the Southern Hemisphere the warmest since 1880.

The head chefs at NOAA‘s National Climatic Data Center (NCDC), working from a slightly different recipe, reveal a 2009 that was not as well done as NASA’s, ranking only the fifth-warmest since 1880.

Other not-so-well-done findings:

NASA cooks:

  • January 2000 to December 2009 was the warmest decade on record.
  • Surface temperatures have been trending upward 0.38 degrees per decade during the last three decades.
  • In total, average global temperatures have increased by about 1.4 degrees since 1880 (“an important number to keep in mind,” says NASA climatologist Gavin Schmidt.)

NOAA chefs:

  • Global land and ocean annual surface temperatures for 2009 tied with 2006 as the fifth-warmest on record, at 1.01 degrees about the 20th century average.
  • The 2000 – 2009 decade is the warmest on record, with average global surface temperature of 0.96 degrees above the 20the century average.
  • Land surface temperatures in 2009 tied with 2003 as the seventh-warmest on record, at 1.39 degrees above the 20th century average.
  • The years 2001- 2008 each rank among the 10 warmest years since 1880.

How do they arrive at the numbers?

There is a major problem with the NASA and NOAA numbers, according to skeptical researchers who have dissected the data: They are inaccurate, the result of cherry-picking, computer manipulation and “best guess” interpretation.

The agency kitchens have concocted warming global temperatures using a hard-to-follow recipe of thinning reporting stations, grid-box interpolation, temperature homogenization, and algorithmic ingredients blended into a tweak-on-the-fly computer program.

Veteran meteorologist Joe D’Aleo – a long-time critic of official global-warming statistics – says NASA and NOAA are manipulating the data, calling their actions the U.S. version of last year’s Climategate scandal.

The Climategate brouhaha ensued when thousands of hacked (or whistleblower) e-mails were uploaded last November to the Web from Britain’s Climate Research Unit (CRU) in East Anglia and seen by millions of fascinated snoops. The revealing missives exposed scientific misconduct by top climate scientists and researchers, several of whom are now under investigation in Britain and the United States.

“The CRU has been ground zero for alleged scientific conduct, but other national weather centers, organizations, universities, and the U.S. global data centers at NOAA and NASA are complicit in the misrepresentation or manipulation of data to support the supposed [global warming] consensus,” says D’Aleo, who also heads ICECAP, the International Climate and Environmental Change Assessment Project.

What warming?

NASA and NOAA have several data-manipulation tricks in their global warming cookbook. But before eyeballing their recipe, here are few inconvenient truths that failed to make their way into the NASA and NOAA press releases.

  • According to D’Aleo: “Global temperatures peaked in late 1990s, leveled off, and have been declining since 2001. All four official databases – NOAA’s NCDC, Hadley CRU, University of Alabama at Birmingham (UAH), and Remote Sensing Systems (RSS) – confirm the decline.”
  • Satellite data and land-ocean data sets are diverging. Last year, NOAA announced that June global temperatures ranked the second-warmest in 130 years, while the two satellite data sources — UAH and RSS –ranked the month’s temperatures the 15th and 14th coldest, respectively, in 31 years of record-keeping.
  • In 2009, all regions of the United States were normal or below normal except for the Southwest and Florida, according to the NCDC.
  • The annual temperature in 2008 was the coolest since 1997, according to NOAA.
  • Thirty-eight of 50 states set their all-time record temperatures in the decades prior to 1960.
  • According to the NCDC, 1936 experienced the hottest overall summer on record in the continental United States. In fact, out of 50 states, 22 recorded their all-time high temperature during the 1930s.
  • The 2000s had the most benign weather, in terms of records (heat and cold), of any decade since the 1880s.
  • According to the Danish Meteorological Institute, arctic temperatures are currently below minus 31.27 degrees, more than five degrees below normal and the lowest since 2004.
  • Last year, Chicago experienced its coolest July 8 in 118 years, and only four days during the summer reached the 90s. Six states experienced their coldest-ever July in 115 years, four the second coldest and two the third coldest. October was the third coldest and December the 14th coldest in the United States in 115 years.
  • Global warming and cooling are cyclical. Data show it warmed from 1920 to 1940 and again from 1979 to 1998. But temperatures cooled from 1940 to the late 1970s, and have been cooling since 2001. Since World War II, CO2 has risen, even as temperatures have cooled, warmed and then cooled again, undermining the theory that CO2 is the single most important cause of climate change.
  • Not a single official computer model predicted the recent decline (since 2001) in global temperatures. Yet extended projections from the same models are referenced by eco-alarmists demanding draconian CO2-emission controls and the imposition of carbon taxes and cap-and-trade restrictions.

The NASA/NOAA recipe

To cook temperature data and warm the earth artificially, NASA and NOAA have whipped up a nifty recipe. Here are the not-so-secret ingredients for global warming:

1) Reduce temperature reporting stations across the globe from nearly 6,000 in 1970 to 1,500 or less today.

2) Drop out reporting stations in higher latitudes (colder), higher elevations (colder) and mainly rural locations (colder).

3) Cool early temperature records through data “adjustments” to create the impression of a current warming trend.

4) Fail to compensate or under-compensate for urban growth and land-use changes that can produce localized warming known as the urban heat island (UHI) effect.

5) Cherry-pick thermometers from reporting stations sited at busy airports and other warm locales (e.g. near the coast or at lower elevations).

6) Fill gaps in the shrunk-down thermometer network by estimating temperatures using a system of global grid boxes. Then “populate” the grids with thermometers stationed at lower latitudes and altitudes, or near the coast and in other warm spots.

7) If there are no temperature stations inside the grid box, use the closest station in a nearby box (for example, at the bottom of a mountain plateau or on the coast).

8) Adjust the final temperature dataset using “homogenization,” a blending process that effectively spreads a warm bias to all surrounding stations.

9) Voila, global warming made easy!

Another bumper cherry-picking season

With global temperatures on the decline, NOAA and NASA are forced to cherry-pick temperature data to craft their annual “man is warming the earth” press releases warning of a coming climate catastrophe.

For example:

  • NOAA collects data from only 35 sites in Canada, down from 600 in the 1970s. And only one station – in relatively warm Eureka on Ellesmere Island – measures temperature for all Canadian territory above the Arctic Circle.(The Environment Canada weather service recently told the National Post that it has 1,400 stations, including 100 north of the arctic circle.) The remainder of the thermometers are sited in warmer, lower-latitude locations – close to airports and major cities, or near the coast.
  • After 1990, NOAA tripled the number of Canadian reporting stations at lower elevations while reducing by half the number at elevations above 300 feet.
  • According to D’Aleo, “High-elevation stations have disappeared from the database. Stations in the Andes and Bolivia have vanished. Temperatures for those areas are now determined by ‘interpolation’ from stations hundreds of miles away on the coast or in the Amazon.”
  • In Russia, Moscow’s Institute of Economic Analysis (IEA) has accused Hadley CRU of data tampering. Only 25 percent of the country’s reporting stations were included in the agency’s global temperature calculations. (The same pruned dataset was used by NOAA.) According to the IEA, the temperature stations that were removed often show no substantial warming in the late 20th century and the early 21st century.

Elevated temperatures in urban areas
Many of NOAA’s temperature stations are located in urban areas, particularly busy airports, where the surrounding infrastructure often contaminates readings. Asphalt parking lots, roads, rooftops, sidewalks, buildings, and air conditioning vents and other radiant heat sources produce the well-documented UHI effect and can “bias thermometer readings upwards by as much as 7º C (12º F),” according to a study by SurfaceStations.org.

Dr. Phil Klotzbach, a research scientist in Colorado State University’s Department of Atmospheric Science, says the surface temperatures in many urban areas are “not representative of global temperatures as much as they are of the microclimate in those locations.”

There is a “divergence” between temperatures trends observed at surface stations and those recorded by satellite in the lower troposphere, he says.

“We’re seeing more of a warm bias in the higher latitudes and over land vs. oceans [in the surface station data]. My general impression is to trust the satellite data vs. the surface data because there are fewer variables that come into play.”

Improper siting

Many of the surface temperature stations are improperly sited. A group of volunteers, working with SurfaceStation.org, have surveyed stations across the United States and found that 87 percent of the 1,221 instrument stations are “poorly” or “very poorly” sited and only 10 percent meet NOAA’s Climate Reference Network (CRN) and Cooperative Observer Program (COOP) standards.

The improper siting and UHI effect result “in a warm bias of over 1 degree centigrade, D’Aleo says. “A warm contamination of up to 50 percent has been shown by not less than a dozen peer review papers, including, ironically, one by Tom Karl (1988), director of NOAA’s NCDC, and another by CRU’s Phil Jones (2009).”

Anthony Watts of SurfaceStation.org says the U.S. temperature record can’t be trusted.  “And since the U.S. record is thought to be ‘the best in the world,’ it follows that the global database is likely similarly compromised and unreliable,” he concludes.

Temperature database a mess

Michael Smith, a California-based software engineer, who was instrumental, along with D’Aleo, in crunching the NOAA/NASA data and exposing the temperature tampering, says the historical climate data used by both agencies is obsolete by 20 years and is a mess.

“The ongoing maintenance of the data has been botched. The result is a structural deficit that makes it wholly unsuited to use in climate analysis,” he says. “The warming isn’t global and isn’t from CO2. It’s because we’re using thermometers at airports and in the tropics. An extraordinary hatred of mountains and other cold locations shows up in the data.”

Smith, who conducts his own research and receives no outside funding, says he is interested only in determining the accuracy of the official data — before and after it has been adjusted and “homogenized.”

“I don’t bring my own fantasies to the table . . . I’m not trying to tease something out of the data.”

The same unbiased, scientific approach appears to be AWOL at NASA and NOAA, where both agencies clearly seem intent on fiddling temperature data to make it support pre-ordained conclusions. What could be the motivating impulse behind such sleight of hand?

“Bright people have an amazing capacity to deceive themselves,” answers Smith. “Maybe it’s not done out of malice. Maybe these people actually believe their own B.S.”

Global warming claims not credible

In their just-published research paper, “Surface Temperature Records: Policy Driven Deception,” authors D’Aleo and Watts claim that surface-temperature data has been so widely tampered with “that it cannot be credibly asserted there has been a significant ‘global warming’ in the 20th Century.”

“The global databases are seriously flawed and can no longer be trusted to assess climate trends or rankings or validate model forecasts. And, consequently, such data should be ignored for decision-making,” they conclude.

Kirk Myers’ Examiner column appears several times weekly. To receive alerts when a new article is published, click on the “subscribe” button at the top of the page. Upcoming topics: the carbon-credit game and melting ice caps/sea ice. For a comprehensive look at global warming, including research and video presentations, please see links on the right.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s