GISS Blocking Access To Archived Data And Hansen’s Writings

Michael Hammer has found the original uncorrupted GISS US temperature data on John Daly’s web site. Prior to GISS/USHCN perverting the data set in the year 2000, the 1930s was the hottest decade.

Jennifer Marohasy » How the US Temperature Record is Adjusted

The data was originally here :

If you try to go to that page, you get this :

Not Found

The requested URL /gistemp/graphs_v3/FigD.txt was not found on this server

So I went to the web archive, to look for an archived copy of the data on the GISS web site.*/

This is what you get there :

Here is what robots.txt looks like. Hansen is blocking the web archive from searching GISS data, selected meetings and his publications.

User-agent: *
Disallow: /calendar/
Disallow: /cgi-bin/
Disallow: /data/
Disallow: /dontgohere/
Disallow: /gfx/
Disallow: /internal/
Disallow: /lunch/
Disallow: /meetings/arctic2007/pdf/
Disallow: /meetings/arctic2007/ppt/
Disallow: /meetings/pollution2002/present/
Disallow: /meetings/pollution2005/day1/
Disallow: /meetings/pollution2005/day2/
Disallow: /meetings/pollution2005/day3/
Disallow: /meetings/pollution2005/posters/
Disallow: /meetings/lunch/
Disallow: /rp/
Disallow: /tools/modelE/call_to/
Disallow: /tools/modelE/modelEsrc/
Disallow: /tools/panoply/docs/projections/
Disallow: /tools/panoply/help/projections/
Disallow: /~crmim/publications/
Disallow: /~jhansen/
Disallow: /staff/mmishchenko/publications/
Disallow: /staff/jhansen/
Disallow: /staff/img/

User-agent: discobot
Disallow: /

About these ads

About stevengoddard

Just having fun
This entry was posted in Uncategorized. Bookmark the permalink.

27 Responses to GISS Blocking Access To Archived Data And Hansen’s Writings

  1. Ben says:

    I think you are misreading the word “disallow”. Instead of dis-allow, read it as di-sallow

    Sallow = “an unhealthy yellowish color”.

    Cowards. Double cowards

  2. neill says:


  3. Les Johnson says:

    I think a FOI request needs to be submitted. Some of the other folders look interesting too.

  4. That’s only for robots. Nothing is hidden when using a browser.

    • I think you are completely missing the point.

    • DirkH says:

      The wayback machine and all other internet archives can only archive what their crawlers are allowed to find. Hansen keeps them from archiving versions of his graphs, making it impossible to compare versions of his website later.

      You can of course, as a human, regularly go there and save snapshots manually. But you would have to do this over and over again before anything happens in order to be able to prove the manipulations.

      • Sean says:

        I should point out here that there is nothing preventing a web bot from archiving the entire GISS site except the _convention_ of complying with the restrictions laid out in the robots.txt file; if you wanted to obtain an archive of the site, you could obtain the source code to a web bot, remove or comment out either the code that reads the robots.txt file from a site or the code that matches URLs against the list of folders in that file, recompile it, and point it at the GISS website to obtain a complete copy of the site.

  5. daveburton says:

    Thanks for this! Last year I looked and looked for it, without success.

    However, the data that John Daly archived appears to be newer than the data in the 1999 graph. If you compare the 1934 and 1998 peaks in the 1999 graph, you can see that 1934 was about 0.6 C warmer than 1998.

    But in that copy of the data from John Daly’s site, 1934 is only 0.25 C warmer than 1998. (1998 is 4th-warmest, behind 1934, 1921, and 1931.) So, already, by the time that John Daly archived that file, about 0.35 C of cooling had been erased. If anyone knows where to find the earlier version, in which 1934 was about 0.6 C warmer than 1998, as shown in that 1999 graph, I’d be grateful for a copy!

    Here’s another version of the data from John Daly’s site. It appears to be from later in year 2000, or perhaps early 2001, and another 0.21 C of warming has been added for the 1934-to-1998 interval, leaving 1934 just 0.04 C warmer than 1998.

    Note: In the latest version of the data (and a recent graph), 1934 is 0.078 C cooler than 1998.

    • daveburton says:

      I wrote, “In the latest version of the data (and a recent graph), 1934 is 0.078 C cooler than 1998.”

      The trend continues: in the version of the U.S. 48-State Surface Air Temperature anomaly data that I downloaded from NASA GISS today, 1934 is 0.1231 °C cooler than 1998.

      All the versions I’ve found or reconstructed are here:

  6. tckev says:

    When you’ve committed so much social, political, and financial profit in these debased versions of reality, then it no surprise that earlier (honest and clean) versions are hidden and protected, or even erased.

  7. John Blake says:

    Let’s hope that more knowledgeable and far smarter sources than ourselves will expose Hansen in all his glorious detail.

  8. orson2 says:

    DirkH notes: “You can of course, as a human, regularly go there and save snapshots manually. But you would have to do this over and over again before anything happens in order to be able to prove the manipulations.”

    I believe Steven Goddard has given motivated readers here just the incentive needed to do so. Thank you Steve.

  9. LTJ says:

    I’m not sure which vintage this is: But if you compare it to There are changes.

  10. B.C. says:

    If a government worker deliberately (or even “mistakenly”) tampers with public records information, that person is a criminal. I seem to recall a certain WH occupant resigning over a mere 18 minutes of “mistakenly erased reel-to-reel tape”. James “Algore’s Sockpuppet” is light years ahead of that certain WH occupant, when it comes to criminally destroying and/or altering public records. We can only hope, for the sake of our children’s futures, that Hansen and the rest of his criminal cohorts are stopped and imprisoned before they completely destroy not only our economy (and take away our individual liberties), but the rest of the world’s.

    Thank you for all you do, Mr. Goddard. You will go down in history as one of the great defenders of truth and science.

  11. James says:

    robots.txt only tells search bots what not to index. It does not set permissions on files… There’s no conspiracy there mate. The rest is interesting though.

  12. daveburton says:

    The good news is that you, Steve, apparently embarrassed GISS into reviseing their robots.txt to be less restrictive.

    They’d had that restrictive robots.txt file there for years. But after you posted this article on 6/11/2012, it was less than 17 days before they fixed it. Here’s what it looked like 17 days later:

    Unfortunately, the old robots.txt prevented archiving a lot of material, before they changed it. Even the old robots.txt, itself, wasn’t archived, which is presumably why you get this error:

    It stinks to high heaven that they ever created such a robot.txt file. The obvious question is, what where they hiding?

    • daveburton says:

      Well, I spoke too soon. It turns out they tricked me (briefly).

      I downloaded all of the old NASA GISS U.S. Surface Temperature Anomaly files I could find, compared them to eliminate the duplicates, organized them into a table, and put them on my server, here: (No warranty is expressed or implied.)

      There are two interesting things to notice.

      First, see the footnote at the bottom about “time travel.”

      Really! In both 2011 and 2012, GISS had average temperature anomalies reported for the full year in August of that same year! That takes “we don’t need no stinking data!” to a whole new level.

      Second, note that there have been no copies of this data saved at since October 3, 2012 (16 months ago).

      I wondered why not. There was a new version last January. So why didn’t save it?

      It turns out that the reason for that is that sometime between 9:25am 1/14/2013 and 5:20am 1/15/2013 GISS configured their web server to prevent archiving anything in

      Sometime in March, 2013, they changed their server configuration; the error seen by is now different. On 3/14/2013, one successful archive of the main page snuck through (probably while they were in the process of changing their server configuration), but, unfortunately, not the data. Since then their server has blocked every access attempt from

      You can still view the current version in a normal web browser, but, and all fail when trying to archive the file.

      Here’s what WebCitation reported when I tried to use it to save the current Fig.D.txt:

      Your recent WebCite request has completed. Following are the results from this request:


      The caching attempt failed for the following reason: No files could be downloaded for the given URL. This is likely because
      a) The URL is incorrect,
      b) The site in question refuses connections by crawling robots, or
      c) The site in question is inaccessible from the WebCite network

      In fact, even wget on my own computer fails, with a “403 Forbidden” error! Here’s what happens:

      => `Fig.D.txt'Resolving done.
      HTTP request sent, awaiting response... 403 Forbidden
      14:27:52 ERROR 403: Forbidden.

      To download an end-of-year 2013 copy of Fig.D.txt for my table, I had to manually save it from within a web browser.

      This behavior could not be accidental. GISS has intentionally configured their web server prevent their (our!!!) data from being archived.

      I think it will be necessary to spoof a regular browser in order to automate downloading those files.

      Steve, my guess is that, as a result of your 6/11/2012 blog article, about GISS using robots.txt to prevent archiving of their data, GISS was ordered by somebody higher-up to stop doing that. So they cheated: they changed their robots.txt to make it appear that they no longer block, but they actually just configured their web server to do the blocking, instead.

      Here’s their current robots.txt:

      User-agent: *
      Disallow: /pub/
      Disallow: /outgoing/
      Disallow: /cgi-bin/
      Disallow: /gfx/
      Disallow: /modelE/transient/
      Disallow: /work/
      User-agent: msnbot
      Crawl-delay: 480
      Disallow: /cgi-bin/
      Disallow: /gfx/
      Disallow: /modelE/transient/
      Disallow: /work/
      User-agent: Slurp
      Crawl-delay: 480
      Disallow: /cgi-bin/
      Disallow: /gfx/
      Disallow: /modelE/transient/
      Disallow: /work/
      User-agent: Scooter
      Crawl-delay: 480
      Disallow: /cgi-bin/
      Disallow: /gfx/
      Disallow: /modelE/transient/
      Disallow: /work/
      User-agent: YahooSeeker/CafeKelsa
      Disallow: /
      User-agent: discobot
      Disallow: /

      See, their robots.txt now says to block their /pub and /outgoing folders from being archived (why?), but it now looks like they allow to archive most of the rest of their site, including /gistemp/. I’d bet money that’s because of the stink you made, Steve. Somebody probably issued an edict: “stop using robots.txt to block!”

      So they pulled a dirty trick. They changed robots.txt to no longer show that is blocked, but they configured their server to do the blocking, instead. The last successful archive of Fig.D.txt by was at 3:20am EDT on 10/3/2012. Since then their server has been blocking access with “403 Forbidden” errors.

      Here’s what sees now: -> “Access denied.”

      This amazes me. I really am surprised at how blatant their misbehavior is. They’re absolutely shameless. I’m becoming convinced that the guys running GISS are just plain crooks. If I’d given an order that they cease blocking with robots.txt, and I subsequently discovered this subterfuge, I’d fire somebody so fast there would be skid marks on the sidewalk outside the front door where their butt hit the concrete.

      • Shazaam says:

        That has me thinking of the old saw: “Never attribute to malice that which is adequately explained by stupidity”.

        One could speculate (borrowing CAGW logic to do so.) that based upon these epic levels of stupidity evidenced at NOAA and NASA, that stupidity must be caused by government climatology grants.

  13. Eliza says:

    Told ya to keep ALL the data used previously. This is time for Federal Police Intervention. All GISS records should be seized and impounded. BTW Hansen ain’t there anymore its dear ol Gavin. They would be mightlly Pxxxxd off with your site and trying to prevent ANY access to previous data.

  14. omanuel says:

    These talented folks at NASA Goddard Institute for Space Studies have just confirmed serious charges made on 17 July 2013 to the Congressional Space Science and Technology Committee

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s