Mind Blowing Data Tampering At Addison, New York

Our friends at NCDC have done some pretty spectacular work to hide Addison, New York’s warm past. Thermometers tell us that Addison January afternoons were about three degrees warmer in the past than they are now, and that January 1932 was about four degrees warmer than January 2006. But through the magic of data tampering, NCDC has made the January cooling disappear, and made 2006 almost as warm as 1932.

AddisonNYJan

In order to make the warm January of 1932 disappear, they knocked more than four degrees off of the measured temperature.

ScreenHunter_6976 Feb. 09 05.44

If we look at the actual daily temperatures in 1932 and 2006, we can see that January 1932 was indeed an incredibly warm month, with temperatures reaching over 70 degrees on two days.

ScreenHunter_6967 Feb. 09 05.27

Afternoon temperatures in 1932 were as much as 35 degrees warmer than 2006.

ScreenHunter_6965 Feb. 09 05.24

The New York Times reported the warmest January day on record in 1932 at New York City.

ScreenHunter_6978 Feb. 09 06.02

TimesMachine: January 14, 1932 – NYTimes.com

We see the same pattern at thousands of stations. The past cooled and the present warmed. The public is being misled about global warming by a small group of individuals at NCDC and NASA. This is straight out of Orwell.

About Tony Heller

Just having fun
This entry was posted in Uncategorized. Bookmark the permalink.

28 Responses to Mind Blowing Data Tampering At Addison, New York

  1. Gail Combs says:

    Speaking of mind blowing data tampering….

    Bruce Jenner breaks his silence about fatal crash as cops suggest he’s in the clear despite claims he was texting while driving

    * Police are now suggesting the vehicle to blame in the deadly Saturday car crash involving Bruce Jenner was the Toyota Prius at the head of the 4-car pileup, say reports

    * Jenner was spotted back behind the wheel and again driving down the Pacific Coast Hightway in his Porsche 911 GT3 RS

    * Officials now examining phone records of all four drivers to help determine circumstances surrounding crash

  2. omanuel says:

    Thank you Steven for having the talent and the courage to identify and report official deceit that ultimately threatens the lives of all Earth’s inhabitants.

  3. Marsh says:

    They say, the bigger the AGW lie, the more believable it becomes “but” there is a point where reality bites back and the bubble bursts ; that time is not far away…

    • omanuel says:

      Steven Goddard is the one blogger with the analytical mind, courage and communications skills to “break the bubble of consensus babble.”

      Thanks, Steven aka Tony Heller!

  4. A C Osborn says:

    Steve, can you do me a favour.
    Last year I was looking at NCDC Raw and Final data and all the Estimated values.
    Unfortunately I deleted the files, can you tell me where I can get them again?
    The current NCDC data from here
    http://www.ncdc.noaa.gov/cdo-web/confirmation
    does not seem to have any “E” flags.

      • A C Osborn says:

        Thanks

        • A C Osborn says:

          Steve what do you read them with?

        • AC, here are some options:

          PowerArchiver 6.1, 7-zip and Winzip include the gzip compression code and can decompress .gz and tar.gz files. Win-GZ can compress and decompress files in gzip format. Please note that gzip, 7-zip, PowerArchiver 6.1 and Win-GZ are freeware but you must register Winzip and PowerArchiver > 6.1 if you use them regularly.

          http://www.gzip.org/#faq4

        • A C Osborn says:

          Thanks Colorado, I have JZip, that works fine, I had just forgotten how to get in to them.

        • Neal S says:

          If you are using linux, gunzip will get the job done. For example ….

          gunzip ushcn.tmin.latest.raw.tar.gz

          will produce a ushcn.tmin.latest.raw.tar file

          Then you can use ‘tar’ to extract individual files.

          tar -xvf ushcn.tmin.latest.raw.tar

          will extract roughly 1200 some text files.

          If you are stuck on a windows box, then I strongly suggest that you download Cygwin and do all these steps in a command window from Cygwin. Visit Cygwin.com. Cygin is basically a way of running unix/linux programs on a WIndows box.

          The result of these steps are a multitude of plain text files which can be analyzed and manipulated in a number of ways. Another fine program to then use is gawk or awk. (Its name is based on its creators, Aho, Weinberger and Kernighan) Awk is great for coming up with software to deal with text files and do things limited only by your imagination and computing platform.

          Linux, gunzip, tar, Cygwin, awk, gawk, are all free and open source software.

        • Sophie says:

          I always use 7zip, it opens most files and is open source, thus comes with no additions or licence limits. It also has a great command line facility, for those like me who do scripting. You may download it here: http://www.7-zip.org/

      • hifast says:

        From the Read Me file at: ftp://ftp.ncdc.noaa.gov/pub/data/ushcn/v2.5/

        “The composition of the network remains unchanged at 1218 stations, but the
        data formats and reprocessing frequency have changed. USHCN version 2.5 is now
        produced using the same processing system used for GHCN-Monthly version 3.
        This reprocessing consists of a construction process that assembles the USHCN
        version 2.5 monthly data in a specific source priority order (one that favors
        monthly data calculated directly from the latest version of GHCN-Daily), quality
        controls the data, identifies inhomogeneities and performs adjustments where
        possible.”

        Shouldn’t adjustments be made only where NECESSARY?

        • Gail Combs says:

          The composition of the network remains unchanged at 1218 stations, but the data formats and reprocessing frequency have changed. USHCN version 2.5 is now produced using the same processing system used for GHCN-Monthly version 3. This reprocessing consists of a construction process that assembles the USHCN version 2.5 monthly data in a specific source priority order (one that favors monthly data calculated directly from the latest version of GHCN-Daily), quality controls the data, identifies inhomogeneities and performs adjustments where possible.

          Except the inhomogeneities are an integral part of the weather!

          Meteorology: A Text-book on the Weather, the Causes of Its Changes, and Weather Forecasting By Willis Isbister Milham 1918

          The observations of temperature taken at a regular station are the real air temperature at 8am and 8pm, the highest and lowest temperatures of the preceding 12 hours, and a continuous thermograph record…. (Richard Freres thermograph) ….these instruments are located in a thermometer shelter which is ordinarily placed 6 to 10 feet above the roof of some high building in the city. At a Cooperative station the highest and lowest temperatures during a day are determined, and also the reading of the maximum thermometer just after it has been set. The purpose of taking this observation is to make sure that the maximum thermometer has been set and also to give the real air temperature at the time of observation.

          If a good continuous thermograph record for at least twenty years is available, the normal hourly temperatures for the various days of the year can be computed….

          “the average temperature for a day is found by averaging the 24 values of hourly temperature observed during that day”

          If the normals are based on twenty years of observations, it will be found that there is not an even transition from day to day, but jumps of even two or three degrees occur….

          So like Zeke of BEST, ‘jumps in the data’ is identified as an inhomogeneities yet Milham says these ‘jumps of even two or three degrees occur’ naturally.

          Or to put it more succinctly “inhomogeneities” = WEATHER.

  5. A C Osborn says:

    Steve, can you also remind me of what the TOBS column actually signifies in the current NCDC downloads, are they saying it definitley was the temperature at the TOB or are they saying that is what they think it should be?
    As I am having trouble rationalising the values.

  6. Don B says:

    “Fiddling temperature data is the biggest science scandal ever,” says Christopher Booker, not pulling his punches. And I think he’s right not to do so. If – as Booker, myself, and few others suspect – the guardians of the world’s land-based temperature records have been adjusting the raw data in order to exaggerate “global warming” then this is indeed a crime against the scientific method unparalleled in history.

    http://www.breitbart.com/london/2015/02/09/global-warming-so-dishonest-it-makes-enron-look-like-a-paragon-of-integrity/

  7. gator69 says:

    Four degrees?! Why not five, or ten?

    I ran an alarmist off a Forbes thread yesterday when I started showing these gross acts of data tampering. When it was tenths of a degree, they would just wave their hands and claim that historic temperature data needs adjustments, but now their hand jobs just aren’t doing it for them anymore.

  8. Dave N says:

    Looks like they used the “it can’t possibly have been that hot” algorithm to adjust 1932

  9. gregole says:

    From the post: “In order to make the warm January of 1932 disappear, they knocked more than four degrees off of the measured temperature.

    It simply makes no sense to make such drastic adjustments cooling the past. I do not care how lame and ignorant MSM is, they are purposely ignoring this fraud. It has gone way, way too far. Somehow this has to see the light of day.

  10. AndyG55 says:

    SG, I have repeatedly asked you to comment on the coincidence of USCRN and USCHN in this link.

    http://www.ncdc.noaa.gov/temp-and-precip/national-temperature-index/time-series?datasets%5B%5D=uscrn&datasets%5B%5D=cmbushcn&parameter=anom-tavg&time_scale=p12&begyear=2005&endyear=2014&month=12

    Is USHCN actually ACCURATE ?

    Please make a reply.

Leave a Reply