Government climate scientists keep saying that they need larger supercomputers to improve their accuracy.
I just tested their claim by processing the entire 135 year long US HCN daily temperature record on my $180 laptop with 2GB memory, in nine minutes and ten seconds – while I was simultaneously tweeting, blogging and generating spreadsheet graphs.
And my temperature graph is far more accurate than any of the fraudulent graphs they make on their billion dollar computers. The US is much cooler now than it was 80 years ago. Somehow our “top scientists” missed this.
BTW – this was done on Windows 10. I doubt it would have been possible to do this on Windows 8 or earlier versions, because the memory management wasn’t adequate.