The US Federal Government through its agencies, NOAA and NASA, intentionally publish False information to mislead the Citizens!


NOAA, NASA manipulates data to support the political agenda of the myth of manmade climate change otherwise known as Anthropogenic Climate change. And to keep their jobs the employees in the various agencies of the federal government have all been infected with the desire to keep their jobs i.e. publish data, tables and charts that purport to show that we are in the hottest year ever and that if we don’t tax carbon immediately that we are all going to die. The national media dutifully promotes this cause and those that believe in the cause of more taxes attack anyone that disputes the narrative calling them names such as a Flat Earther or Non Believer and other nastier names as suits them at the time. However there is a building problem in that the citizens of the country don’t see what the propaganda claims to be happening and as the disparity gets larger and larger every year the government and their minions, the national media, get more and more desperate.

Over the past several years I have been downloading a table of global temperatures that NASA publishes each month to use in my research. This table is identified as the Land Ocean Temperature Index (LOTI) which consists of a number which is generated by a very complex computer program. To calculate this table NASA first set an arbitrary global temperature by calculating a base global temperature from period from1951 to 1980 of 14.0 degrees Celsius (57.2 degrees Fahrenheit). Then they determine an anomaly by taking the temperature that they calculated subtracting 14.0 from it then they divide multiply the result by 100. This than gives a plus or minus value from base 0.0 which represents 14.0 degrees Celsius (C). This creates a whole number which is apparently easier for the scientists to work with. Example 14.5 would be an anomaly of 50 (.5 * 100). The only problem with this method is that the base is strictly arbitrary and can be any number one wants to use; but that doesn’t matter for what we’re going to talk about here.

As the federal agencies try to support what the Obama administration wants to promote, they have had to resort to data manipulation that has become so blatant that even non technical people are able to see that something is very wrong. One of the tricks that these agencies use is to change history. They do this in how they calculate their data and in the form of what they show, for example we have the following Chart of anomalies from three different plots of the LOTI monthly tables for the indicated years. The first is from December 1998 (blue line), the second from October 2009 (green line) and the third from December 2014 (red line) which is the lasted one available at the time this paper was written.

This plot was generated by break each indicated year into a ten year blocks of values and then creating an average for each block. By doing this we take out large changes in the monthly numbers; for example for the month of January 1980 from the LOTI issued in December 1998 the value was 30 and the LOTI value for the same month, January 1980, was 24 on the December 2014 report, which one was right? To my way of thinking some variances could be expected but they would cancel out with looking at blocks of numbers; this is not the case here. There is one other issue of note and that is when looking at this subject for the first time around 2005 the LOTI table went back to January 1980. Unfortunately at the time the methodology used by NASA was not understood by me, at the time, and the earlier tables were not saved. Each month as new data was published the current value was added to the table that had been developed. It never occurred to me that the published data was itself a variable. Why would numbers be changed constantly for if they are, of what value are they? Now that we understand how the numbers were derived lets analyze the Chart.

Manipulation Chart

At this scale all three plot should be one on top the other, which they are not. Next it’s curious as to why NASA stopped publishing the anomalies prior to 1980. Also we can see that the plots prior to 1950 have major swings in them; and that the plots after 1980 do as well. Lastly why would the base temperature be calculated using the temperatures from 1951 to 1980 since there is a clear and large upward trend to the data?

The first things we’ll look at are the Blue and Green plots. Which follow each other reasonably with the only difference being dropping the numbers prior to 1980. What comes to mind is that those promoting anthropogenic climate changes did not want to show a decline in temperatures while carbon emissions were growing from 1860 through 1890. It would be interesting to see when this change was made; it wouldn’t be a surprise if it was in the period when James E. Hansen’s was put in charge of the Goddard Institute for Space Studies (GISS) section of NASA.

Next, what happened before 2014 that made such a large change in the red plot? Could it be that if we look at the plot from the periods from 1910 to 2000 there would be close to a full degree upward movement in temperature? Actually that change occurred sometime between October 2011and September 2012 but LOTI tables for that period have not been found yet so it is somewhere in those eleven months. If we go back to 1860 values that almost one degree increase drops by 1/3, is this intentional?

Lastly we look at the period from 2000 to the present. Interestingly it shows that the plot for October 2009 is higher than the plot for December 2014. The period from January 2000 to December 2009 was 62 on the October 2010 report. The same period was 55 on the December 2014 report which is enough to make 2014 hotter. Was this done so the Obama administration could say that 2014 was the hottest years ever?

For reference the following table is what was used to make the plot.

Manipulation Table

Analysis of Global Temperature Trends, December 2014 What’s really going on with the Climate?


The analysis and plots shown here are based on the following: first NASA-GISS temperature anomalies (converted to degrees Celsius so non-scientists will understand the plots) as shown in their table LOTI, second James E. Hansen’s Scenario B data, which is the very core of the IPCC Global Climate models (GCM’s) and which was based on a CO2 sensitivity value of 3.0O Celsius, lastly, a plot based on an alternative climate model designated ‘PCM’ and based on a sensitively value of .65O Celsius.

The next three paragraphs have been added to this monthly temperature plot to clear up confusion regarding the methods used in this work. That confusion is my fault for not properly explaining what is shown here.

An explanation of the alternative model designated PCM is in order since many have interpreted this PCM model as a statistical least squares projection of some kind and nothing could be further from the truth. A decade ago when I started this work the first thing I did was look at geological temperature changes since it is well know that the climate is not a constant; I learned that in my undergrad climatology course in 1964. One quickly finds that there is a clear movement in global temperatures with a 1,000 some year cycle going back at least 3,000 to 4,000 years. There are also 60 to 70 year cycles in the Pacific and the Atlantic oceans that are well documented. We also know that there are greenhouse gases such as Carbon Dioxide and the National Academy of Sciences (NAS) estimated that Carbon Dioxide had a doubling rate of 3.0O Celsius plus or minus 1.5O Celsius in 1979

The IPCC still uses the NAS 3.0O Celsius as the sensitivity value of Carbon Dioxide and a number in that range is required to make the IPCC GCM’s work. The problem with using this value is it leaves no room for other factors and hence the need of the infamous Hockey Stick plots of the IPCC from Mann, Bradley & Hughes in 1999. The PCM model is based on a much lower value for Carbon Dioxide consistent with current research which places the value between 0.65O and 1.5O Celsius per doubling of Carbon Dioxide. If the long and short movement in temperatures and a lower value for Carbon Dioxide are properly analyzed and combined a plot that matched historical and current NASA temperature estimates very well can be constructed. This is not curve fitting.

The PCM model is such a construct and it is not based on statistical analyses of raw data. It is based on creating curves that match observations (which is real science) and those observations appear to be related to the movement of water in the world’s oceans. The movements of ocean currents is well documented in the literature all that was done here was properly combine the separate variables into one curve which had not been previously done. Since this combined curve is an excellent predictor of global temperatures unlike the IPCC GCM’s it appears to reflect reality a bit better than the convoluted IPCC GCM’s which after the past 19 years of no statistical warming have been shown to be in error.

Now, continuing from the first paragraph, to smooth out monthly variations a 12 month running average is used in all the plots. This information will be shown in four tables and updated each month as the new data comes in about the middle of the month. Since no model or simulation that cannot reasonably predict that which it was design to do is worth anything the information presented here definitively proves that NASA, NOAA and the IPCC just don’t have a clue.

2014.12 PCM plot

The first plot, UL is a plot of the NASA temperature anomaly converted to degrees Celsius and shown in red with a black trend line added. There has been a very clear reversal in the upward movement of global temperatures since about 2001 and neither the UN IPCC nor anyone else has an explanation for this 13 years later. Since CO2 has continued to increase at what could be argued an increasing rate this raises serious doubts about the logic programmed into all the IPCC global climate models.

The next plot UR, also in red, shows the IPCC estimates of what the Global temperature should be, based on Hansen’s Scenario B, with the NASA actual temperatures’ subtracted from them. Therefore this plot represents a deviation from what the Climate “believers” KNOW what the temperature should be; with a positive value indicating the IPCC values are higher than actual and a negative value indicating the IPCC values are lower than actual, as measured by NASA. A black trend line is added and we can clearly see that the deviation from expected is increasing at an increasing rate. This makes sense since the IPCC models project increased temperatures based primarily on the increasing level of CO2 in the earth’s atmosphere. Unfortunately, for them, the actual temperatures from NASA are trending down (even as they try to hide the down ward movement with data manipulation) since other factors are in play, therefore each year the gap between them widens. Since we have 13 years of observations’ showing this pattern it becomes hard to justify a continuing belief in the IPCC climate models, there is obviously something very wrong here.

The next plot LL shown in blue is based on the equations in the PCM climate model described in previous papers and posts here and since it is generated by “equations” a trend line is not needed. As can be seen the PCM, LL, and the NASA, UL, trend plots are very similar the reason being that in the PCM model there is a 68.2 year cycle that moves the trend line up and then down a total of .30O Celsius (currently negative .0070O Celsius per year); and we are now in the downward portion of that trend which will continue until around 2035. This short cycle is clearly observed in the raw NASA data in the LOTI table going back to 1868. Then there is a long trend, 1052.6 years with an up and down of 1.36O Celsius (currently plus .0029O Celsius per year) also observed in the NASA data. Lastly there is CO2 adding about .005O Celsius per year so they basically wash out which matches the current holding pattern we are experiencing. However within a few years the increasing downward trend of the short cycle will overpower the other two and we will see drop of about .002O Celsius per year and that will be increasing until till around 2025 or so. After about 2035 the short cycle will have bottomed and turn up and all three will be on the upswing again. These are all round numbers shown here as representative values.

The last plot LR in blue uses the same logic as used in the UR plot, here we use the PCM estimates of what the Global temperature should be with the NASA actual temperatures’ subtracted from them. A positive value indicates the PCM values are higher than actual and a negative value indicates the PCM values are lower than expected. A black trend line was added and it clearly shows that the PCM model is tracking the NASA actual values very closely. In, fact since 1970 the PCM model has rarely been off by more than +/- .1 degrees Celsius and has an average trend of almost zero error, while the IPCC models are erratic and are now approaching an error rate of +.5O above expected.

In summary, the IPCC models were designed before a true picture of the world’s climate was understood. During the 1980’s and 1990’s CO2 levels were going up and the world temperature was also going up so there appeared to be correlation and causation. The mistake that was made was looking at only a ~20 year period when the real variations in climate move in much longer cycles. Those other cycles can be observed in the NASA data but they were ignored for some reason. By ignoring those trends and focusing only on CO2 the models will be unable to correctly plot global temperatures until they are fixed.

Lastly the next Chart shows what a plot of the PCM model would look like from the year 1000 to the year 2200. The plot matches reasonably well with history and fits the current NASA-GISS table LOTI date very closely. Again this plot is a combination of three factors a long cycle probably in ocean currents, a short cycle probably related more to atmospheric effect from the ocean and a factor for CO2 using a much smaller sensitivity value than the IPCC. I understand that this model is not based on physics but it is also not curve fitting. It’s based on observed reoccurring patterns in the climate. These patterns can be modeled and when they are you get a plot that works better than the IPCC’s GCM. If the conditions that create these patterns do not change and CO2 continues to increase to 800 ppm or even 1000 ppm than this model will work into the foreseeable future. Two hundred years from now global temperatures will peak at around 15.5 to 15.7 degrees C and then will be on the downside of the long cycle for the next 500 years. The overall effect of CO2 reaching levels of 1000 ppm or higher will be between 1.0 and 1.5 degrees C which is about the same as that of the long cycle.

Carbon Dioxide is not capable of doing what Hansen and Gore claim!

2014-November-2
The purpose of this post is to make people aware of the errors inherent in the IPCC models so that they can be corrected.

Sir Karl Raimund Popper (28 July 1902 – 17 September 1994) was an Austrian and British philosopher and a professor at the London School of Economics. He is considered one of the most influential philosophers of science of the 20th century, and he also wrote extensively on social and political philosophy. The following quotes of his apply to this subject.

If we are uncritical we shall always find what we want: we shall look for, and find, confirmations, and we shall look away from, and not see, whatever might be dangerous to our pet theories.

Whenever a theory appears to you as the only possible one, take this as a sign that you have neither understood the theory nor the problem which it was intended to solve.

… (S)cience is one of the very few human activities — perhaps the only one — in which errors are systematically criticized and fairly often, in time, corrected

A Classic Example Of Climate Fraud By The Union Of Concerned Scientists


FRAUD is all they have their theories have all been debunked!

Tony Heller's avatarReal Climate Science

The United States is already experiencing more intense rain and snow storms.

As the Earth warms, the amount of rain or snow falling in the heaviest one percent of storms has risen nearly 20 percent on average in the United States—almost three times the rate of increase in total precipitation between 1958 and 2007.

Are Severe Rain storms, Snow storms, Drought, and Tornadoes Linked to Global Warming?

Look what these UCS crooks did. They cherry picked starting their graph at the minimum during the drought of the late 1950’s, because the earlier years wrecked their claim. So they simply threw the other data out.

ScreenHunter_6371 Jan. 24 15.42

In fact, there is zero correlation between global temperatures and heavy US rainfall events, with all of the heaviest rain years occurring during years of below average global temperature.

ScreenHunter_6370 Jan. 24 15.35

This is blatant fraud by an organization calling themselves scientists. These crooks are worse than Bernie Maddoff

View original post

Gavin’s Hockey Stick of Data Tampering


Faking the numbers is the only thing they can do to try and save their failed climate models. History will not treat this scam artists very well.

Tony Heller's avatarReal Climate Science

The hockey stick was created largely by altering historical data.

The graph below shows how Gavin has altered his own global surface temperatures by 0.4 C, just since 2003.

ScreenHunter_6363 Jan. 24 14.04 2003 : FigA.txt

2014 : Fig.A.txt

The shape is not accidental. By pushing most of the tampering to the beginning and the end of the interval, it makes the damage harder to detect visually, as seen in the overlay below.

ScreenHunter_6364 Jan. 24 14.18

The animation below provides a much better visualization of how much damage the tampering does.

2003-2014GISSTamperingAnimation

View original post

If the data don’t fit the model, adjust the data?


If there is one thing that climit scientists and agencies like NOAA and NASA have done very well is to made the data that have fit the narrative that they want to promote. This group belongs in story telling (propaganda) not science!

Bob Greene's avatarJunkScience.com

New estimates of historical sea level rise rates seem to be adjusting the data to fit water input estimates rather than adjusting the input estimate to fit data. 

View original post 334 more words

Super-Heated Air from Climate Science on NOAA’s “Hottest” Year


As NOAA plays games with the numbers sometimes they get caught at least by some of us!

Environmental Audit Committee wants to keep global temperature below 2C


My would anyone base anything on a made up 2C limit that no scientist will even take credit for developing!

tallbloke's avatarTallbloke's Talkshop

The level of ignorance is astonishing. These people aren’t fit to wield power. Vote them out in May.

eac-release

View original post 58 more words

2014: The Most Dishonest Year on Record


With all the fudging they do with data who knows was the temperature really was? But in any case the 2014 temperature is nowhere close to what the IPCC GCM’s say it should be!

Tony Thomas: The Settled Science of Ignoring Facts


This matches the climate work I have been doing over the past 10 years.

tallbloke's avatarTallbloke's Talkshop

Guest post from Tony Thomas originally posted at Quadrant online.

Australia’s Academy of Science is overdue to clarify its position on global warming, but don’t expect that much-delayed document to be written in the ink of rational objectivity. Despite doubts creeping into the pronouncements of overseas counterparts, local warmists remain determined to defend the faith.

models-reality

The position of the Australian Academy of Science on global warming was last stated in August, 2010. It basically regurgitated the 2007 findings of the Intergovernmental Panel on Climate Change (IPCC) with some Australian temperature trends thrown in – from 1910, thus eliding the inconvenient 19thcentury heatwaves.

But the Academy has  had  problems with its promised update for 2014.   Kick-start funds for printing and production, undisclosed but modest, arrived from the Labor government in June, 2013.

By October, 2014, the text was finished  and the project moved to the design phase, in…

View original post 2,508 more words

Bill Gates on the need for World De-Population


This is an incredible story of how we’ve got to this point of actually watching Bill Gates showing that, albeit indirectly, there are way too many people in the world which than implies de-population. At TED2010, Bill Gates unveiled his vision for the world’s energy future, describing the need for “miracles” to avoid planetary catastrophe from CO2 and the necessary goal of Zero carbon emissions globally by 2050.

Watch the first 12 minutes of this clip and listen to Gates talk about CO2 reductions and then read the rest of this review.

This perceived problem of CO2 had its beginnings in the United Nations (UN) Conference on the Human Environment, which met at Stockholm from June 5th to June 16th 1972. What happens over the next thirty years can be directly traced to this conference! The previous Video and the following discussion highlight only a few of the major events that have led many to believe that all life on earth is threatened by there being too many people a principle first proposed by Thomas Malthus, an early English economist. Malthus published and essay in 1798 titled An Essay on the Principle of Population where he proposed that sooner or later population growth will be checked by famine and disease, leading to what is known as a Malthusian catastrophe; which later technology prevented from happening.

The 1972 Stockholm conference led to European studies on the role of Carbon Dioxide and the environment such as the SCOPE 13 The Global Carbon Cycle paper published in 1979 by the Scientific Committee On Problems of the Environment (SCOPE) in Paris. This paper showed very dire results for increased levels of Carbon Dioxide, and reignited the old Malthusian catastrophe concept.

In conjunction with the Europeans climate work a request was made to the National Academy of Science (NAS) to study the issue. In 1979 the completed study, now called the Charney Report, agreed that there was a problem and justified their conclusions by defining a key number need in the science. They looked at the work of a young scientist working at the National Aeronautics and Space Administration (NASA) James E. Hansen’s high estimate of 4.0 C and added .5 degrees C to it for uncertainty. Then they took another scientist working a National Oceanic and Atmospheric Administration (NOAA) Syukuro Manabe’s low estimate of 2.0 C and subtracted .5 from it for uncertainty. Lastly they average the two which then gives us a 1.5 C Low value, an 3.0 C expected value and a 4.5 C high value as the CO2 sensitivity values which are what are still used today thirty five years later. Hansen and Manabe were the only two that had climate models that were reviewed in the Charney Report and Hansen’s paper was not officially published at the time.

James Edward Hansen while at NASA, was the driver for the US government’s push for control of energy. Hansen gave a presentation to the US congress in 1988 where he showed them what he thought would happen to Global Climate if we did not stop putting Carbon Dioxide (CO2) into the earth’s atmosphere. In the original 1988 paper, three different scenarios were used; A, B, and C. They consisted of hypothesised future concentrations of the main greenhouse gases – CO2, CH4, CFCs etc. together with a few scattered volcanic eruptions. Essentially, a high and low estimate that bracketed the expected value (B) which Hansen specifically stated that he thought as the “most plausible”. Hansen used the 1979 NAS report as justification for the logic used to build these three scenarios.

Shortly thereafter we had the creation of the Intergovernmental Panel on Climate Change (IPCC) which was set up in 1988 by the United Nations (UN) at the request of two of its other organizations; the World Meteorological Organization (WMO) formed in 1950, and the United Nations Environment Program (UNEP) set up after the Stockholm Declaration in 1972. The IPCC’s mission is to provide comprehensive scientific assessments of current scientific, technical and socio-economic information worldwide about the risk of climate change, specifically Anthropogenic Climate Change. A key point here is the IPCC was never charged with proving whether the Anthropogenic assertion true or not it was only charged with determining how bad it would be; in essence assuming it was true.

The next major event was the UN Conference on Environment and Development (Earth-Summit), held in Rio-de-Janeiro on June 13, 1992, where 178 governments voted to adopt the program called UN Agenda 21. This was a comprehensive blueprint for creating a “sustainable” world which went from world governance to local school boards and zoning boards which meant that “every” aspect of a person’s life was to be controlled by UN Agenda 21. This program based on Carbon Dioxide rising world temperatures beyond the point where humans could maintain a civilization completed all that was needed for implementation and we were off on a Quest to save the planet.

Enter Al Gore who while in Congress became interested in Climate Change and he was instrumental in getting Hansen funding from Congress to study the problem of Climate Change which was known as Global Warming back then. Gore was very active in the environmental movement while he was Bill Clinton’s VP. Gore continued to promote the movement, after leaving office, and his documentary “An Inconvenient Truth” was released in 2006; this documentary was a story about how the burning of fossil fuels were destroying the planet. It seemed to be targeted at young adults without the education to discern truth from fiction and it was very successful in achieving negative awareness on the subject. Unfortunately, the message in that documentary was not factually correct and appeared to be only an emotional appeal to support the regulation of Carbon Emissions’ (CO2) in some form of Carbon Tax.

An interesting fact, Al Gore was one of the investors that had helped set up a Carbon Trading exchange in Chicago along with a then young Barack Obama (on the board of the major investor The Joyce Foundation located in Chicago) that they named the Chicago Carbon Exchange CCX in 2003. When the American Clean Energy and Security Act was not passed by the US Senate in 2009 the CCX exchange folded the following year, 2010. Gore had been very vocal on this subject and if HR 2454 had been passed by the US Congress Gore would have become very wealthy; so the question is was his involvement in the movement because he believed what he was promoting or because what he was promoting would have made him very wealthy?

This brings us to Bill Gates with his Gates Foundation that has along with Al Gore taken up the cause of stopping Anthropogenic Climate Change which they believe will cause the planet to overheat and create a mass extinction and possibly even ending human life. This movement has now taken the look of a religion and therefore no debate allowed. A few years ago Bill Gates gave a presentation to a select group of supporters where as part of that presentation he showed described a simple equation to show what was needed to reduce Carbon Dioxide to save the planet.

The Gate’s equation is CO2 = P x S x E x C which is the amount of CO2 emitted is equal to the number of people (P), times the service they use (S), times the energy per service (E), times the CO2 per unit of energy. Gates after explaining this equation goes on to explain that we have to get the CO2 value to near “zero” which means that some of these numbers need to get close to zero. Gates doesn’t use any numbers and then he goes on to other subjects in the presentation.

The following numbers represent the estimated 2014 values for the US for the logic Gates presents and which we can then plug into Gate’s equation. GDP is around $18.0 trillion, there are probably 320 million people in the country, the energy we use is almost 100 quad and we produce 0.0000000663 metric tons of CO2 per BTU used. Multiplying all the values as shown in the following table gives 6.525 million metric tons of CO2 which is about what the US emits at present.   Now looking at these numbers how are we going to going to get 6.525 million metric tons of CO2 even close to zero?

De-Population_Page_4

To properly look at Gates’ equation we must look at how all the variables, that he identifies, interact with each other to create Carbon Dioxide. Since this is a very simple equation it was easy to make four tables, one for each variable, and then vary the values to see how they changed the result. Each of the following tables is for one of Gates’ equation, for example the first one is Population (P). We see in the first line that population is 320,090,073 (the second column) and it’s identified as 100% (the first column) which is today’s number, and that using the Gates equation as shown in the table above equals 6,526.0 million metric tons of CO2. The next line down is 98% and each line down is reduced by 2% until we get to 80% in the last line. That represents a 20% reduction in the population to 256,072,058 people and 5,221.8 million metric tons of CO2.
The next three table for S, E and C follow the same logic although the reducing percentage is different for each. What we have then are 20% fewer people, 30% less GDP, 40% less energy and 50% less Carbon Dioxide if those levels can be obtained. The fifth table at the bottom of the page is the summary of the other four showing that if all those objectives were achieved Carbon Dioxide would be reduced by 83.2% to 1,096,368,000 million metric tons of CO2. I think that the reader can see that this draconian reduction would not be supported by the citizens.

De-Population_Page_5

The average person burns enough food in their body to release about 328.7 Kg of CO2 per year; so if there are 320 million people that’s 105.1 million metric tons per year. Unfortunately to get to zero emissions means there can be no people by definition. Further it’s obvious that the number of people is the driving force in the equation. But even that level (with no economy and no energy) is way more than Gates would like as we are still emitting 105.1 million metric tons of CO2 per year. So how does he propose to get to Zero without getting rid of almost all the people?

The Globalists like Gore, Gates, Soros, and others know that it’s not possible to get to zero human emissions as we showed in these tables. However they do want to reduce the world population to something close to 500,000,000 which is a 92.9% reduction which is not going to happen without a fight; not with 9 out of 10 people being eliminated!

The purpose of this paper and tables is to show that it isn’t possible to do what Gates and his friends say needs to be done — so what is their real motive if not to get rid of lots of people? Or maybe just like Jonathon Gruber they just think we are not smart enough to know they are trying to do something really bad?