Study Finds The CO2 Greenhouse Effect Is Real…But Dangerous Global Warming From Rising CO2 Is Not

Before the article, I reference another article from MIT News.

“…With no feedback effects at all, the change would be just 1 degree Celsius, climate scientists agree…”  So, for ECS there is no positive feedback and global warming is just made from models, not reality.  So, what did this article find the ECS to be?  0.7, or less than 1 degree celsius

2022 / November / 24 / Study Finds The CO2 Greenhouse Effect Is Real…But Dangerous Global Warming From Rising CO2 Is Not

Study Finds The CO2 Greenhouse Effect Is Real…But Dangerous Global Warming From Rising CO2 Is Not

By Kenneth Richard on 24. November 2022


German physicists claim to have experimentally demonstrated the greenhouse effect from greenhouse gases like CO2 and CH4 is a real phenomenon, but assess the climate sensitivity to a doubling of CO2 with feedbacks is “only ECS = 0.7°C … 5.4x lower than the mean value of CMIP6 with ECS = 3.78°C.”

“The derived forcing for CO2 is in quite good agreement with some theoretical studies in the literature, which to some degree is the result of calibrating the set-up to the spectral calculations, but independently it determines and also reproduces the whole progression as a function of the gas concentration. From this we deduce a basic equilibrium climate sensitivity (without feedbacks) of ECSB = 1.05°C. When additionally assuming a reduced wing absorption of the spectral lines due to a finite collision time of the molecules this further reduces the ECSB by 10% and, thus, is 20% smaller than recommended by CMIP6 with 1.22°C.”

“Detailed own investigations also show that in contrast to the assumptions of the IPCC water vapor only contributes to a marginal positive feedback and evaporation at the earth’s surface even leads to a significant further reduction of the climate sensitivity to only ECS = 0.7°C (Harde 2017 [15]). This is less than a quarter of the IPCC’s last specification with 3°C (see AR6 [1]) and even 5.4x lower than the mean value of CMIP6 with ECS = 3.78°C.”

“The presented measurements and calculations clearly confirm the existence of an atmospheric GHE, but they also demonstrate the only small impact on global warming, which apparently is much more dominated by natural impacts like solar radiative forcing (see, e.g., Connolly et al. 2021 [16]; Harde 2022 [17]). Therefore, it is high time to stop a further indoctrination of our society with one-sided information, fake experiments, videos or reports, only to generate panic.”

Image Source: Harde and Schnell, 2022

CO2 is Innocent but Clouds are Guilty. New

Key to me is “…Why is albedo change important?  Because the IPCC theory of CO2 effect on GW assumes that the earth’s albedo has been constant (or not changed much) and CO2 (and other greenhouse gases) thru Radiative Forcing effect GW.  The resent satellite data says this is not true...”

CO2 is Innocent but Clouds are Guilty.  New Science has Created a “Black Swan Event”**

2022  November 23

Charles Rotter

By Charles Blaisdell PhD ChE  

**From web sources: “,… in 1697 the Dutch explorer Willem de Vlamingh discovered black swans in Australia, upending the belief” (that all swans were white) “and transforming how we understand the natural world.  …the phrase “black swan event” came to refer to an event that suddenly proves something that was previously thought to be impossible.”

(This paper is a continuation of my previous paper   (1) with new data that reaches the conclusion that “CO2 is innocent but Clouds are Guilty” )

Part I:  CO2 is Innocent but Clouds are Guilty.

     Our tax dollars have been at work with NASA for the last 20+ years putting satellites in orbit to detect and measure the “CO2 effect” on Global Warming, GW.  After 20 years, the CERES satellite (and others) has discovered that cloud reduction is the major effect on GW for those 20 years. Two papers published in 2021 reach this conclusion, Dübal and Vahrenholt,  (2) and. Loeb, Gregory et al  (3) 

These new papers do claim some sign of CO2 effect (and other greenhouse gases) on GW; but the papers show the dominate effect on GW for those 20 years was the cloud reduction effect (albedo reduction- warming).   This paper will show that the observed cloud reduction will account for all the GW in those 20 years and back to 1975, leaving no GW left over for the CO2 effect on GW. Cloud reduction is albedo reduction, (albedo: color of the earth, black, 0.0, is hot and white, 1.0, is cool).  Another recently published paper (2021) by Goode et al (4) measuring earth’s albedo from moon shine also reports the same reduction in albedo as the CERES data of both Dübal and Loeb:  one can only conclude that for 20 years of data the albedo change is real.   

Why is albedo change important?  Because the IPCC theory of CO2 effect on GW assumes that the earth’s albedo has been constant (or not changed much) and CO2 (and other greenhouse gases) thru Radiative Forcing effect GW.  The resent satellite data says this is not true.   Cloud cover changes are best documented at “Climate and Clouds”(5) with links to the data source at “Climate Explorer” (6).  “Climate and Clouds” conclude that cloud change only accounts for 25% of the GW.  This paper will show an improved analysis of “Climate and Clouds” data agrees with the CERES data of Dübal and Loeb that cloud reduction is accounting for most if not all of the warming over CERES’s 20 years.  Figures 1 and 2 show a graphic representation of what Dübal and Loeb observed in the CERES data and what was expected from IPCC Radiative Forcing, RF, theory.  The shape (slopes) of the observed and expected are entirely different but the increase in the missing energy (Earths Energy Imbalance, EEI) is the same.  The missing energy, EEI, is used to warm the earth though the energy balance equation:

Energy In = Energy out + Accumulation (EEI)    Eq 1.

If the accumulation (EEI) is positive the earth warms if negative the earth cools.

     Cloud reduction effects GW by reducing the amount of highly reflective clouds covering the earth and letting in more sun light to warm the earth, Cloud Reduction Global Warming, CRGW.

Is Cloud Cover Changing?

     Yes, Cloud cover changes with seasons, hemisphere, altitude, and over time. Figure 3 shows the satellite data for cloud cover for the whole earth vs time (about 36 years).  The sine-al nature of the graph is a seasonal variation shown in Figure 4.  Figure 5 shows the hemispherical differences in cloud cover.  The hemispherical and seasonal variation in cloud cover is related to the tilt of the axis (23.5’ north) of the rotating earth favoring the northern hemisphere with more sun light and the larger land mass of the northern hemisphere (Total land mass of the earth is 39% of that 68% is in the northern hemisphere and 32% in the southern hemisphere).  It will be later shown that, these variables change the relative humidity which are responsible for the sine-al nature of the cloud cover.

     For global warming the change in cloud cover over years is the variable of interest.  The whole earth’s cloud cover (least squares fit from “climate Explorer” data) vs time in Figure 5 show a 0.075 % cloud change/year.  Note the high degree of variability in Figure 5, some of this variability is theorized by Dübal and Vahrenholt,  (2) to be due to the AMO (Atlantic Multi-decadal Oscillation) in the northern hemisphere which is a natural oscillation in ocean temperature with a period of 60-80 year and an amplitude of +/-0.2’C.  (a period up swing of the AMO occurred in the 1985 to 2020 range and could be related to the peak in 1997 and flatting after 2000 in Figure 5).  There is also a periodic swing in ocean temperature in the Pacific, PDO (Pacific Decadal Oscillation) in the southern hemisphere (commonly known as El Nino) with a period similar to the AMO of 60-80 years and a smaller amplitude of about +/- 0.1’C. The amplitude of each of these oscillations is smaller than the overall change in temperature and are not increasing over time.  The periods of AMO and PDO seem to be opposite and may have some canceling effect on a global basis.  Further explanation of these oscillations are best left up to the experts, in this paper, they are just potential noise makers to the cloud reduction data and emphasize the importance of long term data (the 36 years of cloud data may not be enough).  The 36 year cloud cover decrease of 0.75% per decade will be used in calculations of cloud effected energy changes.

     One more variable that needs to be considered in temperature vs cloud cover:  Time delay, when clouds decrease part of the sun light fall on land and the rest on water.  Land gives its energy back to the atmosphere quickly (over days), over water the energy is stored for years.  Some have calculated up to 80 years for a step change in energy into the ocean to come the full equilibrium  (20) and (21).   This time delay is another reason to use long term slope date to analyze cloud change data.  Our current 36 years of cloud data is probably not enough to complete our understanding of cloud cover and GW. It should be noted that surface sea temperature, SST, follows air temperature closely, questioning the significance of the time delay.

Cloud Cover Change vs Temperature Change

     An empirical way to relating cloud cover to temperature is to divide the least squares fits of the temperature change by cloud reduction change over the 36 years of data.  Figure 6 shows both least squares fits with the result of the ratio being -0.27 ‘C/% cloud change.  “Climate and Clouds”(5)  scatter plot of monthly temperature and cloud cover of the same data showed a least squares fit of -0.066 ‘C/% cloud cover; further emphasizing the need to use long term data to better understand cloud and temperature relationships.    [“Climate4You” (5) web site is a product of ISCCP:  (“Since July 1983, ongoing variations in the global cloud cover have been monitored by The International Sattelite Cloud Climatology Project (ISCCP). This project was established as part of the World Climate Research Program (WCRP) to collect weather satellite radiance measurements …”.) ] The “Climate4You” ratio only accounts for 25% of the observed ( 0.4 ‘C) 20 years of CERES data.  Figure 6’s -0.27 ‘C/% cloud cover accounts for all of the observed temperature change.

     Although significant, this ratio of temperature and cloud cover change is not the best way to prove the significance of cloud cover change.  The CERES data is energy data, cloud cover change must be related to CERES energy observations. Table 1 converts the observed albedo from Dübal (2) to energy change (Short Wave, SW, in – SW out at Top Of the Atmosphere, TOA) and is shown in Figure 7.  Table 2 uses the cloud cover from “Climate Explorer” least squares fit in Figure 5 and the Dübal “cloudy area” and “clear sky” albedo data to calculate the energy to the earth, the results are shown in Figure 7.  The comparison of the two calculation is close enough to claim: the cloud cover change can account for all the temperate change and energy change observed in the 20 years of CERES data.

     (Note: in Table 2 Dübal observed a small (but significant) change in the “Clear sky” albedo (decreasing). The “clear sky” albedo is the ground (land + ocean) color of the earth.  Holding the cloud change constant shows this small albedo “clear sky” albedo change can account for 15% of the observed energy in the 20 years of CERES data.  Cloud change is the major effect on GW)

Why a 1975 Zero for the CERES data?

     Many researchers have noticed that the temperature vs time curve since 1880 is not linear, the data better fits an exponential or 2ed degree polynomial.   One can also use two linear equations to fit the data, as shown in Figure 8.  The intersection of the two lines is about 1975.  The lower line has a poor R^2 and accounts for about 25% of the temperature rise.  The second line has a much higher R^2 and account for about 75% of the rise.  We have a lot more data in the 1975 to 2020 range so we should have a better chance of explaining GW in that range.

     The extrapolated data and 20 years of CERES data in Figure 7 are overlayed on Figure 8 – a good fit.  Table 1 and 2 show that albedo change and cloud cover change from 2001 to 2020 and from 1975 to 2020 can account for all the temperature change in each period.   CRGW is a valid theory and should be considered by the IPCC.

How did this significant change in scientific understanding occur?

     The 2021 papers by Dübal, Loeb, and Goode (and some others) verifying a 20-year change in the earth’s albedo is like a scientific “Black Swan Event” **.  The earth’s albedo and cloud cover changing over time was totally unexpected (“all swans are white”).  Albedo change being caused by cloud cover reduction was also unexpected prior to 2021.  All previous methods of measuring albedo and cloud cover showed no change.  There were modelers like Walcek (7) who predicted that if cloud cover changed it could be as significant as the predicted greenhouse gas GW.  The effect of greenhouse gases could be measured in the lower atmosphere and was known to be saturated (all ready enough, more would not change GW).  The IPCC needed a theory that could account for the observed GW with constant albedo and cloud cover – That theory was Radiative Forcing, RF.  RF is a plausible theory but needed to be measured in the upper atmosphere.  NASA sent up satellites to measure the RF (along with many other thing).   NASA’s satellites changed the method of measurement and the accuracy and with 20 years of data could see the small differences in big numbers needed.  And here we are today trying to get the IPCC to look at the “Black Swan”.

     Models are also a contributing factor.  There are climate models that use scientific laws and math (like IPCC’s Global Circulation Models, GMC’s) to calculate GW and like the simple models in Tables 1 and 2.  Other models use statistical multi variable analysis, SMVA, to predict GW.  The use of SMVAs can lead to some inaccurate conclusions.  Good multi variable analysis design an experimental grid to avoid confounded variables, it is difficult to do this with natural data.  In the case of GW, Cloud cover, relative humidity, albedo, specific humidity, CO2, and other GHGs are all confounded with the earth’s temperature change.  Variables with high accuracy in measurement and definite trends, like CO2, will dominate in SMVAs, even if they have nothing to do with GW.  Variables with poor measurement but good trends (but are the real effect on GW), like cloud cover, will show significance in SMVA’s but not eliminate variables like CO2.  Results from a SMVA are not a proof.  The IPCC’s SMVA model has a “dog’s breakfast” of variables in its AR6 model of GW, in AR6 cloud cover is not listed, but cloud density is, as a global cooling variable.   In all fairness, AR6 was issued in 2021 the same time at the Dübal and Loeb papers  – they may be looking at them now.

What is causing the reduction in Cloud Cover?

     Cloud cover is part of the earth’s water cycle:  the sun’s energy evaporates water, the water vapor makes clouds, and clouds make rain.  We are looking for a disturbance in this natural cycle

The water cycle variables that are a signature of cloud cover change:

Long term Signature of Cloud Cover Reduction

1.     Temperature increasing (less cloud cover – more sun’s energy to the earth, see Figure 8)

2.     Specific Humidity increasing (a result of higher temperature and more evaporation the atmosphere can holding more water, see Figure 9)

3.     Rain fall increasing (more energy in evaporates more water, (if not used for specific humidity increase) the water got to come back down.  A statistical increase has been observed but very low R^2 – graph not shown)

4.     Relative humidity decreasing (main effect on less clouds which leads to the other atmospheric variables, see Figure 10 and Figure 12)

     This is a unique set of atmospheric variables only associated with cloud reduction.

Relative Humidity and Cloud Reduction

     Relative Humidity, RH, has for a long time been associated with clouds.  Figure 11 show a page from Walcek (7)  1995 report which show the decline in cloud cover vs RH observed by him and other researchers.  The trend is there but the noise level is high.  Satellites have improved the observation.   “Climate and Clouds”(5) shows that different types of clouds form at different levels and that their formation may be triggered by things other than RH.  Particulates (aerosols) and cosmic rays have been documented as sources of cloud formation.  Even at 100% RH air can become super saturated and not form clouds.  All the variables are probably responsible for the noise in Figure 11; but the general trend is RH.  Of the three categories of clouds mentioned in “Climate and Clouds”(5) The only one that showed a significant reduction over time was the “Low Level” clouds, cumulus clouds.  Cumulus clouds are about 28% of the total 63% cloud cover of the earth.  The other clouds only create noise in the total cloud cover data.  In “International Satellite Cloud Climatology Project” (8) Cumulus clouds were the only cloud types of nine types of clouds that showed reduction over time, see Figure 13.   

     Cumulus clouds are the ones most affected by changes in RH from the earth surface in that they are the ones in contact with low RH air first.   The data in Figures 4, 5, and 6 contain all cloud types, but the yearly oscillations are related to similar changes in “low level” clouds and RH with time.  These oscillations can be used to make a plot of RH vs cloud cover for all the monthly data in Figure 3 to produce the scatter plot in Figure 14.  The data points used in the model in Table 2 are in red.  Note that these points are within the range of the natural variation of the data.

     The data in Figure 14 can be broken down into more detail to show the difference in monthly profiles between Northern Hemisphere (NH), Sothern Hemisphere and Time shift, see Figure 15.  Note the expected difference in shape of the NH and SH plots, in some months they cancel each other and in other complement each other giving the overall results in Figure 4.  In Figure 15 the cloud change in the Southern Hemisphere is greater than in the NH and the Sothern Hemisphere somewhat dominates the overall cloud change.  All the plots shift with time as the relativity humidity decreases.

The Missing Energy in the Earth’s EEI, Eq 1

     The missing energy in Figure 1  can go to the following paces  see Table 7 for details:

·      Warm the dry air in the atmosphere.  (Small but significant)

·      increase the moisture in the atmosphere and is the major use of EEI energy (specific humidity, Figure 9 and Table 7)

·      increase precipitation (small)

·      warm the land (small)

·      warm the oceans (small, with a time delay)

The bulk of the energy goes into water increase in the atmosphere.

    The Dübal and Loeb data can be used to estimate a degrees Celsius / W/m^2 energy change from short wave energy change of 0.3 ‘C per W/m^2.

Conclusion So Far

     There is no doubt that albedo of the earth has changed over the last 20 years (and longer) and that this albedo change is due to cloud cover reduction (and a little “clear sky” albedo change).  The cloud cover reduction is related to relative humidity reduction.  Relative humidity reduction has been going since 1948 (possible longer).  The cloud reduction data (starting in 1984) has been extrapolated back to 1975.  Cloud reduction has been around for a while.  CO2 is innocent but cloud cover reduction is guilty.  Leaving the question:

Part II.  Cloud reduction effects GW but ‘Man” is still Guilty.

What is affecting the Relative Humidity reduction?

 The observation of relative humidity decreasing (see Figure 10 and 12) has long puzzled climate scientist.  Most climate models show specific humidity increasing (which it does) and relative humidity staying the same.  Papers by J. Taylor (9) and K.  Willett (10)  both express that increasing SH and decreasing RH is inconsistent with CO2 (and other GHGs) effect on GW with no explanation as to why.  This paper gives an explanation.

     The theory Cloud Reduction Global Warming, CRGW, has been proposed (1): “Man’s changes to land use effects the production of low relative humidity, RH, hot air rising to where clouds could be prevented (or destroyed) thus reducing the albedo of the earth”.  This reduction in RH is triggered by a localized reduction (not an increase) in Specific Humidity, SH.  This reduction in SH is occurring only on land and is over whelmed by the increase in SH from evaporation (from oceans) due to the lower Cloud Cover, CC.  The relationship between SH, RH, and CC has a very large natural amplification factor.

  The key to CRGW is water evaporation, transpiration, or run off on land.  When water (rain or snow) falls on the land it can soak into the ground or run off.  On land when ground water is not available the relative humidity drops.  In any man-made structure that covers the virgin land prevents water from soaking in and increases the Run Off, RO.  When water is not available for Evaporation or Transpiration, ET, the relative humidity drops.  (ET is sometimes called Evapotranspiration.)  Some man-made effects (anthropological global warming, AGW) sources of relative humidity reduction are:

·      Cities

·      Any man-made structure that covers the natural ground

·      Forest to farm land or pasture land

·      Pumping water from aquifers

·      Forest fire land change.

·      Flood water prevention like dams and levees.

·      I am sure there are others

     Figure 13 shows a very good depiction of the water cycle on earth.  Of interest to the CRGW theory is the land part showing rain fall, evaporation, transpiration, and run off, RO.  Note that the rain fall is the sum of the evaporation, transpiration, and run off.  The evaporation is from water that has soaked in to the ground.  Transpiration is water that evaporates through any kind of vegetation, trees have the highest.  In the land water balance if any one of these changes it effects the others.   An example: if the virgin land is cover with asphalt or concrete that prevents water from soaking in to the ground where vegetation can evaporate the water then the water will run off and the ET will decrease.  Another example: If a forest is replaced with farm land or pasture the forest’s ground cover no longer holds water.  The crops that replaced the forest are only growing part of the year and do not have the leaf area as the tree’s many leaves and deep roots, all this decrease the ET.  A decreasing ET increases the Run Off, RO.  This change in ET sets in motion a series of events on land where ET has been restricted:

1.     RO increase making ET decrease, this lowers the local specific humidity, SH (SH % change is another measure of ET % change).  The land-based change in RH vs SH is shown in Figure 17, showing a 21:1 ratio.  Table 6 adjusts this relationship for the whole earth, down to 6.2 : 1.

2.     As the ET decrease, this creates low relativity humidity, RH, air with SH change equal to the change in -RO(%) and +ET(%).  In this step the RH and SH both decease.  Note at this point the local SH decrease is opposite the observed global increase.

3.     As the low humidity air rises (to where clouds form) the relative humidity, RH, drops even further. Another amplification occurs, 4.58 : 1, Figure 10 slope ratios, SH 850mb/SH 1000 mb..

4.     The low RH air spreads around the world (mainly off shore) and reduces cloud cover, CC,  this process has the smallest of amplifications, 1.2 : 1, In Table 3 dif. CC/dif. RH 850mb.

5.     Less CC lets more sun’s radiation in.  Table 6 shows the product of all these amplifications to be  34:1 change in CC per change in SH through this series steps of RH changes.

6.     More radiation warms all earth’s surfaces.  On land more radiation makes the relative humidity even lower.

7.     On the oceans the radiation increase warms the water and evaporates more water increasing the global specific humidity, SH.  This increase is greater than the local land decrease in SH resulting in the observed increase in SH.

8.     The result is the rising SH and dropping RH.  The localized short term ET effects are not seen on a yearly basis.  (Figure 1 in (1) shows city examples)

     This list of events is better seen by Figure 18.  Figure 18 is a blowup of a small part of a Psychrometric Chart, PC, that best describes the earth’s atmosphere.  Table 3 is a list of all the least squares fit data (from Figures in this paper) that are used to show the CRGW theory is valid.  Table 4 puts this data in to a “Free on Line PC “, (11) to test the fit of actual 1975 and 2020 data to calculated data from (11).  The low difference in Table 4 shows a good fit.  The “Free on-Line Psychrometric Chart” is a good calculator for atmospheric changes.

Estimating ET changes

     ET is somewhat like cloud cover; It varies a lot from season to season, hemisphere to hemisphere, and with land mass.   What we are looking for is small changes over time (smaller than the cloud changes – remember the amplification factor).      

     The easiest to explain change is ET is Cities or better known as Urban Heat Islands, UHI’s,   UHI’s got their reputation as heat island due the higher temperature from lower albedo and lower water, SH.  The effect on RH was not appreciated until this paper and the previous paper.  UHI’s temperature, SH, and RH behavior is predicted by a PC in (1).  The UHI’s change in ET is related to run off, RO, increasing, (if precipitation cannot soak into the ground, it runs off and is not available for ET).  The earths land surface is covered by 3% urban development, and about half of the population lives there.  The structure that the other half lives in also covers the earth with roof tops and drive ways that do not allow water to soak in and also increasing the RO.  That gives 6% of the earths land mass having an effect on the RO and ET.   The amount of RO change for UHI’s is hard to find data on but a lab experiment by U of Colorado College of Eng. (16) shows a 20%-30% increase in RO.  Table 5 uses 6% of land coverage and 25% change in RO.

     RO changes from land use changes are also hard to find.  One of the best reports on land change from satellite data is by Winkler K. et al (16)  with data claiming 32% of the land has been changed by man.  Most changes were virgin land to crop or pasture, some was reclaimed pasture back to forest.  Run Off in the Mississippi river basin computer simulations is documented by Tracy E. et al (18) showing a range of RO from +45% to -25% depending on what was being converted to what.     McMenemie C. (19)   paper singles out damming rivers and putting in levies to prevent flooding to have a significant effect on ET. 

     Depleting aquifers effects the water table to lower and reduce the water available for ET thus making more low RH air.  Most ground water from aquifers is recycled back as recharge yet the earths aquifers are decreasing.  According to the web “Typically, 10 to 20 percent of the precipitation that falls to the Earth enters water-bearing strata, which are known as aquifers.”  The 10 to 20 percent may not be shown accurately in Figure 16 (it is possible it is part of the RO).  No data could be found on the total earth effect of lower ground water on ET or RH – it should not be insignificant.  

     The study of ET is a relatively new monitoring field.  Most paper on the subject at less than 10 years of data and the emphasis of the studies are usually water management or carbon sequestration very little atmospheric specific humidity  or relative humidity data is reported.  Some satellite data is just started (< 5 yrs).  The results of current papers are not consistent.  Some examples: BaolinXue et al (23)   shows no change in 20 years from rural land-based stations in the FLUXNET dataset (as expected on non-land change areas and not urban data).  Samuel Zipper et al  (24) have a short (4 year) but very good study of Madison Wi USA UHI showing a 5%/year drop in ET in a 20 km^2 radius of Madison central (not a statistically significant time).  Qingzhou Zheng et al (25) study of a whole water shed (110 km^2) in China that included forest, crop-land, baron land, rivers, wetlands, and cities showed a 7% reduction in ET in 13 years, with the main factor being urban expansion.

     This paper will use estimates of land change RO that is in the range of the publish data as shown in Table 5, 30% land coverage and 10% RO change.  The man-made structure RO (% of global) added to the RO (% of global) totals 1.3% (% of global) (or -ET% change).   It is not the intension of this paper to be an expert on ET change only to show this change can account for all the GW at  1.3% (% of global) from 1975 to 2020.

Explanation of Figure 18 model of the Path taken by the 1975 to 2020 climate change.

      Figure 18 is a very blown-up Psychrometric Chart showing the chain of effects (the path) to the final 1975 to 2020 observed climate change.  The parallel energy (all atmospheric changes occur at constant energy or a shift in energy) lines are established from the observed 1975 and 2020 data of 33.54 kJ/kg(da) for 1975 and 35.72 kJ/kg(da) 2020.  The starting point is the 1975 SH at 7.7 g/kg(da) on the 33.54 kj/kg(da) energy line.  The ET change of -1.3% (above) is about -0.1 SH change shown at 7.6 g/kg(da) in the Figure 18 model.   The 34:1 natural amplification of this SH change (through RH changes) results in the CC reduction (see Table 6) and the energy shift to the 2020 parallel energy line of 35.72 kJ/kg(da).  The hot low RH air evaporates water (increase SH), cools the air and increases the RH to the 2020 end point.  This technique of tracing an energy, temp, SH, and RH path is standard in engineering heating cooling design.

Conclusions on the Effects of Relative Humidity Reduction Over Time.

     The high sensitivity of SH to RH to CC is a natural phenomenon.  The CC variation in Figure 3 is natural (effected by tilt of the earth and larger land mass in the NH) tracked by RH and SH.   These natural CC variations are greater than the CC reduction observed.   The natural laws used in the Psychrometric Chart show the high sensitivity of RH to SH.  Using an estimate of RO in the range of publish data shows a good fit to observed data.  This explains the rising SH and decreasing RH.

     The modelers of the 1990’s where on the right track – if clouds change the results would be as strong as the that expected from CO2.  The IPCC should evaluate CRGW theory. 

Figures and Tables

Figure 1,  Graph of what Dübal and Loeb both observed (all energy is TOA).

Figure 2,  Graph of what was expected in the 20 years of CERES data based on IPCC Radiative Forcing theory of greenhouse gases.

Figure 3, Cloud Cover over 45 years from “Climate and Clouds”(5) shows the seasonal and time reduction in global cloud cover.

Figure 4, Breaking down Figure 3 data by average/month shows a low in cloud cover in the summer month (NH) and a high in the winter months (NH) due to larger land mass and axis tilt in NH vs SH.  Later Figures shows the hemispherical contribution to this Figure.   Relative Humidity from “NOAA Physical Science Laboratory” was added to show the good fit.

Figure 5, From “Climate Explorer” (6).  Showing the difference in hemispherical cloud cover due to northern hemisphere getting more sun than the southern hemisphere.  All data is 3 year smoothed.

Figure 6, Combine satellite data for cloud cover and temperature on graph to get a -0.27 ‘C/% cloud cover ratio.  All data is 3 year smoothed.

Figure 7,  Observed from Albedo change and Calculated from Cloud Cover Change in the 20 years of CERES data.  A very good match.

Table 1, Albedo Change Model from Dubal (2) data.  Extrapolation to 1975.

Table 2, Calculated energy change using Cloud change data from “Climate Explorer” and Dübal data for “cloudy area” albedo and “clear sky” albedo.

Figure 8, Dividing Temperature vs Time into two parts and overlaying SW energy change from albedo and clouds.  Good fit to 1975 to 2020 data. All data is 3 year smoothed.

Figure 8a,  1975 to 2020 part of Figure 8 with actual data, least squares fit used for calculation.  All data is 3 year smoothed

Figure 9, Specific Humidity vs time, note the break point at 1975.  All data is 3 year smoothed

Figure 10,  Relative Humidity vs year for ground level and cloud level.  Cloud level RH much more sensitive than ground level RH.  Cloud level RH will be used.

Figure 11 Copy of page from Walcek (7) showing the 1995 correlation of clouds and RH.

Figure 12,  Relative Humidity at 850 mb (cloud level) vs time.  Note difference between NH and SH.  Good correlation, RH has been changing for a long time.

Figure 13,  “International Satellite Cloud Climatology Project” (8) of just Cumulus cloud cover over 27 years.  Cumulus clouds were the only ones changing of 9 cloud types studied.

Figure 14,  Scatter plot of all the monthly data in Figure 3 and 12 to obtain a correlation between RH and Cloud Cover.  Red dots are data used in the Model in Table 2.

Figure 15,  Monthly plot of cloud cover in both northern hemisphere and southern hemisphere.

Figure 16.  A good diagram of the water cycle on earth from Trenberth et al (13)

Figure 17.  Relative Humidity vs Specific Humidity the average slopes, 21, will be used in Table 6 to show the natural change in RH/SH.

Figure 18,  Path of energy and Specific Humidity, SH, change that accounts for the observed 1975 to 2020 change.


1.     “Where have all the Clouds gone and why care? “ web link:  Where have all the Clouds gone and why care? – Watts Up With That?  

2.     “Radiative Energy Flux Variation from 2001–2020” by Hans-Rolf Dübal and Fritz Vahrenholt  web link:  Atmosphere | Free Full-Text | Radiative Energy Flux Variation from 2001–2020 | HTML (

3.     “Satellite and Ocean Data Reveal Marked Increase in Earth’s Heating Rate” by Norman G. Loeb,Gregory C. Johnson,Tyler J. Thorsen,John M. Lyman,Fred G. Rose,Seiji Kato  web link  Satellite and Ocean Data Reveal Marked Increase in Earth’s Heating Rate – Loeb – 2021 – Geophysical Research Letters – Wiley Online Library

4.     “Earth’s Albedo 1998–2017 as Measured From Earthshine”  by P. R. Goode,E. Pallé,A. Shoumko,S. Shoumko,P. Montañes-Rodriguez,S. E. Koonin  First published: 29 August 2021  web link:  Earth’s Albedo 1998–2017 as Measured From Earthshine – Goode – 2021 – Geophysical Research Letters – Wiley Online Library

5.     “Climate and clouds” by web site  link    climate4you ClimateAndClouds

6.     Climate Explorer web site  Climate Explorer: Select a monthly field (  go to “Cloud Cover”  click “EUMETSAT CM-SAF 0.25° cloud fraction”  click “select field” at top of page on next page enter latitude (-90 to 90) and longitude (-180 to 180) for whole earth.

7.     “Clouds and relative humidity in climate models; or what really regulates cloud cover?”  by Walcek, C. web link Clouds and relative humidity in climate models; or what really regulates cloud cover? (Technical Report) | OSTI.GOV

8.     “International Satellite Cloud Climatology Project” Web page:  ISCCP: Climate Analysis – Part 7 (

9.     “Declining Humidity Is Defying Global Warming Models”  by James Taylor  web link  Declining Humidity Is Defying Global Warming Models (

10.“Investigating climate change’s ‘humidity paradox’”  by Dr Kate Willett  web link How is climate change affecting global humidity levels? | World Economic Forum (

11.“Free Online Interactive Psychrometric Chart”  by  Free Online Interactive Psychrometric Chart web link:  Free Online Interactive Psychrometric Chart (

12.“NASA Physical Sciences Laboratory” web site:  Monthly Mean Timeseries: NOAA Physical Sciences Laboratory

13.”Atmospheric Moisture Transports from Ocean to Land and Global Energy Flows in Reanalyses” by Kevin E. Trenberth1, John T. Fasullo1, and Jessica Mackaro1  web link:  Atmospheric Moisture Transports from Ocean to Land and Global Energy Flows in Reanalyses in: Journal of Climate Volume 24 Issue 18 (2011) (

14.“Met Office Climate Dashboard”  web link  Humidity | Climate Dashboard (

15.“Vital Signs”  Web link  Global Temperature | Vital Signs – Climate Change: Vital Signs of the Planet (

16.“Natural and Urban “Stormwater” Water Cycle Models” by U of Colorado college of Engineering web site  Natural and Urban “Stormwater” Water Cycle Models – Activity – TeachEngineering

17.“Global land use changes are four times greater than previously estimated” by  Karina Winkler, Richard Fuchs, Mark Rounsevell & Martin Herold  web site:  Global land use changes are four times greater than previously estimated | Nature Communications

18.“Effects of Land Cover Change on the Energy and Water Balance of the Mississippi River Basin”  by Tracy E. Twine1, Christopher J. Kucharik2, and Jonathan A. Foley3 web link Effects of Land Cover Change on the Energy and Water Balance of the Mississippi River Basin in: Journal of Hydrometeorology Volume 5 Issue 4 (2004) (

19.“Reasons for Increase in Global Mean Temperature and Climate Change”  By Conor McMenemie web site:  Reasons for Increase in Global Mean Temperature and Climate Change (

20.“How to Heat a Planet? Impact of Anthropogenic Landscapes on Earth’s Albedo and Temperature Mark Healey Lindfield”,  Web file:

21.“Analogy 04 Ocean Time Lag”  by Skeptical Science  web link  SkS Analogy 4 – Ocean Time Lag (

22.“Effect of groundwater pumping on the health of arid vegetative ecosystems” by Victor M. Ponce web link effect_of_groundwater_pumping.pdf (

23.Global evapotranspiration hiatus explained by vegetation structural and physiological controls by BaolinXue et al  web link Global evapotranspiration hiatus explained by vegetation structural and physiological controls – ScienceDirect

24.Urban heat island-induced increases in evapotranspirative demand by Samuel Zipper et al web link  (PDF) Urban heat island-induced increases in evapotranspirative demand (

25.Effects of Urbanization on Watershed Evapotranspiration and Its Components in  Southern China  by Qingzhou Zheng web link Microsoft Word – water-713292.docx (

Willie Soon speaks at the University of Chicago

The article has links to both his presentation and to the slides. It has hard to see the slides in the presentation.

By Andy May

Dr. Willie Soon gave a great presentation at the Federalist Society Chapter at the University of Chicago Law School on November 18, 2022. The title of his talk is:

“The Corruption of Environmental Rulemakings at the US EPA: Climate Change, Mercury Emissions, and Air Quality”Willie Soon, 2022

Dr. Soon’s slide deck is excellent reading and he has kindly sent it to me, you can download it here. If you prefer to watch his presentation, you can do so on YouTube here. Soon’s presentation starts about 22:46 minutes into the video.

Soon’s key points:

  • Given the daily, seasonal, and annual range of temperatures around the Earth, the warming of the past 125 years is trivial.
  • Except for ENSO variations, the global average surface temperature has hardly changed in over 20 years.
  • Willie humorously dismantles the article on him in Wikipedia and Gavin Schmidt’s criticisms, these slides are worth the download!
  • Willie plugs the article he wrote with 23 co-authors entitled: “How much has the Sun influenced Northern Hemisphere Temperature trends? An ongoing debate.” Seriously, this is probably the best climate change article written in the last thirty years in my humble opinion, I refer to it all the time. The bibliography alone is worth it. If you never read another climate article in your life, you should read this one. Download it here.
  • He destroys the Mercury pollution nonsense that is permeating the media. Possible spoiler, don’t drink Coca Cola!
  • Is it air pollution or weather?

Finally, President Dwight Eisenhower’s warning about “public policy [becoming] the captive of a scientific-technological elite” was correct:

“It is time to face a hard truth: the seventy-year experiment to federalize the sciences has been a failure. The task now is to prevent the Big Science cartel from further dehumanizing society and delegitimizing science. There is a second hard truth: the necessary reforms will not come from within. Rather, it will be the people and their representatives that will have to impose them. To restore science to its rightful and valuable place, break up the Big Science cartel.”(J. Scott Turner, Professor of Biology (emeritus), SUNY College of Environmental Science and Forestry, December 10, 2021)

I wish I had said that.

They Are Testing a Super Creepy “Digital Dollar” That They Plan to Introduce Soon

Monday, November 21, 2022

They Are Testing a Super Creepy “Digital Dollar” That They Plan to Introduce Soon

by Michael Snyder

November 17, 2022

NOQ Report Is Moving to

Are you ready for the government to monitor what you buy and sell on a daily basis?  Because that is what could happen if you start using the new “digital dollar” that they are now testing.  Of course using the new “digital dollar” would be voluntary at first, but what if it eventually becomes mandatory?  The use of physical currency continues to decline year after year, and some governments in Europe have already taken radical measures to phase out the use of cash.  Many among the elite consider digital currencies to be the key to a whole new era of  strict governmental control over the way that we live our lives, and there would be so much potential for abuse.

On Tuesday, an extremely ambitious 12 week test of the “digital dollar” was publicly announced.  As you can see, some of the biggest companies in the financial world are participating

Global banking giants are starting a 12-week digital dollar pilot with the Federal Reserve Bank of New York, the participants announced on Tuesday.

Citigroup Inc, HSBC Holdings Plc, Mastercard Inc and Wells Fargo & Co are among the financial companies participating in the experiment alongside the New York Fed’s innovation center, they said in a statement. The project, which is called the regulated liability network, will be conducted in a test environment and use simulated data, the New York Fed said.

When asked about his firm’s participation in the project, a Citigroup executive sounded very enthusiastic

“Programmable US dollars may be necessary to support new business models and provide a foundation to much-needed innovations in financial settlements and infrastructure,” Tony McLaughlin, managing director for emerging payments and business development at Citigroup’s treasury and trade solutions division, said in a statement. “Projects like this, that focus on the digitization of central bank money and individual bank deposits, could be expanded to take a broader view of the opportunity.”

This is something that the Federal Reserve has been working on for a long time.

Don’t eat crickets when food shortages really start hitting in America. Eat organic, freeze-dried, sous vide chicken that you can store for a decade or two if necessary. FLASH SALE happening now at Prepper Organics, $50 off with promo code “survive2030“.

Back in January, the Fed released a “much-anticipated discussion paper” on the possibility of a “digital dollar”, and they invited the public to comment on the paper for four months…

In January, the Fed took a first step toward weighing the use of a central bank digital currency when it released its much-anticipated discussion paper and opened a four-month public comment period to receive input.

The paper said that a CBDC could streamline cross-border payments and could further enshrine and preserve the dominance of the dollar’s international role, including as the world’s reserve currency.

Obviously the Fed did not meet with too much resistance during that stage, and so now they are moving on to fully testing the “digital dollar” that they have come up with.

The fact that they are going to spend an enormous amount of time, money and energy testing this new “digital dollar” strongly indicates that they already have plans to introduce it.

The “digital dollar” would be very similar to Bitcoin and other popular cryptocurrencies.

But instead of a decentralized system, the government would control the currency and would have the ability to track every single transaction.

And as Michael Maharrey of Schiff Gold has noted, there would even be the potential for the government to “turn off” the ability of certain individuals to make purchases…

Imagine if there was no cash. It would be impossible to hide even the smallest transaction from government eyes. Something as simple as your morning trip to Starbucks wouldn’t be a secret from government officials. As Bloomberg put it in an article published when China launched its digital yuan pilot program, digital currency “offers China’s authorities a degree of control never possible with physical money.”

The government could even “turn off” an individual’s ability to make purchases.

We don’t want the government to have that much power over our lives.

Thankfully, some members of Congress are sounding the alarm.  In fact, Senator James Lankford has actually introduced a bill “which would require the U.S. Treasury to keep printing and coining money if the government issues an official digital currency”…

On September 29, Republican Senator James Lankford introduced the No Digital Dollar Act, which would require the U.S. Treasury to keep printing and coining money if the government issues an official digital currency.

Lankford said in a news release: “While some Oklahomans are open to digital currencies, many still prefer hard currency or at least the option of hard currency. There are still questions, cyber concerns, and security risks for digital money. There is no reason we can’t continue to have paper and digital money in our nation and allow the American people to decide how to carry and spend their own money. As technology advances, Americans should not have to worry about every transaction in their financial life being tracked or their money being deleted.”

Unfortunately, that sort of a bill is extremely unlikely to get through Congress.

Most of our leaders seem quite eager to explore the “possibilities” of implementing such a system.

And as we have seen over the past few years, those with authoritarian tendencies are not afraid to push the envelope to frightening extremes.

In an article for MaineWire, Steve Robinson listed several hypothetical scenarios that we could potentially see if a “digital dollar” starts being used on a widespread basis…

1.) To protest governmental limits on personal freedom, liberty activists stage a peaceful protest around the nation’s capital. That nation’s leader, wanting to quell the protest and protect his power, instructs his Minister of Economic Control to reduce the protesters’ CBDC balances by 50 percent everyday until the protest ends. The protest ends shortly after the message pings on the CBDC smartphone app.

2.) Economic growth is lagging, and the economists in the federal government suspect it is because consumer spending isn’t strong enough. People are saving their money, rather than spending it. To fix this problem, the Ministry of Economic Control announces a new year-long negative interest rate for all CBDC accounts. Unspent balances of CBDC will be reduced by 10 percent every month. As a result, no one saves, every one spends, and the economists have saved the economy.

3.) You’re at the grocery store picking up some ribeye steaks because some friends are coming over for a barbecue. When you get up to the counter, there’s a problem. The cashier says the payment isn’t going through. You check the CBDC app on your smartphone. There is an alert: “You have exceeded your monthly carbon credit usage; please remove the following items from your grocery cart in order to proceed…”

4.) You want to pick up a new firearm for hunting season, so you swing by the local sporting goods store. But when you go to transfer CBDC credits for the purchase, you’re denied. The trusty CBDC app explains: “We’ve detected activity on your social media accounts that suggests you are at risk of causing harm to yourself or others. You are prohibited from purchasing a firearm for one year.”

Once we open the door to this sort of tyranny, there is no telling where it could potentially end.

So we should strongly denounce all efforts to introduce a “digital dollar” while we still have the opportunity to do so.

Unfortunately, most of the population is still deep in a state of sleep, and so the elite are moving their agenda forward very rapidly.

About the Author: My name is Michael and my brand new book entitled “End Times” is now available on  In addition to my new book I have written six other books that are available on including “7 Year Apocalypse”“Lost Prophecies Of The Future Of America”“The Beginning Of The End”, and “Living A Life That Really Matters”. (#CommissionsEarned)  When you purchase any of these books you help to support the work that I am doing, and one way that you can really help is by sending copies as gifts to family and friends.  Time is short, and I need help getting these warnings into the hands of as many people as possible.

I have published thousands of articles on The Economic Collapse BlogEnd Of The American Dream and The Most Important News, and the articles that I publish on those sites are republished on dozens of other prominent websites all over the globe.  I always freely and happily allow others to republish my articles on their own websites, but I also ask that they include this “About the Author” section with each article.  The material contained in this article is for general information purposes only, and readers should consult licensed professionals before making any legal, business, financial or health decisions.

I encourage you to follow me on social media on Facebook and Twitter, and any way that you can share these articles with others is definitely a great help.  These are such troubled times, and people need hope.  John 3:16 tells us about the hope that God has given us through Jesus Christ: “For God so loved the world, that he gave his only begotten Son, that whosoever believeth in him should not perish, but have everlasting life.”  If you have not already done so, I strongly urge you to invite Jesus Christ to be your Lord and Savior today.

Article cross-posted from End of the American Dream.

© 2022 NOQ Report

HOW THEY STEAL: We Can See The Phantoms!  Get Them Off The Voter Rolls!

By Joe Hoft
Published November 19, 2022 at 7:00pm

We Can See The Phantoms!  Get Them Off The Voter Rolls!

Guest post by Jay Valentine

Republicans, always late to how their voters are marginalized, now exclaim “we get it!”

“Let’s place mail-in ballot boxes at gun shows, churches and Kiwanis breakfasts.”

This is electioneering theater.

Our Leftist pals respond with two strategies:  they monitor the election apparatus in every district – knowing how many ballots they need.  They use their phantom ballot inventory for the votes to cover any Republican votes gathered at that gun show.

Republicans don’t monitor the election apparatus.  They are too honest to do it.  Where Democrats are happy to do it.

We are left with the Republicans’ new strategy encouraging voters to vote early via mail-in ballots.  The Democrats vote early too, they also have a slush bucket of around 4% – 6% of the ballots, ready when needed.

With 4% to 6%, perhaps as high as 15% of voter rolls with fake names, 1900-year-old voters, undeliverable addresses, people who moved out of state yet still vote, apartment buildings under construction housing scores of registered voters, jails and hotels with registered voters, Air B&Bs, U.P.S. boxes crowded with phantoms – it comes in pretty handy in a tight race.

Here’s how it’s done.

Go to a homeless shelter, the Salvation Army or the Lefty church soup kitchen.  Register less fortunate neighbors, perhaps with a gift card.  Fill in their form, they just need to sign.  100 fellow citizens now get mail-in ballots – every election!

Need more?  Hit those colleges.

Register students.

When submitting registrations, leave out their box number.

You created an inventory of floating ballots which, when shipped to the university dorm are undeliverable.  The kids may not vote but, who cares, vote them yourself.  You have ballots, who cares about voters?

Wait you say!  Those ballots are returned to the Post Office!

Perhaps.  But if you are a clever entrepreneur, knowing someone will pay you $25 for every ballot, you hang around the mail room.  What do you think happens to those ballots?

Do you think the Republican Kiwanis guy in Dockers is there with gift cards for every floating ballot?  Probably not.  Who gets them?  Guess!

With almost no effort, bad guys create thousands of floating ballots – which will never be voted by their applicant.  And this is only part of one county!  These “identities” are around to vote every year – in every election.

It gets much easier.

Run a report against the state voter roll: “show all registered, active voters who never voted before 2022!”  We explain this at

These are real people, usually.  They just aren’t into civics.  Track them via the state’s voter registration system during early voting.  They probably won’t vote, so start slowly adding a few votes from your ballot inventory as you need them.

If they show up at the polls, no problem.

Advertisement – story continues below

Tell them “sorry there Jimbo you already voted!” They may complain.

Give them a provisional ballot, then toss it in Bin 16.  When the cameras are turned off – do you notice they always seem to get turned off? – those ballots get recycled.  Not counted!

Never be greedy.  Win by a fraction of a percentage – enough so Fox News won’t let Tucker Carlson question obvious voter fraud.

It’s important to control the clock – like in a football game.  Sometimes things get out of whack as when that Trump guy ran.  He flooded the system with ballots beyond expectation.

Stop the clock!

Tell everyone to chill!  Shut the place down and don’t be rushed by citizens demanding to know who won.  Who cares?  They don’t count.  You count! Or one should say – “you are counting, thus you count.”

Take your time, like a week, maybe two, bring in extra ballots if needed.  Tie them to that guy who did not vote.  Remember that batch from the shelter – they’re good for 50 or more votes.  Add them as you need them, always tying them to a person somewhere on the voter list – a phantom.

Hapless Republicans set up mail ballot boxes at gun shows but the bad guys have an inventory of ballots they control – it makes electioneering so much simpler.  4% – 6%.  Every state!

See, anyone can do it!  And they do.  For 40 years.

This is electioneering, and the Leftists own it.  Putting mail in boxes at the Kiwanis breakfast is theater, not strategy. It’s RNC-speak for “fighting back.”

Republicans may not control that ballot inventory, but now they can see the phantoms that are tied to every one of those ballots – with technology.

If you can identify them, you can force them to be removed.  Here’s how:

Take every voter registration roll from every state and ingest the property tax database for every county – they’re on-line already.

You instantly know with 100% fidelity, every address that is a business, vacant land, a multi-family unit, rental property, single family home.

You know the number of square feet and number of baths.  Who cares?

When you do “density analysis” showing an 870 square foot house, with 18 voters claiming to live in it in Harris County, Texas, with one bath, you challenge the certificate of occupancy.  Wow, Republicans would never think of that!

You can ID that apartment building in Georgia, under construction with 25 registered voters in it – and doesn’t have a roof yet!

When the local judge tosses your affidavits, as he did, with the photo of the unbuilt building with live voters registered in it, grab a crowd and a bullhorn, stand near the cement mixers and make a scene.

Think what happens when 10,000 Americans do this at scale in 8 swing states on the same day!

Fractal technology, operating 1,000 to a million times faster than any current technology means you can tie every address to every voter from a phone – instantly.  When your Lefty pals question you, it’s their county tax records and county voter registration records showing 120 people living at a convenience store.

Total visibility to voter rolls means you can take action.

People, we are in a political streetfight to save America.

Our Fractal Programming team is delivering disruptive technology in 15 – 20 states and voter integrity teams find phantoms by the cargo vessel load.

People file affidavits, judges toss them, Kari Lake gets screwed.

We know who they are!  We can show phantoms on your phone!

The question for the Republicans is – do you want to have a drop box at the church or do you want to measure every voter against every property tax record and when you find clearly fake people living in a construction site – you show up with a bullhorn?

If you choose the former, you choose to lose.

Perhaps it’s time for the bullhorn!

Jay Valentine can be reached at  The Fractal Election website is  The Fractal team runs the voter registration data for 15-20 states for voter integrity teams.

Joe Hoft


Recent Posts Contact

Joe Hoft is the twin brother of TGP’s founder, Jim Hoft, and a contributing editor at TGP. Joe’s reporting is often months ahead of the Mainstream media as was observed in his reporting on the Mueller sham investigation, the origins of COVID-19, and 2020 Election fraud. Joe was a corporate executive in Hong Kong for a decade and has years of experience in finance, IT, operations, and auditing around the world. The knowledge gained in his career gives him a unique perspective of current events in the US and globally. Joe is the author of five books. His new bestseller, ‘The Steal: Volume II – The Impossible Occurs’ is out now. It addresses the stolen 2020 Election and provides an inventory of activities that prove the 2020 Election never should have been certified for Joe Biden. It’s available at major retailers now – Please take a look and buy a copy.


Methane is saturated

Methane: Much Ado About Nothing

David Archibald

Thanks to Modtran, an online program maintained by the University of Chicago, we know that carbon dioxide’s heating effect is logarithmic.  The first 20 ppm of carbon dioxide heats the atmosphere by 1.5°C. At the current concentration of 412 ppm each extra 100 ppm is only good for 0.1°C. Carbon dioxide is tuckered out as a greenhouse gas.

But what of methane which is the excuse du jour for wrecking livelihoods, towns, industries and whole economies? Methane, with a half life of nine years in the atmopshere, is carbon dioxide’s little brother in the pantheon of the satanic gasses.

Witness this headline about antics in New Zealand:

We return to Modtran to see what that oracle will tell us about methane’s heating effect. This is the model output converted to degrees C:

While not as pronounced as carbon dioxide’s drop off in heating effect with concentration, the effect is still there such that at the current concentration of 1.9 ppm, each extra 0.1 ppm heats the atmosphere by 0.05°C. With the methane concentration currently rising by 0.1 ppm every 20 years, the atmosphere will get an extra 0.2°C of heating by 2100. The reader can decide whether or not he/she/it need be worried by this projection.

But methane has only been going up at that rate for a few years. The atmospheric concentration of carbon dioxide has measured since 1958. Methane measurements only started in the mid-1980s and this is what the data looks like:

There is a steep rise at the beginning but then from the early 1990s to 2010 the concentration went sideways for nigh on 20 years. The Cape Grim concentration is particularly flat. NASA has helpfully provided a graph of rate-of-change:

There are three years – 2000, 2001 and 2004 – in which the methane level went down. Let’s disregard the noise and look at the bigger picture evident. And that is the rate of increase declined for 20 years and then went up for 20 years. A few more decades of observations might show whether or not this is cyclic.

But farms that have been going for generations might be wiped out by unnecessary concern about methane while we are waiting for that data.  So we will make a stab at the underlying science. Two factors are likely involved.

Firstly plant productivity has been going up with the increase in the atmospheric carbon dioxide concentration. Parts of the West Australian desert now have 30% more plant matter than a scant 30 years ago. The same is true of the vast stretch of forest and tundra across northern Russia. Unless this vegetation is consumed by fire, its fate is to be the source of methane via termites or rotting. So the hand of Man is not necessarily involved in a rising methane level.

Secondly, the Sun was more active in the second half of the 20th century than it had been in the previous eleven thousand years. That stopped in 2006 with the end of the Modern Warm Period. The Sun has become less active as shown by this graph of solar extreme ultra violet produced by the University of Bremen:

Our current solar cycle, 25, is tracking lower than any of the previous four. The natural enemy of methane is ozone, the most reactive gas in nature. Ozone is produced in the upper atmosphere by radiation with wavelengths less than or equal to 242 nano metres acting on oxygen. So less ozone has been produced since 2006 and this is when the atmospheric methane level stopped falling and started rising again.

Case closed. Nothing to see here. Move along. Only idiots would get hung up on such a minuscule effect that we can’t change anyway. There are real problems coming at humanity that will take all our attention. Destroying the production base in the interim will only make our situation worse.

David Archibald is the author of The Anticancer Garden in Australia

Atmospheric physics Nitrous Oxide and Climate Atmospheric physics

From the CO2 Coalition

Download the entire PDF Nitrous Oxide

Gregory R. Wrightstone

Nitrous oxide (N20) has now joined carbon dioxide (CO2) and methane (CH4) in the climate alarm proponents’ pantheon of anthropogenic “demon” gases. In their view, increasing concentrations of these molecules are leading to unusual and unprecedented warming and will, in turn, lead to catastrophic consequences for both our ecosystems and humanity.

Countries around the world are in the process of greatly reducing or eliminating the use of nitrogen fertilizers based on heretofore poorly understood properties of nitrous oxide. Reductions of N2O emissions are being proposed in Canada by 40 to 45 percent and in the Netherlands by up to 50 percent. Sri Lanka’s complete ban on fertilizer in 2021 led to the total collapse of their primarily agricultural economy.

To provide critically needed information on N2O, the CO2 Coalition has published an important and timely paper evaluating the warming effect of the gas and its role in the nitrogen cycle. Armed with this vital information, policymakers can now proceed to make informed decisions about the costs and benefits of mandated reductions of this beneficial molecule.

This new paper joins previous CO2 Coalition reports on other greenhouse gases, carbon dioxide and methane.

Key takeaways from the paper:

  • At current rates, a doubling of N2O would occur in more than 400 years.
  • Atmospheric warming by N2O is estimated to be 0.064oC per century.
  • Increasing crop production requires continued application of synthetic nitrogen fertilizer in order to feed a growing population.

N2O and its warming potential

The first portion of the paper is highly technical and reviews the greenhouse warming potential of N2O. Like CO2, nitrous oxide is a linear, chemically inert molecule that absorbs infrared radiation. However, N2O has a longer lifetime in the atmosphere than CH4 because it is more resistant to chemical or physical breakdown. Increasing atmospheric concentrations of N2O likely contribute some amount of warming to the Earth’s atmosphere. To assess how much is likely, the authors consider well-validated radiation transfer theory and available experimental evidence rather than very complex general circulation climate models, which have proven unreliable.

The current N2O concentration at sea level is 0.34 parts per million (ppm) and increasing at a rate of about 0.00085 ppm/year. This rate of increase has been steady since 1985 with no indication of acceleration. A comparison with CO2, at a present concentration of approximately 420 ppm, is in order. For current concentrations of greenhouse gases, the radiative forcing per added N2O molecule, is about 230 times larger than the forcing per added CO2 molecule. This sounds bad, but what are the facts?

The rate of increase of CO2 molecules is approximately 2.5 ppm/year, or about 3,000 times larger than the rate of increase of N2O molecules. So, the contribution of nitrous oxide to the annual increase in forcing is 230/3,000 or about 1/13 that of CO2. If the main greenhouse gases CO2, CH4 and N2O have contributed about 0.1 C/decade of the warming of the Earth observed over the past few decades, this would correspond to about 0.00064 degrees Celsius per year or 0.064oC per century of warming from N2O, an amount that is barely observable. At the present rate of increase, a doubling of the N2O concentration would take more than four centuries and, according to Figure 5 of the paper, the increase in warming would be imperceptibly small.

The nitrogen cycle

Along with water and carbon, nitrogen is of key importance to plant life and the right proportion of it is critical for optimal growth. Carbon is available to plants from CO2 in the atmosphere; nitrogen must be made available in the soil. To this end various microorganisms and plant species, with the aid of symbiotic microorganisms, fix diatomic nitrogen (N2) from the atmosphere into the soil, where it enters complicated cycles of nitrogen-containing compounds that can move more or less freely in soil and serve many plants. Through the activity of microorganisms (recent work shows that archaea are of comparable importance to bacteria) the nitrogen cycle ends by releasing N2, and to a much lesser extent N2O, back into the atmosphere. Because of losses to the atmosphere and leaching to waterways, soil nitrogen needs to be replenished continuously to optimize plant growth.

Agricultural and natural vegetative growth contribute comparable amounts to the nitrogen cycle. Optimum crop growth requires large amounts of nitrogen. Some nitrogen is provided by animal manure and decaying plants. However, these sources of nitrogen are insufficient for the needs of agriculture to feed a growing world population.

Figure 14 from the paper compares the relationship between the increasing use of artificial nitrogen fertilizer and the increasing yields of various crops in the U.S. from 1866 onward. The strong correlation between nitrogen fertilization and crop yields is striking. Figure 13 shows a similar correspondence worldwide between the use of nitrogen fertilizer and the yield of cereal crops. Of course, changes in complicated processes cannot be ascribed to a single cause. Also of considerable importance in crop production are other mineral fertilizers like phosphorus and potassium, better plant varieties like hybrid corn and increasing concentrations of atmospheric CO2. However, the crucial role of nitrogen fertilizers in tremendously increasing crop yields is unmistakable.

Figure 14 – Crop yields for corn, wheat, barley, grass hay, oats and rye in the United States.

Figure 13 – Annual world production of nitrogen fertilizer used in agriculture (blue, in Tg)
and world production of all cereal crops (orange, in Gigatonnes) from 1961 to 2019

Feeding a world population that is growing at a rate of 1.1 percent per year is no trivial matter. Devastating famines from the past have been kept at bay during the last century by the fundamental scientific developments noted above. At the moment many governments, under the influence of ‘’green’’ pressure groups, exhibit a dangerous inclination to limit the use of nitrogen fertilizers to move farmers ‘’back to nature’’ in order to save the world from “climate disaster.” In the Netherlands, the government is considering forcing large numbers of farmers out of business to supposedly prevent catastrophic warming from N2O emissions. As this new paper shows, N2O emissions will have a trivial effect on temperature increases. Farmers themselves, not government bureaucrats, should determine the optimum amounts of nitrogen fertilizer to maximize crop yields.

Agriculture free of artificial fertilizers, despite it being highly labor-intensive and producing very low yields, may be feasible for a small niche of the world population willing and able to pay for it. However, it is inconceivable that the growing masses , or even the current world population, can be fed without the intelligent, science-based use of nitrogen and other fertilizers.

‘’Green’’ illusions cannot feed billions of people.

Wheat with and without nitrogen fertilizer – Deli Chen – University of Melbourne

Ukraine, a Ponzi Scheme, and a Top Democrat Donor Raise Serious Questions


Ukraine, a Ponzi Scheme, and a Top Democrat Donor Raise Serious Questions

By Bonchie | 4:00 PM on November 13, 2022

AP Photo/J. Scott Applewhite, Pool

As RedState reported, crypto-exchange FTX collapsed after its much-lauded founder, Sam Bankman-Fried, appeared to make improper transfers of customer money. Somewhere between $1-2 billion of that amount has now gone missing and Bankman-Fried also has disappeared.

What makes this so interesting, though, isn’t just that a lot of really wealthy people got scammed. It’s that Bankman-Fried also happens to be one of the top donors to the Democratic Party. In fact, outside of George Soros, no one has done more to bankroll Democrat efforts since the 2020 election. Joe Biden alone received a whopping $5.2 million.

But here’s where things get even weirder. Apparently, while the United States was bankrolling Ukraine and its war effort, that country’s leaders were investing money into FTX.

It was also revealed that FTX had partnered with Ukraine to process donations to their war efforts within days of Joe Biden pledging billions of American taxpayer dollars to the country. Ukraine invested into FTX as the Biden administration funneled funds to the invaded nation, and FTX then made massive donations to Democrats in the US.

There are so many questions that arise from this. For example, why is Ukraine, which we are all assured is broke and needs US taxpayer money, playing around with a Democrat-linked crypto company? This wasn’t just about accepting donations through the portal. The report specifically says that Ukraine actively invested money in FTX.

While that was happening, FTX’s founder was handing out tens of millions of dollars, from the Bahamas, to help elect Democrats back in the United States. That is one of the shadiest things I’ve ever witnessed in politics.

Yes, the chain of custody regarding the funds involved is tough to know. When and where money was sent is something only an investigation of FTX’s internal operation can ascertain. Still, the appearances here are just horrific. Were Democrats funneling taxpayer money to Ukraine, only for some of it to be sent to FTX so it could be funneled back to Democrat campaigns? That’s a question that must be answered, and any attempt to gloss over it will raise major red flags.

I don’t think I’m going out on a limb by suggesting that if another company had been scamming people while bankrolling the Republican Party, it would be major news. There would be calls for investigations as far as the eye could see to figure out whether Republican politicians were using that company as a passthrough to avoid campaign finance laws. Never mind that simply receiving funds from a Ponzi scheme, even without ill intent, is really bad on its own.

This entire situation stinks to high heaven. It appears that Republicans will end up taking the House of Representatives. When that becomes official, GOP members need to dive headfirst into this and figure out what in the world happened. Because having a Democrat mega-donor get exposed like this while also having Ukraine tied up in the mix is too much to ignore.

Front-page contributor for RedState. Visit my archives for more of my latest articles and help out by following me on Twitter @bonchieredstate.

Copyright ©2022 Media. All Rights Reserved.

The Dirty Secrets inside the Black Box Climate Models

By Greg Chapman
“The world has less than a decade to change course to avoid irreversible ecological catastrophe, the UN warned today.” The Guardian Nov 28 2007
“It’s tough to make predictions, especially about the future.” Yogi Berra
Global extinction due to global warming has been predicted more times than climate activist, Leo DiCaprio, has traveled by private jet.  But where do these predictions come from? If you thought it was just calculated from the simple, well known relationship between CO2 and solar energy spectrum absorption, you would only expect to see about 0.5o C increase from pre-industrial temperatures as a result of CO2 doubling, due to the logarithmic nature of the relationship.
Figure 1: Incremental warming effect of CO2 alone [1]
The runaway 3-6o C and higher temperature increase model predictions depend on coupled feedbacks from many other factors, including water vapour (the most important greenhouse gas), albedo (the proportion of energy reflected from the surface – e.g. more/less ice or clouds, more/less reflection) and aerosols, just to mention a few, which theoretically may amplify the small incremental CO2 heating effect. Because of the complexity of these interrelationships, the only way to make predictions is with climate models because they can’t be directly calculated.
The purpose of this article is to explain to the non-expert, how climate models work, rather than a focus on the issues underlying the actual climate science, since the models are the primary ‘evidence’ used by those claiming a climate crisis. The first problem, of course, is no model forecast is evidence of anything. It’s just a forecast, so it’s important to understand how the forecasts are made, the assumptions behind them and their reliability.
How do Climate Models Work?
In order to represent the earth in a computer model, a grid of cells is constructed from the bottom of the ocean to the top of the atmosphere. Within each cell, the component properties, such as temperature, pressure, solids, liquids and vapour, are uniform.
The size of the cells varies between models and within models. Ideally, they should be as small as possible as properties vary continuously in the real world, but the resolution is constrained by computing power. Typically, the cell area is around 100×100 km2 even though there is considerable atmospheric variation over such distances, requiring each of the physical properties within the cell to be averaged to a single value. This introduces an unavoidable error into the models even before they start to run.
The number of cells in a model varies, but the typical order of magnitude is around 2 million.
Figure 2: Typical grid used in climate models [2]

Once the grid has been constructed, the component properties of each these cells must be determined. There aren’t, of course, 2 million data stations in the atmosphere and ocean. The current number of data points is around 10,000 (ground weather stations, balloons and ocean buoys), plus we have satellite data since 1978, but historically the coverage is poor. As a result, when initialising a climate model starting 150 years ago, there is almost no data available for most of the land surface, poles and oceans, and nothing above the surface or in the ocean depths. This should be understood to be a major concern.
Figure 3: Global weather stations circa 1885 [3]

Once initialised, the model goes through a series of timesteps. At each step, for each cell, the properties of the adjacent cells are compared. If one such cell is at a higher pressure, fluid will flow from that cell to the next. If it is at higher temperature, it warms the next cell (whilst cooling itself). This might cause ice to melt or water to evaporate, but evaporation has a cooling effect. If polar ice melts, there is less energy reflected that causes further heating. Aerosols in the cell can result in heating or cooling and an increase or decrease in precipitation, depending on the type.
Increased precipitation can increase plant growth as does increased CO2. This will change the albedo of the surface as well as the humidity. Higher temperatures cause greater evaporation from oceans which cools the oceans and increases cloud cover. Climate models can’t model clouds due to the low resolution of the grid, and whether clouds increase surface temperature or reduce it, depends on the type of cloud.
It’s complicated! Of course, this all happens in 3 dimensions and to every cell resulting in considerable feedback to be calculated at each timestep.
The timesteps can be as short as half an hour. Remember, the terminator, the point at which day turns into night, travels across the earth’s surface at about 1700 km/hr at the equator, so even half hourly timesteps introduce further error into the calculation, but again, computing power is a constraint.
While the changes in temperatures and pressures between cells are calculated according to the laws of thermodynamics and fluid mechanics, many other changes aren’t calculated. They rely on parameterisation. For example, the albedo forcing varies from icecaps to Amazon jungle to Sahara desert to oceans to cloud cover and all the reflectivity types in between. These properties are just assigned and their impacts on other properties are determined from lookup tables, not calculated. Parameterisation is also used for cloud and aerosol impacts on temperature and precipitation. Any important factor that occurs on a subgrid scale, such as storms and ocean eddy currents must also be parameterised with an averaged impact used for the whole grid cell. Whilst the effects of these factors are based on observations, the parameterisation is far more a qualitative rather than a quantitative process, and often described by modelers themselves as an art, that introduces further error. Direct measurement of these effects and how they are coupled to other factors is extremely difficult and poorly understood.
Within the atmosphere in particular, there can be sharp boundary layers that cause the models to crash. These sharp variations have to be smoothed.
Energy transfers between atmosphere and ocean are also problematic. The most energetic heat transfers occur at subgrid scales that must be averaged over much larger areas.
Cloud formation depends on processes at the millimeter level and are just impossible to model. Clouds can both warm as well as cool. Any warming increases evaporation (that cools the surface) resulting in an increase in cloud particulates. Aerosols also affect cloud formation at a micro level.  All these effects must be averaged in the models.
When the grid approximations are combined with every timestep, further errors are introduced and with half hour timesteps over 150 years, that’s over 2.6 million timesteps! Unfortunately, these errors aren’t self-correcting. Instead this numerical dispersion accumulates over the model run, but there is a technique that climate modelers use to overcome this, which I describe shortly.
Figure 4: How grid cells interact with adjacent cells [4]

Model Initialisation
After the construction of any type of computer model, there is an initalisation process whereby the model is checked to see whether the starting values in each of the cells are physically consistent with one another. For example, if you are modelling a bridge to see whether the design will withstand high winds and earthquakes, you make sure that before you impose any external forces onto the model structure other than gravity, that it meets all the expected stresses and strains of a static structure. Afterall, if the initial conditions of your model are incorrect, how can you rely on it to predict what will happen when external forces are imposed in the model?
Fortunately, for most computer models, the properties of the components are quite well known and the initial condition is static, the only external force being gravity. If your bridge doesn’t stay up on initialisation, there is something seriously wrong with either your model or design!
With climate models, we have two problems with initialisation. Firstly, as previously mentioned, we have very little data for time zero, whenever we chose that to be. Secondly, at time zero, the model is not in a static steady state as is the case for pretty much every other computer model that has been developed. At time zero, there could be a blizzard in Siberia, a typhoon in Japan, monsoons in Mumbai and a heatwave in southern Australia, not to mention the odd volcanic explosion, which could all be gone in a day or so.
There is never a steady state point in time for the climate, so it’s impossible to validate climate models on initialisation.
The best climate modelers can hope for is that their bright shiny new model doesn’t crash in the first few timesteps.
The climate system is chaotic which essentially means any model will be a poor predictor of the future – you can’t even make a model of a lottery ball machine (which is a comparatively a much simpler and smaller interacting system) and use it to predict the outcome of the next draw.
So, if climate models are populated with little more than educated guesses instead of actual observational data at time zero, and errors accumulate with every timestep, how do climate modelers address this problem?
History matching
If the system that’s being computer modelled has been in operation for some time, you can use that data to tune the model and then start the forecast before that period finishes to see how well it matches before making predictions. Unlike other computer modelers, climate modelers call this ‘hindcasting’ because it doesn’t sound like they are manipulating the model parameters to fit the data.
The theory is, that even though climate model construction has many flaws, such as large grid sizes, patchy data of dubious quality in the early years, and poorly understood physical phenomena driving the climate that has been parameterised, that you can tune the model during hindcasting within parameter uncertainties to overcome all these deficiencies.
While it’s true that you can tune the model to get a reasonable match with at least some components of history, the match isn’t unique.
When computer models were first being used last century, the famous mathematician, John Von Neumann, said:
“with four parameters I can fit an elephant, with five I can make him wiggle his trunk”
In climate models there are hundreds of parameters that can be tuned to match history. What this means is there is an almost infinite number of ways to achieve a match. Yes, many of these are non-physical and are discarded, but there is no unique solution as the uncertainty on many of the parameters is large and as long as you tune within the uncertainty limits, innumerable matches can still be found.
An additional flaw in the history matching process is the length of some of the natural cycles. For example, ocean circulation takes place over hundreds of years, and we don’t even have 100 years of data with which to match it.
In addition, it’s difficult to history match to all climate variables. While global average surface temperature is the primary objective of the history matching process, other data, such a tropospheric temperatures, regional temperatures and precipitation, diurnal minimums and maximums are poorly matched.
Even so, can the history matching of the primary variable, average global surface temperature, constrain the accumulating errors that inevitably occur with each model timestep?
Consider a shotgun. When the trigger is pulled, the pellets from the cartridge travel down the barrel, but there is also lateral movement of the pellets. The purpose of the shotgun barrel is to dampen the lateral movements and to narrow the spread when the pellets leave the barrel. It’s well known that shotguns have limited accuracy over long distances and there will be a shot pattern that grows with distance.  The history match period for a climate model is like the barrel of the shotgun. So what happens when the model moves from matching to forecasting mode?
Figure 5: IPCC models in forecast mode for the Mid-Troposphere vs Balloon and Satellite observations [5]
Like the shotgun pellets leaving the barrel, numerical dispersion takes over in the forecasting phase. Each of the 73 models in Figure 5 has been history matched, but outside the constraints of the matching period, they quickly diverge.
Now at most only one of these models can be correct, but more likely, none of them are. If this was a real scientific process, the hottest two thirds of the models would be rejected by the International Panel for Climate Change (IPCC), and further study focused on the models closest to the observations. But they don’t do that for a number of reasons.
Firstly, if they reject most of the models, there would be outrage amongst the climate scientist community, especially from the rejected teams due to their subsequent loss of funding. More importantly, the so called 97% consensus would instantly evaporate.
Secondly, once the hottest models were rejected, the forecast for 2100 would be about 1.5o C increase (due predominately to natural warming) and there would be no panic, and the gravy train would end.
So how should the IPPC reconcile this wide range of forecasts?
Imagine you wanted to know the value of bitcoin 10 years from now so you can make an investment decision today. You could consult an economist, but we all know how useless their predictions are. So instead, you consult an astrologer, but you worry whether you should bet all your money on a single prediction. Just to be safe, you consult 100 astrologers, but they give you a very wide range of predictions. Well, what should you do now? You could do what the IPCC does, and just average all the predictions.
You can’t improve the accuracy of garbage by averaging it.
An Alternative Approach
Climate modelers claim that a history match isn’t possible without including CO2 forcing. This is may be true using the approach described here with its many approximations, and only tuning the model to a single benchmark (surface temperature) and ignoring deviations from others (such as tropospheric temperature), but analytic (as opposed to numeric) models have achieved matches without CO2 forcing. These are models, based purely on historic climate cycles that identify the harmonics using a mathematical technique of signal analysis, which deconstructs long and short term natural cycles of different periods and amplitudes without considering changes in CO2 concentration.
In Figure 6, a comparison is made between the IPCC predictions and a prediction from just one analytic harmonic model that doesn’t depend on CO2 warming. A match to history can be achieved through harmonic analysis and provides a much more conservative prediction that correctly forecasts the current pause in temperature increase, unlike the IPCC models. The purpose of this example isn’t to claim that this model is more accurate, it’s just another model, but to dispel the myth that there is no way history can be explained without anthropogenic CO2 forcing and to show that it’s possible to explain the changes in temperature with natural variation as the predominant driver.
Figure 6: Comparison of the IPCC model predictions with those from a harmonic analytical model [6]

In summary:
Climate models can’t be validated on initiatialisation due to lack of data and a chaotic initial state.
Model resolutions are too low to represent many climate factors.
Many of the forcing factors are parameterised as they can’t be calculated by the models.
Uncertainties in the parameterisation process mean that there is no unique solution to the history matching.
Numerical dispersion beyond the history matching phase results in a large divergence in the models.
The IPCC refuses to discard models that don’t match the observed data in the prediction phase – which is almost all of them.
The question now is, do you have the confidence to invest trillions of dollars and reduce standards of living for billions of people, to stop climate model predicted global warming or should we just adapt to the natural changes as we always have?
Greg Chapman  is a former (non-climate) computer modeler.
Whilst climate models are tuned to surface temperatures, they predict a tropospheric hotspot that doesn’t exist. This on its own should invalidate the models.