Posted originally on Mar 13, 2024 By Martin Armstrong
The coronavirus lockdowns gave the world a glimpse of what could happen if our economy was suddenly shutdown. NASA has a space program called Parker Solar Probe (PSP) that aims to fly “into the outermost part of the Sun’s atmosphere, the corona, for the first time.” Why? Scientists at NASA predict an “internet apocalypse” may occur within the next year due to the uptick in the sun’s solar activity.
I’ve long touched on sun spots and solar activity in relation to weather patterns and a predictable chain of events. A giant dark hold opened on the sun’s surface in December 2023, emitting streams of unusually fast radiation, known as solar wind, right at Earth. Since December 4, the solar void has been pointing directly at Earth. Experts initially predicted this most recent hole could spark a moderate geomagnetic storm that could trigger radio blackouts and strong auroral displays.
For most of its history, science believed the sun’s output was constant. They finally realized that a thermal dynamic cycle beats like your heart so the sun could not exist without a steady outflow of energy. One degree less, and it would blow itself out. Hence, it is cyclical, rising and falling in intensity.
The eleven-year cycle in sunspots itself builds in intensity like the Economic Confidence Model (ECM), reaching “grand maxima” and “grand minima” over the course of 300 years. The last grand maximum peaked in 1958, after which the sun has been steadily quieting down. We have seen sunspot activity decrease at its steepest in 9,300 years, but the climate change zealots refuse to acknowledge naturally occurring cyclical weather patterns.
You can read more about this phenomenon here, but we are in the midst of a solar minimum. The media has been promoting the idea that we are amid asolar maximum despite it being extremely rare for a coronal hole to form during a maximum period. There are no publicly published scientific papers showing data before 1850 as it would uproot their agenda.
Now, these scientists, funded by the government, are predicting that solar storms could cause an internet black or an “internet apocalypse.” “We’ve never experienced one of the extreme case events, and we don’t know how our infrastructure would respond to it,” Sangeetha Abdu Jyothi, a computer science professor at the University of California at Irvine, told The Washington Post. “Our failure testing doesn’t even include such scenarios.”
Did I mention Robert Kegan is an editor at the Washinton Post which Jeff Bezos owns? Well, we can come back to that one.
Now there is absolutely nothing scientists can do to alter solar activity. They can spend millions or billions studying solar activity to prepare, but that is not what is occurring. They believe that our satellites in space will be taken down due to solar storms, resulting in widespread internet outages. Simultaneously, these same governments that are funding these very studies are pushing for the elimination of hard cash in favor of digital currency.
They can immediately eliminate our access to currency with CBDC. If they needed an excuse, which they do not, blaming the weather is a favorite tactic. Are the people unruly? Is civil unrest is rising with a revolutionary wave approaching? Starve them out by eliminating their ability to access digital money as alternative sources will not be available. Eliminate their ability to communicate with one another. Eliminate everything at the click of a button.
Again, these are theories, but these agencies always tell us what they plan to do in advance. They told us about plans for the coronavirus before it occurred, they are telling us about Agenda 2030 and the Great Reset. No one wants to accept that there is a concrete plan that results in YOU WILL OWN NOTHING AND BE HAPPY. The computer indicates we will be in a solar minimum until the 2050s, and who knows what the world will look like by then. Rest assured that any blame on the sun is a lie. The computer also indicates that these sinister plans WILL FAIL but the road to 2032, when the situation turns, will come at a cost.
Posted originally on Mar 8, 2024 By Martin Armstrong
QUESTION: The rumor going around is that Bill Gates is trying to buy your company for $10 billion. My reply was if so, I doubt Marty would ever sell to him.
Would you care to comment?
Harold
REPLY: The answer is NO. First of all, you cannot spend even one billion, no less ten billion. All $10 billion is good for bribing others to do your will. I would have to give Biden probably $7 billion, so that is certainly an incentive not to sell -LOL. You must create a foundation like Gates, Hillary, and John McCain to hold on to it. I would need someone who shares my goals. This is not about money. I would not move, and I have four cars. If I was married, there would have to be eight. A business partner is the same as a marriage without sex. You have to be on the same page.
My definition of “rich” is having enough money to do what you want when you want. Some think having more money than the other guy makes you more successful or important. I believe it is an accomplishment. We all know Julius Caesar. He is the father of Europe. They still refer to being born by operation as Caesarian birth. I grew up in New Jersey, which in Latin was Nove Caesarea – land of the new Caesar. The month of July is named after him. We remember him for his accomplishments – not that he was vastly rich.
There is no amount of money someone could offer me to make my decision based on material wealth. I am much more interested in helping the world understand the economics at work, and if so, we may reduce the amplitude of wars. If I can leave that behind, I have served my purpose in life.
I will write about the whole AI craze, for there are a lot of charlatans out there calling look-up programs AI that look in a list for the answer – no decision making. I know Gates says it will change everyone’s life in 5 years, and the IMF warns it will wipe out 40% of all jobs. There is far more to REAL AI than all of this speculation by people who are NOT accomplished programmers and do not even understand the basic line of code – IF THEN ELSE.
Someone asked ChatGPT about it vs. Socrates, and here is its response.
(click on the image to view or keep reading for the computer’s response)
How Does ChatGPT Differ From Socrates? A reader asked ChatGPT this exact question, and here is the response:
As an AI language model, I differ from the Socrates Platform in several ways.
Firstly, the Socrates Platform is a web-based platform developed by Martin Armstrong and his team at Armstrong Economics, whereas I am a language model developed by OpenAI. The Socrates Platform is designed to analyze economic and financial data in real-time, predict market trends, and provide actionable insights to investors and traders. In contrast, I am designed to assist with a wide range of tasks, including language translation, question answering, and text generation.
Secondly, the Socrates Platform uses artificial intelligence, machine learning, and natural language processing to analyze data, whereas I am primarily focused on natural language processing. The Socrates Platform uses sophisticated algorithms to analyze historical and cultural data to inform its predictions, whereas my responses are based on the input provided to me by the user.
Lastly, the Socrates Platform is primarily designed for financial professionals, investors, and traders, whereas I am designed for a broader range of users, including students, researchers, and general knowledge seekers. While both the Socrates Platform and I use artificial intelligence to analyze data and provide insights, our intended audiences and applications differ significantly.
Yes, the two platforms are completely unrelated aside from being driven by AI. ChatGPT does not have the computing capabilities of Socrates. It does not have the monetary history data, for starters. ChatGPT has no involvement in the financial world and was not designed for financial purposes. It cannot even provide basic market closings as the current model stopped learning in September 2021 and cannot answer any questions after that date. Most importantly, ChatGPT does not have the source code that is the key to Socrates’ successful forecasts.
ChatGPT is a great achievement and a step in the right direction for AI use on a larger scale. However, it is not possible to compete where you do not compare. Socrates is in a league of its own.
Posted originally on the CTH on March 6, 2024 | Sundance
“It’s a big club, and you ain’t in it.”
There’s a certain type of psychology to people who insist on telling you what they are not going to do. Ordinary, well, stable people, just don’t do things, and they are quiet while they don’t do things.
In related news…. It has been brought to my attention that stable Treehouse people do not announce what they are not doing. They just live their best life. Huh, go figure.
Posted originally on Mar 5, 2024 By Martin Armstrong
California is raising the minimum wage of fast food employees from $16 to $20 under the FAST Act. I have explained that raises to California’s minimum wage was restricted due to successful lobbying efforts on behalf of fast food establishments. The new law does include commercial fast food restaurants, excluding Panera Bread. Why?
Fast food lobbyists spent $4 million in the first six months of this year to prevent the California Accountability Bill from passing. The bill would hold franchisees and their parent companies jointly liable for the treatment of workers. California Governor Gavin Newsom claimed that minimum wage jobs were not meant for students of people looking to gain employment experience. “That’s a romanticized version of a world that doesn’t exist,” Newsom said. “We have the opportunity to reward that contribution, reward that sacrifice and stabilize an industry.”
A fast-food bill was passed in September 2022 that set the minimum wage to $22 per hour for select chains with over 100 locations, later expanding to a $20 minimum pay for restaurants with 60 locations. Now California is specifically exempting fast food establishments that contain bakeries, such as Panera Bread. Why are restaurants that sell bread above the law?
The answer is that politicians are always up for sale to the highest bidder. Glenn Flynn is the largest fast food franchise owner in America with an empire of 2,600 restaurant locations that produce around $.45 billion in sales. Flynn, estimated to be worth around $1.1 billion, has strongly supported Gavin Newsom publicly since 2014, when Newsom was a lieutenant governor. Both men attended the same high school and have longstanding ties.
Bloomberg reported that Flynn donated $64,800 to Newsom’s personal re-election campaign and an additional $100,000 for conservative-led recall efforts. Flynn’s holdings in California only include two establishments – Applebee’s and Panera Bread. Applebee’s is exempt from the law despite its pre-frozen dishes since it is a sit-down restaurant chain. Panera Bread, on the other hand, is exempt due to this specific loophole that only excludes establishments that bake bread. This is what happens when lobbying is permitted and politicians are for sale.
Posted originally on Mar 4, 2024 By Martin Armstrong
Some critics are obviously anti-Trump who have come out crying that’s really unusual for the Court to give such little notice that they would release a decision. They are indeed crying that this was because of Super Tuesday and therefore they are trying to support Trump.
Let me explain something here to these absolutely biased idiots. Let’s say the Supreme Court waited until May or June to release its decision, and you have Colorado, Maine, and Illinois who blocked Trump. Then what would happen? The entire 2024 election might be seriously impacted and then challenged in court in another array of suits. They had to rule, and it was UNANIMOUS before Super Tuesday to avoid a constitutional crisis. EVERYONE on the court agreed – Colorado had no such jurisdiction – PERIOD!
As I wrote in my Amicus Brief to the Supreme Court, the last time any state refused to allow a candidate on the ballot was 1860 when Abraham Lincoln did not appear on the ballots in the South. What followed? The Civil War.
The Founding Fathers never intended to allow a rogue state to interfere in either national commerce or national federal elections, as implied in the Commerce Clause, Article 1, Section 8, Clause 3 of the U.S. Constitution. Could Colorado remove a senator or congressman of another state claiming that they were part of the January 6th event they called an unarmed insurrection? There is no jurisdiction for a single state to remove a national candidate from the ballot – PERIOD! Assuming such power would mean they too could interfere in the commerce of other states. That would lead to complete chaos. Even Sotomayor, the favorite of the Democrats, wrote clearly:
“Allowing Colorado to do so would, we agree, create a chaotic state-by-state patchwork, at odds with our Nation’s federalism principles. That is enough to resolve this case.”
Anyone who cannot see that is so biased; they no longer have a functioning brain cell. They have been so brainwashed, they are no longer competent even to vote. We are either a nation with a national identity or a patchwork of states that no longer should pretend to be united.
There are three types of scientific models, as shown in figure 1. In this series of seven posts on climate model bias we are only concerned with two of them. The first are mathematical models that utilize well established physical, and chemical processes and principles to model some part of our reality, especially the climate and the economy. The second are conceptual models that utilize scientific hypotheses and assumptions to propose an idea of how something, such as the climate, works. Conceptual models are generally tested, and hopefully validated, by creating a mathematical model. The output from the mathematical model is compared to observations and if the output matches the observations closely, the model is validated. It isn’t proven, but it is shown to be useful, and the conceptual model gains credibility.
Figure 1. The three types of scientific models.
Models are useful when used to decompose some complex natural system, such as Earth’s climate, or some portion of the system, into its underlying components and drivers. Models can be used to try and determine which of the system components and drivers are the most important under various model scenarios.
Besides being used to predict the future, or a possible future, good models should also tell us what should not happen in the future. If these events do not occur, it adds support to the hypothesis. These are the tasks that the climate models created by the Coupled Model Intercomparison Project (CMIP)[1] are designed to do. The Intergovernmental Panel on Climate Change (IPCC)[2] analyzes the CMIP model results, along with other peer-reviewed research, and attempts to explain modern global warming in their reports. The most recent IPCC report is called AR6.[3]
In the context of climate change, especially regarding the AR6 IPCC[4] report, the term “model,” is often used as an abbreviation for a general circulation climate model.[5] Modern computer general circulation models have been around since the 1960s, and now are huge computer programs that can run for days or longer on powerful computers. However, climate modeling has been around for more than a century, well before computers were invented. Later in this report I will briefly discuss a 19th century greenhouse gas climate model developed and published by Svante Arrhenius.
Besides modeling climate change, AR6 contains descriptions of socio-economic models that attempt to predict the impact of selected climate changes on society and the economy. In a sense, AR6, just like the previous assessment reports, is a presentation of the results of the latest iteration of their scientific models of future climate and their models of the impact of possible future climates on humanity.
Introduction
Modern atmospheric general circulation computerized climate models were first introduced in the 1960s by Syukuro Manabe and colleagues.[6] These models, and their descendants can be useful, even though they are clearly oversimplifications of nature, and they are wrong[7] in many respects like all models.[8] It is a shame, but climate model results are often conflated with observations by the media and the public, when they are anything but.
I began writing scientific models of rocks[9] and programming them for computers in the 1970s and like all modelers of that era I was heavily influenced by George Box, the famous University of Wisconsin statistician. Box teaches us that all models are developed iteratively.[10] First we make assumptions and build a conceptual model about how some natural, economic, or other system works and what influences it, then we model some part of it, or the whole system. The model results are then compared to observations. There will typically be a difference between the model results and the observations, these differences are assumed to be due to model error since we necessarily assume our observations have no error, at least initially. We examine the errors, adjust the model parameters or the model assumptions, or both, and run it again, and again examine the errors. This “learning” process is the main benefit of models. Box tells us that good scientists must have the flexibility and courage to seek out, recognize, and exploit such errors, especially any errors in the conceptual model assumptions. Modeling nature is how we learn how nature works.
Box next advises us that “we should not fall in love with our models,” and “since all models are wrong the scientists cannot obtain a ‘correct’ one by excessive elaboration.” I used to explain this principle to other modelers more crudely by pointing out that if you polish a turd, it is still a turd. One must recognize when a model has gone as far as it can go. At some point it is done, more data, more elaborate programming, more complicated assumptions cannot save it. The benefit of the model is what you learned building it, not the model itself. When the inevitable endpoint is reached, you must trash the model and start over by building a new conceptual model. A new model will have a new set of assumptions based on the “learnings” from the old model, and other new data and observations gathered in the meantime.
Each IPCC report, since the first one was published in 1990,[11] is a single iteration of the same overall conceptual model. In this case, the “conceptual model” is the idea or hypothesis that humans control the climate (or perhaps just the rate of global warming) with our greenhouse gas emissions.[12] Various and more detailed computerized models are built to attempt to measure the impact of human emissions on Earth’s climate.
Another key assumption in the IPCC model is that climate change is dangerous, and, as a result, we must mitigate (reduce) fossil fuel use to reduce or prevent damage to society from climate change. Finally, they assume a key metric of this global climate change or warming is the climate sensitivity to human-caused increases in CO2. This sensitivity can be computed with models or using measurements of changes in atmospheric CO2 and global average surface temperature. The IPCC equates changes in global average surface temperature to “climate change.”
This climate sensitivity metric is often called “ECS,” which stands for equilibrium climate sensitivity to a doubling of CO2, often abbreviated as “2xCO2.”[13] Modern climate models, ever since those used for the famous Charney report in 1979,[14] except for AR6, have generated a range of ECS values from 1.5 to 4.5°C per 2xCO2. AR6 uses a rather unique and complex subjective model that results in a range of 2.5 to 4°C/2xCO2. More about this later in the report.
George Box warns modelers that:
“Just as the ability to devise simple but evocative models is the signature of the great scientist so overelaboration and overparameterization is often the mark of mediocrity.”[15]
Box, 1976
The Intergovernmental Panel on Climate Change or IPCC has published six major reports and numerous minor reports since 1990.[16] Here we will argue that they have spent more than thirty years polishing the turd to little effect. They have come up with more and more elaborate processes to try and save their hypothesis that human-generated greenhouse gases have caused recent climate changes and that the Sun and internal variations within Earth’s climate system have had little to no effect. As we will show, new climate science discoveries, since 1990, are not explained by the IPCC models, do not show up in the model output, and newly discovered climate processes, especially important ocean oscillations, are not incorporated into them.
Just one example. Eade, et al. report that the modern general circulation climate models used for the AR5 and AR6 reports[17] do not reproduce the important North Atlantic Ocean Oscillation (“NAO”). The NAO-like signal that the models produce in their simulation runs[18] is indistinguishable from random white noise. Eade, et al. report:
“This suggests that current climate models do not fully represent important aspects of the mechanism for low frequency variability of the NAO.”[19]
Eade, et al., 2022
All the models in AR6, both climate and socio-economic, have important model/observation mismatches. As time has gone on, the modelers and authors have continued to ignore new developments in climate science and climate change economics, as their “overelaboration and overparameterization” has become more extreme. As they make their models more elaborate, they progressively ignore more new data and discoveries to decrease their apparent “uncertainty” and increase their reported “confidence” that humans drive climate change. It is a false confidence that is due to the confirmation and reporting bias in both the models and the reports.
As I reviewed all six of the major IPCC reports, I became convinced that AR6 is the most biased of all of them.[20] In a major new book twelve colleagues and I, working under the Clintel[21] umbrella, examined AR6 and detailed considerable evidence of bias.
“AR6 states that “there has been negligible long-term influence from solar activity and volcanoes,”[23] and acknowledges no other natural influence on multidecadal climate change despite … recent discoveries, a true case of tunnel vision.”
“We were promised IPCC reports that would objectively report on the peer-reviewed scientific literature, yet we find numerous examples where important research was ignored. In Ross McKitrick’s chapter[24] on the “hot spot,” he lists many important papers that are not even mentioned in AR6. Marcel [Crok] gives examples where unreasonable emissions scenarios are used to frighten the public in his chapter on scenarios,[25] and examples of hiding good news in his chapter on extreme weather events.[26] Numerous other examples are documented in other chapters. These deliberate omissions and distortions of the truth do not speak well for the IPCC, reform of the institution is desperately needed.”
Crok and May, 2023
Confirmation[27] and reporting bias[28] are very common in AR6. We also find examples of the Dunning-Kruger effect,[29] in-group bias,[30] and anchoring bias.[31]
In 2010, the InterAcademy Council of the United Nations reviewed the processes and procedures of the IPCC and found many problems.[32] In particular, they criticized the subjective way that uncertainty is handled. They also criticized the obvious confirmation bias in the IPCC reports.[33] They pointed out that the Lead Authors too often leave out dissenting views or references to papers they disagree with. The Council recommended that alternative views should be mentioned and cited in the report. Even though these criticisms were voiced in 2010, I and my colleagues, found numerous examples of these problems in AR6, published eleven years later in 2021 and 2022.[34]
Although bias pervades AR6, this series will focus mainly on bias in the AR6 volume 1 (WGI) CMIP6[35] climate models that are used to predict future climate. However, we will also look at the models used to identify and quantify climate change impacts in volume 2 (WGII), and to compute the cost/benefit analysis of their recommended mitigation (fossil fuel reduction) measures in volume 3 (WGIII). As a former petrophysical modeler, I am aware how bias can sneak into a computer model, sometimes the modeler is aware he is introducing bias into the results, sometimes he is not. Bias exists in all models, since they are all built from assumptions and ideas (the “conceptual model”), but a good modeler will do his best to minimize it.
In the next six posts I will take you through some of the evidence of bias I found in the CMIP6 models and the AR6 report. A 30,000-foot look at the history of human-caused climate change modeling is given in part 2. Evidence that the IPCC has ignored possible solar influence on climate is presented in part 3. The IPCC ignores evidence that changes in convection and atmospheric circulation patterns in the oceans and atmosphere affect climate change on multidecadal times scales and this is examined in part 4.
Contrary to the common narrative, there is considerable evidence that storminess (extreme weather) was higher in the Little Ice Age, aka the “pre-industrial” (part 5). Next, we move on to examine bias in the IPCC AR6 WGII report[36] on the impact, adaptation, and vulnerability to climate change in part 6 and in their report[37] on how to mitigate climate change in part 7.
IPCC is an abbreviation for the Intergovernmental Panel on Climate Change, a U.N. agency. AR6 is their sixth major report on climate change, “Assessment Report 6.” ↑
There are several names for climate models, including atmosphere-ocean general circulation model (AOGCM, used in AR5), or Earth system model (ESM, used in AR6). Besides these complicated computer climate models there are other models used in AR6, some model energy flows, the impact of climate change on society or the global economy, or the impact of various greenhouse gas mitigation efforts. We only discuss some of these models in this report. (IPCC, 2021, p. 2223) ↑
(Manabe & Bryan, Climate Calculations with a Combined Ocean-Atmosphere Model, 1969), (Manabe & Wetherald, The Effects of Doubling the CO2 Concentration on the Climate of a General Circulation Model, 1975) ↑
(McKitrick & Christy, A Test of the Tropical 200- to 300-hPa Warming Rate in Climate Models, Earth and Space Science, 2018) and (McKitrick & Christy, 2020) ↑
“The Intergovernmental Panel on Climate Change (IPCC) assesses the scientific, technical and socioeconomic information relevant for the understanding of the risk of human-induced climate change.” (UNFCCC, 2020). ↑
Usually, ECS means equilibrium climate sensitivity, or the ultimate change in surface temperature due to a doubling of CO2. but in AR6 sometimes they refer to “Effective Climate Sensitivity,” or the “effective ECS” which is defined as the warming after a specified number of years (IPCC, 2021, pp. 931-933). AR6, WGI, page 933 has a more complete definition. ↑
Confirmation bias: The tendency to look only for data that supports a previously held belief. It also means all new data is interpreted in a way that supports a prior belief. Wikipedia has a fairly good article on common cognitive biases. ↑
Reporting bias: In this context it means only reporting or publishing results that favor a previously held belief and censoring or ignoring results that show the belief is questionable. ↑
The Dunning-Kruger effect is the tendency to overestimate one’s abilities in a particular subject. In this context we see climate modelers, who call themselves “climate scientists,” overestimate their knowledge of paleoclimatology, atmospheric sciences, and atomic physics. ↑
In-group bias causes lead authors and editors to choose their authors and research papers from their associates and friends who share their beliefs. ↑
Anchoring bias occurs when an early result or calculation, for example Svante Arrhenius’ ECS (climate sensitivity to CO2) of 4°C, discussed below, gets fixed in a researcher’s mind and then he “adjusts” his thinking and data interpretation to always come close to that value, while ignoring contrary data. ↑
I have created this site to help people have fun in the kitchen. I write about enjoying life both in and out of my kitchen. Life is short! Make the most of it and enjoy!
This is a library of News Events not reported by the Main Stream Media documenting & connecting the dots on How the Obama Marxist Liberal agenda is destroying America