The Cover-Up Of Problems In The IPCC Report (by Prof. Antero Ollila) + The IPCC’s Methodology Is Fundamentally Flawed (by Ross McKitrick)

There is now a substantial number experts casting serious shade over the “flawed methodology” used by the IPCC in their latest analysis of the climate.

Below is a one-two punch to the IPCC’s Sixth Assessment Report (AR6).

Scientific flaws in the climate models, and the cover-up of problems in the IPCC AR6 report

[by Antero Ollila, Adj. Prof. Aalto University (Emer.) — below is an abridged version, you can find the original at]

The amount of carbon dioxide in the atmosphere has increased by 32% since 1750. According to the IPCC’s latest climate report (AR6), this is due solely to man-made emissions accumulating by an average of 44% per year, with the rest absorbed by oceans and vegetation.

Approximately 25% of the atmospheric carbon dioxide changes annually from the oceans and vegetation. As a result, less than 6% of the initial amount of carbon dioxide in the atmosphere remains after 10 years, and therefore the increased amount of carbon dioxide in the atmosphere cannot be entirely of anthropogenic origin with a permille value of -28%. The IPCC remains silent on permille values (a measure of the ratio of carbon isotopes used to analyze the origin of carbon dioxide–suitable for validating carbon cycle models).

The cover-up of this issue continues with the anthropogenic carbon dioxide lifetime in the atmosphere, which is now vaguely stated as being from hundreds of years to thousands of years. The removal rate of radioactive carbon from the atmosphere (a perfect tracer test for anthropogenic carbon dioxide) after 1964 is only 64 years. The recovery time of the total atmospheric amount of carbon dioxide to the level of 1750 can be estimated to be similar to that of its accumulation period, i.e. just under 300 years.

Furthermore, the AR6 report no longer shows the IPCC’s own definition of the greenhouse effect, except a brief mention in the glossary. Missing is the explanation for how a greenhouse gas absorption of 158 Wm-2 can create downward infrared radiation on the ground of 342 Wm-2 — this is against fundamental physical laws because it assumes that the excess energy came from nothing.

The radiation to the surface consists of four energy fluxes, which, according to the IPCC’s energy balance, are 1) greenhouse gas absorption of 158 Wm-2, 2) latent water heat 82 Wm-2, 3) sensible heat (warm air): 21 Wm-2, and 4) solar radiation absorption in the atmosphere: 80 Wm-2. The first three energy fluxes total 261 Wm-2 and maintain the greenhouse effect. So, by distorting the size of the greenhouse effect on the absorption of greenhouse gases alone, the IPCC is able to increase the contribution of carbon dioxide in the greenhouse effect from approximately 7.5 % to 19%, and the temperature effect from 2.5 °C to 6.3 °C. 

This also means that the equations used by the IPCC to calculate the radiation forcing values and the global warming impacts of carbon dioxide for increasing carbon dioxide concentrations are not in line with the contribution of carbon dioxide in the greenhouse effect.

The IPCC’s science, aka the definition of “climate change” in the Paris Agreement, gives a strongly exaggerated warming capability to carbon dioxide.

Moreover, AR6 shows a strong positive trend in solar shortwave radiation from 9/2000 to 6/2017,  but its impact has been omitted in post-2000 warming calculations [for more, see].

The IPCC’s attribution methodology is fundamentally flawed

[by Ross McKitrick — originally posted on]

One day after the IPCC released the AR6 I published a paper in Climate Dynamics showing that their “Optimal Fingerprinting” methodology on which they have long relied for attributing climate change to greenhouse gases is seriously flawed and its results are unreliable and largely meaningless. Some of the errors would be obvious to anyone trained in regression analysis, and the fact that they went unnoticed for 20 years despite the method being so heavily used does not reflect well on climatology as an empirical discipline.

My paper is a critique of “Checking for model consistency in optimal fingerprinting” by Myles Allen and Simon Tett, which was published in Climate Dynamics in 1999 and to which I refer as AT99. Their attribution methodology was instantly embraced and promoted by the IPCC in the 2001 Third Assessment Report (coincident with their embrace and promotion of the Mann hockey stick). The IPCC promotion continues today: see AR6 Section 3.2.1. It has been used in dozens and possibly hundreds of studies over the years. Wherever you begin in the Optimal Fingerprinting literature (example), all paths lead back to AT99, often via Allen and Stott (2003). So its errors and deficiencies matter acutely.

The abstract of my paper reads as follows:

“Allen and Tett (1999, herein AT99) introduced a Generalized Least Squares (GLS) regression methodology for decomposing patterns of climate change for attribution purposes and proposed the “Residual Consistency Test” (RCT) to check the GLS specification. Their methodology has been widely used and highly influential ever since, in part because subsequent authors have relied upon their claim that their GLS model satisfies the conditions of the Gauss-Markov (GM) Theorem, thereby yielding unbiased and efficient estimators. But AT99 stated the GM Theorem incorrectly, omitting a critical condition altogether, their GLS method cannot satisfy the GM conditions, and their variance estimator is inconsistent by construction. Additionally, they did not formally state the null hypothesis of the RCT nor identify which of the GM conditions it tests, nor did they prove its distribution and critical values, rendering it uninformative as a specification test. The continuing influence of AT99 two decades later means these issues should be corrected.  I identify 6 conditions needing to be shown for the AT99 method to be valid.”

The Allen and Tett paper had merit as an attempt to make operational some ideas emerging from an engineering (signal processing) paradigm for the purpose of analyzing climate data. The errors they made come from being experts in one thing but not another, and the review process in both climate journals and IPCC reports is notorious for not involving people with relevant statistical expertise (despite the reliance on statistical methods). If someone trained in econometrics had refereed their paper 20 years ago the problems would have immediately been spotted, the methodology would have been heavily modified or abandoned and a lot of papers since then would probably never have been published (or would have, but with different conclusions—I suspect most would have failed to report “attribution”).

Optimal Fingerprinting

AT99 made a number of contributions. They took note of previous proposals for estimating the greenhouse “signal” in observed climate data and showed that they were equivalent to a statistical technique called Generalized Least Squares (GLS). They then argued that, by construction, their GLS model satisfies the Gauss-Markov (GM) conditions, which according to an important theorem in statistics means it yields unbiased and efficient parameter estimates. (“Unbiased” means the expected value of an estimator equals the true value. “Efficient” means all the available sample information is used, so the estimator has the minimum variance possible.) If an estimator satisfies the GM conditions, it is said to be “BLUE”—the Best (minimum variance) Linear Unbiased Estimator; or the best option out of the entire class of estimators that can be expressed as a linear function of the dependent variable. AT99 claimed that their estimator satisfies the GM conditions and therefore is BLUE, a claim repeated and relied upon subsequently by other authors in the field. They also introduced a “Residual Consistency” (RC) test which they said could be used to assess the validity of the fingerprinting regression model.

Unfortunately these claims are untrue. Their method is not a conventional GLS model. It does not, and cannot, satisfy the GM conditions and in particular it violates an important condition for unbiasedness. And rejection or non-rejection of the RC test tells us nothing about whether the results of an optimal fingerprinting regression are valid.

AT99 and the IPCC

AT99 was heavily promoted in the 2001 IPCC Third Assessment Report (TAR Chapter 12, Box 12.1, Section 12.4.3 and Appendix 12.1) and has been referenced in every IPCC Assessment Report since. TAR Appendix 12.1 was headlined “Optimal Detection is Regression” and began

The detection technique that has been used in most “optimal detection” studies performed to date has several equivalent representations (Hegerl and North, 1997; Zwiers, 1999). It has recently been recognised that it can be cast as a multiple regression problem with respect to generalised least squares (Allen and Tett, 1999; see also Hasselmann, 1993, 1997)

The growing level of confidence regarding attribution of climate change to GHG’s expressed by the IPCC and others over the past two decades rests principally on the many studies that employ the AT99 method, including the RC test. The methodology is still in wide use, albeit with a couple of minor changes that don’t address the flaws identified in my critique. (Total Least Squares or TLS, for instance, introduces new biases and problems which I analyze elsewhere; and regularization methods to obtain a matrix inverse do not fix the underlying theoretical flaws). There have been a small number of attribution papers using other methods, including ones which the TAR mentioned. “Temporal” or time series analyses have their own flaws which I will address separately (put briefly, regressing I(0) temperatures on I(1) forcings creates obvious problems of interpretation).

The Gauss-Markov (GM) Theorem

As with regression methods generally, everything in this discussion centres on the GM Theorem. There are two GM conditions that a regression model needs to satisfy to be BLUE. The first, called homoskedasticity, is that the error variances must be constant across the sample. The second, called conditional independence, is that the expected values of the error terms must be independent of the explanatory variables. If homoskedasticity fails, least squares coefficients will still be unbiased but their variance estimates will be biased. If conditional independence fails, least squares coefficients and their variances will be biased and inconsistent, and the regression model output is unreliable. (“Inconsistent” means the coefficient distribution does not converge on the right answer even as the sample size goes to infinite.)

I teach the GM theorem every year in introductory econometrics. (As an aside, that means I am aware of the ways I have oversimplified the presentation, but you can refer to the paper and its sources for the formal version). It comes up near the beginning of an introductory course in regression analysis. It is not an obscure or advanced concept, it is the foundation of regression modeling techniques. Much of econometrics consists of testing for and remedying violations of the GM conditions.

The AT99 Method

(It is not essential to understand this paragraph, but it helps for what follows.) Optimal Fingerprinting works by regressing observed climate data onto simulated analogues from climate models which are constructed to include or omit specific forcings. The regression coefficients thus provide the basis for causal inference regarding the forcing, and estimation of the magnitude of each factor’s influence. Authors prior to AT99 argued that failure of the homoskedasticity condition might thwart signal detection, so they proposed transforming the observations by premultiplying them by a matrix P which is constructed as the matrix root of the inverse of a “climate noise” matrix C, itself computed using the covariances from preindustrial control runs of climate models. But because C is not of full rank its inverse does not exist, so P can instead be computed using a Moore-Penrose pseudo inverse, selecting a rank which in practice is far smaller than the number of observations in the regression model itself.

The Main Error in AT99

AT99 asserted that the signal detection regression model applying the P matrix weights is homoscedastic by construction, therefore it satisfies the GM conditions, therefore its estimates are unbiased and efficient (BLUE). Even if their model yields homoscedastic errors (which is not guaranteed) their statement is obviously incorrect: they left out the conditional independence assumption. Neither AT99 nor—as far as I have seen—anyone in the climate detection field has ever mentioned the conditional independence assumption nor discussed how to test it nor the consequences should it fail.

And fail it does—routinely in regression modeling; and when it fails the results can be spectacularly wrong, including wrong signs and meaningless magnitudes. But you won’t know that unless you test for specific violations. In the first version of my paper (written in summer 2019) I criticized the AT99 derivation and then ran a suite of AT99-style optimal fingerprinting regressions using 9 different climate models and showed they routinely fail standard conditional independence tests. And when I implemented some standard remedies, the greenhouse gas signal was no longer detectable. I sent that draft to Allen and Tett in late summer 2019 and asked for their comments, which they undertook to provide. But hearing none after several months I submitted it to the Journal of Climate, requesting Allen and Tett be asked to review it. Tett provided a constructive (signed) review, as did two other anonymous reviewers, one of whom was clearly an econometrician (another might have been Allen but it was anonymous so I don’t know). After several rounds the paper was rejected. Although Tett and the econometrician supported publication the other reviewer and the editor did not like my proposed alternative methodology. But none of the reviewers disputed my critique of AT99’s handling of the GM theorem. So I carved that part out and sent it in winter 2021 to Climate Dynamics, which accepted it after 3 rounds of review.

Other Problems

In my paper I list five assumptions which are necessary for the AT99 model to yield BLUE coefficients, not all of which AT99 stated. All 5 fail by construction. I also list 6 conditions that need to be proven for the AT99 method to be valid. In the absence of such proofs there is no basis for claiming the results of the AT99 method are unbiased or consistent, and the results of the AT99 method (including use of the RC test) should not be considered reliable as regards the effect of GHG’s on the climate.

One point I make is that the assumption that an estimator of C provides a valid estimate of the error covariances means the AT99 method cannot be used to test a null hypothesis that greenhouse gases have no effect on the climate. Why not? Because an elementary principle of hypothesis testing is that the distribution of a test statistic under the assumption that the null hypothesis is true cannot be conditional on the null hypothesis being false. The use of a climate model to generate the homoscedasticity weights requires the researcher to assume the weights are a true representation of climate processes and dynamics. The climate model embeds the assumption that greenhouse gases have a significant climate impact. Or, equivalently, that natural processes alone cannot generate a large class of observed events in the climate, whereas greenhouse gases can. It is therefore not possible to use the climate model-generated weights to construct a test of the assumption that natural processes alone could generate the class of observed events in the climate.

Another less-obvious problem is the assumption that use of the Moore-Penrose pseudo inverse has no implications for claiming the result satisfies the GM conditions. But the reduction of rank of the resulting covariance matrix estimator means it is biased and inconsistent and the GM conditions automatically fail. As I explain in the paper, there is a simple and well-known alternative to using P matrix weights—use of White’s (1980) heteroskedasticity-consistent covariance matrix estimator, which has long been known to yield consistent variance estimates. It was already 20 years old and in use everywhere (other than climatology apparently) by the time of AT99, yet they opted instead for a method that is much harder to use and yields biased and inconsistent results.

The RC Test

AT99 claimed that a test statistic formed using the signal detection regression residuals and the C matrix from an independent climate model follows a centered chi-squared distribution, and if such a test score is small relative to the 95% chi-squared critical value, the model is validated. More specifically, the null hypothesis is not rejected.

But what is the null hypothesis? Astonishingly it was never written out mathematically in the paper. All AT99 provided was a vague group of statements about noise patterns, ending with a far-reaching claim that if the test doesn’t reject, “then we have no explicit reason to distrust uncertainty estimates based on our analysis.” As a result, researchers have treated the RC test as encompassing every possible specification error, including ones that have no rational connection to it, erroneously treating non-rejection as comprehensive validation of the signal detection regression model specification.

This is incomprehensible to me. If in 1999 someone had submitted a paper to even a low-rank economics journal proposing a specification test in the way that AT99 did, it would have been annihilated at review. They didn’t state the null hypothesis mathematically or list the assumptions necessary to prove its distribution (even asymptotically, let alone exactly), they provided no analysis of its power against alternatives nor did they state any alternative hypotheses in any form so readers have no idea what rejection or non-rejection implies. Specifically, they established no link between the RC test and the GM conditions. I provide in the paper a simple description of a case in which the AT99 model might be biased and inconsistent by construction, yet the RC test would never reject. And supposing that the RC test does reject, which GM condition therefore fails? Nothing in their paper explains that. It’s the only specification test used in the fingerprinting literature and it is utterly meaningless.

The Review Process

When I submitted my paper to CD I asked that Allen and Tett be given a chance to provide a reply which would be reviewed along with it. As far as I know this did not happen, instead my paper was reviewed in isolation. When I was notified of its acceptance in late July I sent them a copy with an offer to delay publication until they had a chance to prepare a response, if they wished to do so. I did not hear back from either of them so I proceeded to edit and approve the proofs. I then wrote them again, offering to delay further if they wanted to produce a reply. This time Tett wrote back with some supportive comments about my earlier paper and he encouraged me just to go ahead and publish my comment. I hope they will provide a response at some point, but in the meantime my critique has passed peer review and is unchallenged.

Guessing at Potential Objections

1. Yes but look at all the papers over the years that have successfully applied the AT99 method and detected a role for GHGs. Answer: the fact that a flawed methodology is used hundreds of times does not make the methodology reliable, it just means a lot of flawed results have been published. And the failure to spot the problems means that the people working in the signal detection/Optimal Fingerprinting literature aren’t well-trained in GLS methods. People have assumed, falsely, that the AT99 method yields “BLUE” – i.e. unbiased and efficient – estimates. Maybe some of the past results were correct. The problem is that the basis on which people said so is invalid, so no one knows.

2. Yes but people have used other methods that also detect a causal role for greenhouse gases. Answer: I know. But in past IPCC reports they have acknowledged those methods are weaker as regards proving causality, and they rely even more explicitly on the assumption that climate models are perfect. And the methods based on time series analysis have not adequately grappled with the problem of mismatched integration orders between forcings and observed temperatures. I have some new coauthored work on this in process.

3. Yes but this is just theoretical nitpicking, and I haven’t proven the previously-published results are false. Answer: What I have proven is that the basis for confidence in them is non-existent. AT99 correctly highlighted the importance of the GM theorem but messed up its application. In other work (which will appear in due course) I have found that common signal detection results, even in recent data sets, don’t survive remedying the failures of the GM conditions. If anyone thinks my arguments are mere nitpicking and believes the AT99 method is fundamentally sound, I have listed the six conditions needing to be proven to support such a claim. Good luck.

I am aware that AT99 was followed by Allen and Stott (2003) which proposed TLS for handling errors-in-variables. This doesn’t alleviate any of the problems I have raised herein. And in a separate paper I argue that TLS over-corrects, imparting an upward bias as well as causing severe inefficiency. I am presenting a paper at this year’s climate econometrics conference discussing these results.


The AR6 Summary paragraph A.1 upgrades IPCC confidence in attribution to “Unequivocal” and the press release boasts of “major advances in the science of attribution.” In reality, for the past 20 years, the climatology profession has been oblivious to the errors in AT99, and untroubled by the complete absence of specification testing in the subsequent fingerprinting literature. These problems mean there is no basis for treating past attribution results based on the AT99 method as robust or valid. The conclusions might by chance have been correct, or totally inaccurate; but without correcting the methodology and applying standard tests for failures of the GM conditions it is mere conjecture to say more than that.

[by Ross McKitrick — originally posted on]

See also:

Social Media channels are restricting Electroverse’s reach: Twitter are purging followers while Facebook are labeling posts as “false” and have slapped-on crippling page restrictions.

So, be sure to subscribe to receive new post notifications by email (the box is located in the sidebar >>> or scroll down if on mobile).

Please also consider disabling ad blockers for, if you use one.

And/or become a Patron, by clicking here:

The site receives ZERO funding, and never has.

So any way you can, help us spread the message so others can survive and thrive in the coming times.

Grand Solar Minimum + Pole Shift

Related posts

29 Thoughts to “The Cover-Up Of Problems In The IPCC Report (by Prof. Antero Ollila) + The IPCC’s Methodology Is Fundamentally Flawed (by Ross McKitrick)”

  1. Charles Hazell

    Carbon dioxide has been measured regularly since 1958 at the Mauna Loo facility on the Hawaiian Islands. CO2 atmospheric content was 312 ppm in 1958 and is now today 415 ppm. This means CO2 content of the atmosphere was 0.0312% in 1958 and is today 0.0414%. Earlier measurements published in 2007 gave a previous high 1943 of 414 ppm, 370 ppm in 1858, and 450 ppm in 1812.
    Despite the increased use of fossil fuels in the 20th century, CO2 content has remained steady at between 0.03% and 0.04% for the last 200 years.
    There is no manmade climate warming emergency.

    1. P. J. Flanders

      If you have ever tried to convince someone that the MSM is lying to us, the following video is a powerful tool to accomplish that:

  2. robertL

    Thank you for these refutations of the UN IPCC frauds.

    Most appreciated.

    To all those AGW sycophants out there – you are supporting a hidden agenda.

    Shame on You.

  3. Ed taster

    Very interesting. I’d love to see an expert debate on the subject

    1. ED – re your comment: I’d love to see an expert debate on the subject.

      The warmists won’t debate anymore because they got slaughtered every time.

  4. Ice Age Eugenics Tipping Point Now.inf0... get stuff

    People are waking up to the scam, so they’re hurrying their push for booster shots to kill off the masses before they revolt

    Because of all the deaths, injuries and outright fraud committed via PCR-diagnosed “cases,” the people of the world are rapidly awakening to the astonishing truth of all this: It’s a scam to achieve global depopulation and authoritarian control over humanity.

    As the lockdowns intensify, the people are taking to the streets and demanding an end to the scam. With hundreds of thousands now protesting in France, the establishment there is locking unvaccinated people out of grocery stores, seeking to literally starve out the unvaccinated. This will only serve to wake up the people even faster, adding to the levels of discontent that are now exploding across the planet.

    We have arrived at a tipping point. Humanity is awakening at an accelerating rate, so now the globalists are trying to exterminate people as rapidly as possible to stop the spread of truth.

    1. prioris

      You have to be careful of the naturalnews and guy that runs it. There can be good info there but you should be aware that he is fake opposition and likely works for intel agencies. Remember, he promoted Trump.

      1. Homo Erectus

        August 21, 2021 at 9:30 pm
        You have to be careful of the naturalnews and guy [Mike Adams] that runs it.

        Situation Update, August 24th, 2021 – Governments unleash TOTAL WAR against the human population with vaccines, starvation and death camps
        Urgent!: the Spike-Protein is Graphene Oxide inside your body “spiking” you’

        Call me anytime Prioris Priapism.


    August 9, 2021 Cap Allon
    More than 20,000 households are understood to be without power after high winds brought down lines and freezing weather saw usage surge beyond generation capacity.

    August 9, 2021 Cap Allon
    …if you want proof of mainstream media obfuscation and agenda-driving drivel, you need look no further than their reporting of the Greenland ice sheet.

    Excerpt from my paper
    “Rode and Fischbeck, professor of Social & Decision Sciences and Engineering & Public Policy, collected 79 predictions of climate-caused apocalypse going back to the first Earth Day in 1970. With the passage of time, many of these forecasts have since expired; the dates have come and gone uneventfully. In fact, 48 (61%) of the predictions have already expired as of the end of 2020.”

    Climate doomsters have a perfect NEGATIVE predictive track record – every very-scary climate prediction, of the ~80 they have made since 1970, has FAILED TO HAPPEN.

    Fully 48 of these predictions expired at the end of 2020. Never happened! Never will!
    What are the odds at 50:50 per prediction?
    3.6*10^-15 = 0.0000000000000036
    That is one in 281 Trillion!

    There is a powerful logic that says no rational person or group could be this wrong, this utterly obtuse, for this long; they followed a corrupt agenda, and they lied again and again.

    The ability to predict is the best objective means of assessing scientific competence, and the global warming alarmists have NO predictive track record – they have been 100% wrong about everything and nobody should believe these fraudsters – about anything!

    1. Jack


      I predicted that the 2021 monsoon season would be wetter than average back in early June, before the fun started, based only on looking at the ground. I don’t need a medal, but I’ll have a banana.

      Let’s look at the facts and numbers.

      According to the USGS web pages about the water cycle and probably many, many other sources if you don’t trust the gubmint_

      90% of atmospheric moisture comes from evaporation of surface water (oceans, rivers, lakes and streams);

      the remaining 10% of atmospheric moisture comes from transpiration of water by plant life.

      Bring on the extended, exceptional southwest drought.

      Hey! Why did you turn off the water?

      Plant life doesn’t have sufficient water to keep pumping it out, so they scale it way back to conserve water for basic biological function. Transpiration is inhibited. Photosynthesis is scaled way down, so the carbon cycle is messed up-for all you carbon dioxide level fanatics out there.

      A lot of plant life dies, especially if they are less drought tolerant.

      Plants are dehydrated, so they’re more susceptibe to pests and disease, and many perish.

      Plant life can’t afford to waste water with unnecessary cooling via transpiration, but the sun is still hot, so the color of the foliage changes to a lighter green, to reflect more light, and depending on the angle of incidence light is reflected to the ground, which heats it up.

      There isn’t any water for growth, so new growth is inhibited.

      Everyone is so parched, that they’re more susceptible to wildfires, and wildfires burn out of control.

      The wind still blows occasionally and knocks off foliage.

      Water is scarce, so the foliage isn’t replaced, resulting in less shade.

      The sun still shines, and with less shade the ground absorbs more heat.

      With all of these trees and plants dying and little new growth, there is a lot more bare ground, which absorbs more heat.

      Water is drawn down by what has survived.

      Water has a high heat capacity. Wet soil changes temperature more slowly than dry soil, given the same amount of energy, so the dry soil heats up more. Granted, the soil color gets lighter and reflects more light, but it doesn’t’t reflect all of it. If it doesn’t get water soon, it will keep getting warmer.

      All of these factors make the ground warmer, and the heat is conducted to the air, which heats up the atmosphere.

      Evaporation requires latent heat, which comes from the environment. Transpiration is evaporation.

      Cloud formation is dependent on a balance of atmospheric moisture, temperature and pressure. It’s warmer. Plant life isn’t producing as much moisture. Thus, there are fewer clouds, and it’s sunnier.

      Lakes, ponds and other natural and manmade reservoirs are getting lower, and due to the shapes of all of these reservoirs there is less surface area from which water is evaporated. Granted, with less thermal mass the reservoirs heat up, which helps evaporation but kills aquatic life, including aquatic plant life. See above.

      And humans keep stripping trees and vegetation, covering the ground with asphalt, concrete and everything else to build roads and mini-malls. Plant life is reduced, and the artificial groundcover heats up and reduces surface area from which water can get into the atmosphere via evapotranspiration from natural growth.

      Will evaporation from the ocean make up for the reduced atmospheric moisture? Not a chance.

      Evaporation is dependent on relative humidity, wind speed, temperature, surface area and sunlight. Well, a portion of the moisture from plant life is missing, so it is drier, so that helps. The water temperature might be slightly higher, so that helps. The air temperature might be higher, so that helps. That’s a lot of might. The air temperature over the ocean probably won’t be enough to prevent cloud formation, so you’re not going to get more sunlight. We can’t do the math without knowing all of the variables, but if it automatically balanced there wouldn’t be any drought.

      And you still have to deal with the land temperature. When the clouds comes inland, where the temperature is too high the clouds are vaporized, so it won’t rain to cool it down.

      There’s your global warming in a nutshell.

      The plant life is essential to the water cycle, and water is essential to plant life.

      Water in the ground is essential to keeping it cool.

  6. Jack

    I just looked at the 24-hour snow radar.

    It shows snow from north-central to southwest Colorado, across a width of more than a third of the width of the state.

    Wyoming had scattered snow over a large area, too.

    It must be those pesky carbon dioxide levels, because there’s no other way to explain it.

    We’re all just frogs in a pot of water, and they’re are cranking up the heat to maximum as quickly as possible now.

    end sarcasm

  7. Jack

    I just tried to have a conversation with an online meteorolgy forum about my theory pointing to an ultimate cause of extreme weather, with plenty of supporting observations. They were quite rude and refused to answer my specific questions with any honesty. Their only explanation was synoptic variabilty or some other meteorolgical *effect*. I kept pointing out that their answers were all effects that had some cause.

    They kept insisting that the scale of my observations was too small, when they had no way of knowing what the scale of the effects was.

    I also offered a published article about the scale of the phenomenon upfront, as independent verification of my observations about the root of the ultimate cause.

    They devolved into logical fallacies, including Appeal to Authority in the form of, to paraprase, “Listen to me, because we’re meteorologists and have diplomas, so it it is because we say so, even though we haven’t answered anything. So show a little respect, dumbass.”

    Complete hubris.

  8. Jack

    One of the aforementioned meteorologist called my observation of snow at an ambient temp of 75°F, saying it was thermodynamically impossible.

    When I asked why exactly it was thermodynamically impossible, he said he wasn’t going to discuss the impossible.

    1. Bible Scholar

      … also, about Tom’s weenus probs and also noting the bible behind your cat in the vid looked pretty worn out…so did ja ever figure out the thermodevoutly odd physics of this phenomenon with your sassy and scientifically devout and analytical mind? Was just wonderin’ after the aforementioned and recommended video that you made.

      1. P. J. Flanders

        I like point 11. Any Catholic who mentions Jesus’ foreskin now will be excommunicated from the Church.
        Is that like a get out of jail free card?

      2. Jack

        Bible Scholar,

        Mind if I just call you BS?

        I really don’t care. Nobody wants to back up their conclusions that refute all of the evidence, because they think their doctorates give them the final say.

        There are more interesting problems.

        Are we really supposed to believe that clouds from wimpy Tropical Storm Marty are overwhelming western New Mexico and Colorado, after after climbing from sea level to elevations of 8,000 to over 10,000 feet in southeast Arizona and southwest New Mexico and 13,000 feet in Colorado or squeezing through narrow hallways and then taking the escalator?

        Okay, the trek across the hot desert might lift up the clouds from the Pacific some, but the low temperatures amd pressures up there are dumping buckets of rain and drying out the air well before any moisture gets much past Silver City.

        How is all of this weather growing to the west in northeast Arizona when the wind from the west on one side is fighting the wind from the east on the other?

        And the winds that are supposed to be pushing all of this moisture up from across the border are so disorganized because the winds are all bouncing off of mountains in Mexico, southeast Arizona and southwest New Mexico and being deflected away from the moisture.

        And these winds are picking fights with high pressure systems up at 5000 to 13000 feet in the mountains that are at a higher pressure than Phoenix at 1450 feet.

        The moisure is coming from the forests in the Rockies, Coconino, Tonto, Apache Sitgreaves, White Mountains, etc.

        Sometimes old wives’ tales have a grain of truth. Forests do bring the rain.

        Forests bring the rain by adding enough additional moisture to what is already in the atmosphere to make the rain fall.

  9. Jack

    Mead and Powell both had two surplus days in a row!

    Powell was up roughly 4 inches in two days!

    Powell hasn’t had a surplus day in many weeks.

  10. Jack

    The other day when I was reading about atmospheric electricity, I came across an interesting fact that I recall reading roughly two decades ago:

    Natural growth strengthens the electric field at the surface.

    Recently I have been delving into the effect of evapotranspiration on cloud formation and weather. There are several recent studies that assert a correlation, i.e. a feedback mechanism between vegetation and weather/climate.

    I suggest that in addition to the movement of moisture from the ground to the atmosphere a correlation between evapotranspiration and weather is due to the alteration of the electric field by natural growth. In other words, the vegetation must have adequate water to enable the feedback mechanism via the enhancement of the electric field.

    Now just look at recent weather.

    There was a 20+ year drought in the southwest. I’m not saying that it is over, but conditions have improved dramatically.

    The west, northwest and other regions have had exceptional and extreme drought.

    In 2020, we have the start of a global cooling trend.

    Now, I will say it again. Since 2017 there have been a lot of positive changes to the ecology in the White Mountains of Arizona. A majority of those improvements were accomplished in November 2019, when the runoff was reduced substantially, putting water back into a lot of very thirsty ground. I have noted amazing improvements in the area from Show Low to Taylor-Snowflake.

    In February 2021, the local newspaper White Mountain Independent reported on high water tables in the White Mountain region and how the region was well positioned for a water crisis.

    Summer 2021 has been one of the wettest monsoon seasons on record, from Arizona to Texas and other parts, one year after the driest monsoon season on record in Arizona last year.

    There has been record-smashing precipitation in many areas all over the country.

    Winter 2020-21 saw extreme record-breaking cold and snow across the country.

    We’re seeing changes in the jetstream, which MSM wants to tell us could collapse and kill us all.

    The White Mountains region was directly in the path of major weather systems that stretched clear across the country in every direction on different days.

    Can you see the causation now?

    The improvements in the ecology, higher water tables and resulting enhanced evapotranspiration affected the electric field, resulting in restoration of the water cycle.



    1. Homo Erectus

      Ooooh… I jus loves it when you gets all angry and fired up there DENISS or DENNIS or DENICE or SASSY… keep going!

      1. P. J. Flanders

        I proofread better.

  13. From For Life on Earth

    The FLOE Show 1: Who’s pulling the strings on the Great Reset global putsch?

    Highly informed discussion — Strongly recommended video!

  14. Ice Age Eugenics and Biodigital-Convergence Now.Info

    They are committing mass murder notes Lee Merritt, MD

    Not caused by a virus argues Lee Merritt, MD


    MAD COW Vaccine – mRNA outside Cells are PRIONS

  15. Michal Krawczynski

    Just heard the summer here will basically end tomorrow (already last week was bad, a week ago my tomato plants were healthy, now they’re all rotten from the cold and rain…) and it’s still August. We used to get 30+ in September and October. Now we don’t get much of that even in August. From the very start in March 2020 Ive been thinking this is ultimate distraction and controlled demolition. Perhaps that’s why the global “government” doesnt care at all about Sweden. Perhaps over there they wont even get a chance to flee, so no need for lockdowns and movement restrictions over there…

  16. Ice Age Deca-Mega-Self-dissemicide Vaxx(s) Now.Info


    Self-disseminating shedding vaccines to advance genocide/sterilization.

    Self-disseminating vaccines have their roots in the Australian effort to create sterilizing vaccines for [human and] small mammal control15,16, and have also been developed and tested experimentally as a tool for vaccinating rabbits [humans] against [reproductive capacity] myxomatosis and rabbit haemorrhagic fever17,18,19. Their obvious advantage, of course, is that for each animal you vaccinate directly, additional animals are vaccinated for ‘free’ either through behavioural transmission of a conventional vaccine or through the contagious spread of a transmissible vaccine.

    A major distinction among self-disseminating vaccines with considerable epidemiological consequences is whether the vaccine is ‘transmissible’ and capable of indefinite [or infinite] transmission or is ‘transferable’ and restricted to a single round of transmission (Fig. 1). In the next sections, we review the basic epidemiological theory quantifying the gains provided by each type of self-disseminating vaccine. These theoretical results use the classical epidemiological concept of a basic reproductive number, R0, that quantifies the number of secondary ‘vaccine infections’ created by the first vaccinated individual in the population.

    One-step, [self-disseminating] ‘transferable’ vaccines reduce vaccination effort despite being dead ends… so… multi-multi-booster shots to the rescue.

    We now have the technology to develop vaccines that spread themselves

    Future Zombies. Rudolf Steiner 1923 Prophecy: A Vaccine to Sever the Spiritual Connection in Mankind | The Truthseeker
    About ICN2 – ICN2 Graphene Vaccine Neuralink Zombie
    SV40 Another antivaccine zombie meme: polio vaccine and SV40 and cancer, oh, my! – Science-Based Medicine

  17. Current-Lee Merritt, MD, stay-Current-Lee

    Lee Merritt, MD – Brighteon [maybe the brightest best connected/credentials/logos on the brighter side of genocide]

  18. Current-Lee Merritt, MD, stay-Current-Lee

    Lee Merritt, MD – Brighteon [maybe the brightest best connected/credentials/logos on the brighter side of genocide]

  19. Lucy

    I read this after my walk with Jeffy today… kisshes for you Cap’n;).

Leave a Comment