Overwhelmed by the effects of the COVID-19 pandemic, citizens and governments have had to push aside their concerns over the more threatening yet distant matter of climate change. Despite global lockdowns unprecedented in human history, CO2 emissions were barely impacted and are expected to continue increasing in the near future as economic activity resumes. The negative shock to the global economy in 2020 pales in comparison to the long-term costs of climate change, which I will try to estimate in this paper using Bayesian econometric methods.
Using CO2 and temperature anomaly data collected by the National Oceanic and Atmospheric Administration (NOAA) I will examine trends from 1995-2020, build an estimate of economic costs based on temperature anomalies, and lastly create an estimate using Bayesian techniques for economic costs from climate change in 2040. This paper has a number of limitations. First, the raw data provided by NOAA requires a high degree of cleaning and contains some imperfections. Second, costs from natural disaster events vary wildly dependent on a large number of factors (insurance costs, labor hours lost, human toll) too complex to be fully addressed here that take years to calculate and understand, so the focus will mainly be on material damages. Third, this paper can be considered a best case scenario, since numerous feedback loop events are not included in the model and thus I am limiting the estimates to a function of the current trend in temperature anomalies and carbon emissions. Despite these constraints the estimates arer still significant and far from trivial in their potential impact.
The author of the book “The Uninhabitable Earth”, David Wallace-Wells, describes the effects of climate change as a “cascade”. Droughts in 2011 created civil unrest in Syria, which unleashed a full blown civil war to overthrow the Assad regime, a refugee crisis in Europe ensued, generating a political movement shifting towards nationalist populism. The costs from climate change are human, economic, and extremely hard to calculate accurately due to these intercorrelations.[1]
At this point in history we have moved away from questioning the presence of this phenomena and now shift towards calculating costs as a way to encourage productive policies to mitigate the dangers. These policies so far have resulted to be quite unpopular despite the risks we face as a species. In France a gasoline tax led to some of the largest riots in decades in late 2018, shutting down coal plants continues to cause resentment in parts of the United States, and carbon taxes are still not widely implemented despite the universal consensus about their effectiveness due to pushbacks from the energy lobby. The societal hurdles remain steep and numerous as the situation deteriorates, with 2020 on track to being one of the top three hottest years on record.[2]
To summarize the context of this paper I display below the increase in CO2 around the world and the national trend in temperature anomalies in the US. The threshold of 400 ppm, which scientists warned was a significant and dangerous milestone was cleared last decade even as emissions continued unabated. In the US we also see that our temperature anomaly now averages around one degree Celsius over pre-industrial levels. The relationship between CO2 and temperatures is proven below as well by a Markov Chain Monte Carlo (MCMC) model, which will be explained later on in the paper.
##
## Iterations = 1001:101000
## Thinning interval = 1
## Number of chains = 1
## Sample size per chain = 100000
##
## 1. Empirical mean and standard deviation for each variable,
## plus standard error of the mean:
##
## Mean SD Naive SE Time-series SE
## (Intercept) -2.8594818 1.41666694 0.0044798942 0.0044798942
## CO2 0.0098461 0.00366398 0.0000115865 0.0000115865
## sigma2 0.9352993 0.07762145 0.0002454606 0.0002473341
##
## 2. Quantiles for each variable:
##
## 2.5% 25% 50% 75% 97.5%
## (Intercept) -5.63373592 -3.81452552 -2.85818690 -1.9104328 -0.0735971
## CO2 0.00264716 0.00739355 0.00984075 0.0123166 0.0170147
## sigma2 0.79548701 0.88096068 0.93106032 0.9849581 1.0986190
As CO2 emissions continue to rise as shown above both graphically and in the MCMC results, temperature anomalies follow in tandem. This relationship has been understood since the nineteenth century and can be confirmed quite easily as shown above, that as CO2 ppm increase temperatures move in the same direction.
Therefore, we can expect that each additional particle per million of CO2 in the atmosphere leads to an increase in temperatures in the US of around 0.009 degrees Celsius on average. Studies show that increased heat in the atmosphere exacerbates extreme climatological events, since heat provides additional energy to global weather systems. We now ask ourselves, based on this pattern what does the future hold twenty years from now?
## Series: co2.ts
## ARIMA(2,2,3)
##
## Coefficients:
## ar1 ar2 ma1 ma2 ma3
## -0.082906 0.093546 -1.344201 0.229450 0.124136
## s.e. 0.375026 0.074542 0.375158 0.555861 0.190757
##
## sigma^2 estimated as 0.0802071: log likelihood=-116.7
## AIC=245.39 AICc=245.5 BIC=273.11
Looking at this twenty year timeframe (2020-2040), I created an ARIMA (2,2,3) model to forecast CO2 concentration levels in the atmosphere. Based on the results, the forecast fitted values reach a level of 464.797 ppm in 2040 compared to the 414.72 in 2020. This difference of around 50 ppm, assuming a “business-as-usual” trend, would translate into a 0.49 degree Celsius warming of US temperatures based on the previous results obtained. This results in roughly a 1.4 degree temperature anomaly overall, which falls in line with the expectations of the United Nation’s Intergovernmental Panel on Climate Change (IPCC) sixth assessment report from 2018.[3]
A CO2 level of 465 ppm is not a trivial amount. When we consider that half the CO2 in the atmosphere was emmitted between 1990 and 2020, an additional 80ppm is quite considerable. For context, the pre-industrial CO2 levels were around 275 ppm, meaning we are on pace to reach 1.7 times pre-industrial carbon levels by 2040 with fairly devastating consequences. As we can see in the chart below, even at current CO2 levels we are already experiencing the increasing economic damage from severe weather events in the United States.
In order to estimate the relationship between increasing temperature anomalies and damages from natural disaster events, I used a Bayesian Standard Normal Linear Regression Model. This method lies in contrast with the more common frequentist methods that employ hypothesis testing and sampling to reach conclusions.
The Standard Normal Linear model is the simplest and most straightforward of the Bayesian econometric models. This model takes the shape:
\[ y_t = X_t \beta + \epsilon_t, t = 1...n, \epsilon_i |X\sim^{i.i.d} N(0, \sigma^2)\]
Where \(y_t\) is the economic damage in a given year \(t\), \(X_t\) is the amount of CO2 in the atmosphere, and \(\epsilon_t\) is the error term. This resembles the frequentist linear regression model with a normal distribution, but by including Markov Chain simulations it incorporates Bayes theorem into the process.
Markov Chain Monte Carlo (MCMC) is a model that relies on repeated simulations and the law of large numbers to perform numerous samples in an experiment to obtain an expected value \(\mu\). The closer we get to infinite simulations the closer we get to \(\mu\).
\[ \bar X_n \to \mu\] \[for\] \[n \to \infty\]
The distribution of the estimates we are seeking is called the “posterior”, given by:
\[ P(\beta,\tau|y,x) \propto P(y,x|\beta, \tau)P(\beta,\tau)\]
As with a frequentist distribution we assume that the distribution is independent \(P(\epsilon,x|\beta,\tau) = P(\epsilon|\beta,\tau)P(x|\beta,\tau)\), exogenous \(P(x|\beta,\tau) = P(x)\), and that the errors are i.i.d. \(P(\epsilon|\beta,\tau) \sim N(0_n, \tau^{-1}I_n)\).
The posterior distribution is derived in turn by two components: The “likelihood” function, which measures the goodness of the fit of the model, and the “prior”, which represents the a priori assumptions and beliefs about the posterior distribution and does not depend on the data.
The likelihood, representing the density of the data conditonal on the parameters of the model is given by:
\[ P(y,x|\beta,\tau) = L(\beta,\tau;y,x)\] Broken down as:
\[ \propto (\tau)^{n/2} exp[-\frac{\tau}{2}(y-x\beta)'(y-x\beta)]\] The prior used in this model is the standard conjugate vague prior:
\[ P(\beta,\tau) = P(\beta)P(\tau)\] Which implies that:
\[ P(\beta,\tau) \propto \frac{1}{\tau}\]
To implement all this in practice, I used the MCMCregress function from the MCMCpack package in R. This function uses Gibbs sampling and a normally distributed prior to give us the posterior sample estimate.
With the use of MCMC simulations, this function approaches the real coefficient between the variables. Since we are analyzing country-wide temperature data driven by planetary level phenomena, I use the default vague priors. Attempting to provide more informative priors produced no noticeable improvements to the model.
lm.MCMCx <- MCMCregress(data = short_temp, damage ~ mean_temp, burnin = 1000,
mcmc = 100000,
b0 = 0,
B0 = 0,
c0 = 0.001,
d0 = 0.001)
summary(lm.MCMCx)
##
## Iterations = 1001:101000
## Thinning interval = 1
## Number of chains = 1
## Sample size per chain = 100000
##
## 1. Empirical mean and standard deviation for each variable,
## plus standard error of the mean:
##
## Mean SD Naive SE Time-series SE
## (Intercept) 157.175 111.081 0.351270 0.351270
## mean_temp 150.290 104.200 0.329509 0.329509
## sigma2 63514.272 21125.008 66.803139 72.538607
##
## 2. Quantiles for each variable:
##
## 2.5% 25% 50% 75% 97.5%
## (Intercept) -61.6879 84.7004 157.171 229.407 377.097
## mean_temp -56.4154 81.9734 150.598 218.427 356.163
## sigma2 34467.8263 48746.3403 59515.171 73806.688 115568.213
plot(lm.MCMCx)
From 100,000 simulations we can see that there is a clear positive relationship between economic damage from climate events and temperature anomalies. As temperatures get warmer in the US we can expect more disasters, which will incur increasingly higher costs.
From the mean value we determine that for every additional degree Celsius in a given year translates into almost $150B in damages on average in a given year. This lies in stark contrast to the current $43.9B average from the past 40 years. It is worth noting that this value includes only a fraction of all the potential costs incurred from a warming planet, leaving out factors such as:
Because we expect temperatures to be roughly 0.5 degrees celsius over the current levels, that translates to an additional $75B in damages from extreme weather events every year in 2040 compared to those in 2020.
Lastly, I also ran a Heidelberger and Welch convergence diagnostic to determine that the number of simulations is sufficient to reach a convergence in the estimate. From the results we can infer that all tests passed as shown below.
##
## Stationarity start p-value
## test iteration
## (Intercept) passed 1 0.655
## mean_temp passed 1 0.237
## sigma2 passed 1 0.288
##
## Halfwidth Mean Halfwidth
## test
## (Intercept) passed 157 0.688
## mean_temp passed 150 0.646
## sigma2 passed 63514 142.176
Looking forward to 2040 and a hypothetical 1.4 C increase in temperature anomalies, annual economic damages increase to $367.58B.
\[ Estimated Damages 2040 = 157.175 + 150.290 * (1.4) = 367.581\]
This value is close to an estimate obtained by The Universal Ecological Fund in 2017, which expects $360B a year on average in damages by the next decade.[4] Their estimate includes health costs and other economic losses, which provides higher estimates than the one calculated in this paper. However, even the simple model from this paper proves to be within the reasonable ballpark of probable costs.
As almost everyone acknowledges at this point, severe climate events seem to be occurring more often and getting worse. We see wildfires in California in December, hurricanes named Eta and Iota devastating Central America, Arctic heat waves, etc. The impacts are far from theoretical and easily quantifiable.
Estimating the economic consequences of all this has so far produced models and forecasts that have extremely wide margins of errors varying by tens of billions of dollars each year. The results from my model lie within that range of possibilities not necessarily due to the model’s strength, but due to the wide variance in predicting what economic and atmospheric conditions will be like twenty years from now. Much of the uncertainty also depends on how governments around the world will react.
As mentioned, the obtained results are likely best case scenarios. With every new UN report on the condition of the planet we and are reminded that time is running out and the price tag will continue to increase for every year we delay.[5] As Andreas Malm writes in “The Progress of This Storm”: “Man-made weather is never made in the present… Global warming is a result of actions in the past.”[6] In other words, prior inaction is coming back to haunt us, and it is for the sake of future generations that we must enact substantive changes immediately, for the costs of doing nothing will be much higher.
[1] David Wallace-Wells, The Uninhabitable Earth, New York, New York: Tim Duggan Books, 2019.
[2] Andrew Freedman, Earth just notched its warmest November, as 2020 closes in on record for hottest year, Washington Post https://www.washingtonpost.com/weather/2020/12/07/warmest-november-hottest-year/
[3] Intergovernmental Panel on Climate Change, Sixth Assessment Report - United Nations, 2020. https://www.ipcc.ch/assessment-report/ar6/
[4] The Universal Ecological Fund, The Economic Case for Climate Action in the United States Report, 2017. https://feu-us.org/case-for-climate-action-us/
[5] U.S. Global Change Research Program, Fourth National Climate Assessment Report, 2018. https://nca2018.globalchange.gov/
[6] Andreas Malm, The Progress of This Storm, Verso, 2018.
Adam B. Smith, 2010-2019: A landmark decade of U.S. billion-dollar weather and climate disasters, NOAA, 2020. https://www.climate.gov/news-features/blogs/beyond-data/2010-2019-landmark-decade-us-billion-dollar-weather-and-climate
Adam B. Smith & Jessica L. Matthews, Quantifying Uncertainty and Variable Sensitivity within the U.S. Billion-dollar Weather and Climate Disaster Cost Estimates, National Oceanic and Atmospheric Administration, 2015. https://www.ncdc.noaa.gov/monitoring-content/billions/docs/smith-and-matthews-2015.pdf
Stephen Leahy, Hidden Costs of Climate Change Running Hundreds of Billions a Year, National Geographic, 2017. https://www.nationalgeographic.com/news/2017/09/climate-change-costs-us-economy-billions-report/
National Oceanic and Atmospheric Administration - Data Access, https://www.ncdc.noaa.gov/data-access