The Social Cost of Carbon: Advances in Long-Term Probabilistic Projections of Population, GDP, Emissions, and Discount Rates

ABSTRACT:The social cost of carbon (SCC) is a crucial metric for informing climate policy, most notably for guiding climate regulations issued by the US government. Characterization of uncertainty and transparency of assumptions are critical for supporting such an influential metric. Challenges inherent to SCC estimation push the boundaries of typical analytical techniques and require augmented approaches to assess uncertainty, raising important considerations for discounting. This paper addresses the challenges of projecting very long-term economic growth, population, and greenhouse gas emissions, as well as calibration of discounting parameters for consistency with those projections. Our work improves on alternative approaches, such as nonprobabilistic scenarios and constant discounting, that have been used by the government but do not fully characterize the uncertainty distribution of fully probabilistic model input data or corresponding SCC estimate outputs. Incorporating the full range of economic uncertainty in the social cost of carbon underscores the importance of adopting a stochastic discounting approach to account for uncertainty in an integrated manner.

economic growth, population, and greenhouse gas emissions, as well as calibration of discounting parameters for consistency with those projections. Our work improves on alternative approaches, such as nonprobabilistic scenarios and constant discounting, that have been used by the government but do not fully characterize the uncertainty distribution of fully probabilistic model input data or corresponding SCC estimate outputs. Incorporating the full range of economic uncertainty in the social cost of carbon underscores the importance of adopting a stochastic discounting approach to account for uncertainty in an integrated manner.
A s the primary economic measure of the benefits of mitigating climate change, the social cost of carbon (SCC) has been called "the most important number you've never heard of" (Economist 2017; Roston 2021). Put simply, the SCC is an estimate, in dollars, of the economic cost (i.e., damages) resulting from emitting one additional ton of carbon dioxide (CO 2 ) into the atmosphere. Conversely, it represents the benefit to society of reducing CO 2 emissions by one ton-a number that can then be compared with the mitigation costs of reducing emissions. There are analogous metrics for methane (CH 4 ) and nitrous oxide (N 2 O). The SCC has deep roots in economics. Indeed, many textbooks use carbon emissions and the resulting climate change as the canonical example of an externality that must be addressed through Pigouvian taxation or other means to maximize human welfare. In particular, basic economic theory recommends that an optimal tax on CO 2 emissions (a carbon tax) be set equal to the SCC, for which marginal damages are measured along an optimal emissions trajectory (Pigou 1920;. 1 But the relevance and application of the SCC go well beyond its role in determining an optimal Pigouvian tax. As political leaders and stakeholders debate both the broad outlines and the fine details of policies to reduce carbon dioxide emissions, the SCC lies in the background as a remarkably important calculation, used by the US federal government for more than a decade for developing vehicle fuel economy standards and power plant emissions rules. Such analyses have been a mainstay of the regulatory rule-making process since Executive Order 12291 was issued more than forty years ago. 2 The SCC has also been the basis for the value of federal tax credits for carbon capture technologies, beginning in 2018 (Rodgers and Dubov 2021), and zero-emissions credits for nuclear power in New York State. 3 The power grid operator for New York is working to include the SCC as a cost adder on top of energy supply bids submitted by power plants, thereby reflecting social costs into market prices and plant dispatch. 4 Many other states have used the SCC as the basis for climate policies and as a benchmark against which proposed carbon prices are compared. 5 Proposed applications include federal procurement decisions and royalties on oil and gas leases on federal land Prest and Stock 2021;White House 2021b, sec. 5[b][ii]). 6 Construction of the SCC and the benefits of reducing emissions are also somewhat distinct from the distribution of benefits. That is, because the consequences of climate change will be different for different communities (country, region, income, social identity), the benefits of mitigating climate change will similarly vary. For example, rising temperatures are likely to create heavier burdens on already hot (and often poor) countries like Bangladesh than on cold (and often rich) countries like Norway. Putting greater weight on dollar-value effects in poorer communities-that is, equity weighting (Errickson and others 2021)-is not the current standard practice, however. Rather, the distribution of effects (when available) is presented alongside the aggregate, unweighted summary. Weighting becomes important as we gain understanding of the distribution of effects.
Estimation of the SCC goes back to William  and has recently seen increasing prominence. In 2018, the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel was awarded to Nordhaus (alongside Paul Romer) for his work incorporating climate 3. State of New York Public Service Commission, Case 15-E-0302 and Case 16-E-0270, "Order Adopting a Clean Energy Standard," https://documents.dps.ny.gov/search/Home/ ViewDoc/Find?id=<44C5D5B8-14C3-4F32-8399-F5487D6D8FE8>&ext=pdf, page 131. 4. New York ISO, "Carbon Pricing," https://www.nyiso.com/carbonpricing. 5. Institute for Policy Integrity, The Cost of Carbon Pollution, "States Using the SCC," https://costofcarbon.org/states; Resources for the Future, "Carbon Pricing 101," https:// www.rff.org/publications/data-tools/carbon-pricing-bill-tracker/; see also Johnson (2009). 6. Many aspects of climate policy decisions are not necessarily tied to the SCC. Essentially, these include all policy design issues beyond measuring benefits and balancing with costs, such as optimal R&D spending amid knowledge spillovers, cost-effective policy design (e.g., uniform standards versus flexible incentive-based policies), interactions between policies (Goulder 1995;Barrage 2020aBarrage , 2020bBorenstein and others 2019), and differences in the distribution of the costs (and in certain cases government revenues) associated with different policy approaches. These are distinct from the question of estimating the marginal benefits of reducing emissions. change into economic analysis, including the role of the SCC in informing policy. 7 The SCC is typically estimated using integrated assessment models, such as the Dynamic Integrated Climate Change (DICE) model developed by Nordhaus. Integrated assessment models couple climate and economic models to estimate the economic effect of an incremental pulse of CO 2 emissions (in tons) on climate and economic outcomes. The net present value of changes in economic outcomes, divided by the number of tons in the pulse, delivers the SCC. However, many integrated assessment models used in SCC estimates have not kept up with rapidly evolving climate, economic, and demographic science. Moreover, as  noted, many of the factors underlying the SCC are deeply uncertain-notably, our understanding of Earth's climate, the effect of climate change on economic outcomes, and future socioeconomic conditions that capture the discounted consequences from changes in emissions today. The need for robust policy decisions implies we should update the SCC over time to refine central estimates and the range of uncertainty as our scientific understanding progresses.
In this paper, we review efforts to update determinants of the SCC to reflect the best available science, based on the recommendations of a 2017 committee report by the National Academies of Sciences, Engineering, and Medicine (NASEM 2017). This updating is particularly relevant in light of Executive Order 13990 (January 20, 2021), which reestablished the Obama-era Interagency Working Group (IWG) on the Social Cost of Greenhouse Gases and directed it to update the SCC. We also note other research efforts on updating the SCC.
The NASEM report recommended creating an integrated framework comprising four components ("modules") underlying the SCC calculation: socioeconomics-probabilistic projections of population, gross domestic product (GDP), and emissions over multiple centuries; climate-an improved model of Earth's climate system and climate change; damages-the economic consequences of climate change, based on recent studies; and discountingaggregated present-value marginal damages and stochastic discount factors that correctly reflect the uncertain socioeconomic drivers. Figure 1 shows how the modules fit together, including how socioeconomics affects emissions trajectories, which are input into the climate module to project future temperatures. These temperatures are converted into a stream of future economic losses in the damages module (also influenced by socioeconomic trajectories), which are then discounted to a present value in the discounting module.
Because the SCC represents the marginal effect of an incremental ton of emissions, this entire model is run twice-once as a baseline and once with a small pulse of additional emissions (figure 2). The resulting change in the stream of economic damages per ton from this emissions pulse, in present value, is the SCC. More generally, when inputs to a module are uncertain (e.g., because of uncertainty about the climate's response to emissions or about future economic growth), modelers have incorporated that uncertainty through Monte Carlo analyses by taking draws of (potentially correlated) probability distributions of each random variable. The result is a distribution of SCCs, often summarized by its expected value. For example, the federal government's current interim value of $51/ton CO 2 (IWG 2021) reflects the expected value of the SCC over uncertainty in the climate's warming response and scenarios of economic growth and population, at a 3 percent constant discount rate. The NASEM report noted that the IWG's estimates of the SCC, including the current interim $51/ton SCC value, used somewhat dated and often simplistic modules. For example, five socioeconomic scenarios were not developed with formal probabilities attached but were treated as equally likely. The scenarios did not incorporate the work done by economists, demo graphers, and statisticians to estimate and quantify uncertainty around long-term economic and population growth. Also, the discounting approach used a constant discount rate rather than treating the discount rate as stochastic; that choice becomes increasingly important as the decision horizon extends into the future. The IWG noted the potential for a declining term structure and correlation between the discount rate and damage outcomes but did not consider an explicit stochastic discount factor that accounts for both future discount rate uncertainty and, through uncertain socioeconomic outcomes, correlation with the damages being discounted. To address such shortcomings, the NASEM report issued recommendations for improvement, which Executive Order 13990 specifically directed the IWG to consider. This paper documents recent work that has improved the scientific basis for the modules so that the IWG can update the SCC to reflect the best available science. Section I discusses the improved socioeconomic module, with long-term probabilistic projections of population, economic growth, and emissions. Section II illustrates how an incremental ton of emissions translates into climate and economic effects (damages). Section III discusses the crucial role of the discount rate, given recent research on declining equilibrium interest rates, plus the importance of using stochastic discount factors and the shadow price of capital for valuing effects on investment. Section IV then combines these elements into a simplified model of the SCC, with associated uncertainty bounds for the socioeconomic, climate, damages, and discounting components. Finally, section V concludes and raises issues that await future research.

I. Economic and Demographic Drivers of Climate Effects
Assessments of damages from climate change are influenced by projections of population, economic growth, and emissions. Population growth can drive emissions and increase or decrease total economic exposure to the health effects of climate change. Economic growth similarly affects both the level of expected emissions and the resulting damages, which are often estimated to scale with economic activity . For example, the monetization of mortality consequences typically depends on per capita income (Robinson, Hammitt, and O'Keeffe 2019). Economic growth projections can also influence the SCC through the discount rate if estimates are calculated using Ramsey-like discounting, where the discount rate is a function of the rate of economic growth: higher (lower) growth scenarios will yield a higher (lower) discount rate. Finally, projections of global emissions determine the background state of the climate system against which damages from an additional pulse of emissions are measured.
Estimates of the SCC are highly sensitive to socioeconomic and physical projections (Rose, Diaz, and Blanford 2017), but revised estimates have been based primarily on changes in socioeconomic projections, not on improved understanding of the climate system (Nordhaus 2017b). Explicitly considering realistic, probabilistic socioeconomic projections is thus important for improving the characterization of both the central tendency and the uncertainty in the SCC.
A robust characterization of socioeconomic contributions to SCC estimates would ideally incorporate probabilistic projections of population, economic growth, and emissions. The particular requirements of SCC estimation, however, pose significant challenges for generating such projections. One is the time horizon: given the long-lived nature of greenhouse gases in the atmosphere, the SCC needs to account for discounted damages two hundred to three hundred years into the future (NASEM 2017). Yet nearly all projections, such as the scenarios previously used by the IWG (2010) and the shared socioeconomic pathways used by the IPCC (Riahi and others 2017), end at year 2100 and are often scenario-based rather than probabilistic. New probabilistic projections that extend well into the future are required.
Another challenge is that although climate change can be projected from emissions scenarios consistent with globally aggregated projections of economic activity and population growth, the resulting climate damages are most appropriately estimated at a regional (or even local) scale. Thus, they require geographically disaggregated estimates of GDP and population.
A third challenge is that the future path of emissions likely depends on uncertain improvements in technology and on the scale and success of policy interventions outside the range of the historical record. That is, whereas historical data may be a reasonable guide to forecast population and economic activity, the same is not true for emissions. The SCC should be measured against our best estimate of future emissions, inclusive of future mitigation policies except the one under analysis.
The fourth issue is the interrelated nature of these variables: the projections for each variable must be consistent with one another. For example, emissions intensity might be lower with higher economic growth (and its associated wealth and technological improvements).

I.A. Past Approaches to Socioeconomic Projections
In lieu of using fully probabilistic socioeconomic projections, researchers have typically turned to socioeconomic scenarios, which can provide consistency across analyses and still incorporate specific narratives. The IWG adopted a scenario approach in its initial estimates (IWG 2016), and these same scenarios support the interim estimates put forward by the Biden administration in January 2021 (IWG 2021). The five socioeconomic scenarios were drawn from the Energy Modeling Forum 22 exercise (Clarke and Weyant 2009), selected to span roughly the range of emissions outcomes in the full set of the forum's scenarios and thus represent uncertainty across potential socioeconomic projections. Only one of the scenarios represented future climate policy. The IWG extended the five scenarios to 2300 by assuming that GDP and population growth each decreased linearly to zero in 2300. The five scenarios were assigned equal probability for computing an expected value for the SCC (no such probabilistic interpretation existed for the work by the Energy Modeling Forum 22).
The IWG scenarios were critiqued for not spanning the uncertainty in a full set of relevant socioeconomic variables (e.g., GDP, population) or reflecting the broader scenario literature overall (Rose and others 2014;Kopp and Mignone 2012). The resulting SCC estimates, then, may not reflect damage calculations based on the full range of expected variation. The NASEM panel noted that the IWG did not provide a rationale for its scenario weighting or the choice to extend the scenarios from 2100 to 2300 by assuming that GDP and population growth each decreased linearly to zero. The panel recommended using a combination of statistical methods and expert elicitation to generate a set of probabilistic long-term projections for each variable.
Subsequently, a multidisciplinary research effort developed the shared socioeconomic pathways (SSPs) (Riahi and others 2017), scenarios intended primarily to support the assessment efforts of the Intergovernmental Panel on Climate Change (IPCC). Each of the five SSPs consists of quantified measures of development and an associated narrative describing plausible future conditions that drive the quantitative elements. The SSPs end in 2100, but researchers have offered extensions to 2300 (Nicholls and others 2020;Kikstra and others 2021). The SSPs are freely available and comprehensive, have an extensive publication record, and are expected to be used in the IPCC's Sixth Assessment Report. For these reasons, we use the SSPs as our primary point of comparison.
Scenarios in general, and the SSPs in particular, do not come (as the IWG assumed) with associated probabilities. That limits their utility in evaluating uncertainty. Although the SSP authors have themselves cautioned against using the SSPs in a probabilistic fashion, Ho and others (2019) sought to address this limitation through an expert survey assessing the likelihood of each SSP. Others have sought to guide scenario usage by characterizing the plausibility of various scenarios (Stammer and others 2021). Even without formal probabilities, in practice, the SSPs are often interpreted in modeling exercises as representing the uncertainty between high-emissions (SSP5) and low-emissions (SSP1) futures, at times with the implication that the difference represents a "no policy" counterfactual versus a "likely policy" scenario. This has led to a recent debate over the viability of the highemissions scenario, given the current pace of technology evolution, among other factors (Hausfather and Peters 2020).
Previous efforts to quantify the uncertainty of socioeconomic projections over a century are limited. Raftery and others (2017) used a statistical approach to generate density functions of country-level economic growth per capita, population, and carbon intensity (CO 2 /GDP) to project a density of future emissions trajectories via the IPAT equation (Commoner 1972), similar to our socioeconomic approach. 8 Müller, Stock, and Watson (2020) employed a Bayesian latent factor model that projects long-run economic growth based on low-frequency variation in the historical data of countrylevel GDP per capita. 9 Christensen, Gillingham, and Nordhaus (2018) conducted an expert survey of economists to quantify the 10th, 50th, and 90th percentile ranges of economic growth for six groupings of countries. Comparing results with the SSP ranges, they found that the SSPs underestimated the range of uncertainty expected by the experts and that using the increased range for economic growth with the DICE model suggested that emissions were also underrepresented by the SSPs.
The NASEM (2017) report noted that statistical models based solely on historical data are unlikely to fully inform the variability of future projections over centuries, suggesting caution in using raw outputs from statistical 8. The IPAT equation is Impact = Population × Affluence × Technology, a heuristic for thinking about the impact of humans on the environment.
9. The method used by Müller, Stock, and Watson (2020) extends the approach provided in Müller and Watson (2016), which was suitable only for global estimates of economic growth, to generate internally consistent growth projections at the country level. models over long time scales. This concern led the NASEM panel to recommend using formal expert elicitation to quantify the uncertainty around future long-run projections, which can then be used to augment projections from statistical models.
We next describe efforts undertaken by the Resources for the Future's (RFF) Social Cost of Carbon Initiative and collaborators to build on both statistical and expert-based approaches to generate distributions of projections of population and GDP per capita at the country level, plus distributions of the three primary greenhouse gases (CO 2 , CH 4 , and N 2 O) at the global level. The resulting probabilistic distributions, collectively referred to as the RFF Socioeconomic Projections (RFF-SPs), fully incorporate the NASEM recommendations for generating an improved socioeconomic module for SCC estimation.

I.B. Probabilistic Population Projections to 2300
METHODS To develop probabilistic, country-level population projections through 2300, we start with the fully probabilistic statistical approach that has been used since 2015 by the United Nations (UN) for its official population forecasts to 2100 (United Nations 2019). We then extend the statistical model to 2300, incorporating feedback and improvements suggested by a panel of nine leading demographic experts that we convened to review preliminary results. This work is detailed in Raftery and Ševčíková (2021).
The UN uses a probabilistic method built on the standard deterministic cohort-component method of population forecasting (Preston, Heuveline, and Guillot 2001). This method projects forward the three components of population change: fertility, mortality, and migration, broken down by age and sex. The probabilistic method builds Bayesian hierarchical models for each of the three components and projects them forward probabilistically using a Markov chain Monte Carlo method, which produces a large number of trajectories (typically 1,000-2,000) of future numbers of births, deaths, and migration events in each country by age and sex. Each trajectory of fertility, mortality, and migration is then combined to give a trajectory of future population by age and sex in each country. These trajectories of population numbers in turn approximate a probability distribution for any population quantity of interest (Raftery and others 2012;Raftery, Alkema, and Gerland 2014;Gerland and others 2014).
Fertility is projected by focusing on each country's total fertility rate (TFR), which is the expected number of children a woman would have in a given period if she survived the reproductive period (typically to age 50) and at each age experienced the age-specific fertility rates of that period. The UN models the evolution of fertility in all countries using a Bayesian hierarchical model that divides it into three phases depending on where it lies in the fertility transition from high to low fertility (pre-transition, transition, post-transition). It then fits a time series model to each phase, accounting for spatial correlation between countries (Alkema and others 2011;Raftery, Alkema, and Gerland 2014;Fosdick and Raftery 2014;United Nations 2019;Liu and Raftery 2020). 10 Mortality is similarly projected by focusing on life expectancy at birth. 11 This is projected by another Bayesian hierarchical model for all countries for both sexes (Raftery and others 2013;Raftery, Lalic, and Gerland 2014). The UN has traditionally projected net international migration for each country deterministically by assuming that it would continue in the future at the same rate as currently (United Nations 2019).
We extended the UN's method, designed for projections to 2100, out to 2300 and preliminary results were reviewed by a panel of nine expert demographers that we convened. 12 While broadly supportive, the panelists were in agreement that the resulting uncertainty bounds for TFR in 2300 were too narrow and that in particular the lower bound of the 95 percent prediction interval for world TFR in 2300 (1.66) was too high. A lower bound of 1.2 children per woman for the world TFR in 2300 was suggested as a more plausible lower bound. We incorporated this recommendation by adding a worldwide random walk component to the TFR model.
Experts on the panel also suggested that international migration should be projected probabilistically, in line with the general approach, rather than deterministically as done by the UN. We implemented this by projecting net international migration using a Bayesian hierarchical model (Azose 10. The TFR has evolved in a similar way in all countries. In preindustrial times, the TFR for a typical country was high (in the range of 4-8 children per woman). Then, usually after the onset of industrialization, it started to decrease. After a bumpy decline lasting several decades to a century, the TFR flattened out at a level below the replacement rate of about 2.1 children per woman. This decline is called the fertility transition. After the end of the fertility transition, the TFR fluctuated without a clear trend, mostly staying below the replacement rate. For example, in the United States, the TFR was around 7 children per woman in 1800 and then declined, reaching 1.74 in 1976 and thereafter fluctuating up and down; it is now 1.64, close to the level it was at in 1976 .
11. The general trend since 1840 has been that life expectancy has increased steadily (Oeppen and Vaupel 2002), with slower increases for countries with the lowest and highest life expectancy and the fastest increases for countries in the middle.
12. Each panelist provided written reviews of the preliminary projections and methodology, and all except Tomáš Sobotka presented them as part of a virtual workshop convened by Resources for the Future on October 4, 2018. Panelists are listed in the acknowledgments. and Raftery 2015; Azose, Ševčíková, and Raftery 2016). We additionally implemented the final panel recommendation to impose constraints on population density to prevent unrealistically high or low population numbers in some age groups in some countries. RESULTS The resulting population projections for 2300 for the world as a whole and for the continents are shown in figure 3. They show that total world population is likely to continue to increase for the rest of the twentyfirst century, albeit at a decreasing rate, to level off in the twenty-second century, and to decline slightly in the twenty-third century. Uncertainty for 2300 is considerable, appropriately, reflecting the very long forecast time horizon, with a median forecast of 7.5 billion, but a 90 percent interval from 2.8 to 20.5 billion. The results agree closely with the UN forecasts for the period to 2100 (United Nations 2019). Figure 3 also shows the results for each major continental region. The populations of Asia, Europe, and Latin America are likely to peak well before the end of this century and then decline substantially. The populations of Africa and North America are also likely to peak and then decline but much later, in the twenty-second century. In the case of Africa this is due to population momentum (with a high fraction of the population currently in reproductive ages) and current high fertility. In the case of North America it is due to a combination of modest population momentum, fertility that is closer to replacement level than in other continents, and immigration. Uncertainty for each region in 2300 is high.
In comparison to the population projections from the SSPs, our population projections are centered around a peak of slightly over 10 billion people globally reached late this century, lying closest to SSP2, although SSP2 levels off at a higher level than our median projection after 2200. Through 2300, the 90 percent confidence distribution around our median is narrower than the range indicated by the SSPs and considerably narrower through 2200. SSP1 and SSP5 lie below the 5th percentile of our distribution through almost the entire time horizon to 2300. SSP3 features a very aggressive population projection in the top tail of the distribution, at about the 99th percentile in 2300. In sum, none of the SSPs has a central tendency for population in line with our fully probabilistic projections, and the range of population given by SSP1-SSP5 is wide relative to ours.
We are aware of only three other detailed efforts to project world population to 2300, all of them deterministic, in contrast with our probabilistic method described here. One was carried out by the United Nations (2004) and was deterministic but containing several scenarios. The range of these projections for 2300 from the different scenarios went from 2.3-36.4 billion, Source: Authors' calculations based on Raftery and Ševčíková (2021). compared with our 98 percent prediction interval of 1.7-33.9 billion. Although using different methodologies and carried out over fifteen years apart, the two sets of projections give results that are compatible with one another, perhaps to a surprising extent. 13 Another such exercise was carried out by Vallin and Caselli (1997), also deterministic with three scenarios corresponding to different long-term trajectories of world TFR. Two of the scenarios led to world population stabilizing at around 9 billion, while the other resulted in 4.3 billion people in 2300. All three of these scenarios give world population in 2300 well within our 80 percent interval, though with a range that is much narrower than either ours or that of United Nations (2004). Gietel-Basten, Lutz, and Scherbov (2013) also performed a projection exercise to 2300, with a very wide range of scenarios for long-term world TFR. They obtained projections of global population yielding anything from zero to 86 billion in 2300. 14

I.C. Probabilistic Economic Growth Projections to 2300 and Economic Growth Survey
METHODS The probabilistic projections of economic growth often used in analyses by governments and the private sector have not incorporated the time scale of centuries, as is needed to support SCC estimates and other economic analyses of climate change. Müller, Stock, and Watson (2020) took a significant step forward by providing probabilistic econometric projections over long periods. Their methodology involves a multifactor Bayesian dynamic model in which each country's GDP per capita is based on a global frontier of developed economies (countries in the OECD) and country-specific deviations from that frontier. Correlations between countries are also captured in a hierarchical structure that models countries in "covariance clubs," in which country-level deviations from the frontier vary together. The hierarchical structure also permits pooling information across countries, an approach that tightens prediction intervals. This model is then estimated on data for 113 countries over 118 years (1900 to 2017). The model yields 2,000 sets of trajectories of country-level GDP per capita from 2018 to 2300. Each can be considered an equally likely uncertain future. Each is characterized by a path for the global factor and 113 countryspecific deviations from that pathway. The results are described more 13. The very high upper bound for the UN (2004) projections is likely an artifact of the perfect correlation implied by the deterministic scenarios and the aggregation of such results.
14. As in the UN (2004) projections, these very extreme outcomes are likely due in part to the perfect correlation between countries implied by the deterministic scenarios and the aggregation of such results. fully below; for more information about the model, see Müller, Stock, and Watson (2020).
As noted earlier, however, NASEM (2017) recommended augmenting statistical models with formal expert elicitation to quantify uncertainty, especially for long-term projections. But surveying experts on long-term uncertainty of economic growth at the country level is impractical because of time constraints and the difficulty of accounting for intercountry correlations. Consequently, our study was designed to work in tandem with an econometric model that provides country-level projections and represents the intercountry dynamics. The RFF Economic Growth Survey focused on quantifying uncertainty for a representative frontier of economic growth in the OECD countries. The results informed econometric projections based on the model by Müller, Stock, and Watson (2020) of an evolving frontier (also based on the OECD), in turn providing country-level, long-run probabilistic projections.
The methodology we applied is the "classical model" (Cooke 1991(Cooke , 2013) of structured expert judgment, analogous to classical hypothesis testing. In essence, the experts are treated as statistical hypotheses: they are scored on their ability to assess uncertainty based on their responses to calibration questions whose true values are known to us but unknown to the experts. This scoring allows us to weight the experts' judgments, and the scores of combinations of experts serve to gauge and validate the combination that is adopted. The ability to performance-weight experts' combined judgments has generally been shown to provide the advantages of narrower overall uncertainty distributions with greater statistical accuracy and improved performance both in and out of sample Cooke 2017, 2018;Cooke, Marti, and Mazzuchi 2021).
Ten experts, selected for their expertise in macroeconomics and economic growth and recommended by their peers, were elicited individually by videoconference in roughly two-hour interviews in 2019-2020. They received an honorarium where appropriate. The full elicitation protocol is available in the online appendix; the general process was as follows. First, experts quantified their uncertainty for several initial questions, after which answers were provided for self-assessment; this step was intended to familiarize them with the process and alert them to potential biases. The experts then provided a median and 90 percent confidence range for eleven calibration questions for which the true values were known to us.
Experts next provided their 1st, 5th, 50th, 95th, and 99th quantiles for the variables of interest: levels of OECD GDP per capita for 2050, 2100, 2200, and 2300. For experts more comfortable working with growth rates (rather than levels), we provided a spreadsheet tool that translated average growth rates into GDP per capita levels. The experts were informed that their combined quantiles of GDP levels would be further combined with country-level econometric projections, as described below, but they were not shown the results. They were given historical data on economic growth to provide a consistent baseline of information across the panel, and they were permitted to consult outside sources if desired. The experts provided additional rationale for their quantiles verbally throughout the elicitation and concluded the survey by formally identifying the primary factors driving their low and high future growth scenarios.
Given that the projections were being used as an input to the estimation of climate change damages, which would reduce economic activity below the projected level, the experts were specifically asked to provide quantiles of economic growth absent the effects of further climate change as well as absent further policy efforts to reduce emissions. Two of the ten experts provided a pair of modified base quantiles to reflect the absence of effects from climate damages and climate policy that are utilized here, but in general the proposed modifications to their original distributions were minor. Moreover, several experts noted that although climate change was a primary factor underlying their probability of low growth projections, the complexity of the multiple uncertain factors represented in their base quantiles precluded systematic removal, and they deemed their base quantiles appropriate for assessing uncertainty in the SCC and other analyses assessing the economic damages from climate change.
The results of the expert elicitations were combined by first fitting each expert's five quantiles for each year, in log GDP per capita, with a Johnson S U distribution (Johnson 1949) to generate a continuous cumulative distribution function specific to each expert. We next combined the cumulative distribution functions in two ways: averaging across the set of expert functions with equal weight, and performance-weighting the experts according to their performance on the calibration questions. This process yielded a pair of final combined elicited values of OECD GDP per capita for each elicited year and quantile. 15 RESULTS OF ECONOMIC GROWTH SURVEY On the calibration questions (see online appendix), the experts demonstrated an overall high level of statistical accuracy compared with other structured expert judgment studies and results that are robust against expert loss. As shown by their individual quantiles (figure 4) and as expressed in comments during the videoconferences, most participants' median forecast was that long-term growth would be lower than the growth rate of the past one hundred years. The responses show considerable diversity in their characterization of uncertainty around the median, however, with some of the widest ranges being driven by their explicit inclusion of events that are not present or fully realized in the historical record of economic growth on which statistical growth projections are based. 16 When asked to identify the primary drivers of the low-growth quantiles, the experts most commonly responded with climate change, followed by world conflict, natural catastrophes, and global health crises. Rapid advancement of technology was cited most often as the primary driver of high growth, followed by regional cooperation and advances in medical science. Many experts expected that technology breakthroughs in Sources: RFF Expert Growth Survey; Müller, Stock, and Watson (2020); and authors' calculations. Note: For each bar, the circle shows the median and the lines show the 1st, 5th, 95th, and 99th percentiles of the relevant distribution.
Average percentage OECD growth rate (2020 to year) Müller, Stock, and Watson (2020) Individual experts Performance-weight combination Equal-weight combination clean energy would dramatically lower global emissions. Implicit in this narrative is a negative correlation between economic growth and carbon dioxide emissions.
As shown in figure 4, both the performance-weighted and the equalweighted combinations of the experts' distributions yield narrower ranges as well as lower medians than do the statistical trajectories for all four years (2050, 2100, 2200, and 2300). The median of the equal-weighted combination is consistently higher than the median based on performance weighting, but the difference shrinks throughout the period until the medians nearly converge in 2300. Overall, the experts viewed sustained long-term growth rates above 4 percent or even slightly below zero percent as highly unlikely but not impossible.

RESULTS OF ECONOMETRIC GROWTH PROJECTIONS AUGMENTED WITH EXPERT
JUDGMENT We used the survey results to modify econometric projections of GDP per capita based on the methodology of Müller, Stock, and Watson (2020) and to generate density functions of internally consistent projections of economic growth at the country level. As indicated in Müller, Stock, and Watson (2020), economic growth 100-300 years into the future is highly uncertain, well beyond that captured in typical scenario projections (see figure 5).
The tails of the Müller, Stock, and Watson (2020) distribution are quite wide, leading to some implausibly small or implausibly high long-term average growth rates in the extreme tails (e.g., below the 1st percentile or above the 99th percentile). These extreme tails correspond to extremes of persistent economic growth beyond what has been observed historically over long periods (e.g., below -1 percent or above +5 percent annually on average through 2300). Specifically, according to the Maddison Project data set-one of two data sets used by Müller, Stock, and Watson (2020)which includes country-level GDP per capita data as far back as 1500 for some countries, no country has experienced such extreme growth for such long periods. 17 In their model, those extreme tail simulated outcomes are driven by the structure of the Bayesian model with its embedded distributional assumptions rather than by the historical data used to estimate the model. Further, the 1st and 99th percentiles of the combined distribution of longrun growth rates based on our economic growth survey are -0.6 percent 17. For example, no country in Maddison Project data has observed 100-year growth rates below -1 percent or above +3 percent. Maddison Project data are available at Clio Infra, "GDP per Capita," https://clio-infra.eu/Indicators/GDPperCapita.html. and +4.4 percent, indicating that long-run growth rates are unlikely to fall outside this range. For these reasons, and in consultation with James Stock, we omit some projections in the extreme tails of Müller, Stock, and Watson's (2020) distribution that are outside the range of historical experience and also outside the long-run range implied by our survey (see online appendix for our approach).
Our survey provides quantiles of economic growth for the OECD for four discrete years. To maintain the rich country-level information of the econometric model while incorporating the information from the experts, we reweight the probability of occurrence of each of the 2,000 draws from Müller, Stock, and Watson (2020) to satisfy the experts' combined distribution over the long run. The underlying projections from Müller, Stock, and Watson (2020) remain unchanged (aside from the omission of extreme tails described above), but the likelihood of drawing a given trajectory is modified such that the quantiles of OECD growth reflect the distribution produced by the survey.
Average percentage OECD growth rate (2020 to year) Müller, Stock, and Watson (2020

Figure 5. Average Projected Growth Rates of GDP per Capita in OECD Countries
We accomplish this reweighting in two steps. First, we generate a set of target quantiles for the years 2030, 2050, 2100, 2200, and 2300 by calculating weighted averages of the combined cumulative distribution functions from the experts and the corresponding functions from the raw data in Müller, Stock, and Watson (2020). NASEM (2017) recommended giving expert judgment increasing weight for longer horizons, so the near-term weighting is governed more by historical evidence and that of the longterm future more by the experts. For this reason, we increase the weight of the survey quantiles versus Müller, Stock, and Watson's (2020) quantiles linearly over time from zero percent in 2030 to 100 percent in 2200 and thereafter.
We then use iterative proportional fitting (Csiszar 1975) to impose the target quantiles for OECD growth on the 2,000 trajectories of the frontier from Müller, Stock, and Watson (2020) for each of the four benchmark years. For each range of values between each elicited quantile, this algorithm reassigns probabilities to each trajectory whose value falls within that range by minimizing a penalty for nonequal weights, subject to matching the target quantiles. Because there are four years for which we have a combined expert distribution to satisfy, the algorithm iterates between each year until all years' distributions are satisfied. Figure 5 compares the resulting distributions from Müller, Stock, and Watson (2020) with those reweighted according to our economic growth survey.
We next generate a distribution of projected global GDP per capita rates by taking 10,000 independent samples from the population and survey projections, taking the product of population and GDP per capita at the country level, summing to yield global GDP, and dividing by the global population for that draw. 18 Figure 6 shows that the resulting median global GDP growth rates from the RFF-SPs track slightly higher than SSP3, with SSP1, SSP2, and SSP5 also falling within the 90th percentile range. The SSPs do not span the full range of potential growth paths, especially below the median for 18. The raw data set from Müller, Stock, and Watson (2020) provides growth projections for 113 countries. Here we expand on that scope of coverage to include all 184 countries represented in the SSPs by undertaking the following steps to impute each country omitted in Müller, Stock, and Watson (2020): (1) identify the country within the same continent and within 30 degrees latitude with the closest matching log(GDP/capita) for the year 2020 (or, for eleven countries missing data for 2020, we use the most recent year available, typically 2019); (2) calculate a scaling factor based on the ratio between the respective 2020 GDP/ capita values; and (3) apply the scaling factor to each trajectory for the matched country to generate corresponding trajectories for the omitted country. Matches for omitted countries from Oceania were identified from within Asia. The countries imputed represent a total of 3 percent of global GDP for the year used for the match. the RFF-SP growth trajectories. As will be discussed in section IV, these relatively low-growth potential paths contribute substantially to the SCC.

I.D. Projected Emissions to 2300 Based on Economic Growth: Future Emissions Survey
METHODS To generate very long run distributions of global emissions of CO 2 , CH 4 , and N 2 O, the RFF Future Emissions Survey elicited ten experts in socioeconomic projections and climate policy who were nominated by their peers or by members of the RFF Scientific Advisory Board. The experts surveyed were based at universities, nonprofit research institutions, and multilateral international organizations. They have expertise in and have undertaken long-term projections of the energy-economic system under a substantial range of climate change mitigation scenarios.
Like our economic growth survey, the future emissions survey employed the classical model of structured expert judgment: experts first quantified their uncertainty about variables for which true values were known, for calibration and performance weighting. Experts next provided quantiles Source: Authors' calculations. Note: The solid line represents the median value, and dark and light shading represent the 5th to 95th (darker) and 1st to 99th (lighter) percentile ranges of the RFF-SPs. of uncertainty (minimum, 5th, 50th, 95th, maximum, as well as additional percentiles at the expert's discretion) for four variables for a case we called Evolving Policies, which incorporates views about changes in technology, fuel use, and other conditions and is consistent with the expert's views on the evolution of future policy. The Evolving Policies case corresponds to the US federal government's approach to benefit-cost analysis, which evaluates US regulations as incremental against a more expansive backdrop of other policies and conditions and is responsive to NASEM recommendations for including future background policy in the uncertain distributions of socioeconomic projections. Experts provided quantiles of uncertainty for (1) fossil fuel and processrelated CO 2 emissions, (2) changes in natural CO 2 stocks and negative emissions technologies, (3) CH 4 , and (4) N 2 O for five benchmark years: 2050, 2100, 2150, 2200, and 2300. For category 1, they were also asked to indicate the sensitivity of emissions to five GDP per capita trajectories. 19 For each expert we generate a set of cumulative distribution functions, one for each benchmark year, emissions source, and economic growth trajectory, by piecewise linear interpolation between the quantiles provided. Then, as in the economic growth survey, we generate a corresponding set of combined equal-weight cumulative distribution functions by averaging the functions in equal measure, and a set of performance-weighted cumulative distribution functions by averaging in accordance with the experts' relative performance on the calibration questions. Quantile values from the combined functions were linearly interpolated in time between each of the benchmark years to yield a distribution of piecewise linear, nonoverlapping trajectories for each emissions source and sink.
Based on the future emissions survey, we developed a distribution of emissions scenarios to pair, one to one, with our economic growth scenarios. First, we sampled from one of 10,000 economic growth trajectories, described above. Second, we sampled a value (q) on the continuous interval [0,1] to determine the percentile of the expert's emissions trajectory to evaluate. Third, at five-year intervals from 2025 to 2300 we generated an interpolated value of the qth percentile of emissions based on the realized GDP level corresponding to that GDP trajectory in that year and the qth percentile of the experts' emissions distributions for the bounding GDP values elicited. Net emissions of CO 2 were generated by sampling independent q values for direct emissions (category 1) and natural carbon stocks and negative 19. See online appendix for a more-detailed discussion of the survey methodology and the full elicitation protocol. emissions technologies (category 2) and summing the resulting trajectories, thereby including the possibility of net negative emissions. 20 RESULTS OF THE FUTURE EMISSIONS SURVEY Experts' performance on the calibration questions was high, as measured by statistical accuracy, informativeness, and robustness of results (see online appendix). Experts described their rationale and the conditions supporting their distributions of emissions, often citing the same factors. For direct CO 2 emissions (category 1), experts viewed low economic growth as likely to reduce emissions overall but also lead to reduced global ambition in climate policy and slower progress to decarbonization. For median economic growth conditions, experts generally viewed policy and technology evolution as the primary driver of their emissions distributions, often offering a median estimate indicating reductions from current levels but with a wide range of uncertainty. Several experts said high economic growth would increase emissions through at least 2050, most likely followed by rapid and complete decarbonization, but with a small chance of substantial continued increases in emissions. In general, the distributions were inconsistent with keeping global temperature increases below 1.5 degrees Celsius, even when considering the potential for negative emissions.
Though their rationales were often similar, experts' interpretation of those narratives, as shown in their quantiles of emissions, differed substantially (figure 7). For example, for the median growth trajectory to 2050, the median emissions ranged from 15 to 45 Gt CO 2 , a span encompassing a decrease of more than 50 percent to an increase of more than 30 percent from today's levels. Experts often provided highly skewed distributions, with significant chances that direct CO 2 emissions (category 1) would be exactly or near zero while allowing for much higher emissions in the middle and upper quantiles of their distribution.
The experts' narratives support an evolution of the combined distributions. Over time, emissions distributions for all growth trajectories exhibit a shift, particularly evident for the median and high-growth trajectories, with median emissions approaching zero in and after 2150. Emissions distributions for the lower-growth trajectory show a decreased range of emissions overall compared with the higher-growth trajectories, but the temporal trend toward lower emissions is not as strong. Higher-growth trajectories show relatively greater probabilities of increased emissions in the near term, followed by greater chances of full decarbonization in the  next century, while also allowing for the possibility of much higher emissions over the long term. 21 RESULTING GLOBAL GREENHOUSE GAS EMISSIONS PROJECTIONS Figure 8 shows the resulting distribution of projected net CO 2 emissions based on the future emissions survey. The median emissions trajectory is a roughly 50 percent decrease from today's levels by 2100, followed by slowly decreasing levels that approach but do not reach net zero. The median of our CO 2 emissions and concentrations paths is similar to SSP2, and the 98 percent confidence interval spans a range similar to that of SSP1 through SSP3, at least through 2140. 22 The magnitude of CO 2 emissions associated with SSP5, however, is considerably higher than the upper end (99th percentile) of our distribution through the middle of the next century, consistent with the findings of Source: Authors' calculations. Note: Lines represent median values, and dark and light shading represent the 5th to 95th (darker) and 1st to 99th (lighter) percentile ranges of the RFF-SPs.  (2017) and Liu and Raftery (2021). Beyond the middle of the next century, all the SSP emissions trajectories increasingly lie well within our distribution because their extension beyond 2100 is constructed to achieve zero emissions by 2250. This is a weakness of the SSPs as a basis for SCC estimation, even if a subset of the SSPs spans a "reasonable range" during this century.

Raftery and others
For CH 4 (figure OA-9 in the online appendix), the emissions distribution resulting from the future emissions survey is centered between SSP2 and SSP5 and spans a range similar to that of SSP1-SSP5, at least through 2100. After that point, as with CO 2 , the emissions range spanned by the SSPs narrows, whereas the CH 4 emissions from the survey maintain a relatively wide distribution, similar to that in 2100. For N 2 O (online appendix figure OA-10), the median of the emissions paths is between SSP2 and SSP5 through roughly 2200, and the full distribution from the survey spans a range wider than all the SSPs.
In sum, no single SSP is centered similarly to the median emissions paths across all three major greenhouse gases. The full range of emissions represented by the SSPs is higher than for the future emissions survey for CO 2 through 2140; by construction the range narrows to zero for CO 2 after that point and is narrower than the survey results for both CH 4 and N 2 O for nearly the full period.

II.A. Climate System Methods
The second step in estimating the SCC is using a climate model to calculate changes in the climate system corresponding to changes in greenhouse gas emissions. Climate models vary in their representation of the underlying physics, in their spatial and temporal resolution, and in their computational requirements. Earth system models, such as those used for IPCC analyses, require supercomputers, but SCC calculations, typically generated from tens to hundreds of thousands of samples to characterize their uncertainty, preclude use of full-scale earth system models. SCC models are designed to emulate the response of full earth system models across a subset of relevant climate outputs, such as globally averaged surface temperature.
Previous SCC calculations from the federal government used three integrated assessment models: DICE, the Climate Framework for Uncertainty, Negotiation and Distribution (FUND), and Policy Analysis of the Greenhouse Effect (PAGE), each of which employs its own reduced-form climate model. These integrated assessment models can deliver substantially different temperature increases for the same pulse of emissions (Rose and others 2014), leading to inconsistency when results are averaged to calculate the SCC. The NASEM report therefore recommended adopting a uniform climate model that met certain criteria, including that it generates a distribution of outputs across key climate metrics comparable to distributions of outputs from the full earth system models.
The Finite Amplitude Impulse Response (FaIR) model (Millar and others 2017) was highlighted in the NASEM report as a reduced-form model that met the criteria. To assess the changes in global mean surface temperatures resulting from the RFF-SPs, we ran the latest version, FaIR 2.0 (Leach and others 2021), using 10,000 draws from the emissions trajectories of CO 2 , CH 4 , and N 2 O while also sampling across FaIR's native uncertainty in climate variables. 23 Figure 9 shows the median temperature trajectory associated with the RFF-SPs: increases reaching nearly 2.6 degrees Celsius above the average global temperature for 1850-1900 (the standard IPCC preindustrial benchmark) through 2100 and continued increases through 2300. The low end of the distribution indicates a roughly 20 percent chance that the increase will remain below 2 degrees Celsius through 2100. Our experts' expectations for negative emissions technologies lead to an increasing chance of drawing down atmospheric CO 2 to yield temperatures at current levels and below by the late 2100s.

II.B. Resulting Temperature Change from RFF-SPs
The RFF-SP median temperature trajectory tracks closely with SSP2 through 2150, thereafter continuing to increase slightly. SSP1 is largely consistent with the 5th percentile results throughout the period. Temperatures resulting from SSP3 emissions are consistent with the 95th percentile of the RFF-SPs through the middle of the next century, at which point temperatures stop increasing, by construction. The median temperature from SSP5 is roughly consistent with the 99th percentile of temperatures from the RFF-SPs through 2100, at which point it begins to level off to meet the imposed requirement for net zero emissions by 2250.
In this comparison, uncertainty in the climate system itself, as represented by the uncertain distributions of climate parameters in the FaIR model, contributes significant uncertainty to the range of projected temperatures. The temperature distributions for the RFF-SPs include climate uncertainty 23. Trajectories for non-CO 2 , CH 4 , and N 2 O were drawn from SSP2. from FaIR, but for clarity we omit climate system uncertainty in presenting projected temperatures from the SSPs. For a sense of scale, the 90th percentile range in temperatures from FaIR in 2300 for SSP5 is about -2.5 to +7 degrees Celsius about the median.
METHODS FOR CLIMATE DAMAGE ESTIMATION The third step in estimating the SCC is translating changes in the climate system, such as temperature, into total economic damages over time. Damages can be calculated by estimating costs for various sectors (e.g., human health and mortality, agriculture, energy usage, coastal flooding) and summing them, or by taking an aggregate approach to estimate damages across the economy as a whole.
Recent advances in methodologies for damage estimation are not reflected in the integrated assessment models used by the federal government to calculate the SCC (NASEM 2017; Diaz and Moore 2017). The NASEM report made recommendations on improving sectoral damage estimation, finding sufficient peer-reviewed research to support updates on human health and mortality, agriculture, coastal inundation, and energy Source: Authors' calculations. Note: Temperature change is relative to the standard 1850-1900 preindustrial average. Solid lines represent median values. Dark and light shading represent the 5th to 95th (darker) and 1st to 99th (lighter) percentile ranges based on the RFF-SPs. For clarity of presentation, uncertainty in the climate system is reflected in the uncertainty range only for the RFF-SPs (and not the SSPs). demand. Since the report was issued, the literature addressing specific sectors has grown. Nevertheless, few studies meet the full requirements (e.g., global coverage with regional detail, translation into economic damages) put forward by  or Raimi (2021) to serve as the basis for an updated damage function for the SCC. For example, two independent, comprehensive reviews (Bressler 2021;Raimi 2021) found just three suitable studies (World Health Organization 2014; Gasparrini and others 2017; Carleton and others 2018). Our own further assessment of the damages literature found two candidates for agricultural damages (Moore and others 2017; Calvin and others 2020), two for energy demand (Clarke and others 2018; Ashwin and others 2021), and one for coastal damages (Diaz 2016).
Among the notable additions, the Climate Impact Lab has developed a methodology to generate empirically derived, hyper-localized damage functions accounting for adaptation. The Climate Impact Lab in its research has been applying its methodology across a comprehensive set of sectors including health, agriculture, labor, energy, conflict, coastal, and migration (Carleton and others 2018). Upon completion, this full set of sectors is intended to support fully empirically based climate damage estimates.
Much of the new sectoral damages research identified here is currently under peer review for publication, and efforts to implement the existing peer-reviewed studies will similarly be completed on a timeline that is compatible with the IWG process to update the SCC. As described below, for the purposes of this paper we have deployed the aggregate global climate damage function from the widely used DICE model (Nordhaus 2017b) to develop illustrative SCC estimates, coupled with the RFF-SPs, the FaIR climate model, and the stochastic discounting approach described in the next section.

III. Discounting Approaches for the Social Cost of Greenhouse Gases
The long residence time of CO 2 in the atmosphere implies that today's emissions will have consequences for centuries. This time horizon makes the discount rate a major factor for the SCC. For example, the IWG's 2021 interim SCC estimate is $51/ton with a 3 percent discount rate (IWG 2021) but would be about $121/ton at a 2 percent discount rate (RFF and NYSERDA 2021). That 1 percentage point difference alone would more than double the SCC and, by implication, greatly strengthen the economic rationale for substantial emissions reductions.
The discount rates used in federal regulatory analysis are guided by Circular A-4, issued by the Office of Management and Budget (OMB) in 2003, which endorses rates of 3 percent and 7 percent reflecting, respectively, consumption and investment rates of return (White House 2003). OMB guidance also allows for additional sensitivity analysis in cases with intergenerational consequences, such as climate change. However, this guidance runs counter to current economic thought and evidence, for three reasons: (1) a constant deterministic discount rate becomes increasingly problematic for long-horizon problems ); (2) benchmarks for the consumption rate of interest (currently 3 percent) have declined substantially over the past two decades (CEA 2017; Rudebusch 2020, 2021); and (3) the rationale for 7 percent-to address possible policy effects on capital-is flawed in ways that are magnified for very long term decisions (Li and Pizer 2021).
The NASEM (2017) report and recent technical guidance on the SCC (IWG 2021) acknowledged those concerns. A 2021 executive order directed the OMB to reassess existing practice and consider "the interests of future generations" in revisions to Circular A-4 (White House 2021a, sec. 2). Alongside issues related to empirical discount rate uncertainty over long time horizons, the comparison of welfare across generations creates an ethical concern dating back at least as far as : Do we discount the welfare of future generations simply because they are born later?
One rationale for changing the government's discounting approach is the systemic decline in observed interest rates over at least the past two decades (Kiley 2020;Del Negro and others 2017;Johannsen and Mertens 2016;Laubach and Williams 2016;Caballero, Farhi, and Gourinchas 2017;Christensen and Rudebusch 2019;CEA 2017;Rachel and Summers 2019;Rudebusch 2020, 2021), which along with other research on discount rates for very long-run horizons (Giglio, Maggiori, and Stroebel 2015;Giglio and others 2021;Drupp and others 2018; has led to calls for using a lower discount rate; 2 percent is often suggested. The second argument for a modified discounting approach stems from uncertainty in the discount rate, which tends to lead to declining future discount rates.  showed that if one is uncertain about the future trajectory of (risk-free) discount rates, and uncertain shocks to the discount rate are persistent, the certainty-equivalent (risk-free) discount rate declines with the time horizon toward the lowest possible rate. This result stems from a straightforward application of Jensen's inequality to a stochastic discount factor, leading to declining (risk-free) discount rates (Arrow and others 2014). At the same time, if the payoffs to investments in emissions reductions are correlated with future income, the effective riskadjusted rate could be higher if the correlation is positive or lower if it is negative (Gollier 2014). This correlation is often termed the "climate beta," but it is not clear ex ante whether the beta is positive, as in Nordhaus's work and as argued by , or negative, as in Lemoine (2021).
The third issue is the need, in light of recent research (Li and Pizer 2021), to rethink the use of the higher discount rate (7 percent) reflecting the return to capital. Several decades ago, researchers suggested that when taxes create a wedge between consumption and investment interest rates, the alternative rates could be used to bound a benefit-cost analysis, as a shorthand version of the shadow price of capital (SPC) approach (Harberger 1972;Sandmo and Drèze 1971;Marglin 1963aMarglin , 1963bDrèze 1974;Sjaastad and Wisecarver 1977). However, the assumptions underlying the soundness of that approach are quite restrictive: costs are assumed to occur entirely in the first period; benefits are constant and occur either in a single period or in perpetuity; and benefits displace only consumption while costs displace either investment or consumption. Li and Pizer (2021) extend Bradford (1975), showing that the traditional approach of using 7 percent as a shorthand means for representing investment impacts of regulatory costs becomes increasingly very inaccurate the farther one looks into the future.
The NASEM (2017) report foreshadowed those results and recommended using a central consumption rate estimate along with sensitivity cases.  provide some guidance, examining central values of 2 percent and 3 percent and a range of values between 1.5 percent and 5 percent (though they do not recommend those particular values). Their discussion of discount rates is based primarily on questions about the most appropriate near-term consumption rate and does not address the SPC approach. Pizer (2021) details how the SPC approach could be implemented, suggesting sensitivity cases that employ the consumption discount rate, with costs and benefits alternately multiplied by the SPC to reflect the possibility that the entirety of each of these impact streams falls on investment: an SPC of 1.2 is proposed as a conservative value. Alternatively, simply multiplying regulatory costs by the SPC provides a sensitivity case that is consistent with an (extreme) scenario where all costs fall on investment. Conceptually, this is equivalent to what is being sought with the traditional approach of discounting benefits at the higher 7 percent rate, but it has the advantage of both being analytically correct and allowing for a consistent discounting approach across different elements of benefit-cost analysis. The consumption discount rate would be employed in all cases, and the SPC approach would apply generally, not just in the context of the SCC.
Each of these discounting ideas (including stochastic growth discounting, discussed below) could be incorporated in a revision to Circular A-4, with relevance to both SCC estimation and other contexts. This would harmonize SCC discounting and broader US government guidance on benefit-cost analysis.

III.A. Stochastic Growth Discounting with Economic Uncertainty
One rationale for discounting, generally, is the concept of declining marginal utility of consumption. Intuitively, a $100 cost in a future in which society has grown dramatically wealthier should be valued less, from today's perspective, than the same $100 cost in a relatively poor future with stagnant economic growth. This result is often embodied by the classic equation derived in  that relates the consumption discount rate (r t ) to the rate of consumption growth (g t ) over time: .
In equation (1), ρ represents the rate of pure time preference (how much utility is discounted over time) and η represents the curvature of an isoelastic utility function. 24 We use time subscripts to refer to the compound average value of the indicated variable from today (time 0) to year t. If average consumption growth to year t, g t , is uncertain, as it is given the probabilistic socioeconomic scenarios discussed earlier, then the average discount rate to year t, r t , is also uncertain. This leads to a stochastic discount factor, which is used to discount stochastic marginal damages from an incremental ton of emissions (MD t ) to a present value (PV ) equivalent: where r t is determined by equation (1) based on the uncertain growth rate g t . An alternative is to base the discount rate on some market proxy for the discount rate as in . Either way, the discount rate is considered uncertain, and the first term inside the expectation, e -r t t , represents a stochastic discount factor. In our treatment, the discount factor and rate are uncertain due to the stochastic growth rate. The importance of a stochastic discount factor is well established in the finance literature, and its importance is increasingly recognized in the literature at the nexus of macro and climate economics (Cai and Lontzek 2019; Hansen 2020, 2021). A stochastic discount rate leads to a declining certainty-equivalent risk-free rate . To clearly see the derivation of this result, suppose for the moment that the discount rate is normally distributed, r t ∼ N(µ t , σ 2 ), and that it is uncorrelated with marginal damages, corr(e -r t t , MD t ) = 0, which corresponds to a climate beta of zero.
Then it is easy to show that the certainty-equivalent rate, which is denoted r t ce and represents the rate at which to discount expected marginal damages (as in e -r c ce t E[MD t ]), declines with the time horizon of the impacts being discounted, t: 25 Of course, equation (3) represents a special case. More generally, absent these two specific assumptions, the risk-free rate given by this equation does not account for the risk profile of the benefits of emissions reductions, namely, through the climate beta, which reflects the potential correlation of the stochastic discount rate with marginal damages. If one wants to retain the certainty-equivalent approach to discounting, Gollier (2014) shows that a risk adjustment is necessary to account for any such correlation, but the form of this adjustment depends on the potentially complex nature of the joint uncertainties. We instead take a more general approach to account for these issues by directly using the more general equations (1) and (2) to implement stochastic discounting as part of the Monte Carlo estimation of the SCC, which explicitly accounts for any such correlation. Accounting for this correlation is important in theory  and also, 25. A version of this result is shown in Newell and Pizer (2003), but for clarity of exposition we explain it briefly here. Starting with the definition that the certainty-equivalent rate yields the same present value of equation (2) where the last equality follows by the assumption of zero correlation. Solving for r t ce yields r t ce = 1 ---t log(E[e -r t t ]). A well-known property of the exponential function, e x , applied to a normally distributed variable, x ∼ N(µ, σ 2 ), is that E[e ax ] = e aµ+ 1 -2 a 2 σ 2 .Applying this formula with x = r t and a = -t yields the result. as our results show, matters greatly in practice when the climate beta is not zero. Indeed, the climate beta in most integrated assessment models is implicitly taken to be close to one.
For example, in the DICE model, damages are assumed to be a percentage of GDP (where that percentage depends on global temperature), and the discount rate is a linear function of economic growth, as in a Ramsey-like framework (Nordhaus and Sztorc 2013). This implies a beta of essentially one, since higher income (and, in turn, greater discounting) is perfectly correlated with higher undiscounted damages. That is, a positive beta implies that undiscounted damages are largest when economic growth is largest, and smallest when growth is smallest. Mirroring this, with η > 0, the discount factor is smallest when growth is largest and largest when growth is smallest. Using a stochastic discount factor as in equation (2) will therefore discount damages most in states of the world where they accrue to rich future generations and correspondingly discount them least in states where the future is poor. Amid uncertainty about socioeconomic trajectories, ignoring this stochastic discount factor (and its correlation with climate impacts) could severely bias estimates of the SCC. The magnitude of this bias depends on the climate beta and on the nature of the uncertainty in socioeconomic and emissions trajectories;  and our illustrative results (below) show that this bias could change the SCC by a factor of two or more.
Despite the importance of stochastic discounting, federal government benefit-cost analysis has historically not treated the discount rate as explicitly uncertain, nor has the discount rate been connected to growth as in the Ramsey framework. Instead, the consumption discount rate used in past government estimates of the SCC has been a constant rate of 3 percent. 26 This is equivalent to implicitly choosing the discounting parameters ρ = 3 percent and η = 0, corresponding to a linear utility function. Yet this approach effectively eliminates any consideration of declining discount rates, as in , and risk premia, as in Gollier (2014). More intuitively, it also treats a $100 cost to a member of a wildly rich future generation the same as a $100 cost to a poor one, which is incorrect from 26. Although 3 percent was the central rate, the IWG also previously used constant rates of 2.5 and 5 percent as sensitivity cases. Because those values were estimated to roughly approximate the effects of explicitly accounting for uncertainty in risk-free and risk-adjusted rates , those motivations are no longer appropriate when stochastic discounting can be captured explicitly in integrated assessment models, as we propose. a welfare perspective. Correspondingly, such parameter values receive little support from economists working in this field (Drupp and others 2018).
Although the case for using stochastic discounting as in equation (1) is strong, the choice of the parameters in that equation is not a simple matter, and their values can lead to very different effective discount rates (Stern 2007;Nordhaus 2017a) and their connection to economic growth and climate damages. One recent paper surveyed economists about their preferred values of ρ and η (Drupp and others 2018). This is valuable, but the federal government has a long tradition of relying on descriptive, empirical approaches to informing discounting guidance, as in other aspects of benefit-cost analysis. In particular, Circular A-4 refers to observed interest rates in selecting 3 and 7 percent (White House 2003). A choice of ρ and η might therefore sensibly start with the constraint that the associated nearterm rate match the consumption rate used elsewhere in benefit-cost analysis, as recommended in NASEM (2017). However, a continuum of (ρ, η) combinations can match any particular near-term rate, so another constraint is needed.  provide such an approach. They calibrate the values of (ρ,η) such that, when applied to the Müller, Stock, and Watson (2020) growth distribution, the implied discount rate term structure starts at a specified rate in the near term (say, 3 or 2 percent) before declining with the time horizon in a manner consistent with evidence from the empirical literature on future interest rate term structures Rudebusch 2020, 2021). 27 Figure 10 illustrates the calibrated combinations of (ρ, η) yielding implied (fitted) term structures when applied to the RFF-SPs (dashed lines). These parameters were calibrated to be as consistent as possible with those implied by the Bauer and Rudebusch (2021) model initialized to given targeted near-term rates of 1.5, 2, or 3 percent (solid lines). For example, using the estimated model from  and starting with a near-term rate of 2 percent, we construct a target term structure (solid black curve). We then find the combination, (ρ, η) = (0.2 percent, 1.24), that best fits the target term structure.
27. The parameters shown here differ slightly from those in Newell, Pizer, and Prest (2021) because we calibrate them to the full RFF-SPs, corresponding to the Müller, Stock, and Watson (2020) distribution weighted based on our economic growth survey. The methodology developed in  was demonstrated on the raw distribution, before the weights were applied.
The calibration procedure in  can be implemented for any specified near-term rate. Here we present three cases: These ρ and η parameters lie in the middle of the range often used in the literature, particularly for target near-term rates of 3 percent and 2 percent. Implementing them simultaneously with the socioeconomic trajectories discussed in section I produces a declining term structure of  Target  Term Structure 28. See Newell, Pizer, and Prest (2021, sec. 3.2) for the rationale behind each rate, and that paper's appendix for additional alternative near-term rates. certainty-equivalent, risk-free rates consistent with the empirical literature Rudebusch 2020, 2021). Importantly, implementing the stochastic discount rate alongside stochastic damages via equation (2) explicitly captures risk aversion and the correlation between the discount rate and climate damages, meaning no ex post risk adjustment to the discount rate is necessary.
This calibrated stochastic discounting rule can now be used with the undiscounted damage estimates (discussed above) to estimate the SCC in an internally consistent manner.

IV. Illustrative Calculations of the Social Cost of Carbon
We present illustrative estimates of the SCC based on our socioeconomic projections (the RFF-SPs), the FaIR climate model, and our discounting methodology-all of which speak directly to the NASEM (2017) recommendations-and apply them using the DICE damage function (Nordhaus 2017a). This approach is directly responsive to three of the four NASEM recommendations. The fourth recommendation is to update the damage functions with the best available science on sectoral damages, rather than using an aggregate damage function such as that in DICE. We will include more recent sector-specific damage estimates, reflecting the best available science, in future work, but for the moment we use the DICE damage function to produce illustrative SCC estimates. Although the values we present here should be considered illustrative, they highlight the importance of socioeconomic uncertainty and stochastic growth discounting, and the interaction of these two important drivers of the SCC.
We also compare our SCC estimates with those from SSPs 1, 2, 3, and 5. Because of the lack of socioeconomic uncertainty in each SSPand the lack of relative probabilities across them-we cannot meaningfully calibrate ρ and η parameters for those scenarios to deliver comparable near-term rates. We therefore apply constant discount rates of 2 percent and 3 percent to the SSPs. 29 The results are shown in figure 11, leading to our first major conclusion: a quantitative probabilistic accounting of socioeconomic uncertainty matters 29. As an additional comparison, figure OA-13 in the online appendix presents an analogous figure to figure 11 but applying our RFF-SP-calibrated discounting parameters (ρ, η) to the SSPs. This is purely for presentational purposes, and we caution that our discounting parameters were not calibrated to the SSPs. Because the SSPs have no uncertainty within them, it is not possible to calibrate discounting parameters to them as we can do to the socioeconomic distributions   greatly for the SCC. Panel A of figure 11 shows the distributions of our illustrative SCC values calibrated to 2 percent and 3 percent discount rates in the near term (means are also in the left columns of table 1). The other two panels show the SCC distributions under each SSP at 3 percent (panel B) and 2 percent (panel C) discount rates. Panel A reflects socioeconomic uncertainty implicitly, leading to central SCC estimates of $61 and $168/ton CO 2 under 3 percent and 2 percent near-term stochastic discounting, respectively. The distribution underlying those means reflects both socioeconomic and climate uncertainty. We disaggregate that distribution below, but the bottom panel shows the importance of socioeconomic uncertainty explicitly by comparing across the SSPs. SSP5 (high income growth) produces mean SCC values three to six times higher than the other SSPs. Hence, if one were to use a weighted combination of the SSPs, the resulting average SCC would reflect the relative weight given to each SSP, especially SSP5-a choice with no clear empirical basis. This result highlights the importance of incorporating a quantitative accounting of economic uncertainty, as in the RFF-SPs.
Next, figure 12 demonstrates the effect of stochastic versus constant discounting on the mean SCC, leading to our second major conclusion: stochastic growth discounting is crucially important to SCC estimation in the context of socioeconomic uncertainty. In table 1, the first two columns of the first row show mean SCCs under the RFF-SPs for stochastic growth discounting approaches consistent with 3 percent and 2 percent nearterm rates (both in 2020 dollars), producing mean SCC estimates of $61.4 and $168.4/ton CO 2 , respectively. These estimates, reflecting the updated socioeconomic, emission, climate, and discounting modules (three of the four NAS recommendations), are 33 percent and 50 percent higher than the corresponding DICE-only SCC estimates from the 2016 IWG ($46 and $112/ton CO 2 in 2020 dollars; RFF and NYSERDA 2021).
We also present the results from the RFF-SPs with constant discounting to illustrate the importance of stochastic discounting, but as previously discussed, constant discounting is inappropriate when uncertainty in economic growth is considered, as here. When constant discounting is coupled with uncertain growth, the mean SCC is higher than is appropriate by a factor of three to nine, $194 and $1,557/ton CO 2 for 3 percent and 2 percent discount rates, respectively, because it ignores the correlation between damages and growth (the climate beta) and hence the discount rate. In other words, ignoring the risk profile of the SCC threatens to overstate the mean SCC in this example by a factor of three or more ($194 versus Average growth rate from 2020 to 2300 $61/ton with 3 percent discounting, and $1,557 versus $168/ton with 2 percent discounting). More specifically, the high expected values reflect a right-skewed distribution of damages. It is well known that skewed distributions and tail events can influence the expected value of the benefits of mitigating climate change (Gollier 2008;Weitzman 2011Weitzman , 2014. Under constant discounting, such tail events are very rich futures with associated large amounts of consumption at risk from climate change. Yet constant discounting treats each dollar of cost to those wealthy future generations the same as a dollar of cost to a relatively poor future. Hence, with constant discounting, the effects on the future rich inappropriately dominate the expected value of the SCC, leading to a strong upward bias in the SCC estimate.
This problem is recognized in the finance literature as the result of ignoring the risk properties of an investment-namely, the correlation of an uncertain payoff with the stochastic discount rate. Stochastic growth discounting addresses this by discounting the high-growth, high-damage states at a higher rate. By discounting high-growth states more, stochastic discounting stabilizes the mean and variance of the SCC, as documented in .
The second row of table 1 highlights this greater stability under stochastic discounting by showing a sensitivity case in which we drop the top and bottom 1 percent of the global average income trajectories. 30 Under constant discounting, the mean SCC is quite sensitive to dropping these 1 percent extremes, falling from $194 to $96/ton at a 3 percent discount rate and from $1,557 to $450/ton at a 2 percent discount rate. By contrast, the mean SCC is virtually unchanged under stochastic discounting, changing by less than 1.5 percent for each of the stochastic rates that are consistent with 2 percent and 3 percent near-term rates. More generally, the SCCs with stochastic discounting change only negligibly even when much larger percentiles are dropped from the tails. For example, with stochastic discounting, the mean SCCs also change by less than 1.5 percent even when the top and bottom 10 percent of draws of global average income trajectories are dropped, whereas under constant discounting, the mean SCCs fluctuate by factors of 3 to 11. This result highlights the stabilizing effect of properly incorporating stochastic growth discounting, as anticipated in the NASEM (2017) report.
30. Specifically, we drop the draws with global average GDP per capita in 2300 in the top 1 percent and bottom 1 percent of draws, before taking the average SCC.
This stability with stochastic discounting is apparent in figure 12, which plots the individual Monte Carlo SCC draws against each draw's longrun  global GDP per capita growth rate, under the 3 percent near-term stochastic discounting parameters (ρ = 0.8%, η = 1.57). 31 Roughly speaking, the vertical spread of SCC values in the figure largely reflects climate uncertainty for each given level of growth in GDP per capita, whereas the horizontal spread of SCC values reflects uncertainty in longrun income growth. Because the DICE damage function is proportional to GDP, undiscounted marginal damages scale roughly one-for-one with income growth, but with η > 1 they are discounted somewhat more than one-for-one with stochastic discounting, leading to a modest negative relationship between the SCC and GDP per capita growth. In other words, the SCC is higher when income growth is lower, and vice versa.

V. Conclusion
Since the SCC is a vitally important metric guiding climate policy, its calculation must be supported by the best available science, including an explicit incorporation of uncertainty. Our results demonstrate that socioeconomic uncertainty and stochastic discounting are important drivers of the SCC, and our work presents an opportunity to incorporate those uncertainties into ongoing updates.
Although the SCC estimates presented here are meant to be illustrative and use a highly simplified estimate of climate damages, they nonetheless highlight two major conclusions. First, socioeconomics matter significantly to the SCC, highlighting the importance of a quantitative accounting of socioeconomic uncertainty. Whereas scenario-based socioeconomic projections like the SSPs have no formal probabilities attached to them, our approach to quantifying uncertainties in future trajectories of population, GDP, and emissions helps account for these uncertainties in the SCC. Second, when incorporating socioeconomic uncertainty, stochastic growth discounting is crucial to account for the correlation of climate damages and the discount rate, whereas ignoring it leads to a large upward bias in the SCC estimate. Our work represents an advance in uncertain socioeconomic trajectories and discounting approaches based on empirically based explicitly probabilistic methods. Nevertheless, potentially important components 31. The shape of the curve is similar under the 2 percent near-term parameters (ρ = 0.2%, η = 1.24) but shifted up to a higher level.
have not yet been fully incorporated into officially adopted SCC values. Recent work has begun to account for how the risk of tipping points influences the SCC (Dietz and others 2021). Other important factors include climate-related migration, conflict, and loss of at-risk species. Another conceptual issue is equity weighting, wherein effects on poorer regions of the world could be weighted more than equivalently sized dollar value to rich regions (Errickson and others 2021). Future research in these areas could be incorporated into official SCC values over time.
More generally, the SCC should be continually updated as the scientific frontier advances, as recommended by NASEM (2017). Our work speaks directly to those NASEM recommendations and presents an opportunity for the US government to improve on simple, deterministic approaches to socioeconomic projections and discounting methodologies to better reflect the interrelated uncertainties about future population, income, emissions, climate, and discount rates.

COMMENT BY
MICHAEL GREENSTONE For a century, large temperature increases have been observed and have had a significant impact on the overall climate (IPCC 2021). Greenhouse gas emissions, including largely CO 2 , play a deterministic part in rising temperatures. While reduced greenhouse gas emissions can be costly, associated mitigation in rising temperatures can lead to significant reductions in climate damages and net improvements to welfare.
The social cost of carbon (SCC) is a critical input into assessing whether potential climate policies have benefits that exceed their costs. The SCC is the monetized value of all future net damages associated with the release of an additional ton of CO 2 . The SCC, therefore, provides a measure of how much society should be willing to pay for a one-ton reduction in CO 2 emissions and allows policymakers to conduct a comparison of a regulation's benefits and costs, both measured in dollars. The SCC became a key part of US climate policy in 2010 (Greenstone, Kopits, and Wolverton 2013;IWG 2013) and has been used extensively in the United States and internationally since then. The basis for the US government's estimate of the SCC is derived from William Nordhaus's seminal research estimating the costs of climate damages and the SCC, as well as the Climate Framework for Uncertainty, Negotiation and Distribution (FUND) and Policy Analysis of the Greenhouse Effect (PAGE) integrated assessment models (Nordhaus 1992;Anthoff and Tol 2014;Hope 2011). To bastardize Winston Churchill's famous quote about democracy-as of 2010, the integrated assessment models were the worst approach to estimating climate damages, except for all the others that have been tried. (Churchill's quote is "Democracy is the worst form of Government except for all those other forms that have been tried from time to time.") 1 It is my great pleasure to discuss this paper by Rennert and others that suggests a new approach for the US government to update the SCC. The authors' approach emphasizes socioeconomic uncertainty and its correlation with damages. My goal in this comment is to situate their contribution in the broader context of a holistic approach to updating the US government's SCC, underline drawbacks of the previous approach, and suggest criteria for the SCC calculation that makes it consistent with advances in the literature, economic theory, and policy objectives.
Overall, my conclusion is that the authors have taken an important step in fixing what ails the SCC, but their improvements need to be digested and examined by the scientific community. Further, to this point, their solutions fail to exploit the advances in damage estimation, which many believe to be the area where the most progress has been made in the last ten to fifteen years. Overall, this is an important contribution but more is needed to return the US government's SCC to the frontier of scientific understanding about climate damages.

BACKGROUND
The SCC in climate policy. In the United States, most major legis lation requires agencies to conduct cost-benefit analyses. For policies aimed to reduce CO 2 emissions, such analysis heavily relies on the SCC. During the Obama administration, the SCC was set at $51 by the Interagency Working Group (IWG 2013). Setting out to roll back environmental regulations, the Trump administration lowered this number to $1-$8 by restricting damages to domestic ones and applying higher discount rates (Plumer 2018). Currently, the Biden administration has returned the SCC to $51 as an "interim" value and is actively working to update it.
The SCC extensively influences public policy. Through 2017, it had been used in analyzing the value of more than eighty regulations with gross benefits exceeding a trillion dollars . Moreover, at least eleven state governments, including Illinois and New York, use the SCC to value zero-emissions credits paid to clean energy producers (Rennert and Kingdon 2019). The SCC had also been implemented internationallycountries like Canada, France, Germany, Mexico, Norway, and the United Kingdom all implemented the SCC to some extent (Institute for Policy Integrity 2014). Finally, meaningful US mitigation efforts facilitate international climate negotiations and lead to significant reductions in emissions from other countries (Houser and Larsen 2021).
The US government's SCC is no longer on the frontier of under standing. I co-led the IWG in 2009-2010 along with Cass Sunstein. Nordhaus (1992) was incredibly influential in shaping the economics profession's thinking and giving us a framework to think about the SCC and climate damages. This was achieved through the Dynamic Integrated Climate Change (DICE) model, which is generically referred to as an integrated assessment model along with PAGE and FUND. These three models all date back to the 1990s and formed the basis of the SCC. Even though all three models were somewhat dated by 2010, it was reasonable to conclude that the SCC reflected the frontier of understanding because the economics profession had not done much to update them since their creation.
In the intervening dozen years, these three integrated assessment models, and the resulting SCC, have fallen behind this frontier in several key areas. First, the set of models is highly reliant on expert judgment; there is a limited set of people with knowledge of what goes on inside the models, making replication, validation, and improvements challenging. This kind of small-club approach is not the best way to make scientific progress. Second, when these three models were created, computational resources were relatively limited, which forced researchers to make several simplifying assumptions. For example, the estimated relationship between human well-being and changes in temperature in these models is based on quite limited data and a heavy reliance on functional form assumptions. In this respect, they generally have not taken advantage of the robust and rapidly growing climate damages literature (Deschênes and Greenstone 2011) that has emerged in the past two decades (see figure 1). Similarly, the underlying climate models are dated and fail to capture many climate dynamics. Third, integrated assessment models used deterministic models in most cases, and hence such models incompletely accounted for uncertainty in their estimates. Finally, these models produced highly aggregated estimates of climate impacts. Even the most disaggregated model, FUND, has only sixteen regions, assuming, for instance, that climate change will affect Miami and Minneapolis identically. Recent research has established that the impacts of climate change are highly heterogeneous, and this heterogeneity matters for the SCC calculation. For instance, Hsiang and others (2017) demonstrate that the projected end-of-century climate damages are nine times greater in the poorest 5 percent of US counties than in the richest 5 percent. Consistent with these findings, Carleton and others (2020) find significant differences in the projected change in mortality risk both across and within countries.
Given the amount of time that has passed, it is not surprising that the SCC is due for an update. Indeed, the original IWG suggested that the SCC should be updated regularly to reflect advances in the understanding of key components of the calculation (IWG 2010). The need for the update was pointed out again seven years later by the National Academies of Sciences, Engineering, and Medicine (NASEM 2017). Further,  lay out a detailed plan for updating it. Finally, the SCC's legal durability relies on the estimation that is based on frontier science and economics.
UPDATING THE US GOVERNMENT'S APPROACH TO SCC ESTIMATION NASEM (2017) and  explain that there are four key modules or ingredients in constructing the SCC. This section describes the authors' efforts in the paper in each of these areas, providing some context for their contributions.
Socioeconomic and emissions pathways. The current (as of March 2022) and past SCC calculations, which were developed using Energy Modeling Forum (EMF 22) scenarios (Clarke and others 2009), do not reflect the last decade of work in probabilistic scenario development. A specific concern has been that the SCC relied on five socioeconomic scenarios, which were equally weighted, and that they did not span the full uncertainty about economic growth, increases in greenhouse gas emissions, and population growth (NASEM 2017). Each of these socioeconomic variables is a key input into the SCC, meaning that it may not reflect the full range of expected variation in these variables.
For all three variables, the authors combined statistical projections of these variables with expert elicitation to generate probabilistic projections through 2300. These probabilistic projections are referred to as the RFF Socioeconomic Projections (RFF-SPs) and allow for substantially more socioeconomic uncertainty than was being captured previously.
In the case of economic growth, the authors rely on Müller, Stock, and Watson's (2019) statistical model of economic growth extended out to 2300. The model is derived from a data set that covers 113 countries from 1900 to 2017. Müller, Stock, and Watson's (2019) projections were coupled with formal expert elicitation about the "frontier of economic growth." The ten experts-Daron Acemoglu, Erik Brynjolfsson, Jean Chateau, Robert Gordon, Lant Pritchett, Melissa Dell, Mun Ho, Chad Jones, Dominique van der Mensbrugghe, and Pietro Peretto-were interviewed separately for roughly two hours each. It is worth noting that the authors "omit some projections in the extreme tails of Müller, Stock, and Watson's (2019) distribution that are outside the range of historical experience" and outside the range specified by the experts. This choice is motivated by the factas the authors notice-that "such low or high sustained growth rates would lead to global GDP/capita either falling by more than 90% between 2021 and 2300 (e.g., 0.99 279 ) or rising by a factor of more than 800,000 (1.05 279 ) implying a global average income of more than $10 billion per person" (online appendix).
With respect to population, the authors replaced EMF 22 projections with a probabilistic UN statistical model extended to 2300. Like with economic growth, due to expert disagreement concerning the projected lower bound on the total fertility rate, the model was further altered to account for population growth experts' views. Finally, to incorporate the uncertainty of emissions, greenhouse gas emissions projections from ten experts were paired with economic growth scenarios. In the case of population, nine experts were surveyed, while ten were surveyed for emissions trends.
Climate model. The integrated assessment models used in the SCC calculation represented economists' interpretation of climate change and did not reflect the last decade of modeling. The primary input into each model is equilibrium climate sensitivity, which represents the total warming realized from doubling CO 2 concentrations in the atmosphere. While equilibrium climate sensitivity has a tremendous impact on the interim SCC calculation, its actual value is not known with scientific precision.
Accounting for the best science available at the time, the IWG combined the equilibrium climate sensitivity estimates across all models by employing a probability distribution reflecting the likelihood of different possible climate outcomes at the end of the century adapted from the Inter governmental Panel on Climate Change's fourth assessment report (IPCC 2007). Even so, integrated assessment models fail to precisely measure multiple links in the causal chain from CO 2 emissions to temperature change (Dietz and others 2021;Hänsel and others 2020;Montamat and Stock 2020;NASEM 2017). Particularly, these models significantly understated the speed of warming (Montamat and Stock 2020). For instance, increased carbon concentrations lead to warmer and more acidic oceans, which in turn makes them less effective at removing CO 2 from the atmosphere. The resulting positive feedback loop is missing from both the DICE and PAGE models (Dietz and others 2021). Furthermore, delayed warming projected in these models likely results in a downward biased estimate of the SCC as warming further into the future is discounted more heavily.
The authors address this problem by incorporating the Finite Amplitude Impulse Response (FaIR) climate model (Millar and others 2017). This choice is consistent with NASEM's key climate module criteria for the SCC calculation (NASEM 2017). The model chosen by the authors generates climate projections consistent with comprehensive, frontier science models, such as the set of models composing the CMIP6 ensemble (Eyring and others 2016), and can be used to quantify uncertainty surrounding the impact of an additional metric ton of CO 2 on global mean surface temperature. Moreover, the FaIR model is computationally feasible, transparently documented, and is commonly used in the SCC updates (Carleton and others 2020; Dietz and others 2021; Hänsel and others 2020; Rode, Baker and others 2021; Rode, Carleton and others 2021). FaIR's main limitation is that it does not capture changes in global mean sea level rise. One promising, but imperfect, way to overcome this limitation is to use semiempirical models that enable the inclusion of damages due to projected sea level changes (Kopp and others 2016).
Damage functions. The next step in calculating the SCC is to make changes in the physical climate (e.g., temperature) and determine their impact on net economic damages. The relationship between economic damages and temperature change is known as a damage function. The previous generation of damage functions, which includes the FUND, DICE, and PAGE models, was developed in the 1990s and hence omits a rapidly growing literature (see figure 1). Indeed, my judgment is that this is the area with the greatest advances in understanding in the last few decades.
There are several shortcomings in the damage function used for the SCC calculation. First, the older models rely heavily on data from wealthy countries with temperate climates. This means that these models had to rely on ad hoc assumptions to create global damage functions because there simply was no support in the available data for the hot, poor, and hot and poor places where much of the world's population lives. Given the tremendous progress in data availability over the last few decades, there is no need to rely on ad hoc assumptions any longer; instead, there are now opportunities to rely on large-scale and globally representative data.
Second, there is substantial heterogeneity around the planet-what happens in Accra when hot temperatures arrive is vastly different from the effect of these temperatures in Oslo, for example ( figure 2). This heterogeneity can in part be explained by nonlinear relationships between temperature, mortality, and adaptation. The DICE model currently used by the authors Panel B. Change in temperature, 2020 to 2099 (RCP8.5)

Panel A. Temperature and mortality
Source: Adapted with permission from , also available at SSRN: https://ssrn.com/abstract=3764255. Note: The figure shows estimated mortality-temperature relationships for age 65 and older (panel A), as well as projected changes in temperature distribution, for Oslo, Norway, and Accra, Ghana as a placeholder ignores distributional impact by dividing the planet into no more than sixteen regions. To capture local nonlinearities, updated damage functions should be more granular, for example, Climate Impact Lab employs distributed computing using 24,378 regions. 2 Third, even within a given region, economic and climate uncertainty is substantial in every aspect of damage function estimation: mortality, coastal, labor, agriculture, electricity, and other fuels, as can be seen in figure 3. Thus, updated damage function should account for heterogeneous effects of temperature across sectors, as well as econometric uncertainty.
Damage functions are the engine that drives the determination of the SCC, and the authors' reliance on older DICE damage functions means that their approach is behind the frontier. This is the case in three specific ways. First, it is now possible to rely on damage functions that are empirically founded and represent plausibly causal impacts of climate change on socioeconomic outcomes. Second, recent work has demonstrated that data representative of the global population, not just rich or temperate regions, are now available and can be used to estimate damage functions. Finally, damage functions should account for both estimated benefits and costs of future adaptive investments, and this is a hallmark of the Climate Impact Lab's estimation of damage functions represented in figure 3. See  for a fuller discussion of these issues.
Discounting. CO 2 added to the atmosphere causes a stream of damages and benefits associated with a given trajectory of warming spanned over centuries. The choice of a discount rate is therefore highly consequential for determining the SCC. To date, the SCC relied on a central constant discount rate of 3 percent, following US government's guidance on the conduct of cost-benefit analysis.
This approach fails to account for several features of current economic thinking about discounting that are especially important in the climate context where greenhouse gases can influence the climate for centuries after their release. These features include: (1) the 3 percent figure is intended to reflect the riskless rate but that rate is now likely 2 percent or lower Rudebusch 2020, 2021); (2) uncertainty in the riskless discount rate would lead the discount rate to decline with the time horizon , which constant discount rates do not capture; and (3) payoffs to emissions mitigation could be correlated with future income realizations, Sources: Panel A adapted with permission from Carleton and others (2020); panel B adapted with permission from Depsky and others (forthcoming); panel C adapted with permission from Rode, Baker, and others (2021); panel D adapted with permission from Hultgren and others (forthcoming); panels E and F from Rode, Carleton, and others (2021), adapted by permission from Springer Nature. Note

Figure 3. Economic and Climate Uncertainty in Examined Sectors
in which case there is effectively a "climate beta" and riskless rates are inappropriate (Gollier and Hammitt 2014). The  equation provides a standard way to think about the intertemporal problem of discounting that can accommodate these limitations of constant discounting. It is: where r t represents the discount rate at time t, ρ is the pure rate of time preference, η is the coefficient of relative risk aversions, and g t is the per capita growth in consumption at time t. When the growth rate is uncertain, as is the case with the RFF-SP probabilistic growth scenarios, then the average discount rate in year t, r t , is also uncertain. Following , the present value of damages from an additional ton of emissions, MD t , is then given by: This means that there is a stochastic discount factor (due to g t ) and that produces a declining certainty-equivalent risk-free rate.
An appealing feature of this approach is that it incorporates the climate beta because when η > 0, the discount factor is smallest when growth is largest and largest when growth is smallest. This captures the idea that a dollar of damages is more meaningful when we are relatively poor. To apply these insights and connect them to the current riskless rates, the authors follow  and choose values of (ρ, η) that are disciplined by the current riskless rate. In so doing, they ignore the available evidence of the values of these parameters and instead use observed interest rates to govern the choice of the parameters. This creates an inconsistency between their approach and the large body of literature that has, for example, estimated values for η that range from 1 to 4 but are generally centered around 2 (Gollier and Hammitt 2014). So this aspect of their approach has practical appeal, but it is not built on an especially solid foundation of evidence.
The authors apply this approach and demonstrate its value. An especially important finding is that when uncertainty in economic growth is incorporated (as the RFF-SPs do), then constant discounting produces values of the SCC that appear inappropriately high. This is because it places a relatively greater weight on damages that occur in good times, which does not fit the widespread evidence on the declining marginal utility of consumption. Put another way, it ignores the climate beta and leads to an upward bias in the SCC.
COMMENTS AND CONCLUSIONS The SCC has been overdue for a revision probably for almost a decade but certainly for at least five years since the 2017 NASEM report was issued. In many respects, this paper is a response to the near-term items that NASEM outlined. It updates the climate model, characterizes the uncertainty in projections about economic growth, popu lation, and emissions, and implements a Ramsey-style approach to discounting that takes advantage of the characterization of uncertainty in the socioeconomics and that climate damages may be correlated with the overall economy. These are important accomplishments.
The straw that stirs this drink and this work's primary contribution are the socioeconomic projections. As a reminder, these boil down to projections of how these variables will evolve for the next three hundred years. This is a terribly difficult task but nevertheless a critical one for getting climate economics and policy right. The choice for developing multi-century estimates of how growth, population, and emissions will evolve essentially boils down to relying on expert judgment in one form or another (e.g., the probabilistic scenarios or the deterministic shared socioeconomic pathways), using statistical models to make projections, or some combination as the authors do in this paper.
I will confess to skepticism about the value of relying on responses from prominent researchers to a two-hour survey that in many instances does not relate to the core of their scientific work. The penalty of being wrong is essentially zero and internal consistency in answers across questions is not assured. Further, I think there is little disciplining the replies besides personal opinions, prejudices, and incomplete recollections of statistical models. And yet, the academic reputation of the respondents provides credibility to the entire exercise-credibility that I think is unwarranted given the challenges I have outlined here. In contrast, good statistical projections are disciplined in transparent ways. We may argue about the statistical approach, but it is at least clear what it was.
In this vein, the following figures plot statistical features of the distribution of future global CO 2 emissions from 10,000 joint population-GDPemissions trajectories up to 2300 simulated in the paper. These trajectories are derived by sampling from the Resources for the Future distributions of future population, GDP, and emissions, which are constructed using a combination of prior studies and expert elicitation. In particular, emissions are paired one-to-one with each of 10,000 population-GDP trajectories based on a distribution constructed through a survey of experts. I note that separate distributions were specified for direct emissions and CO 2 removal through negative emissions technologies and that the authors generated net emissions by independently sampling from these distributions and summing. These figures are designed to show the basic statistical properties of the authors' projections, and they highlight some surprising features that at least partially arise due to the use of expert elicitation to shape probability distributions.
In figure 4, the solid line is the median across these 10,000 simulations, dark gray shading shows the 5th-95th percentile range, and light gray shading shows the 1st-99th percentile range. For comparison, the plots also show projections from the RCP/SSPs (dashed lines), which are the deterministic scenarios of socioeconomics and emissions used in the IPCC's Sixth Assessment Report (IPCC 2021). 3 There are several noteworthy features of this figure. It is certainly interesting, and perhaps reassuring, that the median falls in the middle of the SSP-RCP combinations. However, I was especially struck by some of the patterns in the tail. For example, 4 percent of the projections have cumulative emissions that are negative by 2300. This can only be the case if Source: Author's calculations. Note: For each year indicated on the horizontal axis, the solid line represents the median value of cumulative emissions across 10,000 simulations given in the paper; dark gray and light gray shaded areas respectively show the 5th-95th and 1st-99th percentile ranges across these simulations. economically and technically feasible carbon dioxide removal technologies arrive and are used so extensively that more CO 2 is removed from the atmosphere than was ever emitted since 1850. In effect, these projections assume that the optimal global temperature must be colder than preindustrial temperatures and societies continue to fund the operation of carbon dioxide removal machines until this colder optimum is reached. A not quite as astounding, but still surprising, feature of the projections is that 11.4 percent of them have cumulative emissions that are lower than current cumulative emissions. This too can only be explained by a massive use of carbon dioxide removal technologies. As a point of comparison, there are approximately zero technically and economically scalable examples of these technologies currently. Figures 5 and 6 report histograms of the year that annual CO 2 emissions peak and the year that cumulative emissions peak (i.e., the sum of all future emissions is negative), which is also the year net-zero CO 2 emissions are achieved, respectively. In figure 5 cumulative emissions occur before 2050. By 2100, 13.7 percent of the projections have reached peak cumulative emissions. Without revealing my own expert judgments, I will note that this is at great odds with the Paris Climate Accords, which set a target of achieving net-zero by the middle of the twenty-first century. It is especially striking that 71.8 percent of the projections have not reached peak cumulative emissions by 2300, which would put any of the frequently discussed temperature change targets (e.g., 1.5 or 2.0 degrees Celsius) far out of reach. Personally, I am not quite sure what to make of these findings, but it would be instructive to have them interrogated by the academic community and to be explicit about the roles of the underlying statistical models and the expert judgment in producing them.
Overall, several of the findings from these figures surprised me. Does that mean that they are wrong? No. However, I think there is a strong case for opening these projections up to the research community so that they can be analyzed carefully. It would be especially interesting to compare them to projections that are based entirely on statistical models. Regardless of what conclusions are reached, the seriousness of the climate problem demands that such peer review be a part of the process of inserting expert elicitationbased projections into policy-relevant models of climate change.
Ultimately, my judgment is that all approaches to developing long-run socioeconomic projections are going to be unsatisfactory, but they are nevertheless necessary for devising a socially desirable policy to confront the climate change problem. The authors have made a careful effort to develop these projections. They should be carefully scrutinized with an eye toward how much weight, if any, to place on expert judgment. Regardless of what is chosen, the paper also deserves credit for demonstrating how to integrate uncertainty about these projections and recognizing that climate damages may be correlated with the overall economy into discounting through Ramseystyle discounting. Both of these contributions align with key near-term recommendations from the 2017 NASEM report.
I will close by noting that NASEM's 2017 medium-or long-run recommendations extended beyond the contributions in this paper and involved the incorporation of empirically founded damage functions into the calculation of the SCC. This NASEM recommendation came from the rapid advances in estimation techniques, data access, and computing that have made it possible to ground damage function estimation in data, rather than assumptions.
It is now possible to achieve these long-run NASEM goals. Trevor Houser, Solomon Hsiang, Robert Kopp, and I cofounded the Climate Impact Lab in 2014 to build climate damage functions empirically and use them to calculate the SCC. The guiding principles were a ruthless belief that the SCC should be based on the best available econometric evidence and that it must account for adaptation costs and benefits, be globally representative, rely on the best available climate models, and value uncertainty and unequal impacts. The result of this work is the development of the Data-driven Spatial Climate Impact Model (DSCIM), which is a modular system for computing the SCC and the global impacts of climate change at the level of 25,000 regions (e.g., a US county) around the world using data. DSCIM is built to be very flexible-for example, it can incorporate characterizations of econometric, climate, and socioeconomic uncertainty (including the authors' projections), value this uncertainty, implement essentially any approach to discounting, including the one the authors outline, and deliver estimates of climate damages where people live rather than at the global or country level.
What is ahead for SCC research? The to-do list is long but it certainly includes building out damage functions for more sectors (e.g., damages from altered labor productivity, alterations in ecosystem services, migration, etc.), improving understanding of the costs and benefits of adaptation, understanding the interaction of impacts in sectors (e.g., agriculture and migration), and so much more. This is an exciting area of research with enormous implications for policy.

COMMENT BY
MAR REGUANT This paper makes significant contributions to updating the social cost of carbon (SCC) calculations, which estimate the marginal damages from climate change and are vital to climate policy . As explained by the authors, the SCC is often used to inform the design of important legislation, and it has influenced over sixty federal regulatory analyses (Aldy and others 2021).
The authors' goal is to present a framework with several steps to improve current estimations of the SCC. The proposed methodology corrects for the intertemporal changes in the utility of income due to income effects by introducing a stochastic discount factor. The authors also propose eliciting experts' beliefs on uncertain processes, such as emissions, growth, and population trends.
The work by Rennert and colleagues is part of a larger agenda conducted by the Resources for the Future's (RFF) Social Cost of Carbon Initiative and collaborators to improve and expand the tools to inform the metrics surrounding the SCC. This agenda follows the recommendations from a 2017 committee report by the National Academies of Sciences, Engineering, and Medicine (NASEM 2017). 1 This initiative plans to expand further the treatment of uncertainty surrounding economic damages, another recommendation not yet addressed in this contribution which suffers from several limitations. 2 In this comment, I focus first on the mechanics behind the correction when using a stochastic discount factor. I then discuss the challenges in eliciting beliefs about future outcomes for scenarios that have never happened, a difficulty shared when estimating the SCC with micro data. I also point out the benefits of complementing the SCC calculation with more detailed sectoral analyses. Finally, I conclude by discussing the importance of understanding what the estimates of SCC imply for our abatement strategies, highlighting the necessity of assessing that the recommended SCC is consistent, under reasonable assumptions, with the emissions trajectory that informs it.

CORRECTING FOR THE DECLINING MARGINAL UTILITY OF INCOME
A key finding of the authors is that considering the declining marginal utility of income, in an intertemporal sense, leads to a lower SCC estimate. 3 Table 1 in the article shows that accounting for stochastic growth discounting (declining marginal utility of income in future events) reduces the SCC by a factor of two to four. The authors find that outliers drive part of the magnitude of the correction. However, muting the role of outliers, the SCC still goes from $96 to $61 when introducing this correction. In sum, considering the declining marginal utility of income leads to less emphasis on climate mitigation efforts.
It is essential to understand the factors driving this finding, which leads to a downward correction of the SCC. This finding clashed with my initial economic intuition, as I had expected the SCC to become more prominent after considering the declining marginal utility of income. My instinct was based on two premises. From a cross-sectional point of view, marginal damages from climate change tend to be largest for poor agents (countries or households), and these agents also have the largest marginal utility of income. From an intertemporal point of view, I expected bad news from climate (severe marginal damages) to correlate with poor growth outcomes, leading to low income and high marginal utility of income as well.
The fact that the SCC is larger for better outcomes is in part driven by the model's assumptions. As explained by the authors, the role of the correction can critically depend on the "climate beta," a parameter summarizing the relationship between damages and aggregate consumption: "The magnitude of this bias [of the SCC that ignores the stochastic discount factor] depends on the climate beta and on the nature of the uncertainty in socioeconomic and emissions trajectories." The methodology takes a climate beta of essentially one in the short run, partly driven by damages being proportional to GDP . In good states, marginal damages are larger and potentially much larger, given that growth is exponential. However, states with severe damages are also the richest, so they are less important from a relative point of view with Ramsey-like discounting. Therefore, there is a substantial downward correction of the SCC for those states in which the uncorrected SCC is highest.
An additional aspect contributing to a significant downward correction of the SCC is that expert beliefs about growth have a relatively symmetric bell shape centered around a roughly 2 percent growth rate. Because damages 3. Contemporaneous differences in the marginal utility of income are not considered in the paper. are exponential in growth via the previous assumption, marginal damages in high growth states are more heavily corrected downward when compared to low growth states. Still, these are similarly likely, leading to an overall downward correction of the central estimates of the SCC.
It would be helpful to consider departures more explicitly from the baseline assumptions in the model, for example, by assessing the magnitude of the bias correction for a range of climate betas.
ON THE DIFFICULTIES OF FORECASTING Expert beliefs determine trajectories about population, emissions paths, and growth in the proposed calibration of the SCC. However, this is a challenging object to forecast, even by experts, because it is a process that we have never observed.
In truth, forecasting the socioeconomic impacts of climate change is difficult for any methodological approach taken, be it expert elicitation, data estimation, or mathematical modeling. 4 Climate change is an extrapolation of a process that we have never witnessed. The socioeconomic processes behind population growth, emissions, and economic outcomes are extremely difficult to inform in uncharted territory. As soon as we move away from physical processes that follow well-understood laws of physics, nonlinearities in the impacts of climate change are hard to foresee.
The highly uncertain nature of growth, emissions, population, and climate damages affects the precision of the distribution of the SCC. Even without incorporating uncertainty about economic damages, figure 11 shows that there is a wide distribution of implied marginal damages for any given approach or set of assumptions. Therefore, the calculation should be interpreted as an improvement to a "scenario" approach to calibrating the SCC but of a highly uncertain nature.
Whereas this might be the best possible strategy, I believe there is room for improvement in future work. In particular, the modeling of interactions between these three processes is limited, given that the three processes are elicited from different experts. The paper allows for some considerations on the correlations between the distributions, but it is not very clear in the current contribution to what extent they play a significant role. Additionally, it would be interesting to include elicitation of climate betas in future iterations.
The elicitation of independent experts for these categories can lead at times to seemingly inconsistent beliefs. For example, when describing the correlation between growth and emissions, "many [growth] experts expected that technology breakthroughs in clean energy would dramatically lower global emissions. Implicit in this narrative is a negative correlation between economic growth and carbon dioxide emissions." However, when looking at emissions elicitation, the beliefs by emissions experts, as implied by figure 7, seem to associate higher emissions to higher growth, at least in the medium run (2050). Some further discussion on how to deal with these relationships could be a potentially fruitful area of improvement.
BRIDGING THE GAP BETWEEN SHADOW VALUES As countries announce pledges to reduce their emissions by 2030 and to achieve net zero by 2050, there has been a focus on how to reduce emissions most effectively for a given goal. This alternative approach calculates the price of carbon that would be consistent with a specific reduction in emissions by solving for the optimal portfolio of climate policies with a given target. The target is taken as given, and therefore this approach is not explicit about what the damages from climate change are. Examples of such an optimization approach are the models solving for how to reach net zero in the United States or the policy framing of the Fit for 55 goals in the European Union. 5 There has been a recent discussion on the merits and pitfalls of using either approach (Stern and Stiglitz 2021; Aldy and others 2021). While the SCC approach has been preferred in the US context, the target approach is much more common in Europe. This is in part due to the higher agreement on the politically established emissions reduction goals across a broad range of the political spectrum. Europe, indeed, has had a cap-and-trade mechanism with explicit cross-sectoral goals since 2005 and has explicit reduction goals by 2030 and 2050 that are being built into legislation.
While it seems natural to stick to the SCC for US regulatory purposes, I believe there could be some fruitful interactions between these two methods. Like the traditional quantities versus prices trade-off, the two approaches suffer from limitations in the presence of uncertainty about the costs and benefits of fighting climate change (Weitzman 1974). Since the target approach focuses on efficient cost abatement, it tends to have a much more detailed accounting of uncertainties on the technological front, something missing in the highly aggregated integrated assessment models (e.g., Dynamic Integrated Climate Change [DICE], Climate Framework for 5. See, for example, Princeton University Andlinger Center for Energy and the Environment, "The Net-Zero America Project," https://acee.princeton.edu/rapidswitch/projects/ net-zero-america-project/; or European Council, "Fit for 55," https://www.consilium.europa.eu/ en/policies/green-deal/eu-plan-for-a-green-transition/. Uncertainty, Negotiation and Distribution [FUND], Policy Analysis of the Greenhouse Effect [PAGE]). In the approach proposed by the authors, and given the focus of SCC calculations on damages, such uncertainty is only indirectly present in the trajectories forecasted during expert elicitation. 6 More accurately describing the expected transformation at a given SCC would add credibility to the forecasted emissions trajectories. Using detailed modeling of technology, even if only for specific sectors in which technology is well understood, such as the electricity sector, could help bound the expected impacts of using the SCC as a tool for cost-benefit analysis. Considering the implications for abatement of a given SCC is particularly important for the proposed methodology, which takes the SCC as an output rather than an equilibrium object, as I discuss next.
A CRITICAL FEEDBACK LOOP In contrast with other attempts at quantifying the SCC, the proposed methodology does not use an integrated assessment model to derive an equilibrium value for the SCC. Instead, the SCC is estimated as the marginal damage of climate change evaluated at the forecasted level of emissions.
This simplicity in the framework can be achieved thanks to directly incorporating expert elicitation for equilibrium trajectories of growth, emissions, and population. However, it comes at the risk of being an out-of-equilibrium object that is potentially inconsistent with the estimated SCC. 7 The importance of the output from this research, an updated SCC to be used in US policy, makes this concern even more crucial. The SCC is a vital number for US policymaking. As explained in the article, "as political leaders and stakeholders debate both the broad outlines and the fine details of policies to reduce carbon dioxide emissions, the SCC lies in the background as a remarkably important calculation, used by the US federal government for more than a decade for developing vehicle fuel economy standards and power plant emissions rules." Precisely for this reason, it is important to ensure that the predicted SCC is consistent with the fundamentals that inform it. Estimating damages from climate change cannot be independent from accurately understanding the abatement cost function. Indeed, the equilibrium feedback loop of the calculation can be even more critical if the obtained SCC has a direct impact on climate policy, and thus the emissions trajectory.
6. Experts are considering beliefs on technological progress and breakthroughs when forecasting growth and emissions, but these are not quantitatively modeled.
7. In line with this concern, Diaz and Moore (2017) emphasize the value of using more detailed integrated assessment models that model more closely the functioning of resources and sectors but that ensure that the calculations of the SCC are in equilibrium.
To make this point more concrete, it is useful to take the illustrative estimate of $61 SCC in the article. While this is an SCC larger than previously estimated by the US federal government, it is unclear that it can be consistent with the substantial and fast decarbonization efforts that one would need to see to achieve emissions reductions consistent with those expected by the experts. In the electricity sector, one of the most easily decarbonized, it is unclear that such a threshold for cost-benefit analysis would lead to massive transformation and almost net zero generation that many governments have announced.
One response to this criticism is that almost net zero targets in the electricity sector is not a desirable goal, as implied by the estimated SCC. However, because the estimate is not an equilibrium object under the proposed methodology-the heart of this criticism-it is unclear how emissions at this SCC can decrease as projected by the experts. Thus, coupling the estimated SCC with information about expected actions that pass the cost-benefit analysis at such value, even if limited to a subset of policies, would be tremendously useful. Figure 1 clarifies this point. The increasing cost curve represents the marginal cost curve of abatement. The downward damage curve represents the marginal damages of climate change, which are highest when no abatement occurs and the temperature rises the most. 8 In a simplified environment without uncertainty, the goal should be to equalize the costs of abatement to the avoided damages.
In the proposed methodology, experts are the ones who determine where in the damage function we are crossing. The researcher elicits the emissions reductions from experts and then estimates the SCC at that point. Suppose that experts do not necessarily have equilibrium beliefs about emissions reductions and are optimistic, concluding that abatement efforts will be at point E1. This would imply an SCC of SCC1 (point A). If these beliefs are out of equilibrium, however, abatement efforts might be much lower at a threshold of SCC1, and emissions reductions might only be E2 (point B). However, at E2, marginal damages should have been estimated to be larger and equal to SCC2 (point C).
Any guidance on whether the emissions reductions implied by E1 are plausible under an SCC equal to SCC1 would add validity to the approach. One possibility would be to take a battery of more complex integrated 8. Note that the illustration gives the illusion that this might be a static formulation, but it should be interpreted as a summary of complex relationships with dynamic abatement and damage curves, including stochastic discounting and so forth. assessment models and run them under the assumption that the benchmark for cost-benefit analysis is SCC1. Because complex integrated assessment models can be sensitive to their technological modeling, the researcher could get a sense of the sectoral and technological assumptions consistent with such an equilibrium outcome.
I believe this is an important missing link. In the example above, I highlight the case where experts might be optimistic about emissions declines because this is the riskiest scenario. Underestimating the need for climate action could lead to substantially more serious marginal damages. Thus, having a more explicit bridge between experts' beliefs and bottom-up approaches that more systematically model expected policies and their impacts on emissions reductions could prove useful to make the recommendations more robust.
CONCLUSIONS The SCC is a critical measure to inform climate policy. The authors present an expanded approach to include stochastic discounting and expert elicitation in the estimation of the SCC. A few limitations are yet present in these calculations. First, it would be important to consider expanded damage functions that are more robust to those based on the DICE GENERAL DISCUSSION Michael Kiley offered a comment on emerging literature in the monetary policy space and highlighted the implications of the developing consensus for the paper. Kiley began by noting that while the authors make some efforts to respond to recent data suggesting lower long-run real interest rates, they may have still underreacted to the plausibility of baseline real interest rates below or even well below 2 percent. He argued that estimates of the social cost of carbon (SCC) will be highly sensitive to the prevailing real interest rate, particularly as that rate approaches zero. Given the focus of the paper on the various sources of uncertainty in estimates of the SCC, he suggested that this is yet another source of uncertainty with the possibility of substantially increasing the SCC.
Kiley continued, remarking that the climate beta value is often positive, reflecting assumptions about the damage function and GDP projection. He added that those GDP projection pathways generally reflect fairly stable economic growth over the long run, albeit with considerable uncertainty. Kiley contended that bad climate scenarios are potentially ones in which the macroeconomy is riskier, implying a negative climate beta and a possibly higher SCC. He noted that this is intuitive at the microeconomic level, with increased probability of localized climate disasters resulting in poor economic outcomes, but argued there are possibilities for macroeconomic growth disasters due to warming that increase the welfare impacts of mitigation efforts. Kiley concluded by acknowledging that this would be a new area, but one that is potentially valuable for future iterations of the SCC literature.
David Popp referenced discussant Mar Reguant's concluding point, which identified a feedback loop wherein climate policy itself is an additional source of uncertainty and is impacted directly by calculations of the SCC. Popp wondered whether it was possible for the model to include a projection of the emissions pathway if climate policies followed the proposed SCC and thus could limit the calculation to internally consistent estimates for the SCC (e.g., the SCC is estimated at $56 conditional on some emissions pathway, but that emissions pathway is not itself possible under a $56 cost of carbon). Popp questioned whether there might be some way to link those elements of the calculation to come up with an internally consistent estimate.
Wendy Edelberg pointed out that, due to the wide bands of uncertainty in the paper's SCC estimates, the distributions sometimes suggest a negative value for the SCC. She wondered if that was a natural consequence of simulations that are not bounded at zero or whether that is a legitimate possibility. Edelberg also asked how adaptation plays into the model.
While acknowledging that projecting adaptation into the future is a difficult task, she wondered if that was factored in and whether that may be a contributing factor to the distributions including negative values for the SCC estimates.
Glenn Rudebusch pointed out that there are also cases where the SCC can become infinite, particularly when the real interest rate will be negative for a long time. He noted that this phenomenon has already been evident recently in the data, so this was an additional factor the discounting module would have to grapple with. In particular, Rudebusch questioned how long negative real interest rates could persist, and in turn how far r* could fall. He referenced his paper with Michael Bauer that attempted to answer a similar question. 1 Adrian Raftery responded to the questions posed by discussant Reguant and followed up by Popp regarding climate feedback loops within the model. Raftery specified that for the population model, climate feedbacks weren't explicitly included but were implicitly included. He pointed out that it would be difficult to identify exactly what those feedbacks would be, and thus the population model is a statistical model with some adjustments based on expert elicitation. He clarified that the biggest driver of future population is fertility, and there is no evidence to date that climate has a direct impact on fertility (though there is evidence for the inverse). Raftery then mentioned that there are big climate impacts on mortality, but while they result in large damages overall, they are only maybe onethird of 1 percent of total deaths. 2 He argued that small changes in public health would likely have larger overall mortality impacts than the climate. He continued with the point that migration has very little effect on world population overall. One exception is international migration with people from poorer countries moving to relatively richer countries. Raftery posited that climate disasters tend to lead to more internal migration than external migration, with exceptions being small island nations and Bangladesh, meaning that the ultimate effects are hard to nail down. 3 He concluded by arguing that the authors have implicitly included the climate feedbacks toward population because the model is estimated based on seventy years of data from 1950 to 2020, which is a period during which there has been 1 degree Celsius of warming. 4 As such, the model is estimated based on data from a period during which there has been substantial warming and anticipates similar warming rates, meaning the model is implicitly incorporating elements of climate feedbacks.
Kevin Rennert followed, clarifying that the paper does not directly incorporate feedbacks-for example, from climate damages onto economic growth-by design because the intention is for the integrated assessment model to itself calculate the damages and then feed that back onto the pathway for economic growth. This is preferred to experts or the underlying statistical model calculating damages and adding them on top because that would lead to double counting.
Richard Newell spoke to the distinction between the marginal benefits of taking action and the marginal cost of taking action, and the degree to which they should be matched or equated within the model. Newell specified that while integrated assessment models are sometimes used to determine an optimal carbon tax (set at a level in which the emissions level is consistent with marginal damages), the authors intentionally do not do that in this paper. Rather than trying to inform a decision about the global optimal level of emissions, Newell pointed out that their paper intends to aid in the cost-benefit analysis and regulatory analysis for individual regulations or smaller actions in the scheme of global climate action. He contended that it is appropriate to separate and not directly couple emissions within the model to marginal damage estimates.