In the past months, including at the COP15 at Copenhagen, a large number of individuals have called for a scale of mitigation and remediation in global emissions of CO2 that would rapidly reduce the atmospheric concentration to at least 350 ppm. The target of 350 was selected, in part but not entirely due to the findings of Hansen et Al.1 which considered the current influence of climate forcings and what climate changes could be adapted to by human civilization at reasonable financial and human costs. Other goals have been proposed for 450 ppm where it is believed that average global temperature would not exceed 2 degrees C, which allows maintenance of a generally stable human society with proper adaptation strategies. However, the 350 camp adamantly believes that stabilization at 450 ppm would not be tenable to the comfortable survival of the human species on Earth, thus although 450 ppm would be acceptable for a very brief period of time, 350 ppm must be the plateau for a final CO2 atmospheric concentration.
Depending on what measurements one uses, the current atmospheric concentration of CO2 resides between 386-388 ppm.1,2 Unfortunately most proponents of the 350 movement fail to realize that this number represents CO2 concentration and its resultant climate forcing and not any of the other greenhouse gases which can be measured in CO2 equivalency. The atmospheric concentration for CO2 equivalency ranges even higher with a very high likelihood of being in the low to mid 400 ppm. Overall the 350 ppm goal must involve CO2 equivalency and not just CO2 alone otherwise the goal is structured in a way where direct achievement may not result in principled success.
Now one may find fault with the statement that CO2 equivalency is over 400 ppm, so how was that statement derived? The IPCC uses the following formula to calculate CO2 equivalency:
Total Climate Forcing = 5.35 ln(CO2 equivalency / CO2 pre-industrial);
First, note that CO2 pre-industrial is equal to the atmospheric CO2 concentration before humans began emitting large amounts into the atmosphere due to advances from the industrial revolution and beyond; this concentration is commonly viewed as 278-280 ppm.1 Most people view total climate forcing as the forcing from all significant greenhouse gases, significant greenhouse gases are defined by the Kyoto Treaty (CO2, CH4, N2O, HFCs and CFCs etc.).
Using these elements alone a total climate forcing relative to 2007 of approximately 2.71 W/m^2 can be calculated. This value leads to a CO2 equivalency of approximately 461.35 ppm to 464.67 ppm. However, this methodology does not take into account the fact that there are other forcings in the atmosphere as well which influence the climate such as areosols, surface albedo, clouds and ozone. In 2007 when the IPCC 4th Assessment Report was released due to a filing deadline it used empirical information from 2006 and earlier. This information generated a forcing map that defined a total climate forcing of approximately 1.6-1.7 W/m^2 when taking all relevant factors into consideration. The figure below outlines these forcings.3
These climate forcing numbers result in a CO2 equivalency of approximately 374.9 ppm to 382 ppm (very similar to the concentration of atmospheric CO2 at the time). The new CO2 equivalency numbers drop significantly due to the inclusion of the negative forcing influence assigned to aerosols, clouds and surface albedo among other elements. Unfortunately since the publication of the 2007 IPCC report new empirical evidence has re-evaluated the forcing influence of aerosols calculating a lower than previously thought negative climate forcing.4 Also new information has been discovered regarding clouds and the sustainability of their impact on climate forcing. Similar to aerosols, clouds provide a negative climate forcing which reduces the overall rate of increase in surface temperatures; however, this new information suggests that as sea surface temperatures increase low-level stratiform clouds decrease in both size and frequency.5 Thus the influence of clouds at reducing the severity of climate change is reduced as temperatures increase, so the influence of clouds will significantly wane over time.
In addition to the loss of influence from aerosols and clouds, surface albedo both on land and on sea, especially sea, have been taking a beating in recent years reducing their influence on limiting the rate of climate change. Finally logical intuition when viewing current empirical evidence suggests a higher atmospheric CO2 equivalency than a value equal to the current atmospheric CO2 concentration. For example the rate of ice melt in the Arctic, Greenland, Western and even Eastern Antarctica significantly eclipse the predictions made in the IPCC 4th Assessment Report, which suggests either incorrect assumptions regarding climate forcing or the exclusion of a significant factor influencing climate change in a negative (temperature increasing) way. With the size of most of the error bar associated with previous climate forcing calculations, the first option seems more probable. Overall with such rapid and negative changes to the climate everyone better hope beyond hope that CO2 equivalency is in the 400s and not the 300s, otherwise the situation is much worse than anyone previously thought.
With all this said there are a number of people that believe the target of 350 ppm is unrealistic in that humans do not possess the necessary tools and/or determination to accomplish such a goal and that humans are better off preparing for a world that has a greater average temperature of at least 2 degrees C. Proponents counter with claims that a phase out of coal in the next 20 years and aggressive anti-deforestation and reforestation programs would go a long way to reaching the 350 goal at modest costs. Unfortunately for the 350 ppm proponents the real failure in achieving maintenance of a familiar ecosystem and environment may not come from a failure in human will, but instead a failure in tactics based on inaccurate information. The chief concern is that improper tactics are being suggested to reach a goal due to incomplete information based on the warming trend.
There are two crucial elements pertaining to the probability of achieving a specific ceiling and stabilization of global surface temperatures: the climate sensitivity of the Earth and the atmospheric concentration of CO2 and other greenhouse gases. Climate sensitivity describes how the surface temperature changes in response to a stabilized doubling of atmospheric CO2. The reason climate sensitivity is important is it tries to provide a direct correlation between surface temperature and changes in CO2 concentration. Basically climate sensitivity describes the influence of greenhouse gases on temperature change. In 2007 the IPCC 4th Assessment Report defined the range of climate sensitivity between 2 and 4.5 degrees C.3 For 350 and other temperature ceiling movements such a range should be troubling because the lower range was raised by 0.5 degrees C from the IPCC 3rd Assessment in 2005 which defined climate sensitivity between 1.5 and 4.5 degrees C,3 in only a few years the estimated lower floor jumped 33%.
The primary means of deducing climate sensitivity is correlating known temperature change trends in the past with changes in atmospheric CO2 concentrations. The best historical data comes from the Last Glacial Maximum because of the size and accuracy of the temperature and CO2 concentration shifts. For example during the Last Glacial Maximum CO2 concentration where approximately 180 ppm vs. 280 ppm for typical pre-industrial times and the 386-388 ppm that current exist.6 Average surface temperatures dropped 7 degrees C in relation to this CO2 concentration which generated a climate sensitivity of 11.2 degrees C.6 Despite this number most climatologists consider it flawed due to questions surrounding how feedbacks like existing sea ice, clouds and water vapor with other particulates were factored in its calculation. Most believe that these feedback elements were more pronounced during the Last Glacial Maximum then they are now which significantly reduces climate sensitivity in the present.
Overall the most widely accepted value for climate sensitivity comes from Charney who calculated a climate sensitivity of 3 degrees C when incorporating fast feedbacks.7 Unfortunately this calculation assumed an instantaneous doubling in CO2 with no surface changes. Eliminating any surface changes limited the accuracy of the calculation. Hansen et Al.1 used paleoclimate data when including slow surface albedo feedback while assuming a first order relationship for the area of surface/sea ice as a function of global temperature to calculate a climate sensitivity of approximately double Charney’s calculation (6 degrees C vs. 3 degrees C). Normally such a calculation would not be a huge deal because slow feedbacks operate over centuries to millennium, but these operational ranges were only experienced through natural cycles, not with humans dumping hundreds of gigatons of CO2 into the atmosphere. Thus, it is difficult to rule out these slower feedbacks exerting an influence after decades instead of centuries. Empirical evidence especially demonstrates significance for these slower feedbacks because of the rapidly melting surface ice in the Arctic.
Determining a reasonable climate sensitivity is important because it is a principle element in how predictions are made regarding future changes in surface temperature and the overall climate in general. The principle elements that allow predictions on the future climate come from many different climate models and current empirical observations. Modeling the climate is incredibly complicated requiring thousands of different variables as well as the inclusion of hundreds to thousands of interactions between those variables to generate results that can even only be considered ‘in-the-ballpark’. The application of these interactions and variables creates a tremendous demand on time and energy for the computers involved in the modeling. Therefore, to ensure that the generation of a single result does not take weeks/months, certain elements are removed from consideration in the final results. In addition there are some gaps in knowledge regarding how certain variables interact with other variables and in an attempt to ensure some level of accuracy, these types of interactions are also removed or estimated as best as one can and modeled accordingly.
Unfortunately these omissions create inaccuracy in the ability of the model to predict how the climate will change relative to how it actually changes. It is in these omissions where most climate skeptics have attacked with the claim that because x model is potentially inaccurate then the very essence of climate change is wrong. Of course any rational person realizes that such claims are utter nonsense as all of the valid empirical evidence still demonstrates that climate change is occurring almost entirely due to the actions of humans and the lack of a completely accurate model, something that probably will never be generated in the first place, does nothing to taint that evidence. However, in the past few people have considered that the predictions made by climate models were incorrect on the other side of the coin, that they are underestimating the rate of climate change. Due to new empirical evidence, largely surrounding the much more rapid ice and glacier melt in the Arctic,8 more individuals are questioning whether the results of the 4th Annual IPCC report were inaccurate, predicting too slow of a surface temperature shift.
When predicting future temperature changes climate models tend to generate either a linear or a quasi-exponential change in the increase in average global air temperature over future years. For example after modeling four distinct scenarios of human action for the future, the 4th Annual IPCC report illustrated surface temperature changes as shown in the graph below.3
At first glance such predictions may seem practical based largely on how much CO2 and other greenhouse gases humans continue to emit into the atmosphere through future action. However, when considering all of the potential environmental feedback elements that could trigger during warming such results seems less and less plausible. The more noteworthy feedback factors that have a high probability of playing a role in additional future temperature increase include: increased water evaporation leading to more water vapor in the atmosphere,9,10 CO2 and methane release from melting permafrost,11,12 nitrous oxide release from peat sources,13 increased ocean albedo due to Arctic ice melt,3 new cloud synthesis or disappearance at different altitudes,5,14 increased rainforest dry season reversing sink to source behavior,15,16 and increased ocean temperatures resulting in conversion from sink to source. Although all of the previously listed feedback elements demand concern, permafrost melt and ocean out-gassing demand the most concern due to the sheer amount of CO2 that either process could eventually release into the atmosphere. Both of these problems have previously been addressed on this blog at the following links:
Ocean Out-Gassing: http://bastionofreason.blogspot.com/2009/09/ocean-acidity-danger-and-remediation.html
It must be noted that the IPCC report identifies the potential inaccuracy in its conclusions due to feedback processes that were not included in the modeling. The decision to exclude most of the feedback information seems to stem from the lack of conclusive and accurate empirical information pertaining to those feedback processes. Basically the mindset of ‘some inaccuracy by not including feedback process A is better than gross inaccuracy through interpreting the feedback process incorrectly.’
Even though the potential inaccuracy is discussed, sadly enough it appears that increased water vapor was the only significant one attempted with varied results.3 A discussion was also given regarding the potential reduction of land and oceanic sinks due to surface temperature increases, but no direct comments regarding sink to source transformation were made. The inaccuracy of the IPCC used models and some of its conclusions have become quite evident most notably in the rapid pace of ice melt in the Arctic and new conclusions that the Arctic may be completely free of summer ice by only 2015-2020 instead of 2080-2100. The most unfortunate element in all of this seems to be the fact that most climate proponents themselves do not incorporate the potential feedback elements into their strategies with regards to limiting surface temperature increases to a certain boundary ceiling. Overall with the inclusion of feedback elements future average global surface temperature increases will more than likely follow a more severe trend than shown in the above graph.
The assumption of a more severe trend in temperature warming finds support when one considers the influence of the ocean in the carbon cycle. In large respects the ocean can be viewed as a dynamic replenishing buffer of some sorts. Various denizens of the ocean, most notably phytoplankton, are able to absorb CO2 either directly from the atmosphere or in the ocean for photosynthesis. When these organisms die, the CO2 that was used in photosynthesis is typically confined to the bottom of the ocean in sediment. After confinement to sediment the capacity of the ocean to absorb CO2 increases. In short due to the interaction of oceanic organisms the ocean is able to continually and consistently increase or at least maintain its ability to draw CO2 from the atmosphere.
Unfortunately buffers can only neutralize pH changes to certain concentrations. When a counter-agent (acid or base) is added at a high enough concentration the buffer collapses and the pH shifts. Such is also true for the ocean and its ability to absorb CO2. As the concentration of CO2, largely due to human driven activities, increases the ocean continues to absorb that CO2, but the rate of absorption is faster than the action of the pathway responsible for burying CO2 in sediment. Thus, concentrations of CO2 in the ocean build-up both by reducing the ability of the ocean to absorb further CO2 from the atmosphere as well as decreasing the efficiency of the CO2 removal pathway by limiting the available organisms responsible for that CO2 removal. The number of organisms is limited due to increases in acidity which result in less available calcium carbonate for certain food chain critical organisms to construct calcium carbonate infrastructure. When these creatures, like coral, are unable to create calcium carbonate shells it negatively affects large portions of the oceanic food chain including organisms that aid in CO2 removal. Eventually this process will conclude at a concentration equilibrium point where the ocean will no longer be able to absorb CO2 from the atmosphere eliminating its CO2 sink capacity.
Now while the loss of the ocean as a sink is bad enough, there is a very real possibility that the ocean will eventually become a source for increasing atmospheric CO2 instead of a sink. Most of the warming due to the excess CO2 in the atmosphere has not occurred on land, but in the ocean. This warming is important because gas solubility in a liquid decreases as temperature increases because increasing temperature increases available kinetic energy which increases molecule movement. Greater molecule movement increases the probability of bond breaking which reduces the ability of the gas to remain in solution. Therefore, as the ocean continues to warm its maximum capacity for CO2 storage in a dynamic equilibrium with the atmosphere will decrease causing it to release CO2 into the atmosphere until it is able to establish a new lower storage equilibrium. Although it is unclear how much CO2 could be released as a result of a negative gas solubility shift in the ocean, the fact that the ocean has increased in CO2 concentration by 118 +- 19 gigatons in the last 200 years17 and absorbs approximately 8-10 gigatons of CO2 a year from the atmosphere paints a dreary picture. Overall although out-gassing would be horrible, the loss of the ocean as a CO2 sink would be far worse over the course of decades.
Tie in the feedback resultant loss of the ocean as a CO2 sink with the potential of out-gassing to the prospect of a continual release of CO2 and methane from the potential 1,672 gigatons of carbon storage load in permafrost18,19 and those two feedback elements alone could create a huge shift in surface temperatures regardless of what humans emit in the decades to come.
Suppose one rejects the above contention of rapid severe warming due to these feedback effects? Even if such warming is rejected and such a rejection turns out to be correct in reality, those wishing to hold temperature increases at a ceiling of 2 degrees C still have the problem of ‘backwash warming’, warming that has yet to catch up with the influence of the current level of climate forcings. Basically the amount of climate forcing that has currently been applied to the environment has not been fully compensated for through change in average surface temperature largely due to slow feedbacks. That is to say that if all human based CO2 emissions were ceased tomorrow, the average global temperature would still increase another x degrees. Although it is not clear how much actual warming will occur through this ‘backwash’, Hansen et Al.1 estimate an additional global temperature increase of approximately 1.4 degrees C. Add that increase to the current increase from pre-industrial times of 0.6 to 0.9 degrees C (depending on what track information is used) and an increase of 2 degrees C is extremely probable regardless of what actions humans take. The graph used by Hansen et Al.1 to illustrate this additional future warming is shown below.
Reduction or mitigation of future emissions is an important element for limiting the total amount of temperature increase, but it is clear that the governments of the world are unwilling to create the cuts that allow mitigation to be an independent strategy that lacks further technological intervention. Currently despite the cries and curses from the environmentalist moment, it is unlikely that enough viable trace/zero emission energy can be generated to compensate for the draw down from coal and natural gas at the speeds required. Also regardless of how some environmentalist spin it, like Joe Romm of Climateprogress, China issuing a non-binding pledge to reduce carbon intensity is rather meaningless because reducing carbon intensity (with the economic growth still to available for China) instead of doing nothing is like getting a 32% on a test instead of a 19%, it is still failure. As it currently stands if the required cuts to avoid significant increases in surface temperature (2-3 degrees C) were to be made it would be a significant detriment to the overall global economic output due to less available energy. The energy gap that is created through emission mitigation for the United States was previous discussed in detail here:
So if mitigation is not occurring rapidly enough and generic geo-engineering tactics are basically worthless, what is to be done? The principle action beyond mitigation must be to draw CO2 out of the atmosphere via technological means. Technology must be harnessed solely because natural methods are just not fast enough. Not only are current carbon sinks becoming compromised due to current warming,20 but it is highly unrealistic that enough trees can be planted in the near-future to enhance land sink capacity especially when REDD, the most promising anti-deforestation proposal, has yet to expand in any significant capacity. Soil strategies using various tilling methods or bio-char may increase sink capacity by very minute amounts, but nothing to the level that is required. Thus, technology must be used.
In short all research funds that individuals want direct towards point-source carbon capture (a.k.a. clean coal) must instead be directed to non-point-source carbon capture (a.k.a. air capture). This blog has previously discussed the outstanding concerns with air capture and they are significant, but realistically the only way humans stop an increase in average surface temperature of even 3 degrees C without a miracle occurring is a combination of reducing carbon emissions and some form of air capture. Finally the development of a technology that could draw CO2 from the ocean would be an exceptionally useful tool in furthering mitigation by increasing the sink capacity of the ocean.
Overall the environmentalist movement needs to shift gears; it is somewhat humorous in that its members vent frustration at the portrayal of the question of global warming like it is a legitimate debate, yet these same individuals do the same by continuing to talk to species annihilators (global warming deniers) in a context of trying to convince them. At this point in time the debate is over; anyone who believes that humans are not the driving force behind climate change will not change his/her opinion regardless of what facts and evidence are highlighted, it is not worth wasting more time trying to convince them. The only thing that will convince these individuals are negative climate events that directly affect them, nothing which can be provided by environmentalists. In addition further discussion and proposition of foolish and inefficient boycotts of high emitters bad guys like Exxon should cease because truly such a strategy would be ineffective and just take time away and personnel away from more meaningful and effective endeavors. Instead it is time to move into research and innovation mode.
Solutions and strategies need to be prepared for when they are needed in the future. An honest assessment of what energy technologies will be needed to replace coal and natural gas will need to be identified. Just a quick note for those wind supporters, wind will not even come close to providing the necessary energy for global growth or even growth in the United States, especially if wind speeds continue to fall. Is nuclear really that expensive, preventing its widespread adoption, or is the expense only contingent on using 2nd generation technology over 3rd or 4th generation? What new energy strategies will need to be researched? There are many more questions beyond the few mentioned above that demand discussion and attention. Also these discussions cannot be broad based with weak statements like ‘oh all sorts of trace emission energy sources like wind, solar, nuclear and geothermal will be needed for the future’. No, these discussions must be full of details and specifics, so businesses and researchers know exactly what the future markets demand and expect.
In the end although mitigation is important, remediation is also important because the environment has reached a point where nature cannot restore the balance on its own. There are important questions to be asked regarding remediation and it is time for the environmental movement to start focusing in on those questions rather than lamenting or championing the latest meaningless poll regarding the public’s view of global warming, clean energy or whatever else is the subject of the poll de jour.
1. Hansen, James, et, al. “Target Atmospheric CO2: Where Should Humanity Aim?” The Open Atmospheric Science Journal, 2008, 2, 217-231.
2. Tans, Pieter. NOAA/ESRL (www.esrl.noaa.gov/gmd/ccgg/trends)
3. Climate Change 2007: Synthesis Report. Intergovernmental Panel on Climate Change.
4. Myhre, Gunnar. “Consistency Between Satellite-Derived and Modeled Estimates of the Direct Aerosol Effect.” Science. June 18, 2009. DOI: 10.1126/science.1174461.
5. Clement, Amy, Burgman, Robert, and Norris, Joel. "Observational and Model Evidence for Positive Low-Level Cloud Feedback." Science. July 24, 2009. 325: 460-464. DOI: 10.1126/science.1171255
6. Kohler, Peter, et, Al. “What caused Earth’s temperature variations during the last 800,000 years? Data-based evidence on radiative forcing and constraints on climate sensitivity.” Quaternary Science Reviews. 2009. 1–17. doi:10.1016/j.quascirev.2009.09.026
7. Charney J. “Carbon Dioxide and Climate: A Scientific Assessment.” National Academy of Sciences Press: Washington DC 1979. 33.
8. Hawkins, Richard, et, Al. “In Case of Emergency.” Climate Safety. Public Interest Research Centre. 2008.
9. Santer, B, et, Al. “Identification of human-induced changes in atmospheric moisture content.” PNAS. 2007. 104: 15248-15253.
10. Dessler, A, et, Al. “Water-vapor climate feedback inferred from climate fluctuations, 2003-2008.” Geophysical Research Letters. 2008. 35: L20704.
11. Åkerman, H, and Johansson, M. “Thawing permafrost and thicker active layers in sub-arctic Sweden.” Permafrost and Periglacial Processes. 2008. 19: 279-292.
12. Jin, H.-j, et, Al. “Changes in permafrost environments along the Qinghai-Tibet engineering corridor induced by anthropogenic activities and climate warming.” Cold Regions Science and Technology. 2008. 53: 317-333.
13. Dorrepaal, E. et, Al. “Carbon respiration from subsurface peat accelerated by climate warming in the subarctic.” Nature. 2009. 460: 616-619.
14. Booth, B, et, Al. “Global warming uncertainties due to carbon cycle feedbacks exceed those due to CO2 emissions.” Geophysical Research. 2009. 11: 4179.
15. Cook, K, and Vizy, E. “Effects of Twenty-First Century Climate Change on the Amazon Rain Forest.” Journal of Climate. 2008. 21: 542-560.
16. Phillips, O, et, Al. “Drought sensitivity of the Amazon rainforest.” Science. 2009. 323: 1344-1347.
17. Sabine, C, et, Al. “The Oceanic Sink for Anthropogenic CO2.” Science. 2004. 305: 367-371.
18. Schuur, E, et Al. “Vulnerability of permafrost carbon to climate change: Implications for the global carbon cycle.” BioScience. 2008. 58: 701-714.
19. Tarnocai, C, et Al. “Soil organic carbon pools in the northern circumpolar permafrost region.” Global Biogeochemical Cycles. 2009. 23: GB2023.
20. Canadell, J, et, Al. “Contributions to accelerating atmospheric CO2 growth from economic activity, carbon intensity, and efficiency of natural sinks.” PNAS. 2007. 104: 18866-18870.