The pdfs of the papers can also be found in this filecabinet
. Alternatively you can also download many of my papers at John Moore's web site
. Otherwise just send me an email and I will provide you with a copy.
Detection and attribution of past changes in cyclone activity is hampered by biased cyclone records due to changes in observational capabilities. Here we relate a new homogeneous record of Atlantic tropical cyclone activity based on storm surge statistics from tide gauges to changes in global temperature patterns. We examine 10 competing hypotheses using non-stationary generalized extreme value analysis with different predictors (North Atlantic Oscillation, Southern Oscillation, Pacific Decadal Oscillation, Sahel rainfall, Quasi-Biennial Oscillation, Radiative Forcing, Main Development Region temperatures and its anomaly, global temperatures, and gridded temperatures). We find that gridded temperatures, Main Development Region, and global average temperature explain the observations best. The most extreme events are especially sensitive to temperature changes, and we estimate a doubling of Katrina magnitude events associated with the warming over the 20th century. The increased risk depends on the spatial distribution of the temperature rise with highest sensitivity from tropical Atlantic, Central America and the Indian Ocean. Statistically downscaling 21st century warming patterns from 6 climate models results in a 2-7 fold increase in the frequency of Katrina magnitude events for a 1°C rise in global temperature (using BNU-ESM; BCC-CSM-1-1; CanESM2; HadGEM2-ES; INM-CM4; NorESM1-M).
Aslak Grinsted, John C. Moore, and Svetlana Jevrejeva (2013), Projected Atlantic hurricane surge threat from rising temperatures, PNAS, doi:10.1073/pnas.1209980110
Figure caption: The frequency of Katrina magnitude surges increase greatly in a globally warming climate.
The surge index record used in this paper is based on six long high resolution tide gauge records (see map) from the US Atlantic and Gulf coasts.
Abstract I asses the feasibility of multi-variate scaling relationships to estimate glacier volume from glacier inventory data. Scaling laws are calibrated against volume observations optimized for the specific purpose of estimating total global glacier ice volume. I find that adjustments for continentality and elevation range improve skill of area-volume scaling. These scaling relationships are applied to each record in the Randolph Glacier Inventory which is the first globally complete inventory of glaciers and ice caps. I estimate that the total volume of all glaciers in the world is 0.35±0.07 m sea level equivalent, including ice sheet peripheral glaciers. This is substantially less than a recent state-of-the-art estimate. Area volume scaling bias issues for large ice masses, and incomplete inventory data are offered as explanations for the difference.
Citation: Grinsted, A. (2013): An estimate of global glacier volume, The Cryosphere, 7, 141–151, doi:10.5194/tc-7-141-2013
- Comparisons to Radic and Hock 2010, and Huss & Farinotti 2012.
- Area-volume scaling and glacier slope.
- Physical vs. statistical estimates of glacier volume.
- Area-Volume scaling, -controversial?
Thoughts on my experience with the open discussion format
This is my first open discussion paper, and I think it was an excellent choice for this manuscript. It is also my first solo paper and the comments from the referees were therefore especially welcome. (Thank you anon.referee#1, R. Braithwaite, and M.Huss for comments that spurred great improvement over the discussion paper).
Figure caption: The volume
fraction stored in all the glaciers larger than a given area. Dark cyan shows
the results of this study, and thin bright cyan excluding regions with many
glacier complexes in RGI v2. The distribution from Huss and
Farinotti (2012) is shown in green.
This figure shows that ~85% of the global glacier volume is stored in ~1000 largest RGI glacier complexes (>100 km2). In the paper I suggest that we can improve the glacier estimate through detailed studies of those complexes.
We present here surface water vapor isotopic measurements conducted from June to August~2010 at the NEEM camp, NW-Greenland (77.45° N 51.05° W, 2484 m a.s.l.). Measurements were conducted at 9 different heights from 0.1 m to 13.5 m above the snow surface using two different types of cavity-enhanced near infrared absorption spectroscopy analyzers. For each instrument specific protocols were developed for calibration and drift corrections. The inter-comparison of corrected results from different instruments reveals excellent reproducibility, stability, and precision with a standard deviation of ~ 0.23‰ for δ18O and ~ 1.4‰ for δD. Diurnal and intra-seasonal variations show strong relationships between changes in local surface humidity and water vapor isotopic composition, and with local and synoptic weather conditions. This variability probably results from the interplay between local moisture fluxes, linked with firn-air exchanges, boundary layer dynamics, and large-scale moisture advection. Particularly remarkable are several episodes characterized by high (> 40‰) surface water vapor deuterium excess. Air mass back-trajectory calculations from atmospheric analyses and water tagging in the LMDZiso atmospheric model reveal that these events are associated with predominant Arctic air mass origin. The analysis suggests that high deuterium excess levels are a result of strong kinetic fractionation during evaporation at the sea ice margin.Citation:
Steen-Larsen, H. C., Johnsen, S. J., Masson-Delmotte, V., Stenni, B., Risi, C., Sodemann, H., Balslev-Clausen, D., Blunier, T., Dahl-Jensen, D., Ellehøj, M. D., Falourd, S., Gkinis, V., Grindsted, A., Jouzel, J., Popp, T., Sheldon, S., Simonsen, S. B., Sjolte, J., Steffensen, J. P., Sperlich, P., Sveinbjörnsdóttir, A. E., Vinther, B. M., and White, J. W. C.: Continuous monitoring of summer surface water vapour isotopic composition above the Greenland Ice Sheet, Atmos. Chem. Phys. Discuss., 13, 1399-1433, doi:10.5194/acpd-13-1399-2013, 2013.
in discussion here
.. status: accepted
We examine the limitations of a semi-empirical model characterized by a sea level projection of 73 cm with RCP4.5 scenario by 2100. Calibrating the model with data to 1990 and then simulating the period 1993-2009 produces sea level in close agreement with acceleration in sea level rise observed by satellite altimetry. Non radiative forcing contributors, such as long term adjustment of Greenland and Antarctica ice sheets since Last Glacial Maximum, abyssal ocean warming and terrestrial water storage, may bias model calibration which, if corrected for, tend to reduce median sea level projections at 2100 by 2-10 cm, though this is within the confidence interval. We apply the semi-empirical approach to simulate individual contributions from thermal expansion and small glacier melting. Steric sea level projections agree within 3 cm of output from process-based climate models. In contrast semi-empirical simulation of melting from glaciers is 26 cm, which is twice large as estimates from some process based models, however all process models lack simulation of calving, which likely accounts for 50% of small glacier mass loss worldwide. Furthermore we suggest that changes in surface mass balance and dynamics of Greenland ice sheet made contributions to the sea level rise in the early 20th century and therefore are included within the semi-empirical model calibration period, and hence are included in semi-empirical sea level projections by 2100. Antarctic response is probably absent from semi-empirical models, which will lead to a underestimate in sea level rise if, as is probable, Antarctica looses mass by 2100.
Jevrejeva, S., J. C. Moore, and A. Grinsted (2012), Potential for bias in 21st century semi-empirical sea level projections, J. Geophys. Res., doi:10.1029/2012JD017704
, in press.
In one of the tests we calibrate against LeClercq's historical contribution from small glaciers and compare the resulting projections to Radic and Hock. Since then Marzeion et al
. has projected much greater contributions, which are more in line with what we obtain with the semi-empirical model.
Caption: Steric sea level simulated with semi-empirical model (black line) using the Gregory et al.,  data over period 1800-2000 for calibration (blue) and steric sea level after 2000 from CMIP3 AOGCM experiments (blue). Colour bars are the 5-95% confidence limit for simulation by semi-empirical model (grey) and AOGCMs (blue).
Efforts to extract a Greenland ice core with a complete record of the Eemian interglacial (130,000 to 115,000 years ago) have until now been unsuccessful. The response of the Greenland ice sheet to the warmer-than-present climate of the Eemian has thus remained unclear. Here we present the new North Greenland Eemian Ice Drilling (‘NEEM’) ice core and show only a modest ice-sheet response to the strong warming in the early Eemian. We reconstructed the Eemian record from folded ice using globally homogeneous parameters known from dated Greenland and Antarctic ice-core records. On the basis of water stable isotopes, NEEM surface temperatures after the onset of the Eemian (126,000 years ago) peaked at 8 ± 4 degrees Celsius above the mean of the past millennium, followed by a gradual cooling that was probably driven by the decreasing summer insolation. Between 128,000 and 122,000 years ago, the thickness of the northwest Greenland ice sheet decreased by 400 ± 250 metres, reaching surface elevations 122,000 years ago of 130 ± 300 metres lower than the present. Extensive surface melt occurred at the NEEM site during the Eemian, a phenomenon witnessed when melt layers formed again at NEEM during the exceptional heat of July 2012. With additional warming, surface melt might become more common in the future.NEEM Community Members, 2012, Eemian interglacial reconstructed from a
Greenland folded ice core. Nature, 493, 489–494, doi:10.1038/nature11789
Primary author: D. Dahl-Jensen. Great work!
This will be a challenge to reproduce by ice sheet models. Current generation models show a terminal decline at much lower temperatures than what is indicated here. I have a feeling the ice sheet geometry feedback on atmospheric flow is a key to explaining this. It will also be good to see studies investigating how much of the large isotope signal can be explained by isotope enabled circulation models using different ice sheet geometries.
Proxy data forms natural time series used to lengthen instrumental climatic records, and may contain a significant portion of autocorrelation. Increased serial correlation limits the number of independent observations, not satisfying the assumptions of conventional statistical methods. We estimate the significance of calibration and verification statistics used in dendroclimatic reconstructions by combining Monte-Carlo iterations with frequency (Ebisuzaki) or time (Burg) domain time series modelling. Significance tests are presented for Coefficient of Determination (R2), Coefficient of Correlation (r2), Reduction of Error (RE) and Coefficient of Error (CE) for time series ranging from very low to very high autocorrelation. Increased autocorrelation implies higher occurrences of relatively high but spurious reconstruction statistics. Ebisuzaki time series modelling shows greater robustness and its use is recommended over Burg’s method, which penalizes the restriction in the number of autocorrelation coefficients imposed by the Akaike Information Criterion. Positive RE and CE values, traditionally viewed as successful reconstruction statistics, are not necessarily significant and depend on the temporal structure of the time series used. This approach is further implemented successfully to compute confidence intervals based on the temporal structure of the residuals of the transfer function. A Matlab® package and a Windows executable file for non-Matlab® users are provided to perform the described analyses.
Macias-Fauria, Grinsted, Helama, Holopainen (2012), Persistence matters: Estimation of the statistical significance of paleoclimatic reconstruction statistics from autocorrelated time series. Dendrochronologia, 30 (2), doi:10.1016/j.dendro.2011.08.003
This paper is an advancement because the conventional wisdom in dendroclimatology is that "... there is no significance test for CE, any value greater than zero indicates some degree of model skill (Cook et al., 1994)
". (e.g. p272 from this 2009 book
). - Well we provide such a test now, and we show that you commonly need to be more than just positive to beat p=0.05.
Sea level rise over the coming centuries is perhaps the most damaging side of rising temperature (Anthoff et al, 2009). The economic costs and social consequences of coastal flooding and forced migration will probably be one of the dominant impacts of global warming (Sugiyama et al, 2008). To date, however, few studies (Anthoff et al, 2009; Nicholls et al, 2008) on infrastructure and socio-economic planning include provision for multi-century and multi-meter rises in mean sea level. Here we use a physically plausible sea level model constrained by observations, and forced with four new Representative Concentration Pathways (RCP) radiative forcing scenarios (Moss et al, 2010) to project median sea level rises of 0.57 for the lowest forcing and 1.10 m for the highest forcing by 2100 which rise to 1.84 and 5.49 m respectively by 2500. Sea level will continue to rise for several centuries even after stabilization of radiative forcing with most of the rise after 2100 due to the long response time of sea level. The rate of sea level rise would be positive for centuries, requiring 200–400 years to drop to the 1.8 mm/yr 20th century average, except for the RCP3PD which would rely on geoengineering.
Figure caption:The graph shows how sea levels will change for four different pathways for human development and greenhouse gas pollution. The green, yellow and orange lines correspond to scenarios where it takes 10, 30, or 70 years before emissions are stabilized. The red line can be considered to represent business as usual where greenhouse gas emissions are increasing over time.
Scenarios used: RCP3PD(green), RCP4.5(yellow), RCP6(orange), RCP8.5(red). More details on scenarios can be found here and here. [PDF]
Citation:S. Jevrejeva, J.C. Moore, A. Grinsted, Sea level projections to AD2500 with a new generation of climate change scenarios, Global and Planetary Change, Available online 21 September 2011, ISSN 0921-8181, 10.1016/j.gloplacha.2011.09.006.
Many coastal areas will need protection against rising sea levels in the coming centuries, regardless of climate change mitigation strategies. A one meter rise is practically certain except in the most optimistic scenarios.
Kemp et al. obtained a really nice proxy of relative sea level from North Carolina. North Carolina is subsiding and they therefore apply a 'GIA' correction to obtain a sea level record free of this particular local effect. In our comment
we argue that the uncertainty they use in the subsidence correction is too small. The important point is not that we have a better estimate than theirs, but that there is a range of different estimates, and that their uncertainty does not bracket them all. Further the particular estimate they have chosen is an outlier, and cannot explain tide gauge observations (see right column). The effect of a greater subsidence correction and greater uncertainty is shown in this figure (click to zoom):
In their paper they then proceed to calibrate a semi-empirical model of global sea level, to this local sea level record. I believe this is not a suitable calibration target precisely because it is local. Further, our key point is that the uncertainties are too low, and that the proxy record as a consequence should not provide as strong constraints to the fitting procedure (therefore the model confidence intervals are artificially too tight). In my personal opinion the NCRSL record is much better suited as a validation for sea level models (in contrast to a calibration target).
Links to paper, comment and reply:
My reaction to their response:
Their response deals almost exclusively with what is the best value to choose as the GIA. This is exemplified in their final conclusive paragraph: "Regional geological data remain the most robust means to estimate late Holocene GIA ...". However, this response does not at all address the key problem we are highlighting: They have an overly optimistic uncertainty interval. We believe that the uncertainty interval should be several times greater than their stated value, and we base this on the work of Engelhart et al. [Disclaimer: Kemp et al. use the Engelhart paper to support the 'optimistic' uncertainty interval. The authors therefore feel that we misrepresent Engelhart. I would argue that we represent Engelhart's data exactly as it is and make our own conclusions. Can that really be called a misrepresentation?]
The figure and text in the column on the right shows that a 1mm/yr subsidence rate cannot explain the regional tide gauge trends.
Detailed response to their 4 points:
- [ICE5g is a poor choice]: Our comment is regarding the uncertainty interval, not what is the best guess value. It is clear from our text that the reason we chose ICE5G is because it is closer to a mid-range value when considering the various estimates (including their geologic estimate). So, this point is moot.
- [GPS rates are too uncertain to determine regional trends]. This is the relevant quote from Engelhart et al. showing just how different geologic and gps rates are:
[Grinsted et al. 'wrongly claim' a too large local tide gauge rate]: Our claim is based on the supplementary material to Engelhart et al. which can be downloaded here. The figure in question is reproduced in the column on the right. look for X=2950 km. If anything our 4mm/yr claim seems too low![The larger rate is inconsistent with Mediterranean proxy records]: First, we do not state we have the best estimate, only that the claimed uncertainty is too low. So, again the point is moot. Secondly North Carolina Relative Sea Level is not Global sea level, so it is allowed to not match with other regions. (And perhaps there are other local issues with these records).
- "The geological rates of subsidence decline rapidly with distance from Hudson Bay along the U.S. Atlantic coast compared to the GPS observations. The GPS observations suggest that high rates of subsidence from the collapse of the forebulge extend into Virginia and the Carolinas (Sella et al., 2007; Snay et al., 2007). For example, the geological data within Chesapeake Bay, Virginia, estimate subsidence of 0.9 ± 0.3 mm a–1 compared to nearby GPS observations of 3.5 ± 1.6 mm a–1 (Sella et al., 2007) and 2.6 ± 1.2 mm a–1 (Snay et al., 2007)"
- This quote shows that GPS inferred subsidence is significantly greater, than the geologic estimate. They may be right that it is hard to extrapolate to what it exactly should be in NC, but it is clear that GPS indicates that the forebulge is collapsing significantly faster than geologic estimates allow.
Tide gauges show that geologic GIA estimate is too low
The figure below is from the supplementary info to Engelhart et al. The important thing here is that each tide gauge (red dot) has been chosen, so that it should be unaffected by confounding factors like ground water pumping. The distance between Tump Point, NC and Churchill, Canada is ~2950 km. For comparison the Global average rate of rise over the 20th century is 2mm/yr. We can subtract the global rate from the local relative rate of sea level rise to estimate the subsidence. That gives a subsidence of atleast 2 mm/yr.
It seems clear that before this issue is resolved, it is dangerous to assume global=local and that it therefore is premature to use the record as a calibration target.
We analyze the global sea-level budget since 1850. Good estimates of sea-level contributions from glaciers and small ice caps, the Greenland ice sheet and thermosteric sea level are available over this period, though considerable scope for controversy remains in all. Attempting to close the sea-level budget by adding the components results in a residual displaying a likely significant trend of .0.37 mm/a from 1955 to 2005, which can, however, be reasonably closed using estimated melting from unsurveyed high-latitude small glaciers and ice caps. The sea-level budget from 1850 is estimated using modeled thermosteric sea level and inferences from a small number of mountain glaciers. This longer-term budget has a residual component that displays a rising trend likely associated with the end of the Little Ice Age, with much decadal-scale variability that is probably associated with variability in the global water cycle, ENSO and long-term volcanic impacts.
Moore, Jevrejeva, and Grinsted, The Historical Sea Level Budget, Ann. Glac., 52, (59) , 2011, [link