Models

Overview

High-Resolution CESM (HR-CESM)

The Earth system model used for iHESP research is the Community Earth System Model version1.3 (CESM1.3) developed at NCAR in collaboration with the broader climate modeling community. This particular version of CESM has been run and evaluated thoroughly for century‐long climate simulations in a high‐resolution configuration [1] , whereas, there is no high‐resolution configuration for the more-recent CESM2 available yet. Furthermore, CESM2 is computationally much more expensive primarily because of many additional physics parameterizations included in its new atmospheric model version. The particular version of CESM1.3 (CESM1.3‐beta17_sehires38) used in this study is based on an earlier version, CESM1.3‐beta17_sehires20, described in Meehl, Yang, et al. (2019), which was developed specifically for supporting a high‐resolution CESM version with a 0.25° atmosphere and a standard‐resolution nominal 1° ocean model. The CESM1.3 component models are the Community Atmosphere Model version 5 (CAM5; Neale et al., 2012), the Parallel Ocean Program version 2 (POP2; Danabasoglu et al., 2012; Smith et al., 2010), the Community Ice Code version 4 (CICE4; Hunke & Lipscomb, 2008), and the Community Land Model version 4 (CLM4; Lawrence et al., 2011).

A detailed description of the progression from CESM1.1 to CESM1.3‐beta17 is provided in Meehl, Yang, et al. (2019). The most significant changes and improvements were made in the atmospheric model. A major change from CESM1.1 to CESM1.2 (also used in CESM1.3) involved moving from an Eulerian to a Lagrangian vertical advection scheme within the Spectral Element dynamical core (SE‐dycore). Additional changes from CESM1.2 to CESM1.3 included the following most noteworthy aspects: (1) a microphysics rearrangement, (2) a change in the radiation code (Rapid Radiative Transfer Model for General Circulation Models, RRTMG), (3) updates to the heterogeneous freezing code and the gravity wave scheme (McFarlane, 1987; Richter et al., 2010), and (4) changes to dust tuning and soil erodibility. As shown by Meehl, Yang, et al. (2019), these changes in CESM1.3 led to a better positioning of the Southern Hemisphere (SH) jet and improved high and low cloud simulations that agree better with the available observations.

Although a high‐resolution version of CESM1.1 was configured and successfully integrated for more than a century under a perpetual present‐day (year 2000) climate forcing [1], a similar configuration for CESM1.3 was not available before the iHESP project. Therefore, the first concerted effort of iHESP was directed at configuring and testing a high‐resolution version of CESM1.3, leading to the current version of CESM1.3‐beta17_sehires38 tag with a 0.25° resolution in CAM5 and CLM4 and a nominal 0.1° resolution in POP2 and CICE4. There are 30 vertical levels in the atmosphere with a model top at 3 hPa and the atmospheric model parameter settings remain unchanged from those in CESM1.3‐beta17_sehires20 described in Meehl, Yang, et al. (2019), except for a couple of bug fixes, including (1) a bug fix in the radiation code that omitted diffusivity angle calculations for key longwave radiation bands, ultimately causing unrealistic atmospheric temperatures and cloud formation, and (2) a fix in logic errors when computing snow water path and snow cloud fraction. A minor change to the processor decomposition for the SE dycore is also included in CESM1.3‐beta17_sehires38. The ocean and sea‐ice models are essentially the same as those used in Small et al. [1]. The ocean model has 62 levels in the vertical with a maximum depth of 6,000 m. Both the horizontal and vertical grids are identical to those used in Small et al. [1]. Compared to CESM1.3‐beta17_sehires20, a couple of changes were made to POP2. First, a more efficient elliptic solver for the barotropic mode was back‐ported from CESM2.0 to CESM1.3 to increase the computational efficiency of high‐resolution simulations when using large processor counts (e.g., Hu et al., 2015; Huang et al., 2016). The development of this new solver was led by the Chinese team in collaboration with NCAR, representing a contribution of the Chinese climate modeling community to the CESM development. Second, a change of ocean coupling frequency was made from 1 hr to 30 min to alleviate coupling instabilities between the ocean and sea‐ice models. In the sea‐ice model, an older penetrative shortwave calculation method was replaced with a newer delta‐Eddington shortwave computation of Briegleb and Light (2007). Although this latter method had been the default shortwave computation for the standard resolution simulations since Community Climate System Model version 4 (CCSM4) and CESM1, it was not used in the high‐resolution CESM1.1 simulation [1]. Unlike the standard resolution version, the high‐resolution POP2 in CESM1.3‐beta17_sehires38 does not include mesoscale and submesoscale parameterizations (same as in [1]). Nor does it include the overflow parameterization (Danabasoglu et al., 2010) (also same as in [1]). Finally, the new coupler developed for CESM2.0–Common Infrastructure for Modeling the Earth (CIME) was backported to CESM1.3‐beta17_sehires38.

CESM1.3‐beta17_sehires38 was first ported and optimized on Stampede2 at the Texas Advanced Computing Center (TACC) for testing and tuning. Several decades of test runs were made, during which melting snow grain radius and the melt onset temperature parameters were adjusted in CICE4 to ensure that sea‐ice thickness and extent were within the observed estimates and that the top‐of‐atmosphere (TOA) radiation imbalance was small (~−0.05 W m−2 averaged over year 6 to 20). This finalized CESM1.3‐beta17_sehires38 tag was then sent to the Chinese team for porting and optimization on the Chinese HPC. The porting and optimization effort on the Chinese HPC was a major software engineering undertaking that took nearly a year to complete—details of this effort are described in Zhang et al. (2020). The outcome was a highly parallelized and efficient Chinese HPC version of CESM1.3‐beta17_sehires38 that can achieve a performance speed of close to five simulation years per calendar day on 61,600 cores with no model output. With high‐frequency model outputs, including daily mean and 6‐hourly variables, the model performance decreases to less than three simulation years per day (~40% decrease). The Chinese HPC version of the model used for this study is available via GitHub at https://github.com/ihesp/CESM_SW. It should be noted that this model version uses the SE‐dycore in CAM5 which was employed for both the high‐ and low‐resolution simulations conducted on the Chinese HPC.

Regional CESM (R-CESM)

The development of high-resolution, fully-coupled, regional Earth system model systems is important for improving our understanding of climate variability, future projections, and extreme events at regional scales. The iHESP team has been developing the Regional Community Earth System Model (R-CESM), which incorporates the ocean model ROMS (Regional Ocean Modelling System v3.5) and the atmospheric model WRF (Weather Research and Forecast model v3.5.1) into CESM2 (v2.1) using the CIME coupling infrastructure. Presently, this configuration has also been tested to work with the CLM4 as the land model; other CESM2 components are planned to be incorporated in the future. R-CESM also adds alternative parameterizations from WRF for computing air-sea fluxes, so R-CESM has two surface flux options:

  1. the standard CESM flux scheme developed for large-scale climate simulations
  2. WRF’s native flux schemes developed for synoptic-scale weather simulations.

By having both the climate-focused CESM scheme and the weather-focused WRF schemes as options, we can compare their performance for different resolutions and lengths of simulations to address the question of whether the CESM scheme is adequate and effective in simulating extreme events at high resolution.

The R-CESM component models and their connectivities for the CESM and WRF surface flux schemes are shown in the figure above. Note that CLM is the standard choice for the land model in R-CESM irrespective of the choice for the surface flux scheme. The major changes made to the original ROMS and WRF source codes are the addition of wrapper code that allows them to interface with the CESM2/CIME framework. For ROMS this includes: i) bypassing surface forcing input and surface flux calculation within ROMS; ii) sending sea surface temperature (SST) and surface currents to CIME; and iii) receiving surface fluxes and surface pressure from CIME. For WRF using the standard CESM flux scheme, this includes: i) bypassing calculation of surface and turbulent mixing fluxes within WRF; ii) sending temperature, humidity, winds (on the lowest model level), surface downward radiative fluxes, and surface pressure to CIME; and iii) receiving surface and turbulent mixing fluxes from CIME. CIME computes the atmosphere-ocean fluxes, but the atmosphere-land fluxes are computed within CLM and then passed to WRF through CIME.

Some of the surface layer variables required by the WRF physics parameterization schemes (for example, bulk Richardson number and roughness length required by the planetary boundary layer scheme) were not readily available in the CESM surface scheme, and they have been added as outputs from the CESM scheme. Similar modifications have been done to the atmosphere-land flux calculation in CLM to get required variables for WRF physics parameterization schemes.

 

Coupled Data Assimilation

Achieving the goal of generating skillful forecasts for climate variations would be of great importance to our society. The coupling between various earth system components (the ocean, atmosphere, and land) plays a critical role on climate variability.  Therefore the development of coupled data assimilation (CDA) for coupled general circulation models is emerging as an important strategy for improving the forecasts. A state-of-the-art CDA scheme should combine observations from different model components with a coupled general circulation model to provide accurate model initial conditions for the climate forecast.

Currently we have implemented an online ensemble Kalman filter (EnKF) based data assimilation capability in the ocean component of R-CESM. We are testing the capacity using pseudo-observations and real observations with the the domain configuration of the Gulf of Mexico at eddy-resolving resolution.  In the next phase of this research, we will implement the online data assimilation capability into the atmospheric component of R-CESM and configure a CDA system for the Gulf of Mexico.

Released Source Code

  • The High-Resolution CESM model for Intel processors. Download from the iHESP github page.  
  • The High-Resolution CESM model for the Chinese HPC architecture. Download from the iHESP github page.  

 

 

[1] Small, R. J.Bacmeister, J.Bailey, D.Baker, A.Bishop, S.Bryan, ... & Vertenstein, M. (2014). A new synoptic scale resolving global climate simulation using the community earth system modelJournal of Advances in Modeling Earth Systems610651094https://doi.org/10.1002/2014MS000363