The Community Earth System Model (CESM) is a coupled climate model for simulating Earth's climate system. Composed of separate models simultaneously simulating the Earth's atmosphere, ocean, land, land-ice, and sea-ice, plus one central coupler component, CESM allows researchers to conduct fundamental research into the Earth's past, present, and future climate states.
The CESM system can be configured a number of different ways from both a science and technical perspective. CESM supports numerous resolutions, and component configurations. In addition, each model component has input options to configure specific model physics and parameterizations. CESM can be run on a number of different hardware platforms, and has a relatively flexible design with respect to processor layout of components. CESM also supports both an internally developed set of component interfaces and the ESMF compliant component interfaces (See the Section called BASICS: How do I use the ESMF library and ESMF interfaces? in Chapter 6)
The CESM project is a cooperative effort among U.S. climate researchers. Primarily supported by the National Science Foundation (NSF) and centered at the National Center for Atmospheric Research (NCAR) in Boulder, Colorado, the CESM project enjoys close collaborations with the U.S. Department of Energy and the National Aeronautics and Space Administration. Scientific development of the CESM is guided by the CESM working groups, which meet twice a year. The main CESM workshop is held each year in June to showcase results from the various working groups and coordinate future CESM developments among the working groups. The CESM website provides more information on the CESM project, such as the management structure, the scientific working groups, downloadable source code, and online archives of data from previous CESM experiments.
The following are the external system and software requirements for installing and running CESM.
UNIX style operating system such as CNL, AIX and Linux
csh, sh, and perl scripting languages
subversion client version 1.4.2 or greater
Fortran (2003 recommended, 90 required) and C compilers. pgi, intel, and xlf are recommended compilers.
MPI (although CESM does not absolutely require it for running on one processor)
Trilinos may be required for certain configurations
LAPACKm or a vendor supplied equivalent may also be required for some configurations.
CMake 2.8.6 or newer is required for configurations that include CISM.
The following table contains the version in use at the time of release. These versions are known to work at the time of the release for the specified hardware.
Table 1-1. Recommmended Software Package Versions by Machine
Machine | Version Recommendations |
---|---|
Cray XT Series | pgf95 12.4.0 |
IBM Power Series | xlf 12.1, xlC 10.1 |
IBM Bluegene/P | xlf 12.01, xlC 10.01 |
Linux Machine | ifort, icc (intel64) 12.1.4 |
Caution |
NetCDF must be built with the same Fortran compiler as CESM. In the netCDF build the FC environment variable specifies which Fortran compiler to use. CESM is written mostly in Fortran, netCDF is written in C. Because there is no standard way to call a C program from a Fortran program, the Fortran to C layer between CESM and netCDF will vary depending on which Fortran compiler you use for CESM. When a function in the netCDF library is called from a Fortran application, the netCDF Fortran API calls the netCDF C library. If you do not use the same compiler to build netCDF and CESM you will in most cases get errors from netCDF saying certain netCDF functions cannot be found. |
Parallel-netCDF, also referred to as pnetcdf, is optional. If a user chooses to use pnetcdf, version 1.2.0 or later should be used with CESM. It is a library that is file-format compatible with netCDF, and provides higher performance by using MPI-IO. Pnetcdf is enabled by setting the PNETCDF_PATH variable in the Macros file. You must also specify that you want pnetcdf at runtime via the io_typename argument that can be set to either "netcdf" or "pnetcdf" for each component.
CESM consists of seven geophysical models: atmosphere (atm), sea-ice (ice), land (lnd), river-runoff (rof), ocean (ocn), land-ice (glc), and ocean-wave (wav - stub only), plus a coupler (cpl) that coordinates the geophysics models time evolution and passes information between them. Each model may have "active," "data," "dead," or "stub" component version allowing for a variety of "plug and play" combinations.
During the course of a CESM run, the model components integrate forward in time, periodically stopping to exchange information with the coupler. The coupler meanwhile receives fields from the component models, computes, maps, and merges this information, then sends the fields back to the component models. The coupler brokers this sequence of communication interchanges and manages the overall time progression of the coupled system. A CESM component set is comprised of seven components: one component from each model (atm, lnd, rof, ocn, ice, glc, and wav) plus the coupler. Model components are written primarily in Fortran 90/95/2003.
The active (dynamical) components are generally fully prognostic, and they are state-of-the-art climate prediction and analysis tools. Because the active models are relatively expensive to run, data models that cycle input data are included for testing, spin-up, and model parameterization development. The dead components generate scientifically invalid data and exist only to support technical system testing. The dead components must all be run together and should never be combined with any active or data versions of models. Stub components exist only to satisfy interface requirements when the component is not needed for the model configuration (e.g., the active land component forced with atmospheric data does not need ice, ocn, or glc components, so ice, ocn, and glc stubs are used).
The CESM components can be summarized as follows:
Model Type | Model Name | Component Name | Type | Description |
---|---|---|---|---|
atmosphere | atm | cam | active | The Community Atmosphere Model (CAM) is a global atmospheric general circulation model developed from the NCAR CCM3. |
atmosphere | atm | datm | data | The data atmosphere component is a pure data component that reads in atmospheric forcing data |
atmosphere | atm | xatm | dead | |
atmosphere | atm | satm | stub | |
land | lnd | clm | active | The Community Land Model (CLM) is the result of a collaborative project between scientists in the Terrestrial Sciences Section of the Climate and Global Dynamics Division (CGD) at NCAR and the CESM Land Model Working Group. Other principal working groups that also contribute to the CLM are Biogeochemistry, Paleoclimate, and Climate Change and Assessment. |
land | lnd | dlnd | data | The data land component no longer has data runoff functionality. It works as a purely data-land component (reading in coupler history data for atm/land fluxes and land albedos produced by a previous run) or both. |
land | lnd | xlnd | dead | |
land | lnd | slnd | stub | |
river-runoff | rof | rtm | active | The river transport model (RTM) was previously part of CLM and was developed to route total runoff from the land surface model to either the active ocean or marginal seas which enables the hydrologic cycle to be closed (Branstetter 2001, Branstetter and Famiglietti 1999). This is needed to model ocean convection and circulation, which is affected by freshwater input. |
river-runoff | rof | drof | data | The data runoff model was previously part of the data land model and functions as a purely data-runoff model (reading in runoff data). |
river-runoff | rof | xrof | dead | |
river-runoff | rof | srof | stub | |
ocean | ocn | pop | active | The ocean model is an extension of the Parallel Ocean Program (POP) Version 2 from Los Alamos National Laboratory (LANL). |
ocean | ocn | docn | data | The data ocean component has two distinct modes of operation. It can run as a pure data model, reading ocean SSTs (normally climatological) from input datasets, interpolating in space and time, and then passing these to the coupler. Alternatively, docn can compute updated SSTs based on a slab ocean model where bottom ocean heat flux convergence and boundary layer depths are read in and used with the atmosphere/ocean and ice/ocean fluxes obtained from the coupler. |
ocean | ocn | xocn | dead | |
ocean | ocn | socn | stub | |
sea-ice | ice | cice | active | The sea-ice component (CICE) is an extension of the Los Alamos National Laboratory (LANL) sea-ice model and was developed though collaboration within the CESM Polar Climate Working Group (PCWG). In CESM, CICE can run as a fully prognostic component or in prescribed mode where ice coverage (normally climatological) is read in. |
sea-ice | ice | dice | data | The data ice component is a partially prognostic model. The model reads in ice coverage and receives atmospheric forcing from the coupler, and then it calculates the ice/atmosphere and ice/ocean fluxes. The data ice component acts very similarly to CICE running in prescribed mode. |
sea-ice | ice | xice | dead | |
sea-ice | ice | sice | stub | |
land-ice | glc | cism | active | The CISM component is an extension of the Glimmer ice sheet model. |
land-ice | glc | sglc | stub | |
ocean-wave | wav | xwav | dead | Support for a separate ocean wave component has been added to the system. At the present time, only stub and dead versions of the wave model are available in this release. Development of a prognostic wave model is underway, and it may be added to the system at some future time. |
ocean-wave | wav | swav | stub | |
coupler | cpl | cpl | active | The CESM coupler was built primarily through a collaboration of the NCAR CESM Software Engineering Group and the Argonne National Laboratory (ANL). The MCT coupling library provides much of the infrastructure. |
The CESM components can be combined in numerous ways to carry out various scientific or software experiments. A particular mix of components, along with component-specific configuration and/or namelist settings is called a component set or "compset." CESM has a shorthand naming convention for component sets that are supported out-of-the-box.
The compset name usually has a well defined first letter followed by some characters that are indicative of the configuration setup. Each compset name has a corresponding short name. Users are not limited to the predefined component set combinations. A user may define their own component set.
See supported component sets for a complete list of supported compset options. Running create_newcase with the option "-list" will also always provide a listing of the supported out-of-the-box component sets for the local version of CESM.
In general, the first letter of a compset name indicates which components are used. An exception to this rule is the use of "G" as a second letter to indicate use of the active glc model, CISM.
The grids are specified in CESM by setting an overall model resolution. Once the overall model resolution is set, components will read in appropriate grids files and the coupler will read in appropriate mapping weights files. Coupler mapping weights are always generated externally in CESM. The components will send the grid data to the coupler at initialization, and the coupler will check that the component grids are consistent with each other and with the mapping weights files.
In CESM1.2, the ocean and ice must be on the same grid, but the atmosphere and land and river runoff can each be on different grids. Each component determines its own unique grid decomposition based upon the total number of pes assigned to that component.
CESM supports several types of grids out-of-the-box including single point, finite volume, spectral, cubed sphere, displaced pole, and tripole. This page, Conservative Remapping on Spherical Grids, illustrates a number of these grid types. These grids are used internally by the models. Input datasets are usually on the same grid but in some cases, they can be interpolated from regular lon/lat grids in the data models. The finite volume and spectral grids are generally associated with atmosphere and land models but the data ocean and data ice models are also supported on those grids. The cubed sphere grid is used only by the active atmosphere model, cam. And the displaced pole and tripole grids are used by the ocean and ice models. Not every grid can be run by every component. The ocean and ice models run on either a Greenland dipole or a tripole grid. The Greenland Pole grid is a latitude/longitude grid, with the North Pole displaced over Greenland to avoid singularity problems in the ocn and ice models. The low-resolution Greenland pole mesh from CCSM3 is illustrated in Yeager et al., "The Low-Resolution CCSM3", AMS (2006), Figure 1b., Web. Similarly, the Poseidon tripole grid is a latitude/longitude grid with three poles that are all centered over land.
CESM1.2 has a completely new naming convention for model resolutions . Using this naming convention, the complete list of currently supported grid resolutions can be viewed at supported resolutions page.
Scripts for supported machines and userdefined machines are provided with the CESM release. Supported machines have machine specific files and settings added to the CESM scripts and are machines that should run CESM cases out-of-the-box. Machines are supported in CESM on an individual basis and are usually listed by their common site-specific name. To get a machine ported and functionally supported in CESM, local batch, run, environment, and compiler information must be configured in the CESM scripts. The machine name "userdefined" machines refer to any machine that the user defines and requires that a user edit the resulting xml files to fill in information required for the target platform. This functionality is handy in accelerating the porting process and quickly getting a case running on a new platform. For more information on porting, see Chapter 5. The list of available machines are documented in CESM supported machines. Running create_newcase with the "-list" option will also show the list of available machines for the current local version of CESM. Supported machines have undergone the full CESM porting process. The machines available in each of these categories changes as access to machines change over time.
Although CESM can be run out-of-the-box for a variety of resolutions, component combinations, and machines, MOST combinations of component sets, resolutions, and machines have not undergone rigorous scientific climate validation. Control runs accompany "scientifically supported" component sets and resolution and are documented on the release page. These control runs should be scientifically reproducible on the original platform or other platforms. Bit-for-bit reproducibility cannot be guaranteed due to variations in compiler or system versions. Users should carry out their own validations on any platform prior to doing scientific runs or scientific analysis and documentation.