CESM User's Guide (CESM1.2 Release Series User's Guide) (PDF) | ||
---|---|---|
Prev | Chapter 5. Porting and Validating CESM on a new platform | Next |
Once a case is running, then the local setup for the case can be converted into a specific set of machine files, so future cases can be set up using your local machine name rather than "userdefined". In addition, new cases should be able to run out-of-the-box without going through step 1 above. Basically, you will need to add files and modify files in the directory $CCSMROOT/scripts/ccsm_utils/Machines to support your machine out-of-the-box. This section describes how to add support for your machine to the CESM scripts in order to support your machine out-of-the box.
Pick a name that will be associated with your machine. Generally, this will be identical to the name of your machine, but it could be anything. "wilycoyote" will be used in the description to follow. It is also helpful to identify as a starting point one or more supported machines that are similar to your machine. To add wilycoyote to the list of supported machines, do the following:
Edit config_machines.xml and add a section for "wilycoyote". You can simply copy one of the existing entries and then edit it. The machine specific env variables that need to be set in config_machines.xml for wilycoyote are already set in the env files in the test1 case directory that was created from the userdefined machine. You will need to leverage the variables you used in the test1 case directory in Step1 above into the config_machines.xml section for wilycoyote. While the compiler options for a given compiler are pretty consistent across machines, invoking the compiler and the local paths for libraries are not. There are several variable settings here. The definition of these variables can be found in the env_build.xml, env_run.xml and env_mach_pes.xml files. Some of the important ones are MACH which should be set to wilycoyote, EXEROOT which should be set to a generic working directory like /tmp/scratch/$CCSMUSER/$CASE shared by and write accessable to all compute nodes, DIN_LOC_ROOT which should be set to the path to the CESM inputdata directory (read accessable to all compute nodes), BATCHQUERY and BATCHJOBS which specify the query and submit command lines for batch jobs and are used to chain jobs together in production, and MAX_TASKS_PER_NODE which set the maximum number of tasks allowed on each hardware node.
Edit config_compilers.xml to translate the additions you made to the Macros file to support "wilycoyote" specific settings.
Create an env_mach_specific.wilycoyote file. This should be a copy of the env_mach_specific file from the test1 case directory in Step1 above.
> cd $CCSMROOT/scripts/test1 > cp env_mach_specific $CCSMROOT/scripts/ccsm_utils/Machines/env_mach_specific.wilycoyote |
Create an mkbatch.wilycoyote file. The easiest way to do this is to find a machine closest to your machine and copy that file to mkbatch.wilycoyote. Then edit mkbatch.wilycoyote to match the changes made in the test1.userdefined.run file in the test1 case in Step1. In particular, the batch commands and the job launching will probably need to be changed. The batch commands and setup are the first section of the script. The job launching can be found by searching for the string "CSM EXECUTION".
Test the new machine setup. Create a new case based on test1 using the wilycoyote machine setup
> cd $CCSMROOT/scripts > create_newcase -case test1_wilycoyote \ -res f45_g37 \ -compset X \ -mach wilycoyote > cd test1_wilycoyote > ./cesm_setup > ./test1_wilycoyote.build > qsub test1_wilycoyote.run |