Skip Navigation Links
NOAA logo - Click to go to the NOAA homepage National Weather Service NWS logo - Click to go to the NWS homepage
EMC Logo
Navigation Bar Left Cap Home News Organization
Navigation Bar End Cap

         MISSION / VISION    |    About EMC

NOAA Environmental Modeling System

Running NEMS

This page is meant to guide you in checking out NEMS and running experiments for the different models within its framework. If you are interested in starting out with just regression tests, please refer to 'Getting Started'.

Libraries & Dependencies

Currently, NEMS requires the following external libraries to be installed on the system in order to run properly.
  • ESMF version 6.3.0r is supported. Available from the ESMF web site.
  • netcdf - available from the UNIDATA website.
  • The following libraries are from NCEP and are available via /nwprod/lib:
    • bacio
    • w3
    • sp
    • nemsio
      • The following will be used only when Post is set to be used:
        • g2
        • g2tmpl
        • sigio, sfcio
        • nceppost - /global/save/Shrinivas.Moorthi/nceppost_moorthi/sorc/ncep_post.fd
        • png, jasper - jasper is the library to make jpeg2000 packing
The last two libraries listed above may already be installed on your system (Linux), but may not be by default. PNG, in turn, requires zlib, but that is almost always installed, at least on Linux. They are required for compilation of the g2 (GRIB2) library.

Check Out and Build NEMS:

If you are interested in just running regression tests, please refer to 'Getting Started'. Otherwise, continue on below.

Process to checkout and build NEMS:
ActionActive DirectoryCommand
Check out NEMS./**/save/Firstname.Lastname (WCOSS)
/scratch*/portfolios/NCEPDEV/**/save/Firstname.Lastname (Zeus)
* - 1 or 2
** - global, meso, etc.
svn checkout
Configure using the version of the ESMF library that will be used (if running, this will be done automatically). Make note of machine specific options..../trunk/srcESMF 3.1.0rp2:
./configure 3_wcoss
./configure 3_zeus (for the ESMF 3.1.0r series library)
./configure 3_yellowstone

ESMF 6.3.0r:
./configure 6.3r_nmm_wcoss
./configure 6.3r_gsm_wcoss
./configure 6_nmm_zeus
./configure 6_gsm_zeus
./configure 6_yellowstone

ESMF with reference NUOPC layer (currently ESMF 7.0.0 beta snapshot):
./configure nuopc_zeus
./configure nuopc_gaea
Source the appropriate module file. Note the machine and ESMF version needed..../trunk/srcsource conf/modules.nems.Gaea_ESMF_NUOPC
source conf/modules.nems.wcoss_ESMF_3
source conf/modules.nems.wcoss_ESMF_630rAPI_gsm
source conf/modules.nems.wcoss_ESMF_630rAPI_nmm
source conf/modules.nems.Zeus_ESMF_310rAPI
source conf/modules.nems.Zeus_ESMF_630rAPI_gsm
source conf/modules.nems.Zeus_ESMF_630rAPI_nmm
source conf/modules.nems.Zeus_ESMF_NUOPC
Clean any previous compilations..../trunk/srcgmake clean
Compile and build the core(s) needed (/job/tests/ will build the default settings - NMM and GFS with GOCART)..../trunk/srcNMM & GSM: gmake nmm_gsm
GEN only: gmake gen
GSM & GEN w/ GOCART & post:gmake gsm_gen_post GOCART_MODE=full
NMM & GSM w/ GOCART: gmake nmm_gsm GOCART_MODE=full
NMM only: gmake nmm
NMM with post: gmake nmm_post
GSM only: gmake gsm
GSM with post: gmake gsm_post
GSM w/ GOCART:gmake gsm GOCART_MODE=full
In order to build NEMS with ocean components, all of the above targets support the optional OCN variable. Supported values are: \"dummy\", \"hycom\", and \"mom5\". Specifying multiple values, e.g. "gmake nmm_gsm OCN=mom5,dummy", is supported.

In order to build NEMS with sea-ice components, all of the above targets support the optional ICE variable. Supported values are: \"dummy\", \"cice\".

Specifying multiple values, e.g. \"gmake nmm_gsm OCN=mom5,dummy,hycom ICE=dummy\", is supported.

The executable, NEMS.x, will appear in /exe.

How To Run NMM-B

NEMS NMM-B documentation, code, and how-to can be found on the DTC | NEMS-NMMB Users' Page.

Information regarding the NMM-B Launcher can be found under the 'Launcher' tab of this website. There is also an evolving draft of Launcher documentation here.

How to Run GFS


Currently, the scripts to run the GFS with NEMS are in flux. Follow the temporary instructions below to run, but bear in mind you must have a Subversion account to follow these directions.

GFS Parallel Scripts:

All script work is being done in the GFS branch, gfs4nems. Make sure to use "--ignore-externals" when checking out the branch, unless you want to check out all of the externals, as well. Also, if you have not been tasked to work on any of the branches mentioned in this section, please do not commit anything back to the branches.

Please note that work will continue to be committed back to this branch, so you will want to follow it with ticket #184.

Config files:

Para_config_T574 and para_config_T1534 in the gfs4nems branch now run semi-Lagrangian.

Note the new $NEMS variable near the top. When NEMS=YES, it sets DYNVARS and PHYVARS for NEMS.x and when NEMS=NO, it sets FCSTVARS for global_fcst.

To run either configuration, please do the following:
  • For T1534 runs: be on phase 2. T1534 semi-lag only runs on phase 2, so CUE2RUN=dev2, use stmpp2, ptmpp2, etc.
  • Check out your own copy of the gfs4nems GFS branch or point to a friend's copy.
  • Run at the top of the branch to build the code within ('sh -c gfs'). You will not use global_fcst but you will want global_chgres.
  • Grab either para_config_T1534 or para_config_T574 from the gfs4nems branch, as well as the rlist.
  • Change BASEDIR in the config to point to your copy of the gfs4nems branch.
  • Change BASE_NEMS int he config to point to a recent copy of the NEMS trunk. NOTE: You will want to change the gsm external link in the copy of NEMS to use Moorthi's gsm branch.
  • Use global_chgres (code and script) in the gfs4nems branch to convert sigio initial conditions to nemsio.
  • Submit your run just like past parallel runs (PSUB CONFIG CDATE CDUMP CSTEP) and make sure to use psub from the branch.
GSM Code:

All model code work is being done in the GSM branch, gsm_slg. All questions about the code should be directed to Moorthi. How to build the current NEMS/GSM code:
  1. Check out the NEMS trunk and make sure NOT to ignore externals. You want to grab the GSM code trhough the external under /src/atmos.
  2. Update the GSM external under /src/atmos to point to Moorthi's branch if you want to run semi-Lagrangian. Leave the external alone if trying Eulerian first.
    1. Type "svn propedit svn:externals"
      (Seek help if you are unsure about how to do this.)
    2. Type "svn update" after changing the external. That will change the contents of the GSM directory under /src/atmos to Moorthi's branch.
  3. To build, do the following under /src:
    1. Type "./configure ESMF_OPTION". (Typing just "./configure" will show options, but you will want 6.3r_gsm_wcoss for the ESMF_OPTION for now.)
    2. Type "source conf/modules.nems".
    3. Type "gmake gsm".
    4. You will get a NEMS.x in the /exe directory. Use that instead of global_fcst.

The global_chgres code and script within the gfs4nems GFS branch is new and working. You should build and use that copy to convert your sigio ICs to nemsio. There are a number of wrapper scripts floating around that can be used to run chgres offline. Feel free to use one you already have or try this one, /gpfs/td1/emc/global/noscrub/Kate.Howard/para_gfs/data_4_gfs4nems/

  1. You can NOT currently convert nemsio back to sigio. Keep that in mind!
  2. Since we can't currently convert nemsio files to other resolutions, you will need to only run 1 segment forecasts for now. The para_config files have fseg=1 to reflect this.
  3. Make sure to go through the script to edit the settings to your preferences. For OUTTYP=1 (nemsio), you need GFSOUT instead of SIGOUT.
  4. For nemsio, the siganl file is named gfsanl. The scripts are set up to look for gfsanl if NEMSIO_IN=.true. and siganl if SIGIO_IN=.true.
  5. The sfcanl file does not change, but needs to be converted as well.
  6. Most likely, nemsio files will be used in binary format. You will notice GRDFMT='bin4' in the CHGRESVARS within the scripts.
  7. New variable, NEMS, near the top of the config file needs to be set to 'YES' to run NEMS/GSM. Setting it to 'YES' tells the config to use BASE_NEMS, NEMS.x, and DYNVARS/PHYVARS instead of global_fcst and FCSTVARS.

The nemsio_get utility is used for nemsio files, just as global_sighdr is used for sigio files. You can use the following copy on Tide, /nwprod/ngac.v1.0.0/exec/nemsio_get. All questions about nemsio_get should be directed to Jun Wang.

Status of Eulerian and semi-Lagrangian:

The current NEMS trunk has been successfully used to run Eulerian sigio to nemsio forecasts (SIGIO_IN=.true. and NEMSIO_OUT=.true.) for all supported resolutions under T878. Eulerian nemsio to nemsio (NEMSIO_IN=.true. and NEMSIO_OUT=.true.) has also been successful for T574. The gsm_slg branch has just started to be used to run semi-Lagrangian T1534, with a few glitches that will hopefully be remedied soon.

On Zeus:

A suite of scripts and sample initial conditions are available to run the NEMS GFS on Zeus. The code is located in /scratch2/portfolios/NCEPDEV/global/save/glopara/TUTORIAL. It is recommended to start with the lower resolution tutorial, tutorlow. It is also highly recommended you reference the README file associated with the lower resolution (README_tutorlow) before starting for all information needed to run. Below are the basic steps contained in the README file:
  1. Setup your experiment directories, including EXPDIR, ROTDIR, scratch space, and ARCDIR.
  2. Populate ROTDIR with initial conditions (starting at both the gdas fcst1 and gdas efmn steps). Sample initial conditions are provided in /scratch2/portfolios/NCEPDEV/global/noscrub/glopara/TUTORIAL/ICs/low_res/.
  3. Change configuration file as needed. It is suggested that you leave most things set as is for your first run. Just double check on EXPDIR, ROTDIR, and ACCOUNT settings. Once your first run is complete feel free to change what you wish.
  4. Submit your run.
  5. Follow your run. Once you have submitted your first job a runlog will appear in your EXPDIR. Each time a step in the parallel system is submitted, begun, failed, or ended it will be noted in the runlog.
  6. Check out your output.
One of the most important files within your output is the dayfile ($PSLOT$CDATE$CDUMP$CSTEP.dayfile for $PSLOT$CDATE$CDUMP$CSTEP_#.dayfile).

There is a dayfile for each step of the run (each CDATE, each CDUMP [gdas or gfs], each job/step, and each rerun of that step [some jobs]). The dayfile will contain information on what occurred during that step of the run and is a great tool for both learning and troubleshooting. Pick a dayfile and take some time to explore it. Whenever you have a failure in your run you should get used to looking at the bottom of the dayfile first, to see what may have gone wrong.

Contributors: Eugene Mirvis, Tom Black, Ed Colon, Jun Wang