Skip Navigation Links www.nws.noaa.gov
NOAA logo - Click to go to the NOAA homepage National Weather Service NWS logo - Click to go to the NWS homepage
EMC Logo
Navigation Bar Left Cap Home News Organization
Navigation Bar End Cap

         MISSION / VISION    |    About EMC

EMC > NEMS > LAUNCHER
NOAA Environmental Modeling System

Launcher



Introduction


The NEMS Launcher was developed to allow for a user-friendly way to set up and run experiments with the regional NEMS NMM-B. Output from the Launcher is comparable to that of operations when the appropriate settings are provided. This page will help you learn how to use the NEMS Launcher, which is capable of running on the WCOSS machines (Tide or Gyre) and Zeus, but with various limitations dependent on the platform. The state of the Launcher on WCOSS is in flux. For reference, please refer to the evolving draft of Launcher documentation.

Before continuing, some information:
  • Links on the right provide navigation to specific sections providing Launcher information, including how to run the Launcher.
  • This page is for internal NCEP users, i.e., those who can access the WCOSS machines (Tide/Gyre) or Zeus, and have a Subversion account
  • To check out the Launcher from Subversion, use the following command, 'svn checkout https://svnemc.ncep.noaa.gov/projects/launcher/trunk/nmmb/wcoss' or 'svn checkout https://svnemc.ncep.noaa.gov/projects/launcher/trunk/nmmb/zeus'.
  • References to 'run/run_nmmb*' are for any run scripts that are located in the launcher /run directory.
  • If at any time you are confused and can't find the information that you need please email for help.

Current Status:

  • Loadleveler queue submission environment changed to PBS on Zeus and LSF on WCOSS.
  • Step dependencies keyed off of PBS "flat chain" process id's on Zeus and directly at the end of process launcher task scripts on WCOSS.
  • A number of routines have been "serialized" including metgrid, graphics, verification, bufr processing and plot soundings due to less than optimal performance of poescript job submissions.
  • Launcher steps currently non-functional:
    • WCOSS: Grid-to-Grid verification.
    • Zeus: All GemPak dependent steps.

Capabilities:

The NEMS Launcher is capable of performing the following tasks:
  • Retrieve needed input files (NAM or GFS grib files).
  • Run the NEMS Preprocessing System (NPS) to derive boundary and initial condition files.
  • Use cold start utility to substitute land states from those used in the production NAM model which provides consistent initial conditions between the experiment (NMM-B) and the control (NAM) needed for graphical comparisons and validation.
  • Run GSI to provide a means for data assimilation using inputs derived from NDAS/GDAS, satellite, radar, and other sources.
  • Run post-processing utilities needed for the rendering of forecast graphics and verification.
  • Run the NCEP verification package.
  • Invoke the NCEP Grid-to-Grid utility in order to derive key forecast verification statistics from gridded verification and analysis data sets.
  • Run FVS verification plotting package used to render the user-requested statistics in graphical form.
  • Produce surface fields and vertical profiles from model forecasts at over 1300 station locations in BUFR format, which are then converted into surface and sounding GEMPAK formatted files.
  • Run the GEMPAK utility to produce sounding plots of forecast temperature, moisture, and winds against rawinsonde observations.
  • Supplemental run utilities used to clear disk space from old Launcher runs, GrADS scripts to produce multi-level plots of 3D fields over small plot domains, GrADS scripts for plotting vertical cross sections, and FVS verification scripts to produce combined precipitation and meteorological verification from a series of Launcher runs.

The flowchart below provides a visual guide to the process the Launcher uses to run the regional NMM-B and create output files.



The launcher has the ability to run on different regional domains, which can be specifically tailored for the user's needs. The most frequently used domains are:
  • The 12-km "air quality" (AQ) domain, which covers roughly half the area of the operational NAM domain most of North America. Sample 2-m forecast temperatures over this grid can be viewed here.
  • The 4-km domain covering the contiguous US (CONUS) that is nested within the 12-km AQ domain, a sample of which can be viewed here.
  • A standalone 4-km domain run over CONUS (settings for 3-km CONUS domain also available).
  • Small, easy to relocate "tiny" domains that can be run on a single processor, which are useful for physics testing and development. The size of the domains can be easily adjusted by the user, and usually are limited to less than 100 grid points in the zonal and meridional directions.~
The launcher user can run on any arbitrary domain based on the grid setting specified in the text files located in the directory /run/domains. As an example, the user may decide to customize the domain by setting these key parameters:
  • Keep nmm_domains=1 unless nesting is wanted.
  • LNSH=${LNSH-5} - defined boundary width (default is 5)
  • LNSV=${LNSV-5} - defined boundary width (default is 5)
  • nx=${nx-(user defined)} - number of grid elements in the x direction
  • ny=${ny-(user defined)} - number of grid elements in the y direction
  • nz=${nz-(user defined)} - number of grid elements in the z direction
  • dx=${dx-(user defined)} - grid element size in the x direction (fraction of a degree)
  • dy=${dy-(user defined)} - grid element size in the y direction (fraction of a degree)
  • cenlat=${cenlat-(user defined)} - Default is new NAM center point at 54.0
  • cenlon=${cenlon-(user defined)} - Default is new NAM center point at -106.0
  • dt_whole=${dt_whole-(user defined)} - Default is to use the same dt as in NAM (26 2/3 s)
  • dt_num=${dt_num-(user defined)}
  • dt_denom=${dt_denom-(user defined)}
Changes in text files are allowable as long as they are physically consistent, for example, the time step selected corresponds with the specified grid element size and the domain location and size remains consistent with input source grib files [usually the NAM grid 221 input files (32-km, N America domain) or the GFS global domain]. The ability to initialize the Launcher run using an existing regional NMMB native restart file is also available but requires some work on the part of the user to invoke. The flag RESTARTR, in the /run run scripts allows one to restart the model from a nonzero forecast hour, and RESTH allows the post processing to start at that hour.

Nesting is available and, generally, a 12 km AQ parent domain and a 4 km nested domain is run. Several domain configuration examples (including specs) can be found in the directory, /run/domains. The launcher assumes that the user understands what kind of nesting domain they want to set up based on the configuration found in run/run_nmmb_1nest or run/run_nmmb_2nests and run/domains/1nest or run/domains/2nests.

The resulting model output can also be post processed to produce GRIB output at the NMM-B's native horizontal resolution on the staggered Arakawa B-grid, as well as produce interpolated GRIB output onto one or more of the following Lambert-Conformal or polar steoreographic output grids:
  • 218 - 12 km over CONUS
  • 221 - 32 km CONUS
  • 242 - 12 km Alaska
  • 227 - 5 km over CONUS
These output grids are also used by the operational NAM, and they are described in some detail with sample maps showing area of coverage here. Detailed grid specifications are also available at this site, as well as at the master list of NCEP storage grids.

README Highlights:

A README file explaining different portions of the Launcher can be found in the /wcoss or /zeus directories. Provided below is some information contained in the README file.

  1. Users should start out with reviewing any of the run scripts, run/run_nmmb*, where each script just specifies a specific domain. It has extensive comments describing almost every variable.

  2. Nearly all of the more frequently modified parameters are set in run/run_nmmb, each with a description of varying length. These 'flags' are clearly identified by variables beginning with the string 'RUN_', and to run this step set the value of that variable to >0. For example, to run the model, set 'RUN_MODEL=1' to just run the model, or 'RUN_MODEL=2' to run the model and run the post processing package side by side. (NOTE: Post processing needs some updates to fully run with the Launcher.) All other options or flags are usually set to '0' to not run that step, or '1' (but it can be >1, too) to run that step. Only the RUN_MODEL variable has a special setting for values set to '1' or '2'. See the table in the following section for more information about each flag.

  3. For most users, consider the "home directory" to be /run. It is where the run_nmmb* scripts reside, which you can make many copies of with multiple changes. Once you edit a script, the NMM-B Launcher is executed by running the run_nmmb* script. IT MUST BE RUN FROM THE /run DIRECTORY FOR THE LAUNCHER TO WORK! Once executed, the script does the following:
    • Creates a load leveler run script.
    • Creates output directories and copies files needed to run the steps you want to run to the necessary directories.
    • Checks to make sure necessary files exist (input and parameter files, executables, etc.), otherwise exits with an instructive error message.
    • Submits the job to PBS (qsub) if the user sets the variable QSUBMIT > 0 (near the bottom of run_nmmb*).

  4. You must have access to a compiled and working copy of NEMS and have the Launcher point to the main NEMS trunk directory (variable $NMM_SRC) and to the name of the NEMS executable file (varialbe $NMM_EXE), which is usually set to NEMS.x by default. The actual path of the NMM-B executable is specified in the Launcher to be ${NMM_SRC}/exe/$NMM_EXE.
    IMPORTANT: Users need to make sure that the directory specified by $NMM_SRC is consistent with special dyn_state.txt and phy_state.txt files in the directory $NMM_SRC/job/regression_tests. If users move or copy executables from other sources to their local their own ${NMM_SRC}/exe directory, that is fine but they should also copy the "job/regression_tests/*.txt" files from the source location to their ${NMM_SRC}/job/regression_tests directory, or else the model may not run.

  5. A long list of variables that define the horizontal domain, vertical coordinate and number of levels, nesting options, time steps, type of surface initialization, tracer advection options, start and frequency of when history and restart files are written, physics options, frequency of physics calls, and various physics-related accumulators are described within files in the run/domains directory.

    Two sample files are present:
    • na12aq_nogwd is the NMM-B version of the North American 12-km Air Quality (na12aq) domain with no gravity wave drag.
    • na12aq_gwd is the same physical domain with gravity wave drag.

    To run over a new domain or with different settings, specify them in a new file (e.g., "hiresw") and then specify the name of that file by setting "DOMAIN_SETUP=hiresw" in the run_nmmb* script.

  6. All of the settings in the run/$DOMAIN_SETUP and some in the run_nmmb* file are used to define a long list of values in the 'configure' file. Hereafter, the 'configure_regional_nam' file is referred to as the 'configure' file.

  7. Once a user executes the run_nmmb* script, it executes a master script, templates/master. Almost everything in the /templates directory is used to create the PBS run script or copy files to the output directories for certain job steps to run. You can think of it as setting the table for the job to run smoothly, and in fact, practically every step has a korn shell (ksh) script named 'setup_$step' that does just that. Nearly all of the qsub scripts (*.qsub files) in the /templates directory are templates, which get modified by the appropriate 'setup_$step' script. The name of the PBS run script is represented by the variable $LAUNCH in the /templates scripts.

  8. A brief description of the other subdirectories:
    • /docs/ - documentation in the form of ascii files. Many of these files are referenced in the extensive comments in run_nmmb*.
    • /parm/ - parameter and input files used by various steps in the NMM-B Launcher's execution.
    • /tables/ - lookup tables and other files needed to run the model.
    • /scripts/ - the processing scripts that execute pre-processing, modeling, post-processing, and verification steps.
    • /externals/ - utilities outside of the launcher framework that are collected in one subdirectory in order to facilitate portability between different computing systems.
    • /util/ - contains several useful utilities, such as:
      • /3d/ - directory containing code to make multi-layer plots
      • /xsect/ - directory containing code to plot vertical cross sections.
      • run_web - script that combines WRF NMM and NMM-B forecast plots and sends them to the rzdm server (may not work)
      • run_verif_web - script that runs the verification for an NMM-B run and a WRF NMM run and sends them to rzdm.

Continue on to more information about the Launcher scripts or skip ahead for how to run the Launcher.

Return to top
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

A Look At Run Scripts:


  • Run scripts in which to make changes based on your preferences (e.g. run_nmmb_na12AQ): /run
  • Master script to run the Launcher: /templates/master
  • Make sure you have the proper job submission settings for your account.
  • 'ROOT_DIR' allows you to run in whichever directory you would like. 'NMM_SRC' may need to be changed to point to your own NEMS.x.
  • Information about the RUN Flags:

    Each of the run scripts in the Launcher /run directory are set up in the same way and are pretty well documented. But in order to run various parameters and domains, the flags in the scripts will have to be set accordingly. Provided is a list of flags and their descriptions that appear at the beginning of those scripts. The items highlighted in red are currently not working on WCOSS.

    FlagDescription
    RUN_GETGet input files from disk or hpss
    RUN_GET_ANLGet 00h analysis files from disk or hpss every 6h, initializing off of GFS;
    NAM (GFS_IC=0) or GFS (GFS_IC>0)(see ../templates/setup_getanl)
    RUN_INITRun NPS to initialize model if >0
    RUN_COLD_STARTRun Cold Start utility if >0
    RUN_MODELRun model if >0
    =1 - run model only
    >=2 - run model and post + prdgen/copygb step together (recommended)
    RUN_POSTIf RUN_MODEL=0 or 1, then set RUN_POST>0 to run the post and prdgen/copygb
    =1 - runs prdgen if weights files are present, otherwise runs copygb (recommended)
    =2 - runs prdgen
    >2 - runs copygb
    RUN_GRAPHICSRun GrADS-based graphics (if >0)
    RUN_VERIFRun verification package to make vsdb files (if >0)
    RUN_VMAPSMakes gridtobs verification maps using Carley's Python codes (if >0)
    Make ctl & nmmb maps if =1, make nmmb maps only if >1
    RUN_G2GRun grid-to-grid verifcation step (if >0)
    RUN_FVSRun FVS to make verification plots (if >0)
    RUN_PCPVERRun precipitation verification, vsdb files + plots (if >0)
    RUN_BUFRRun BUFR processing
    RUN_PLTSNDMake GEMPAK skew-T sounding plots
    RUN_ARCHIVERemoves unnecessary files (cleans directory) and archives output to hpss under the path $HPSSDIR
    RUN_NEXTRun the next job in a series if >0


    Additional Options in the Run Scripts:

    Among the various flags mentioned above, these are some other settings that you may want to modify within the run scripts, most of which are found in /run/run_nmmb/common. They were moved to this file since they are less likely to be changed than those located in the existing run scripts.
    OptionDescription
    nodeSpecifies the number of computational nodes used in the model run; Assumes 64 (60 compute, 4 I/O) tasks per node
    CYCLEForecast cycle
    LENGTHForecast length (h)
    GRIDSList of output grids for graphics, verification; default if unspecified: GRIDS="218"
    CTL_FORMWhen =2, creates NAM variables to compare to experiment output
    SFCVER_REGIONSUsed for plotting; specify a set of regions for surface verification (described in the ../docs/FVS text file)
    SITESSets specific station numbers for soundings
    NEW_CONFIGUsed to replace the configure file in NPS with new settings in the /parm output subdirectory used in the model run
    HISTHFrequency of history output in *hours* used by graphics, verification, BUFR, soundings
    HISTMFrequency of history output in *minutes* written by model and processed by post
    LBCHFrequency of lateral boundary conditions (hrs)
    PLTREGSDetails for what pre-defined plot regions are available (described in the ../docs/PLTREGS text file)
    PLTVARSDetails for what sets or "families" of plots are available (described in the ../docs/PLTVARS text file)
    G2G_GRIDNCEP input grid used from Grid-to-grid verification (only active if RUN_G2G=1) (described in the ../docs/README_G2G_GRIDS)
    G2G_OBSVSThe data set used in the Grid-to-grid verification (only active if RUN_G2G=1) (described in the ../docs/README_G2G)
    G2G_HISTHThe output file interval (hrs) for the Grid-to-grid verification data sets listed in G2G_OBSVS (only active if RUN_G2G=1) (described in the ../docs/README_G2G)
    BSUBMITSet BSUBMIT >0 to automatically bsub the job script, otherwise if <=0 the job script will be created but not submitted.

    Return to top
    --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    Getting Started:


    Steps to Successfully Run the NEMS Launcher:

    The following step-by-step instructions will use /run/run_nmmb_na12AQ as an example of how the Launcher can be run. Currently, portions of the Launcher do not run/run properly on Zeus and WCOSS. This is being addressed.
    1. Before attempting to use the WCOSS launcher, a number of utility modules need to be loaded. Otherwise key components of the WCOSS Launcher will be rendered inoperable. The following list of modules may be activated in any of the user specified login/setup files including: .profile, .cshrc, .bashrc, etc.
      • . /usrx/local/Modules/3.2.9/init/$shell where $shell may be your preferred unix/linux shell i.e. bash, csh, ksh, sh, tcsh, zsh
      • module load ibmpe
      • module load ics
      • module load lsf
      • module load GrADS/2.0.2
      • module load hpss
      • module load imagemagick/6.8.3-3
    2. You must have a compiled copy of NEMS available to use. If you are not sure how to get and compile your own copy, go here for more information.
    3. Check out the Launcher from Subversion using the following command: 'svn checkout https://svnemc.ncep.noaa.gov/projects/launcher/trunk/nmmb/wcoss(or zeus) launcher'.
    4. One must now build the NPS and POST executables needed in both pre-processing and post-processing launcher steps. Go to /externals and type: ./build_nps. Once the NPS build process is complete, one should do the same for the POST utility using the command: /build_post.
    5. Open /run/run_nmmb_na12AQ in your favorite editor. This particular script is for a single, air quality domain NMM-B run. The other run scripts may be used and modified depending on your needs.
    6. For this case, most of the default settings will be used. Make the appropriate changes as follows:
      • Make sure 'NMM_SRC' points to your NEMS trunk directory.
      • Make sure 'NMM_EXE' points to your own NEMS executable (e.g., NEMS.x) in the ${NMM_SRC}/exe directory.
      • Make sure you have the proper job submit settings for your account.
    7. Once you have made all necessary changes, run run_nmmb_na12AQ interactively by entering 'run_nmmb_na12AQ' on the command line. Make sure you specify the switch BSUBMIT=0. Otherwise the Launcher will proceed without you having a chance to examine the run script log file.
    8. Check the file, nmmb_na12AQ_ctl_2013010100.log, to make sure there are no errors/problems. If there are, make any necessary corrections as directed by the log file.
    9. If there are no errors, submit your new run file.
    10. Information regarding output is provided in the following section.

    Return to top
    --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    Output:


    Understanding the /com Directory - Files Used to Generate Output

    When the Launcher is run, two directories will be created under your $ROOT_DIR directory (default: /ptmp/$USER/$CYCLE), where $CYCLE is the YYYYMMDDHH cycle for initial conditions that you want to run. The first directory is explained in more detail in 'Understanding the Various Types of Launcher Output. The second directory is named /com, which is where any files used to create output are located. Following are descriptions of the directories under /com:

    • /log - contains log files
    • /nam.g218 - contains NAM GRIB files derived for grid 218. GRIB files for graphical comparisons.
    • /nam.g221 - same as before except for grid 221.
    • /nam.g242 - same as before except for grid 242.
    • /nam.rst - contains NAM restart file needed for the coldstart step. Replaces the land states within the NPS-derived boundary files with operation NAM land states.
    • /nps_nmmb_na12AQ_ctl (but may be named differently depending on your region) - contains NPS-derived output files needed to run the NMM-B model. View the README located in this directory for more information about each file.
    • /obs_g2g - observation files needed for g2g.
    • /pcpver - files needed for precip verification.
    • /prepbufr - prepbufr files used in the production of sounding plots. The README contains more detailed information.

    Understanding the Various Types of Launcher Output

    Output files are written in the following locations on your $ROOT_DIR directory. All directories and paths below are beneath /meso/noscrub/$USER/$CYCLE.

    NPS output is in $NPS_DIR (specified in run_nmmb*), but if no path is provided, the default location is /com/nps_$DOMAIN_$EXPER based on $DOMAIN, $EXPER variables specified in run_nmmb*.

    The other model output is written beneath a 'run' directory, which is specified by the variable $label in run_nmmb*. If $label is not specified, the output is written in the subdirectory "nmmb_$DOMAIN_$EXPER"; the full path is /meso/noscrub/$USER/$CYCLE/nmmb_$DOMAIN_$EXPER.

    The remaining output files are now relative to the path mentioned previously:
    • /log - directory containing standard error/output files for each step
    • /model - where NMM-B model is run; contains NEMSIO history and restart files
    • /plots_218 - GrADS forecast graphics over CONUS from 12-km grid 218 files for just the NMM-B run only, organized in subdirectories for different user-specified domains and plot fields
    • /plots_221 - same as before, except over most of the domain (i.e., North America) from 32-km grid 221 files
    • /plots_242 - same as before, except over Alaska from 12-km grid 242 files
    • /post - native (NMMPRS) and interpolated (nmmb.AWP218/221/242 for grids 218/221/242, respectively) GRIB files
    • /verif - contains plots and vsdb verification files
      • /plots - subdirectory containing upper-air and surface verification plots. Files with smaller lengths are usually upper air plots (e.g., z12.218.gif), while files with the string 'diurnal' are diurnal traces, and 'biasrmse' are bias + RMSE surface statistics
      • /precip - subdirectory containing EQ threat + bias plots for different regions and times
      • /ctl - subdirectory containing vsdb records for the control run (can be WRF Launcher, NMM-B, or NAM; see description associated with $CTL_DIR and $CTL_FORM variables in run_nmmb*); surface & upper air statistics only
      • /nmmb - subdirectory containing vsdb records for the NMM-B Launcher run; surface & upper air stats only
      • /precip/vsdb - contains precipitation-based vsdb records for the NMM-B (/nmmb/ subdirectory) and the control run (/ctl/ subdirectory)

    The following table lists the output fields and their descriptions as found in the plots_* directories. This output consists of GrADS plots of each field.

    NameDescription
    10mw10m wind
    1500mv1524m AGL wind
    250250mb z-wind
    2mt2m temperature
    2mt-td 2m temperature minus 2m dew point temperature
    2mt-tskin 2m temperature minus surface skin temperature
    2mtd2m dewpoint temperature
    500500mb z-vorticity
    500wind500mb wind
    700700mb z, pw
    700rh 700mb relative humidity and omega
    700wind700mb z-wind
    850850mb z-temperature
    850rh 850mb relative humidity and omega
    850wind850mb z-wind
    925rh 925mb relative humidity and omega
    925wind925mb z-wind
    albdomidday albedo
    blibest lifted index
    blrh lowest level boundary layer relative humidity
    capebest cape (convective available potential energy)
    cconcanopy conductance
    cicelow-level cloud ice
    cin best convective inhibition
    cldbotcloud bottom pressure
    cldicecloud ice + snow
    cldraincloud water + rain
    cldtopcloud top pressure
    cnvcbotconvective cloud bottom pressure
    cnvcfconvective cloud fraction
    cnvctoptotal convective cloud top pressure
    cpcp accumulated convective precipitation using 3-hour buckets
    cpcptot0-84 hour capcp
    cprateinstantaneous convective precipitation rate
    cwtrlow-level cloud water
    deepcbotdeep convective cloud bottom pressure
    deepctopdeep convective cloud top pressure
    dlwfsurface downward longwave flux
    dswfsurface downward shortwave flux
    effalbinstantaneous albedo
    frzp% frozen precipitation
    ghfground heat flux
    gridcbotgridded cloud bottom pressure
    gridctopgridded cloud top pressure
    hicfhigh cloud fraction
    NameDescription
    ipratefrozen precipitation rate (mm/hr)
    lhflatent heat flux
    lowcflow cloud fraction
    midcfmiddle cloud fraction
    netlwfsurface net longwave flux
    netswfsurface net shortwave flux
    netswflwfsurface net radiative flux
    pblplanetary boundary layer height
    pcp accumulated non-convective precipitation using 3-hour buckets
    pcptot 0-84 hour accumulated non-convective precipitation
    prateinstantaneous precipitation rate
    ptypeprecipitation type
    ref11km AGL radar reflectivity
    ref44km AGL radar reflectivity
    refccomposite radar reflectivity
    rimeflow-level rime factor
    rnmrlow-level rain
    rraterain rate (mm/hr)
    sfcexcsurface exchange coefficient
    sfcfricsurface friction velocity
    sh2oeqsnow water equivalent
    shfsensible heat flux
    slpsea-level pressure
    snmrlow-level snow
    snratesnow/ice rate (in/hr)
    soilm10-10cm soil moisture
    soilm210-40cm soil moisture
    soilm340-100cm soil moisture
    soilm4100-200cm soil moisture
    soilt10-10cm soil temperature
    soilt210-40cm soil temperature
    soilt340-100cm soil temperature
    soilt4100-200cm soil temperature
    thick700.ptype850-700mb thickness precipitation type
    thick850.ptype1000-850mb thickness precipitation type
    totcftotal cloud fraction
    totcond condensates within a total column
    tskinskin temperature
    tskin-tsoilsurface skin temperature minus the soil temperature at the first layer (closest to the surface)
    ulwfsurface upward longwave flux
    uswfsurface upward shortwave flux
    vpcp0-24 hour apcp
    z0roughness length
    zssurface terrain height


    The following table provides a list of fields plotted for verification found in the /verif/plots directory, including bias/rmse and diurnal plots of each field at specific stations/locations:
    NameDescription
    dpt2m_biasrmsedewpoint temperature (C) bias & rmse time series
    dpt2m_diurnaldiurnal dewpoint temperature (C)
    rh00.218relative humidity bias & rmse profile at 00h over grid G236
    rh12.218relative humidity bias & rmse profile at 12h over grid G236
    rh24.218relative humidity bias & rmse profile at 24h over grid G236
    rh36.218relative humidity bias & rmse profile at 36h over grid G236
    rh2m_biasrmserelative humidity bias & rmse time series
    rh2m_diurnaldiurnal relative humidity
    t00.218temperature (C) bias & rmse profile at 00h over grid G236
    t12.218temperature (C) bias & rmse profile at 12h over grid G236
    t24.218temperature (C) bias & rmse profile at 24h over grid G236
    t36.218temperature (C) bias & rmse profile at 36h over grid G236
    t2m_biasrmsetemperature (C) bias & rmse time series
    t2m_diurnaldiurnal temperature (C)
    v10m_biasrmsewind (m/s) speed bias,vector rmse time series
    v10m_diurnaldiurnal mean wind speed (m/s)
    vw00.218wind (m/s) RMS speed bias, vector RMS error profile at 00h over grid G236
    vw12.218wind (m/s) RMS speed bias, vector RMS error profile at 12h over grid G236
    vw24.218wind (m/s) RMS speed bias, vector RMS error profile at 24h over grid G236
    vw36.218wind (m/s) RMS speed bias, vector RMS error profile at 36h over grid G236
    z00.218height (m) bias and rmse profile at 00h over grid G236
    z12.218height (m) bias and rmse profile at 12h over grid G236
    z24.218height (m) bias and rmse profile at 24h over grid G236
    z36.218height (m) bias and rmse profile at 36h over grid G236

    Return to top


Return to top
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Ongoing Efforts:


External Utilities

  • In the earlier CCS iteration of the NEMS/NMM-B Launcher, utilities external such as NPS, Post, and other pre- and post-processing tools were maintained in a fixed directory (other_codes) which was maintained by Launcher administrators.
  • During the transition to WCOSS, it was decided to integrate these external tools into an svn-controlled launcher directory (/externals) for the sake of portability between machines. However, it was found that taking this approach effectively decoupled the utilities stored in the launhcer externals directory from the top of the project trunks.
  • Currently experimenting with the use of 'svn propset svn:externals' command to link the top of the svn repository project trunks to the launcher /externals directories.

Task Allocation

  • Optimizing task allocation scheme for computationally intensive tasks (model, post).
  • The ideal allocation scheme is still in development.

Initialization Based on GFS Grib Output

  • One can select to initialize an experiment using either pressure-based GFS grib files or GFS spectral coefficients (plus pressure-based GFS files from which surface conditions are derived).
  • Pressure level based GFS file initialization:
    • /run/run_nmmb_(domain)
      • GRIBDIR=gfs.${cycle}
    • /run/run_nmmb_common
      • GFS_IC=1
    • /templates/master
      • SPECTRAL=.false.
  • Spectral coefficient based GFS initialization:
    • Same settings as in the pressure-level based initialization except:
      • /templates/master
        • SPECTRAL=.true.
  • Currently, initialization based on GFS spectral coefficients is being adapted to the WCOSS environment. Some modification to NPS are being developed to handle the larger grib file sizes associated with this data set.

Questions or comments about the Launcher? Please contact Ed Colon