NCEP/EMC REGIONAL SPECTRAL MODEL (RSM)
Workstation Execution User's Guide
Compiled by Lt Col Mitchell
(IMA to SYSM)
This material was culled together, edited, and considerably expanded
from inputs provided by Dr Henry Juang, NCEP/EMC RSM creator and developer.
This document provides the basic step-by-step guidance for downloading
the RSM software and data files and executing the RSM locally on a
In addition to running on the Cray Supercomputer mainframes of NCEP,
the RSM is easily configured to run externally from NCEP on a number
of Unix workstation platforms, including SGI, HP, Sun SPARC, IBM RISC
including SP2, and DEC Alpha. These workstation versions of the RSM
are maintained centrally at NCEP, and are applied in an Off-the-Shelf
(OTS) manner by external modeling groups. Current U.S. groups running
the RSM on workstations include 1) Scripps Institute of Oceanography,
2) the NWS WFO in Hawaii, 3) the NWS WFO in Oregon, and 4) the
Tennessee Valley Authority (TVA). International groups externally
executing the RSM on workstations include groups in Taiwan (2), Kenya,
India, South Korea, and Chile. Given that the National Weather Service
is part of the Dept of Commerce, which in turn has a strong interest
in promoting economic ties with accelerating international economies,
these international RSM initiatives are encouraged.
The RSM model is a direct, regional-model descendant of the NCEP/EMC
Global Spectral Model (GSM), more commonly known as the Aviation
Model (AVN) or the Medium-Range Forecast Model (MRF). The basic
governing equations of the RSM solve for regional higher-order
(finer resolution) spectral modes superimposed on the lower-resolution
global spectral modes of the GSM. Hence GSM execution and GSM global
analysis and forecast outputs in terms of spectral coefficients on GSM
sigma surfaces are pre-requisites for RSM execution. This GSM output
can either be obtained via 1) ftp from NCEP GSM operational runs (T129)
or 2) a lower-resolution (e.g.T60) workstation version of the GSM
executed locally in tandem with the local workstation version of
the RSM. This second option offers powerful local flexibility, as
only the initial GSM ANALYSIS coefficients and initial surface fields
(SST, etc) need be ftp'd from NCEP, amounting to a SMALL ftp volume
of only about 9.5 Mb. Furthermore, the disadvantage of running a
lower resolution GSM on the local workstation may be offset by the
advantage of much more frequent updates of the RSM lateral boundary
conditions (e.g. hourly updates in place of the 6-hourly NCEP output).
The latter local scenario is rendered even more powerful by the
nesting capability of the RSM, coupled with an RSM nonhydrostatic
option. Hence, an appealing local workstation configuration may
be comprised of A) a T60 GSM execution, B) an outer 50-80 km
hydrostatic RSM window covering a domain of say CONUS or Europe,
and C) a 5-10 km nonhydrostatic RSM window covering a domain of
say the state of Oklahoma or Bosnia.
Since the RSM is a direct decendant of the GSM, the RSM source code
in fact shares a great majority of the GSM source code, including all
of the GSM physical parameterization subroutines. Hence the ongoing
operational upgrades to GSM physics are immediately realized also
in the RSM. These include state-of-the-art PBL physics, cloud and
radiation physics, and soil/vegetation/snowpack physics. The GSM/RSM
PBL physics is in fact now one of the PBL options in the NCAR/PennState
MM5 model and NCAR global CCM2 model.
The RSM has been in development and testing at NCEP/EMC for about six
years. It is now running daily in realtime parallel demonstration mode
in both the 00Z and 12Z production cycles on the Cray Supercomputer
mainframes at NCEP; and RSM output is available to NWS field forecasters
via NCEP/EMC Web sites. Additionally, the RSM model is used daily at
NCEP to produce ensemble forecasts, along with ensemble forecasts from
the NCEP AVN and Eta models. The RSM is expected to achieve formal
operational status at NCEP in about 1-2 years.
2.0 ACQUIRING RSM SOFTWARE AND FIXED DATA FILES FOR LOCAL EXECUTION
The RSM software suite includes several versions, each slightly
reconfigured to run on various Unix workstations. As cited earlier,
the currently supported Unix platforms include (as of Apr 97):
A - HP
B - SGI
C - DEC-ALPHA
D - IBM-RISC/SP2
E - Sun SPARC
For any of these platforms, the RSM software/fixed-files suite is
maintained as a partitioning into two groups: 1 - a "COMMON" group,
which is the same across all platforms, and 2 - a "PLATFORM" specific
group. These two groups are traditionally available from NCEP/EMC
as two separate compressed TAR files staged on an anonymous NCEP ftp
server with filenames typically given as follows:
1 - common.tar.Z
2 - dec.tar.Z (or hp.tar.Z, or sgi.tar.Z, etc)
To acquire the above via ftp, contact your designated RSM Focal Point
at NCEP/EMC (the Focal Point to be determined following a formal
pre-arranged agreement with NCEP). Ask your RSM Focal Point to set
up an NCEP anonymous ftp server site/directory with the above two
files for your desired Unix platform.
FTP the above two files to your local Unix workstation in a directory
hereafter assumed to be named "../../rsm".
(NOTE: On TORNADO, the above directory structure was set up in
Then first, in the latter ../rsm directory, uncompress and untar the
"common.tar.Z" file. The untar operation of this file will create
four subdirectory names as follows:
1 - ../rsm/source96
2 - ../rsm/fmtdata
3 - ../rsm/co2gen
4 - ../rsm/docblock96
The contents of the above subdirectories are discussed later below.
IMPORTANT CAUTION: Do NOT proceed to further uncompress any
compressed *.Z files that may appear under above directories, as the
scripts that run later sometimes assume compressed *.Z form on input.
Second, uncompress and untar the "dec.tar.Z" file. The untar operation
of this file will create one subdirectory named as follows:
1 - ../rsm/dec
Under this ../rsm/dec directory, the aforementioned untar operation
has also created the following eight subdirectories:
1 - ../rsm/dec/1st96
2 - ../rsm/dec/msm1st96
3 - ../rsm/dec/2nd96
4 - ../rsm/dec/msm2nd96
5 - ../rsm/dec/condata
6 - ../rsm/dec/etc
7 - ../rsm/dec/gradsdec
8 - ../rsm/dec/gribdec (actually gribalphaOSF)
The contents of the above subdirectories are discussed later below.
IMPORTANT CAUTION (AGAIN): Do NOT proceed to further uncompress any
compressed *.Z files that may appear under above directories, as the
scripts run later sometimes assume compressed *.Z form on input.
Next is a brief summary of the contents of the directory structure
you have created thus far under ../rsm AND under ../rsm/dec :
a - source96: all "common" non-platform specific RSM subroutines,
as subroutine files, one filename per subroutine name
b - docblock96: subroutine documentation for most but not all
of the RSM subroutines, in the form of subroutine
header "doc block" files, one per subroutine name,
with either the prefix "DB" or "DR" added to name
c - fmtdata: formatted and compressed versions of some of the
fixed data files used by the RSM, e.g. global
10-min Navy terrain
d - co2gen: source code, compile/ load makefiles, and fixed CO2 data
files to generate the CO2 climatology on RSM grid needed
by RSM radiation calculations
UNDER ../rsm/dec (or ../rsm/sgi or ../rsm/hp etc)
e - 1st96: scripts and flexible model domain/resolution parameter
files to run the RSM using global model AVN output for
the initial and lateral boundary conditions
f - msm1st96: As in 1st96, but for a non-hydrostatic version of RSM
g - 2nd96: scripts and flexible model domain/resolution parameter
files to RUN AN INNER NESTED RSM DOMAIN INSIDE THE
DOMAIN OF 1st96 AND USING THE OUTPUT FROM 1st96
EXECUTION AS INITIAL AND LATTER BOUNDARY CONDITIONS.
NOTE: By reconfiguring the flexible domain/resolution
parameters in 2nd96, one can createa 3rd96, etc, and
execute 3rd and 4th nests of the RSM, each using
initial/boundary conditions from the previous nest !!
h - msm2nd96: analagous to msm1st96, but for 2nd96
i - condata: scripts to read compressed formatted fixed data files
in directory ../rsm/fmtdata and uncompress, read, write,
and create corresponding unformatted binary fixed files
for given platform (including the RH-to-Cloud diagnostic
tables for my "CCA" diagnostic cloud scheme).
j - etc: small set of machine-dependent utility software ("tools")
k - gribdec: (actually named gribalphaOSF) GRIB packing and
unpacking subroutines library
l - gradsdec: the machine-dependent subroutine library of the
GRADS graphics/plotting package (devloped by COLA).
(NOTE: AFGWC Users may ignore this directory and
its software if they have their own graphics package,
e.g. PV-WAVE, and their own software that will unpack
the RSM GRIB output files and convert them to the data
format expected by their graphics package. None-
theless, be aware of the GRADS' ADVANTAGES of being
powerful, widely used, free unlicensed shareware for
plotting NWP model output that a user can easily
learn and port to any workstation. COLA, the creating
organization of GRADS, maintains extensive online
GRADS support, Bulletin Boards, and User's Guides.
GRADS is widely used at NCEP, in NOAA labs, NASA,
and the university community.)
(ASIDE NOTE: I can not help but beat my drum here a bit. The "CCA"
cloud scheme in subdirectory ../rsm/dec/condata is the diagnostic
cloud scheme which I developed in the late 80's at Air Force Phillips
Lab, and which NCEP/EMC has implemented and retained since 1993 in
their operational GSM/AVN/MRF - and hence here in the RSM. It is in
support of the MRF/AVN/RSM use of the CCA cloud scheme that NCEP
worked through the DOD's Shared Processing Network committees in the
early 90's to get the present-day NCEP realtime access to AFGWC's
3-hourly RTNEPH. So far, AVN/MRF tests of competing explicit
prognostic cloud water schemes have failed to yield cloud-cover
forecasts and radiative heating rates/fluxes as good as the CCA scheme.
Hence, an explicit prognostic cloud scheme has NOT been implemented
operationally in the MRF/ANV. For some 10 years, I have urged SYSM to
apply the CCA scheme in RWM, GSM, AVN, NOGAPS, MM5, etc. I do believe
an explicit prognostic cloud scheme will be implemented in the MRF/AVN
soon, but it has taken the NCEP "explicit cloud modelers" several more
years of toil and refinement than anticipated to beat CCA. In short,
CCA has been a tough act to follow, thanks in large part to my NCEP/EMC
colleague Ken Campana, who was a CCA believer early on. The popular
bandwagon that maligns diagnostic cloud schemes is somewhat misguided
and over optimistic. Finally, CCA tuning techniques can in fact
be adapted and applied to the fractional cloud-cover algorithms
that even explicit prognostic cloud schemes MUST apply.)
3.0 DOWNLOADING THE REALTIME NCEP/EMC GLOBAL AVN FIELDS USED
FOR RSM INITIAL AND BOUNDARY CONDITIONS
We next describe the procedures for downloading the realtime global
AVN model fields used for RSM initial and boundary conditions.
These fields are acquired via anonymous ftp from the NCEP public
domain Internet server, known as the NIC (NCEP Information Center)
at IP address "nic.fb4.noaa.gov".
Before presenting the details, we remind RSM users that there is an
appealing local alternative for providing the RSM with time-dependent
lateral boundary conditions; namely a workstation version of the global
spectral model (GSM/ANV/MRF), which can be run in low resolution (say
T40 or T60) on the workstation in tandem with the RSM. In this case,
the user need only ftp from NCEP 10 Mb of initial condition files!!
THIS USER'S GUIDE DOES NOT DESCRIBE THIS OPTION; THUS CONTACT YOUR
DESIGNATED NCEP RSM FOCAL POINT FOR FURTHER INFORMATION.
The atmospheric initial and forecast AVN fields/files are downloaded
in spectral coefficient form on the global model's sigma levels. The
single initial file of earth surface fields of SST, snowdepth, etc
are downloaded in grid-point form on the GSM's Gaussian grid. All
can be downloaded from your desired choice of the 00Z, 06Z, 12Z, or
18Z cycles of the NCEP AVN production suite.
To download the AVN fields for the RSM, first create a subdirectory
by the name of "daily_data" under directory ../rsm.
(NOTE: Because of file space limitations on /home/modelers
on TORNADO, Tanya set up an alternate directory tree structure
for the big AVN initial and boundary condition files in
/home2/kem/rsm/daily_data/t00z and .../t12z)
Then "cd" to ../rsm/daily_data and from there establish an ftp
connection to "nic.fb4.noaa.gov", logging in using id "anonymous"
and password given by your official personal email address.
Once you are logged in and see the "ftp>" prompt, do a "pwd"
to see you are in directory "/". Then cd to "pub/rsm/para80/tOOz
(or t12z, etc) and do another "pwd" to confirm your directory.
If the NIC directories /pub/rsm/para80/t00z,t12Z, etc do not exist,
then contact your designated NCEP RSM Focal Point to find out
where the new directories/ftp-site have been moved.
Once in the desired directory, do an "ls -l" command to see the
AVN initial and forecast files and byte sizes. First initiate
a "get" of the one-line ascii files "currentdate" and "dateftpavn".
Then from the "ftp>" prompt, key in "binary" to switch to the
binary ftp mode to next acquire the compressed ".Z" AVN fields.
In binary mode, initiate get requests for all the remaining
files in the directory. These 14 compressed files and their
typical byte sizes (still compressed) for the current T126,
28-layer AVN model are:
TWO INITIAL 00-HOUR FIELDS:
1 - sfcanl126.fmt.Z (smallest file at about 2.0 Mb, and the only
non-spectral coefficient file. Rather, this
is a gridded file on the global Gaussian grid
and contains ALL initial earth-surface fields,
such as the time-dependent fields of SST,
snowdepth, soil moisture, soil temperature,
sea-ice cover, albedo, etc, and constant
surface fields such as sfc roughness length.)
2 - siganl126.fmt.Z (size around 7.55 Mb, containing initial 00-hour
T126 spectral coefficients of atmospheric state
variables on 28 sigma levels)
TWELVE 6-HOURLY FORECAST FIELDS (around 7.72 Mb each)
sig126fXX.fmt.Z , where XX = 06,12,18,24,30,36,42,48,54,60,66,72
NOTE: You only need to download the forecast files spanning the
RSM forecast length you expect to run, e.g. 24 or 48 hours.
Once you have downloaded all the AVN files, quit the ftp and
key-in a "pwd" and "ls -l" on your local workstation to verify
existence of expected filenames and file sizes.
Using an editor, examine the one-line date files "currentdate"
and "dateftpavn" to confirm that the cycle time is what you
expected and desire. The ASCII format and contents of these
one-line date files is obvious by inspection.
WARNING CAUTION: Leave all the AVN "sig..." initial and forecast
files in COMPRESSED format. Do NOT uncompress these files, as the
scripts run later below assume these files are LEFT COMPRESSED.
ASIDE: NOTE ON CONTENTS OF AVN "SIG..." FILES
This NOTE describes what one would find in the contents of the
AVN "sig..." files if one placed them in a working scratch directory,
uncompressed them, and browsed them with an editor.
After uncompression, all AVN 'sig..." files, such as sig126fXX.fmt are
ASCII formatted files and can be browsed via your workstation editor.
These files are somtimes too big for some "VI" editors, but they can
be inspected with the old reliable Unix line editor "ed" (e.g.
"ed sig126fXX.fmt", then key-in "x,y p" to screen print lines
numbered "x" through "y", and "Q" to quit the editor without
altering the file).
Browsing the first two lines of file sfcanl126.fmt shows the
following (note the "ed" utility may be able to "bring in"
only about the first 20 percent of the file):
NMC SURFACE INPUT FORMATTED FILE
0.000000E+00 hh dd mm yy ii jj
where the first number is the "forecast" hour (here initial 00-hour)
and hh, dd, mm, yy, ii, and jj are 2-digit integers giving the
GMT hour, day of month, month of year, i-dimension of Gaussian grid,
and j-dimension of Gaussian grid (namely 384 x 190 grid).
All following lines are 78-column fixed-length records each containing
six "e13.6" formatted floating point data values. Each 384 x 190
Gaussian grid surface field spans 12160 formatted lines. There are
16 surface fields in all in this file, but at the time of this writing
(Apr 97), I do not know the specific order of the individual surface
Browsing the first two lines of files siganl126.fmt and sig126fXX.fmt
we find that the first two lines are given by
NMC SIGMA SURFACE FORMATTED FILE
0.xx0000E+XX hh dd mm yy
where the first real value is the valid forecast length
in hours and hh,dd,mm,yy as before.
Lines 3-509 of files siganl126.fmt and sig126fXX.fmt are the
single-valued real spectral coefficients related to "wave number
zero" modes. All remaining lines of the file contain 6 floating
point values (3 real/imaginary pairs of complex numbers),
representing 3 complex spectral coefficients per line.
END OF ASIDE NOTE ON CONTENTS OF AVN "SIG..." FILES
4.0 SETTING UP THE RSM FOR EXECUTION
The RSM execution set-up procedures (compile/load/execute),
consist of four parts:
1) a one-time initial set-up sequence, which among other things
configures the needed fixed input data files,
2) the set-up of the variable domain/resolution choices,
3) compiling the preprocessor, RSM forecast, and postprocessor source
4) creating and defining the model terrain file
These four set-up parts are discussed separately below.
4.1) ONE-TIME RSM INITIAL SET-UP
Begin by "cd" to your local workstation directory ../rsm
Then depending on your Unix platform type, cd to subdirectory
"dec", or "sgi", or "hp".
In this User's Guide, we assume the DEC-Alpha workstation,
so we cd to "dec".
1 - cd to gribalphaOSF (generically referred to earlier as "gribdec")
2 - enter "compallg.sh" and answer "n", i.e. no, to the roughly
dozen queries of "remove *.o " object files
3 - cd ../etc
4 - enter "makefortprep.s" (script to create executable fortprep.x)
5 - enter "makeDATE.s" (script to create executable DATE.x)
(NOTE: As of this writing, 20 Apr 97, neither of the machine-
dependent compile/load scripts makefortprep.s or makeDATE.s
were present in the ../etc subdirectory of the untar'd dec.tar.Z,
but the resulting executables produced by these scripts WERE
in ../etc, namely "fortprep.x" and "DATE.x" respectively,
so all is temporarily well while SYSM pursues getting
the DEC-Alpha versions of these two DEC/OSF-compatible
compile/load scripts from NCEP, contact Lt Col Mitchell.)
6 - cd ../condata
7 - enter "unfmtmtnvar.s" (to produce binary fixed data file mtnvar.126)
(Note: I corrected line 4 of script to cite proper subdirectory
level and to change "fort..*" to "fort.*" in last line)
8 - enter "unfmttune.s" (to produce binary fixed data file tune1)
(Note: I corrected line 3 of script to cite proper subdirectory
level and to change "tune.0" to "tune.o" in last line)
9 - enter "unfmtsig126.s" (to produce binary initial file siganl126 )
(Note: I changed line 4 of script to cite the "hardwired"
TORNADO subdirectory /home2/kem/rsm/daily_data discussed
earlier to deal with limited available file space on
10 - enter "makefile.co2con" (to create executable co2new.x)
11 - enter "makeco2con.s" (to produce binary fixed CO2 file co2con.28)
(NOTE: As of this writing, 20 Apr 97, neither of the machine-
dependent scripts "makefile.co2con" or "makeco2con.s" were
present in the ../condata subdirectory of the untar'd dec.tar.Z,
but the resulting binary fixed CO2 data file produced by
these scripts WAS in ../condata, namely "co2con.28",
so all is temporarily well while SYSM pursues getting
the DEC-Alpha versions of these two DEC/OSF-compatible
from NCEP, contact Lt Col Mitchell.)
12 - cd ../1st96
13 - using an editor, inspect and if necessary change the file
named "DCLSYS" (which means 'Declare System"). The purpose of
this parameter file is to specify whether the RSM will run on
a CRAY mainframe, an HP workstation, or a non-HP workstation.
This 9-line file looks like the following:
%DCL DEFINE WHICH MACHINE TO RUN;
%##CRA ='C-CRA'; %DCL CRAY CODE ;
%##HP ='C-HP'; %DCL WORKSTATION HP ;
%##AFA =' '; %DCL WORKSTATION ;
%DCL DEFINE MODEL PHYSICS DIMENSION FOR LOCAL;
% #ILOT = #IGRD12 ;
% #KLOT = #LEVR ;
% #ILOR = #IGRD12 ;
% #KSPH = #LEVR ;
The crucial thing here is lines 2-4 above and the
single quoted 4-5 character string after the equal sign.
This character string must be 5 BLANKS in line 3 to
signify non-HP workstation. The C-CRA and C-HP non-blank
strings in lines 2-3 serve to DISABLE those choices. Hence
the one line with a BLANK quoted string signifies the active
configuration, hence only one blank string should be present,
and that line in this DEC-Alpha case should be line 3.
4.2) SET-UP MODEL DOMAIN LOCATION AND GRID DIMENSIONS/RESOLUTION
Next set up your desired RSM grid dimension and model domain location.
This is where the flexible relocatability is exercised. The model
grid dimensions are set via several parameters in the file "DCLRSM"
in directory ../rsm/dec/1st96. The model domain location is set
via several parameters in the file "LOCRSM" in the same directory.
PARAMETER FILE "DCLRSM":
File DCLRSM is a rather extensive declaration of about 60 RSM
parameters controlling various options of model physics, numerics,
outputs, and grid dimensions. In our present basic learning
exercise of a single non-nested coarse grid, we need only concern
ourselves with the following 4 lines in the 60-70 line file DCLRSM:
% ##G2R = ' '; %DCL GLOBAL TO REGIONAL NEST;
% ##C2R = 'C-C2R'; %DCL COARSE REGIONAL TO FINE REGIONAL;
% #IGRD = '64'; %DCL NUMBER OF I GRID INTERVALS;
% #JGRD = '65'; %DCL NUMBER OF J GRID INTERVALS;
The top two lines control whether this RSM run will be for a
"coarse" outer domain or for an inner "fine" nested domain.
In both lines, a quoted string of 5 blanks, i.e. ' ',
turns on that option and a string of 5 non-blanks disables
the option. The list above shows the coarse outer mode enabled
and the fine nested mode disabled.
Secondly, the "I" and "J" grid dimensions are given by IGRD,JGRD,
which must satisfy the following: the j-grid dimension must be
an odd interger (I chose 65), and the i-grid dimension must be
divisible by the product (2**n)(3**m), where the powers (n,m) can
be any integers 0,1,2,3,..,.. etc, but not both zero (I chose 64).
The user must also set the ‘BGF’ parameter to the ratio between the outer grid spacing to the inner grid spacing, setting it to the nearest integer. For example, if the outer nest is the set of T126 AVN coeffecients (dx=106km), and the RSM window is at 27km, BGF=106/27=4 (approximately).
NOTE 1: The RSM executable that results from above four DCLRSM
parameters can be used for different domains without recompiling
provided the User wants to continue to use the same grid dimensions
and a single non-nested domain.
NOTE 2: While file DCLRSM sets the RSM grid dimensions, the file
LOCRSM next sets the location and size of domain and grid spacing.
PARAMETER FILE "LOCRSM":
The file LOCRSM sets 18 parameter values. But for the simple
case of one non-nested RSM domain, only the first 9 of 18 have
to be set, the last 9 are set to zero.
The list of 18 LOCRSM parameters for a single non-nested N. American
domain roughly equivalent to the 50 nm RWM N. American fixed window
is given next:
RPROJ = 1.0 ,
RTRUTH = 60. ,
RORIENT = -80.0 ,
RDELX = 93400. ,
RDELY = 93400. ,
RCENTLAT = 40. ,
RCENTLON = -95. ,
RLFTGRD = 32. ,
RBTMGRD = 32. ,
CPROJ = 0. ,
CTRUTH = 0. ,
CORIENT = 0. ,
CDELX = 0. ,
CDELY = 0. ,
CCENLAT = 0. ,
CCENLON = 0. ,
CLFTGRD = 0. ,
CBTMGRD = 0. , where
RPROJ = the map projection choice flag
= 0.0 for Mercator
= 1.0 for N.H. polar stereographic
= -1.0 for S.H. polar stereographic
RTRUTH = the true latitude of the chosen map projection
= 60. for N.H. polar, -60. for S.H. polar
= typically +30. to -30. for Mercator
RORIENT = the longitude line parallel to y-axis of map proj.
= -80. (for 80 W) is typical of N.H. polar stereo
RDELX = the resolution grid spacing in meters
RDELY = set equal to RDELX
RCENLAT = the latitude of the domain center point (typically the north pole for polar projection, and the equator for Mercator)
RCENLON = the longitude of the domain center point (can be anything for polar and Mercator projection).
(RLFTGRD, RBTMGRD) = the grid location of the north pole (determined with worksheet).
4.3) COMPILING THE PREPROCESSOR, FORECAST MODEL, AND POSTPROCESSOR
While still in directory ../rsm/dec/1st96, via editor inspect
the contents of compilation script named PCMPL. Around lines 36-50,
ensure that in the first three "echo" stmnts which route output to the
compiler-option file "cmpopt", that all compilation options are set
to "yes" (i.e. no occurrences of the string "no_" should appear in
the first three 'echo "............" >cmpopt' lines in lines 36-50.
Change any "no_" occurrences to "yes".
IMPORTANT!: In the fourth and last 'echo "yes " >cmpopt', just prior
to the line "read RUNFCST <cmpopt", change this "yes" to "no_", so
that the RSM model execution is NOT immediately kicked off at the
end of the PCMPL script.
Then save the changed PCMPL script, exit the editor, and
invoke compilation by entering the script name PCMPL.
When the PCMPL script finishes (it takes 10-20 minutes), cd
to its newly created directory /rsm/run/dec/exec and invoke "ls"
and confirm that the following new executables exist there:
rloc.x - sets up and defines the RSM grid
rmtn.x - defines terrain heights on model grid
rinp.x - RSM preprocessor (defines initial conditions on RSM grid)
rbln.x - creates RSM lateral boundary conditions
rsm.x - the RSM forecast model executable
rpgbn.x - posts RSM to pressure levels
rsgbn.x - posts RSM to sigma levels
4.4) CREATING THE TERRAIN FIELD ON THE RSM GRID
While in directory ../rsm/dec/1st96, enter and invoke the script
PRMTN, AFTER deleting the "more stdout.rmtn" in second to last line
in the file (TORNADO hangs up on this "more" stmnt). This script
will create directory ../rsm/run/dec/rmtn wherein it will copy,
uncompress, and unformat the 10-min global Navy terrain data and
then it will run the executable ../rsm/run/dec/exec/rmtn.x, resulting
in five "rmtn..." terrain-related files used later in the RSM model
execution. Printed output created by the execution of rmtn.x is
found in file "stdout.rmtn" in ../rsm/run/dec/rmtn.
5.0 EXECUTING THE RSM FORECAST MODEL
In directory ../rsm/dec/1st96, inspect and set the RSM runtime
parameters in scripts PMAIN and PFCST, such as the forecast
length, the output time interval, and the lateral boundary
condition input interval, time step, etc, as follows:
In PMAIN, inspect around lines 10-15 for the block beginning with
the comment line "# RSM SETUP". Following that comment line,
set the parameters FHOUR (desired RSM forecast length is hours,
here 24 hours), INCHOUR (the time interval of input AVN lateral
boundary conditions, here 6 hours), PRTHOUR (the desired output
interval in hours, here 6 hours).
IMPORTANT: Finally in script PMAIN, check the subdirectory
specified in the assigment of "DIRAVN" around line 11. Make
sure it is the directory path ending in the subdirectory
"daily_data" where you stored the compressed and formatted
ANV forecast output files, herein given as follows:
Next, in script PFCST, set the time step parameter "dt_reg"
to the proper number of seconds. Some common values are"
For DELX grid spacing of 50000 ( 50-km), use dt_reg=300.
For DELX grid spacing of 100000 (100-km), use dt_reg=600.
Finally, invoke the running of the RSM preprocessor,
RSM forecast model, and RSM postprocessors by keying
in the script PMAIN as follows:
nohup PMAIN >nohup.out
The script PMAIN internally invokes the subordinate scripts
PRINP, PFCST, and PPOST, in turn.
(NOTE: Initial execution attempt uncovered an error in PFCST in
the building of NAMELIST file "rfcstparm", which is read in later
as stdin Unit 5 by executable "rsm.x". The correction was to
edit out the first "echo" line in PFCST and move the "&NANSMF"
namelist label to the next line with "CON(1)", and modify
">>" to " >" in that line.)
The execution time of PMAIN for a 24-hour forecast on
a 64 x 65 grid at 93.4 km grid spacing with 600 time step
on the DEC-Alpha is estimated to be 2 hours (based on my RSM run that aborted between the 6 and 12 hour forecast.
6.0 EASY RE-EXECUTION OF THE RSM FOR ANY DOMAIN SIZE/LOCATION
If one chooses to keep the same RSM grid dimensions (here 64 x 65), then it is trivial to re-define the size and location of the RSM domain and re-run. Merely edit the striaghtforward handful of domain size/location parameters in file LOCRSM, as described in Sec. 4.2. Then, in turn, re-execute scripts:
A) PRMTN (after deleting all files in ../rsm/run/dec/rmtn)
B) PMAIN (after deleting all files in ../rsm/run/dec/dirrun
and in ../rsm/run/dec/output)
That is all there is to it!! (i.e. no recompiling)
1. LOCRSM - Set location parameters.
2. DCLRSM - Set grid dimensions (make sure they follow the correct format), and BGF.
3. PFCST - Set model timestep.
The Inner Nest
For running a second window nested within the original regional window, enter the directory ‘2nd96’. Here, you will find duplicates of many of the routines that you set to run the outer nest.
First, a directory must be created to receive the data. On the DEC, I use /home2/kem/rsm/2nd96_output.
Then, the routines in 2nd96 must be set. Propbably the first one that must be adjusted is ‘DCLRSM’.
a. Set the values of #IGRD and #JGRD to the desired grid dimension values (remember the constraints).
b. Set CIGRD to IGRD+1, and CJGRD to JGRD+1, where IGRD and JGRD are the grid dimensions taken from the OUTER window (from DCLRSM in 1st96).
c. Set BGF to the ratio between the outer and inner grid spacings (to the nearest integer).
Following this, get into the file ‘LOCRSM’:
a. Set all the ‘C’ values to those in the outer loop.
b. Set all the ‘R’ values to those in the inner loop.
Enter the file ‘PMAIN’
a. Set DIRAVN to the output directory of the OUTER loop (1st96_output).
b. Set DIRRSM to the new output directory you set up for the INNER loop (2nd96_output).
This allows the code to read the output of the outer loop as the BC for the inner loop.
c. Comment out the code in the secton titled ‘prepare gsm data from daily fmt’, and add the following lines:
cp $DIRAVN/r_sigf00 siganl
cp $DIRAVN/r_sfcf00 sfcanl
This will allow the code to use outer loop data (rather than AVN or GSM data) to initialize the model.
d. Comment out the code in the secton titled ‘RSM Forecast’ (EXCEPT for the line $DIRHOME/PFCST), and add the following lines:
cp $DIRAVN/r_sigf$FH sigf
Make sure this line is BEFORE the call to PFCST.
This will allow the code to use outer loop data (rather than AVN or GSM data) as boundary conditions for the model.
Compile (PCMPL) and run just as usual!