The so-called Real Time Mesoscale Analysis (RTMA) system,
which was recently developed at NCEP and ESRL, is intended for
enhancing existing analysis capabilities and generating CONUS-scale
analyses of various weather elements for the National Digital Forecast
Database. The main component of the RTMA is the 5 km resolution
GSI-2dVar hourly analysis of various surface data including high
density mesonet observations.
Quality control (QC) plays a very important role in the RTMA,
especially in the assimilation of the often poorly verified mesonet
data. In fact, it is the lack of effective QC methods that often
discourages the use of these data in some analysis systems despite the
desirable characteristics of high spatial density and temporal
resolution that these data exhibit. In this research, we explore the
use of the so-called “gross-check” and variational
QC as a way of generating dynamic lists of the poor quality mesonet
data that must be excluded from the analysis. In our experiments, the
analysis is validated against pre-selected sets of observations that
are withheld from the assimilation system, a method known as
cross-validation. In this talk, we also discuss the use of the
“space-filling” Hilbert curve as an effective means
of generating geographically representative sets of cross-validation
data given the highly inhomogeneous surface data distribution of the
RTMA. In addition, we show how cross-validation can be used as a tool
for finding the optimal parameter values of the anisotropic background
error covariance model of the analysis system.
.