-
Notifications
You must be signed in to change notification settings - Fork 1
Complete Examples of Evaluation Declarations
WRES performs an evaluation following instructions provided to it by a user through an evaluation project declaration file, as described in the Declaration language.
The following sections provide examples of project declarations, instructions to execute those examples through the current WRES web-service landing page, a link to the current project declaration schema, and support information.
To understand how to declare evaluations for the WRES, we have provided the following examples. In each case, the project declaration file that is linked is annotated with comments explaining the different aspects of the declaration.
# | Description | Link |
---|---|---|
1 | Ensemble forecasts evaluated against observations, all in PI-timeseries XML format with the forecasts provided in a gzipped tarball. Included is a baseline of ESP forecasts used to calculate skill scores. This makes use of one year of an HEFS Baseline Validation data set. | WRES Project Declaration Example 1 |
2 | Single-valued, operational stream flow forecasts evaluated against observations provided in an NWS datacard file. It includes a comparison against persistence forecasts generated from the same datacard observations. The evaluation is performed at a 24-hour, mean desired time scale with lead times pooled into windows that are adjacent, non-overlapping, and 24-hours in width. Due to a restriction on the lead times to those in the first 48 hours and a time offset necessary to align the observations and forecasts at a 24-hour time scale, there is only one lead time in the evaluation output. | WRES Project Declaration Example 2 |
3 | Evaluation of operational stage forecasts with corresponding observations through a contingency table created for various stage thresholds and probability classifiers indicating when an ensemble forecast is assumed to have forecast a threshold to be exceeded. It is an example illustrating providing thresholds to support metric computation, as well as temporally rescaling the data to 4-day maximums (peak stages). | WRES Project Declaration Example 3 |
4 | Evaluation of single valued forecasts against observations through a time series analysis including the metrics “time to peak error” and “time to peak relative error”. The data backing this evaluation is available in the repository at https://github.com/NOAA-OWP/wres/tree/master/systests/smalldata. Clone the repository or download the data, and modify the declaration accordingly. | WRES Project Declaration Example 4 |
5 | Example of the use of the data sources USGS NWIS, NWM channel routing NetCDF output files, and the WRDS AHPS forecast data services. | WRES Project Declaration Example 5 |
Under construction
Under construction!
Additionally, the schema of the XML in which the above examples are provided, including descriptions of various options (more descriptions will be added in the future), can be found here (for WRES version 6.14):
This can be used as guidance for identifying all of the different options available as part of a WRES evaluation project declaration. For further information, see The WRES declaration schema..
If support is needed at any time, please post a GitHub issue ticket, https://github.com/NOAA-OWP/wres/issues/new, and we will address it as soon as possible.
The WRES Wiki
-
Options for Deploying and Operating the WRES
- Obtaining and using the WRES as a standalone application
- WRES Local Server
- WRES Web Service (under construction)
-
- Format Requirements for CSV Files
- Format Requirements for NetCDF Files
- Introductory Resources on Forecast Verification
- Instructions for Human Interaction with a WRES Web-service
- Instructions for Programmatic Interaction with a WRES Web-service
- Output Format Description for CSV2
- Posting timeseries data directly to a WRES web‐service as inputs for a WRES job
- WRES Scripts Usage Guide