3. POD development checklist¶
This section lists all the steps that need to be taken in order to submit a POD for inclusion in the MDTF framework.
3.1. Code and documentation submission¶
The material in this section must be submitted though a pull request to the NOAA-GFDL GitHub repo. This is described in Git-based development workflow.
The example POD should be used as a reference for how each component of the submission should be structured.
The POD feature must be up-to-date with the NOAA-GFDL main branch, with no outstanding merge conflicts. See Git-based development workflow for instructions on syncing your fork with NOAA-GFDL, and pulling updates from the NOAA-GFDL main branch into your feature branch.
POD source code¶
All scripts should be placed in a subdirectory of diagnostics/
. Among the scripts, there should be 1) a main driver script, 2) a template html, and 3) a settings.jsonc
file. The POD directory and html template should be named after your POD’s short name.
For instance,
diagnostics/convective_transition_diag/
contains its driver scriptconvective_transition_diag.py
,convective_transition_diag.html
, andsettings.jsonc
, etc.The framework will call the driver script, which calls the other scripts in the same POD directory.
If you need a new Conda environment, add a new .yml file to
src/conda/
, and install the environment using theconda_env_setup.sh
script as described in the Getting Started.
POD settings file¶
The format of this file is described in POD settings file summary and in more detail in Diagnostic settings file format.
POD html template for output¶
The html template will be copied by the framework into the output directory to display the figures generated by the POD. You should be able to create a new html template by simply copying and modifying the example templates from existing PODs even without prior knowledge about html syntax.
Preprocessing scripts for digested data¶
The “digested” supporting data policy is described in Section 1.2.
For maintainability and provenance purposes, we request that you include the code used to generate your POD’s “digested” data from raw data sources (any source of data that’s permanently hosted). This code will not be called by the framework and will not be used by end users, so the restrictions and guidelines concerning the POD code don’t apply.
POD documentation¶
The documentation for the framework is automatically generated using sphinx, which works with files in reStructured text (reST,
.rst
) format. In order to include documentation for your POD, we require that it be in this format.Use the example POD documentation as a template for the information required for your POD, by modifying its .rst source code. This should include a one-paragraph synopsis of the POD, developers’ contact information, required programming language and libraries, and model output variables, a brief summary of the presented diagnostics as well as references in which more in-depth discussions can be found.
The .rst files and all linked figures should be placed in a
doc
subdirectory under your POD directory (e.g.,diagnostics/convective_transition_diag/doc/
) and put the .rst file and figures inside.The most convenient way to write and debug reST documentation is with an online editor. We recommend https://livesphinx.herokuapp.com/ because it recognizes sphinx-specific commands as well.
For reference, see the reStructured text introduction, quick reference and in-depth guide.
Also see a reST syntax comparison to other text formats you may be familiar with.
For maintainability, all scripts should be self-documenting by including in-line comments. The main driver script (e.g.,
convective_transition_diag.py
) should contain a comprehensive header providing information that contains the same items as in the POD documentation, except for the “More about this diagnostic” section.The one-paragraph POD synopsis (in the POD documentation) as well as a link to the full documentation should be placed at the top of the html template (e.g.,
convective_transition_diag.html
).
Preprocessing script documentation¶
The “digested” supporting data policy is described in Section 1.2.
For maintainability purposes, include all information needed for a third party to reproduce your POD’s digested data from its raw sources in the doc
directory. This information is not published on the documentation website and can be in any format. In particular, please document the raw data sources used (DOIs/versioned references preferred) and the dependencies/build instructions (eg. conda environment) for your preprocessing script.
3.2. Sample and supporting data submission¶
Data hosting for the MDTF framework is currently managed manually. The data is hosted via anonymous FTP on UCAR’s servers.
Digested observational or supporting data¶
Create a directory under inputdata/obs_data/
named after the short name
of your POD, and put all your digested observation data in (or more
generally, any quantities that are independent of the model being
analyzed). The “digested” data policy is described in Section 1.2.
Requirements - Digested data should be in the form of numerical data, not figures. - The data files should be small (preferably a few MB) and just enough for producing figures for model comparison. If you really cannot reduce the data size and your POD requires more than 1GB of space, consult with the lead team. - Include in the directory a “README.txt” description file with original source info. - Include in the directory any necessary licensing information, files, etc. (if applicable)
Create a tar file of your obs_data directory: - Use the –hard_dereference flag so that all users can read your file. - Naming convention: $pod_name.yyyymmdd.tar, where yyyymmdd is the file creation date. Alternatively, you may use some other version tag to allow the framework to check compatibiity between the POD code and data provided. - Create the tar file from the inputdata directory so the file paths start with obs_data. - Example (c-shell):
set pod_name = MJO_suite set tartail = `date +'%Y%m%d'` cd inputdata/obs_data tar cfh $pod_name.$tartail.tar --hard-dereference $pod_name
To check:
% tar tf $pod_name.$tartail.tar MJO_suite/ MJO_suite/ERA.v200.EOF.summer-0.png MJO_suite/ERA.u200.EOF.summer-1.png
After following the above instructions, please refer to the GitHub Discussion on transfering obs_data or email Dani Coleman at bundy at ucar dot edu or contact your liason on the MDTF Leads Team.
Files will be posted for Guest/anonymous access : ftp://ftp.cgd.ucar.edu/archive/mdtf/obs_data_latest/{$pod_name}.latest.tar with ‘latest’ pointing to the date-or-version-tagged tar file
Note that, previous to version 3, obs_data from all PODs was consolidated in one tar file. To assist in usability as the number of PODs grow, they will now be available individually, with the responsiblity for creating the tar files on the developer.
Sample model data¶
For PODs dealing with atmospheric phenomena, we recommend that you use sample data from the following sources, if applicable: