PI-WRF Dataset Processing
I've pushed my wget script and initial summary here: https://code.usgs.gov/wma/nhgf/geo-data-portal/thredds-config/-/tree/master/mnt_thredds/PI-WRF
Data are in S3 here: s3://nhgf-development/thredds/PI-WRF/
Closure criteria: The content on S3 has been converted into one or more ZARR datasets as appropriate and a plan for documentation in STAC and sciencebase has been made per follow up planning with @ellenbrown and others as needed.
Relevant comms:
For our project Near-term Climate Projections to Inform Adaptation in the Hawaiian Islands.
we have prepared two sets of data products based on WRF model simulations for the Hawaiian Island region, which we want to post on the USGS GeoData Portal (GDP)
These datasets we have already put into two separate repositories One with daily model output
https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/IYVPAY
The other with seasonal mean model output
https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/CVIS05
The NetCDF data cwere created with standard Python support packages and should be able to be imported into your data servers (e.g. OPENDAP server).
@dblodgett's initial take:
I pulled down some sample files and this looks pretty straight forward. The documentation for the climatologies will be a little tricky with so many unique variables, but that’s workable with the metadata that already exist.
And I pulled the data down and got it to S3 linked above:
I got the data downloaded (94G in 208 files) which appears to match what’s on the Dataverse sight.