Water Mission Area issueshttps://code.usgs.gov/groups/wma/-/issues2024-03-28T19:10:09Zhttps://code.usgs.gov/wma/nhgf/pygeoapi/-/issues/22Remove hard coding of PYGEOAPI_VERSION_TAG2024-03-28T19:10:09ZSarah JordanRemove hard coding of PYGEOAPI_VERSION_TAGIn [MR #91](https://code.usgs.gov/wma/nhgf/pygeoapi/-/merge_requests/91), one of the changes we had to make to re-deploy was hard-coding `PYGEOAPI_VERSION_TAG` as `0.16.dev0+limno.1-beta`, because using the Github API to access the lates...In [MR #91](https://code.usgs.gov/wma/nhgf/pygeoapi/-/merge_requests/91), one of the changes we had to make to re-deploy was hard-coding `PYGEOAPI_VERSION_TAG` as `0.16.dev0+limno.1-beta`, because using the Github API to access the latest release was failing. Hard-coding the release is a fine short-term fix, but we should come up with a better strategy for defining this variable.https://code.usgs.gov/wma/nhgf/pygeoapi/-/issues/16Update CI\CD Pipeline2024-03-28T18:44:01ZPaul TomasulaUpdate CI\CD Pipeline## Background
Changing out to a single multi-staged docker file (#14) and exposing additional environment information (#5) will require updates to the CI\CD pipeline.
## Closure Criteria
* CI\CD pipeline has been updated to reflect bui...## Background
Changing out to a single multi-staged docker file (#14) and exposing additional environment information (#5) will require updates to the CI\CD pipeline.
## Closure Criteria
* CI\CD pipeline has been updated to reflect building off a single docker file (see issue #14).
* New features (and some currently undocumented features) of the CI\CD pipeline are documented.
* CI\CD pipeline passes.Builder Refactor for Improved Clarity and Ease of UsePaul TomasulaPaul Tomasulahttps://code.usgs.gov/wma/nhgf/mainstems/-/issues/94GWTC Slides2024-03-25T17:04:32ZBlodgett, David L.GWTC Slides[AWRA_gwtc_2024_hydrofabric.pptx](/uploads/f7d9230076df12a005ce422f186831bd/AWRA_gwtc_2024_hydrofabric.pptx)[AWRA_gwtc_2024_hydrofabric.pptx](/uploads/f7d9230076df12a005ce422f186831bd/AWRA_gwtc_2024_hydrofabric.pptx)https://code.usgs.gov/wma/nhgf/pygeoapi/-/issues/25downgrade to pydantic<2.02024-03-22T20:14:45ZSarah Jordandowngrade to pydantic<2.0pygeoapi recently downgraded to pydantic 1 as discussed [here](https://github.com/geopython/pygeoapi/issues/1573). When we bring in the latest upstream changes, we will need to downgrade to pydantic 1. I think this will mainly consist of...pygeoapi recently downgraded to pydantic 1 as discussed [here](https://github.com/geopython/pygeoapi/issues/1573). When we bring in the latest upstream changes, we will need to downgrade to pydantic 1. I think this will mainly consist of changing the versions of gdptools we import.https://code.usgs.gov/wma/nhgf/toolsteam/gdptools/-/issues/42add ability to limit precision of output from agg_writer2024-03-15T18:27:19ZWieczorek, Michael E.add ability to limit precision of output from agg_writeradd ability to limit precision of output from agg_writer. The decimal places default to 14 places making file size very large.add ability to limit precision of output from agg_writer. The decimal places default to 14 places making file size very large.https://code.usgs.gov/wma/nhgf/pygeoapi/-/issues/15Update test suite to reflect repo new structure2024-03-14T13:40:20ZPaul TomasulaUpdate test suite to reflect repo new structure## Background
The work in issues #5 and #14, will significantly alter the structure of the current repository. This issue is to define an explicit requirement that the test suite be updated and passing prior to closing the milestone.
#...## Background
The work in issues #5 and #14, will significantly alter the structure of the current repository. This issue is to define an explicit requirement that the test suite be updated and passing prior to closing the milestone.
## Closure Criteria
* The existing (or updated) test suite passes both locally and in the CI/CD pipeline.Builder Refactor for Improved Clarity and Ease of UseKieran BartelsKieran Bartelshttps://code.usgs.gov/wma/nhgf/mainstems/-/issues/93NHDPlusHR Crosswalk with NHDPlusMR2024-03-13T15:04:48ZBlodgett, David L.NHDPlusHR Crosswalk with NHDPlusMRPlan to create two tables of hydrolocations that relate inlet and outlet nodes to flowlines in the other dataset.
|id|type|mainstem|reference_id|reference_id_measure|
|--|--|--|--|--|
|identifier of point feature|type of point feature|...Plan to create two tables of hydrolocations that relate inlet and outlet nodes to flowlines in the other dataset.
|id|type|mainstem|reference_id|reference_id_measure|
|--|--|--|--|--|
|identifier of point feature|type of point feature|id of reference feature|measure along reference feature|
type: inlet, outlet
reference_id_measure: 0 at inlet, 100 at outlet
Process requirements:
inlet and outlet locations with known mainstems
full flowline geometry from the reference dataset with known mainstems
Strategy: Build process input lists from each data source based on mainstem id. First build out cross-region paths then iterate through hu04.
Implementation: The nhdplushr workflow will generate all process artifacts as iteration through HU04s will be easiest there.Blodgett, David L.Blodgett, David L.https://code.usgs.gov/wma/nhgf/mainstems/-/issues/91Consolodate NHDPlusHR network fixes in the same way as NHD was done.2024-03-13T01:57:52ZBlodgett, David L.Consolodate NHDPlusHR network fixes in the same way as NHD was done.The network changes are all still very adhoc and need to be consolodated into a table form.The network changes are all still very adhoc and need to be consolodated into a table form.Blodgett, David L.Blodgett, David L.https://code.usgs.gov/wma/nhgf/mainstems/-/issues/92Create reference network v1, v2, v3, mainstem id mapping table.2024-03-13T01:57:29ZBlodgett, David L.Create reference network v1, v2, v3, mainstem id mapping table.Need to be able to cross validate and track levelpath from reference network releases.Need to be able to cross validate and track levelpath from reference network releases.https://code.usgs.gov/wma/nhgf/pygeoapi/-/issues/24Add --no-fail-on-invalid-collection flag to builder2024-03-12T19:00:33ZSarah JordanAdd --no-fail-on-invalid-collection flag to builderthe upstream pygeoapi now includes the optional flag, `--no-fail-on-invalid-collection`. When used, the OpenAPI document will then be generated without the failing collection(s) -- skipped datasets will be logged along with the reason fo...the upstream pygeoapi now includes the optional flag, `--no-fail-on-invalid-collection`. When used, the OpenAPI document will then be generated without the failing collection(s) -- skipped datasets will be logged along with the reason for failure.
- [ ] Update LimnoTech fork to include this change and issue a new release
- [ ] Add the flag to `entrypoint.sh`
I propose waiting to do this until we have resolved [#1578](https://github.com/geopython/pygeoapi/issues/1578) so that more of the USGS datasets will be included.Sarah JordanSarah Jordanhttps://code.usgs.gov/wma/nhgf/pygeoapi/-/issues/23GDP - Provider Throwing an Error2024-03-12T19:00:32ZPaul TomasulaGDP - Provider Throwing an Error# Background
@rmcd @ahopkins1 and I encountered an issue when running the GDP container built from the current main branch. It seems like one (or possibly more) of the resources that use the xarray provider in pygeoapi are missing an eps...# Background
@rmcd @ahopkins1 and I encountered an issue when running the GDP container built from the current main branch. It seems like one (or possibly more) of the resources that use the xarray provider in pygeoapi are missing an epsg_code, causing the provider to fail in generating the openapi.yml file.
I think there are at least two components to this issue.
1. The xarray provider in pygeoapi should be updated to fail gracefully and issue/log a warning for this resource and then continue to the next one. That would make it easier to identify the data source that needs to be corrected, but also make it such that a single invalid datasets doesn't prevent the whole thing from building
2. We need to identify the invalid dataset(s) and add the missing epsg_code.
# Closure Criteria
the pygeoapi fork and/or the gdp config has been updated to allow the container to initialize.Sarah JordanSarah Jordanhttps://code.usgs.gov/wma/nhgf/reference-hydrofabric/-/issues/140Erroneous Topology in Refactor 01, Sciencebase issues2024-03-08T20:29:19ZMike JohnsonErroneous Topology in Refactor 01, Sciencebase issuesUsing the latest refactor_01.gpkg I have _locally_ there is a topology issue on Mainstem/LevelPath 2111164.
While all flowpaths are "off" on this mainstem, the most visible in isolation is the pink flowpath (ID = 10018009) flows to the...Using the latest refactor_01.gpkg I have _locally_ there is a topology issue on Mainstem/LevelPath 2111164.
While all flowpaths are "off" on this mainstem, the most visible in isolation is the pink flowpath (ID = 10018009) flows to the blue one (ID: 10040690)
![Screenshot_2023-11-21_at_2.18.34_PM](/uploads/becf62b116c23dab91d0bf4c76e76823/Screenshot_2023-11-21_at_2.18.34_PM.png)
I wanted to make sure my data is up to date with Science base (I am pretty sure it is) however I get this when trying to download the file (I am signed in).
![image](/uploads/9ac4cfcbd86dc0de18140bfd2f7e65c9/image.png)Mike JohnsonMike Johnsonhttps://code.usgs.gov/wma/nhgf/reference-hydrofabric/-/issues/141Suggested change for Reference Topology 10U2024-03-08T20:28:48ZMike JohnsonSuggested change for Reference Topology 10U**Where does it occur**: VPU 10U right on the Idaho border. Mainstem ID: 1881305
All images come from the `refactor_10U.gpkg`
![image](/uploads/a2aa6a9d5eaf5feef08af94415ae6042/image.png)
**What is the issue**: There is an artificial ...**Where does it occur**: VPU 10U right on the Idaho border. Mainstem ID: 1881305
All images come from the `refactor_10U.gpkg`
![image](/uploads/a2aa6a9d5eaf5feef08af94415ae6042/image.png)
**What is the issue**: There is an artificial path, that is only half represented in the NHD flowlines (as seen from OSM basemap).
The lower portion of the artificial path seems to be the primary divergence (and the toID) coming out of Mainstem: 1881306 and does NOT have an associated divide.
![image](/uploads/1997504eec98d8c8571198153f5a074c/image.png)
Given only part of the artificial path is represented, I think it would be more appropriate for ID:10010387 to flow to ID:10010388 (mainstem 1881303), and ID:10010386 to be removed.
What do you think?Mike JohnsonMike Johnsonhttps://code.usgs.gov/wma/nhgf/reference-hydrofabric/-/issues/143Suggested change for Reference Topology 10L2024-03-08T20:28:15ZMike JohnsonSuggested change for Reference Topology 10L**Where does it occur**: VPU 10L near the North Platte River Authority Airport Mainstem ID: 1107638 (ID: 10023804)
All images come from the `refactor_10L.gpkg`
![image](/uploads/cbe46a427181bc3cb882002aee71966e/image.png)
**What is t...**Where does it occur**: VPU 10L near the North Platte River Authority Airport Mainstem ID: 1107638 (ID: 10023804)
All images come from the `refactor_10L.gpkg`
![image](/uploads/cbe46a427181bc3cb882002aee71966e/image.png)
**What is the issue**: There is a flowpath (red) that is part Fremont Slough AND part Natural channel (South Platte Tributary), The natural channel is the mainsteam (1107638), however the topology flows to the slough that exits the flowpath mid segment
This leads to one of the only cases where a mid segment exit occurs and looking at the context, seems incorrect. Although no simple solution may be ideal:
## **Possible solutions**:
Change the toID of ID:10023804 to 10023793. This would continue the flow on the mainstem and drain to the Platte
Or
Change the mainstem of 10023804 to 1142381. This would place this segment on the Fremont Slough
Or
Split the flowline is some way to have perpendicular flows running along the South Platte Tributary and the Fremont Slough where the latter is an artificial flowline with no divide.
What do you think?Mike JohnsonMike Johnsonhttps://code.usgs.gov/wma/nhgf/reference-hydrofabric/-/issues/134Enable downloading subsets for certain operations.2024-03-08T19:36:34ZRoss, Jesse CEnable downloading subsets for certain operations.Currently, it's hard to know which data is necessary in order to run the pipeline. The `00_get_data.Rmd` script tries to download a whole lot of data. This is a barrier to entry for running the pipeline. Anybody who doesn't have at least...Currently, it's hard to know which data is necessary in order to run the pipeline. The `00_get_data.Rmd` script tries to download a whole lot of data. This is a barrier to entry for running the pipeline. Anybody who doesn't have at least 100 GB to spare will run out of space, and the downloads take a long time.
It might be nice to make it clear how to winnow this down. This could start with a configuration section specifying which types of data are necessary, and which regions. It would also be nice for regional modelers to have the ability to download a single VPU for their region, rather than download the entire seamless lower 48 hydrofabric. I don't yet know enough about use cases to get into the details, but as we develop the pipeline into a product which modelers can use themselves, we will want to put some thought into it. Just making a note for now.
CC @mewieczoRoss, Jesse CRoss, Jesse Chttps://code.usgs.gov/wma/nhgf/reference-hydrofabric/-/issues/147TODOs in HI Navigate2024-03-07T21:41:19ZBlodgett, David L.TODOs in HI NavigateSome stuff is a bit ragged -- someone needs to chaise down what's right and wrong and get it fixed up.
Search for "TODO" in https://code.usgs.gov/wma/nhgf/reference-hydrofabric/-/blob/main/workspace/02_HI_navigate.Rmd?ref_type=headsSome stuff is a bit ragged -- someone needs to chaise down what's right and wrong and get it fixed up.
Search for "TODO" in https://code.usgs.gov/wma/nhgf/reference-hydrofabric/-/blob/main/workspace/02_HI_navigate.Rmd?ref_type=headshttps://code.usgs.gov/wma/nhgf/mainstems/-/issues/87Update base network as combination of enhd, nhdplushr, and merit.2024-02-15T14:46:40ZBlodgett, David L.Update base network as combination of enhd, nhdplushr, and merit.Need to stitch together and normalize everything into a consistently structured reference fabric.
Planning to go in this issue. Can be closed once the workflow and a ready for review gpkg containing the complete set of flowlines is read...Need to stitch together and normalize everything into a consistently structured reference fabric.
Planning to go in this issue. Can be closed once the workflow and a ready for review gpkg containing the complete set of flowlines is ready.
Include attributes:
id, toid (dendritic), fromnode, tonode, divergence, name, lengthkm, totdasqkm, mainstem -- others?Blodgett, David L.Blodgett, David L.https://code.usgs.gov/wma/nhgf/toolsteam/gdptools/-/issues/41Create NHGFStacData class2024-02-09T19:11:10ZHopkins, Anders LCreate NHGFStacData classCreate a new data class for reading in gridded data the the NHGF STAC catalog.Create a new data class for reading in gridded data the the NHGF STAC catalog.Hopkins, Anders LHopkins, Anders Lhttps://code.usgs.gov/wma/nhgf/toolsteam/gdptools/-/issues/40Create STAC catalog example notebook2024-02-08T15:12:25ZHopkins, Anders LCreate STAC catalog example notebookCreate a use case example notebook of gdptools pulling data from STAC catalog. May have to edit gdptools to read in zarr data.Create a use case example notebook of gdptools pulling data from STAC catalog. May have to edit gdptools to read in zarr data.Hopkins, Anders LHopkins, Anders Lhttps://code.usgs.gov/wma/nhgf/pygeoapi/-/issues/2Improve pipeline workflow2024-02-06T19:05:55ZGrahn, Ethan KalobImprove pipeline workflowThe current pipeline carries a lot of overhead in unnecessary image builds and only supports one merge request at a time. All base images (e.g. nox, server base, etc.) should only be rebuilt if the associated files are changed. Adding ru...The current pipeline carries a lot of overhead in unnecessary image builds and only supports one merge request at a time. All base images (e.g. nox, server base, etc.) should only be rebuilt if the associated files are changed. Adding rules to the pipeline will allow us to accomplish this. Currently for the merge requests, the images get built into the same container registry tag. Therefore, if there are multiple merge requests in place there will be potential conflicts in who is building what. Having merge requests build their own server images and having tag expiration rules will help us maintain independent branches while still minimizing container storage.