Skip to content
Snippets Groups Projects
Commit bcd93669 authored by Powers, Peter M.'s avatar Powers, Peter M.
Browse files

Merge branch 'doc-edits' into 'main'

Doc edits

See merge request !697
parents 3a88f7bf 7be6a472
No related branches found
No related tags found
2 merge requests!705Production Release | nshmp-haz,!697Doc edits
Pipeline #214792 passed
Showing
with 431 additions and 285 deletions
...@@ -3,7 +3,7 @@ ...@@ -3,7 +3,7 @@
***nshmp-haz*** is a U.S. Geological Survey ([USGS](https://www.usgs.gov)) developed software stack ***nshmp-haz*** is a U.S. Geological Survey ([USGS](https://www.usgs.gov)) developed software stack
that supports probabilistic seismic hazard (PSHA) and related analyses. It is maintained by the that supports probabilistic seismic hazard (PSHA) and related analyses. It is maintained by the
National Seismic Hazard Model Project ([NSHMP](https://earthquake.usgs.gov/hazards/)) within the National Seismic Hazard Model Project ([NSHMP](https://earthquake.usgs.gov/hazards/)) within the
USGS's earthquake hazards program ([EHP](http://earthquake.usgs.gov)). USGS [Earthquake Hazards Program](http://earthquake.usgs.gov).
*nshmp-haz* supports high performance seismic hazard calculations required to generate detailed *nshmp-haz* supports high performance seismic hazard calculations required to generate detailed
maps over large areas and supports a variety of web services and applications related to maps over large areas and supports a variety of web services and applications related to
...@@ -18,8 +18,8 @@ use *nshmp-haz* as well as underlying model implementation details. ...@@ -18,8 +18,8 @@ use *nshmp-haz* as well as underlying model implementation details.
* [Developer Basics](./pages/Developer-Basics.md) * [Developer Basics](./pages/Developer-Basics.md)
* [Calculation Configuration](./pages/Calculation-Configuration.md) * [Calculation Configuration](./pages/Calculation-Configuration.md)
* [Site Specification](./pages/Site-Specification.md) * [Site Specification](./pages/Site-Specification.md)
* [Examples](../../etc/examples) * [Using Docker](./pages/Using-Docker.md)
* [Web Services](./pages/Web-Services.md) * See also the [examples](../../etc/examples) directory
* [Hazard Model](./pages/Hazard-Model.md) * [Hazard Model](./pages/Hazard-Model.md)
* [Model Structure](./pages/Model-Structure.md) * [Model Structure](./pages/Model-Structure.md)
* [Model Files](./pages/Model-Files.md) * [Model Files](./pages/Model-Files.md)
......
...@@ -18,7 +18,7 @@ considered in other industries such as real estate lending. A 9-member steering ...@@ -18,7 +18,7 @@ considered in other industries such as real estate lending. A 9-member steering
academic and industry experts provides technical oversight and recommendations to the NSHMP. academic and industry experts provides technical oversight and recommendations to the NSHMP.
An NSHM defines the set of likely earthquake sources and their rates in a particular region. Given An NSHM defines the set of likely earthquake sources and their rates in a particular region. Given
parameters of the earhtquake source and a site of interest, ground motion models (GMMs) are used parameters of the earthquake source and a site of interest, ground motion models (GMMs) are used
to estimate ground shaking from the set of earthquakes. The NSHMP routinely updates NSHMs for the to estimate ground shaking from the set of earthquakes. The NSHMP routinely updates NSHMs for the
U.S. and its territories to consider the best available science. U.S. and its territories to consider the best available science.
......
# Building & Running # Building & Running
## Related Pages [TOC]
* [Building & Running](./Building-&-Running.md#building-&-running)
* [Developer Basics](./Developer-Basics.md#developer-basics)
* [Calculation Configuration](./Calculation-Configuration.md#calculation-configuration)
* [Site Specification](./Site-Specification.md#site-specification)
* [Examples](../../etc/examples)
* [Web Services](./Web-Services.md)
## Build & Run Options
* [Build and run locally](#build-and-run-locally)
* [Run with Docker](#run-with-docker)
## Build and Run Locally ## Build and Run Locally
...@@ -21,8 +9,8 @@ Building and running *nshmp-haz* requires prior installation of Git and Java. Pl ...@@ -21,8 +9,8 @@ Building and running *nshmp-haz* requires prior installation of Git and Java. Pl
### Building ### Building
Navigate to a location on your system where you want *nshmp-haz* code to reside, clone the *nshmp-haz* uses the [Gradle](https://gradle.org/) build tool. Navigate to a location on your
repository, and compile: system where you want *nshmp-haz* code to reside, clone the repository, and compile:
```bash ```bash
cd /path/to/project/directory cd /path/to/project/directory
...@@ -36,206 +24,108 @@ This creates a single file, `build/libs/nshmp-haz.jar` that may be used for haza ...@@ -36,206 +24,108 @@ This creates a single file, `build/libs/nshmp-haz.jar` that may be used for haza
users using the native command prompt). This executes any tasks (e.g. `assemble`) after users using the native command prompt). This executes any tasks (e.g. `assemble`) after
downloading all required dependencies, including Gradle itself. downloading all required dependencies, including Gradle itself.
### Computing Hazard ### Running
The `HazardCalc` program computes hazard curves at one or more sites for a variety of intensity
measures. For example:
```bash
java -cp path/to/nshmp-haz.jar gov.usgs.earthquake.nshmp.HazardCalc model sites [config]
```
At a minimum, the hazard source [model](./Hazard-Model.md) and the [site](./Site-Specification.md)(s)
at which to perform calculations must be specified. The source model should specified a path to a
directory. A single site may be specified with a string; multiple sites must be specified using
either a comma-delimited (CSV) or [GeoJSON](http://geojson.org) file. The path to a custom
[configuration](./Calculation-Configuration.md) file containing user-specific settings may optionally
be supplied as a third argument. It can be used to override any calculation settings; if absent
[default](./Calculation-Configuration.md) values are used.
See the [examples](../../etc/examples) directory for more details (or *nshmp-haz* applications may be run from the command line or as a local web service. Command line
[on GitLab](https://code.usgs.gov/ghsc/nshmp/nshmp-haz/-/tree/main/etc/examples)) applications are recommended for long running hazard and disaggregation calculations, but most
users will find the web service endpoints to be more flexible. Web services return
[JSON](https://www.json.org/json-en.html) responses that can be parsed by most programming
languages and data analaysis and visualization programs (e.g. Matlab).
### Computing Disaggregations ### Web Servies
Like `HazardCalc`, the `DisaggCalc` program performs disaggregations at one or more sites for a To run *nshmp-haz* web services:
variety of intensity measures. The return period for the disaggregation is defined in the config,
see [`disagg.returnPeriod`](./Calculation-Configuration.md#calculation-configuration-parameters).
Example:
```bash ```bash
java -cp nshmp-haz.jar gov.usgs.earthquake.nshmp.DisaggCalc model sites [config] ./gradlew run
# or './gradlew run -t' to recompile and relaunch when code changes
``` ```
Disaggregations build on and output `HazardCalc` results along with other disaggregation specific By default, when the web services start up, they load the 2018 NSHM for the conterminous U.S.
files. Disaggregations also have some independent To use a different model run the web services using Java and specify model path:
[configuration](./Calculation-Configuration.md#calculation-configuration-parameters) options.
## Run with [Docker](https://docs.docker.com/install/)
nshmp-haz is available as a public image from [Docker hub](https://hub.docker.com/r/usgs/nshmp-haz)
with tags:
* `latest`: Refers to the latest updates from the main or production branch
* `development-latest`: Refers to forks of the repository.
* `staging-latest`: Latest updates associated with the
[main](https://code.usgs.gov/ghsc/nshmp/nshmp-haz/-/tree/main) branch
* `production-latest`: Latest stable release associated with the
[production](https://code.usgs.gov/ghsc/nshmp/nshmp-haz/-/tree/production) branch
To ensure you have the latest *nshmp-haz* update associated with a specific tag,
always first pull the image from Docker:
```bash ```bash
docker pull usgs/nshmp-haz:<tag> ./gradlew assemble
``` java -jar build/libs/nshmp-haz.jar --model=path/to/model
> Replace `<tag>` with one of the above tags.
Example:
```bash
docker pull usgs/nshmp-haz:production-latest
``` ```
### Docker Memory on Mac After startup, web services and documentation are available at <http://localhost:8080/>.
By default, Docker Desktop for Mac is set to use 2 GB runtime memory. To run *nshmp-haz*, the See the [Matlab](../../etc/matlab) directory for examples of how to call the web services. To
memory available to Docker must be [increased](https://docs.docker.com/docker-for-mac/#advanced) run the ground motion model (GMM) web services, please use the
to a minimum of 4 GB. [*nshmp-haz-ws*](https://code.usgs.gov/ghsc/nshmp/nshmp-ws) repository.
### Run nshmp-haz in Docker ### Command Line Hazard Calculation
The *nshmp-haz* application may be run as a Docker container which mitigates the need to install The `HazardCalc` program computes hazard curves at one or more sites for a variety of intensity
Git, Java, or other dependencies besides Docker. A public image is available measures. For example:
from [Docker hub](https://hub.docker.com/r/usgs/nshmp-haz)
which can be run with:
```bash ```bash
# Pull docker image java -cp path/to/nshmp-haz.jar gov.usgs.earthquake.nshmp.HazardCalc model sites [config]
docker pull usgs/nshmp-haz:latest
# Run docker image
docker run \
--env CLASS_NAME=<DisaggCalc | HazardCalc | RateCalc> \
--env IML=<NUMBER> \
--env RETURN_PERIOD=<NUMBER> \
--volume /absolute/path/to/sites/file:/app/sites.<geojson | csv> \
--volume /absolute/path/to/config/file:/app/config.json \
--volume /absolute/path/to/output:/app/output \
usgs/nshmp-haz
``` ```
Where: At a minimum, the hazard source [model](./Hazard-Model.md) and the
[site](./Site-Specification.md)(s) at which to perform calculations must be specified. The source
* `CLASS_NAME` is the nshmp-haz class to run: model should specified a path to a directory. A single site may be specified with a string;
* [DisaggCalc](../../src/main/java/gov/usgs/earthquake/nshmp/DisaggCalc.java) multiple sites must be specified using either a comma-delimited (CSV) or
* [HazardCalc](../../src/main/java/gov/usgs/earthquake/nshmp/HazardCalc.java) [GeoJSON](http://geojson.org) file. The path to a custom
* [RateCalc](../../src/main/java/gov/usgs/earthquake/nshmp/RateCalc.java) [configuration](./Calculation-Configuration.md) file containing user-specific settings may
* Other arguments (local files mapped to files within the Docker container with `:/app/...`): optionally be supplied as a third argument. It can be used to override any calculation settings;
* (required) The absolute path to a [USGS model (NSHM)](./USGS-Models.md) if absent [default](./Calculation-Configuration.md) values are used.
* Example: `$(pwd)/nshm-hawaii:/app/model`
* (required) The absolute path to a GeoJSON or CSV [site(s)](./Site-Specification.md) file
* CSV example: `$(pwd)/my-csv-sites.csv:/app/sites.csv`
* GeoJSON example: `$(pwd)/my-geojson-sites.geojson:/app/sites.geojson`
* (required) The absolute path to an output directory
* Example: `$(pwd)/my-hazard-output:/app/output`
* (optional) The absolute path to a [configuration](./Calculation-Configuration.md) file
* Example: `$(pwd)/my-custom-config.json:/app/config.json`
### Docker Examples
#### [`HazardCalc`](../../src/main/java/gov/usgs/earthquake/nshmp/HazardCalc.java) Example See the [examples](../../etc/examples) directory for more details.
The following example runs the `HazardCalc` program in nshmp-haz with the ### Command Line Disaggregation Calculation
[nshm-hawaii](https://code.usgs.gov/ghsc/nshmp/nshms/nshm-hawaii.git) model and the
assumption a GeoJSON [site](./Site-Specification.md) file exists named `sites.geojson`.
```bash Like `HazardCalc`, the `DisaggCalc` program performs disaggregations at one or more sites for a
# Download Hawaii NSHM variety of intensity measures. The return period for the disaggregation is defined in the config,
git clone https://code.usgs.gov/ghsc/nshmp/nshms/nshm-hawaii.git see [`disagg.returnPeriod`](./Calculation-Configuration.md#calculation-configuration-parameters).
Example:
# Pull image
docker pull usgs/nshmp-haz:latest
# Run nshmp-haz HazardCalc
docker run \
--env CLASS_NAME="HazardCalc" \
--volume "$(pwd)/nshm-hawaii:/app/model" \
--volume "$(pwd)/sites.geojson" \
--volume "$(pwd)/hawaii-hazard-output:/app/output" \
usgs/nshmp-haz
```
#### [`DisaggCalc`](../../src/main/java/gov/usgs/earthquake/nshmp/DisaggCalc.java) Example
The following example runs the `DisaggCalc` program in nshmp-haz with the
[nshm-hawaii](https://code.usgs.gov/ghsc/nshmp/nshms/nshm-hawaii.git) model and the
assumption a GeoJSON [site](./Site-Specification.md) file exists named `sites.geojson`.
```bash ```bash
# Download Hawaii NSHM java -cp nshmp-haz.jar gov.usgs.earthquake.nshmp.DisaggCalc model sites [config]
git clone https://code.usgs.gov/ghsc/nshmp/nshms/nshm-hawaii.git
# Pull image
docker pull usgs/nshmp-haz:latest
# Run nshmp-haz DisaggCalc
docker run \
--env CLASS_NAME="DisaggCalc" \
--env RETURN_PERIOD=475 \
--volume "$(pwd)/nshm-hawaii:/app/model" \
--volume "$(pwd)/sites.geojson" \
--volume "$(pwd)/hawaii-disagg-output:/app/output" \
usgs/nshmp-haz:latest
``` ```
#### [`RateCalc`](../../src/main/java/gov/usgs/earthquake/nshmp/RateCalc.java) Example Disaggregations build on and output `HazardCalc` results along with other disaggregation specific
files. Disaggregations also have some independent
The following example runs the `RateCalc` program in nshmp-haz with the [configuration](./Calculation-Configuration.md#calculation-configuration-parameters) options.
[nshm-hawaii](https://code.usgs.gov/ghsc/nshmp/nshms/nshm-hawaii.git) model and the
assumption a GeoJSON [site](./Site-Specification.md) file exists named `sites.geojson`.
```bash ## Customizing Code
# Download Hawaii NSHM
git clone https://code.usgs.gov/ghsc/nshmp/nshms/nshm-hawaii.git
# Pull image
docker pull usgs/nshmp-haz:latest
# Run nshmp-haz RateCalc
docker run \
--env CLASS_NAME="RateCalc" \
--volume "$(pwd)/nshm-hawaii:/app/model" \
--volume "$(pwd)/sites.geojson" \
--volume "$(pwd)/hawaii-rate-output:/app/output" \
usgs/nshmp-haz
```
### Run Customization Whereas *nshmp-haz* contains code to run command line applications and web services, model
loading and hazard calculations are handled in the dependent library
[*nshmp-lib*](https://code.usgs.gov/ghsc/nshmp/nshmp-lib). To use a local, modified version of
*nshmp-lib*, set an environment variable `NSHMP_LIB_LOCAL=true` and *nshmp-haz*
will look for *nshmp-lib* in a directory adjacent to *nshmp-haz*. If *nshmp-lib* is located
somewhere else, modify the path specified in
[`gradle/dependencies.gradle`](../../gradle/dependencies.gradle). When using a local version
of *nshmp-lib*, first build the *nshmp-lib* project using `./gradlew fatJar` so that
required dependencies are included.
When running *nshmp-haz* with Docker the maximum JVM memory size can Summary of steps to check out and start running code for local development:
be set with the environment flag (-e, -env):
```bash ```bash
docker run \ # --> Set a NSHMP_LIB_LOCAL=true environment variable on your system
--env JAVA_MEMORY=<MEMORY> \ cd yourProjectDirectory
... git clone https://code.usgs.gov/ghsc/nshmp/nshmp-lib.git
usgs/nshmp-haz cd nshmp-lib
./gradlew fatJar
# Example cd ..
docker run \ git clone https://code.usgs.gov/ghsc/nshmp/nshmp-ws.git
--env JAVA_MEMORY="12g" \ cd nshmp-ws
... ./gradlew run
usgs/nshmp-haz
``` ```
Where:
* `JAVA_MEMORY` is the maximum memory for the JVM (default: 8g)
--- ---
## Related Pages
* [Building & Running](./Building-&-Running.md#building-&-running)
* [Developer Basics](./Developer-Basics.md#developer-basics)
* [Calculation Configuration](./Calculation-Configuration.md#calculation-configuration)
* [Site Specification](./Site-Specification.md#site-specification)
* [Using Docker](./Using-Docker.md#using-docker)
* See also the [examples](../../etc/examples) directory
* [**Documentation Index**](../README.md) * [**Documentation Index**](../README.md)
--- ---
......
...@@ -13,7 +13,7 @@ may be overridden. See [building and running](./Building-&-Running.md) and the ...@@ -13,7 +13,7 @@ may be overridden. See [building and running](./Building-&-Running.md) and the
Parameter | Type | Default | Notes | Parameter | Type | Default | Notes |
--------- | ---- | ------- | ----- | --------- | ---- | ------- | ----- |
__`hazard`__ **`hazard`**
&nbsp;&nbsp;&nbsp;`.exceedanceModel` |`String` | `"TRUNCATION_3SIGMA_UPPER"`| [`ExceedanceModel`][url-exceedance] &nbsp;&nbsp;&nbsp;`.exceedanceModel` |`String` | `"TRUNCATION_3SIGMA_UPPER"`| [`ExceedanceModel`][url-exceedance]
&nbsp;&nbsp;&nbsp;`.truncationLevel` |`Double` | `3.0` | [1](#notes) &nbsp;&nbsp;&nbsp;`.truncationLevel` |`Double` | `3.0` | [1](#notes)
&nbsp;&nbsp;&nbsp;`.imts` |`String[]` | `["PGV","PGA","SA0P01","SA0P02",`<br>`"SA0P03","SA0P05","SA0P075",`<br>`"SA0P1","SA0P15","SA0P2","SA0P25",`<br>`"SA0P3","SA0P4","SA0P5","SA0P75",`<br>`"SA1P0","SA1P5","SA2P0","SA3P0",`<br>`"SA4P0","SA5P0","SA7P5","SA10P0"]` | [`Imt`][url-imt] &nbsp;&nbsp;&nbsp;`.imts` |`String[]` | `["PGV","PGA","SA0P01","SA0P02",`<br>`"SA0P03","SA0P05","SA0P075",`<br>`"SA0P1","SA0P15","SA0P2","SA0P25",`<br>`"SA0P3","SA0P4","SA0P5","SA0P75",`<br>`"SA1P0","SA1P5","SA2P0","SA3P0",`<br>`"SA4P0","SA5P0","SA7P5","SA10P0"]` | [`Imt`][url-imt]
...@@ -23,25 +23,25 @@ __`hazard`__ ...@@ -23,25 +23,25 @@ __`hazard`__
&nbsp;&nbsp;&nbsp;`.useSiteData` |`Boolean` | `true` | Enable site data (e.g. basin depths) &nbsp;&nbsp;&nbsp;`.useSiteData` |`Boolean` | `true` | Enable site data (e.g. basin depths)
&nbsp;&nbsp;&nbsp;`.customImls` |`Map<String, Double[]>` | `{}` (empty object) | [2](#notes) &nbsp;&nbsp;&nbsp;`.customImls` |`Map<String, Double[]>` | `{}` (empty object) | [2](#notes)
&nbsp;&nbsp;&nbsp;`.valueFormat` |`String` | `"ANNUAL_RATE"` | [`ValueFormat`][url-valueformat] &nbsp;&nbsp;&nbsp;`.valueFormat` |`String` | `"ANNUAL_RATE"` | [`ValueFormat`][url-valueformat]
__`gmm`__ **`gmm`**
&nbsp;&nbsp;&nbsp;`.dampingRatio` |`Double` | `0.05` (5%) | Limited to range [0.005..0.3] 0.5% to 30% &nbsp;&nbsp;&nbsp;`.dampingRatio` |`Double` | `0.05` (5%) | Limited to range [0.005..0.3] 0.5% to 30%
&nbsp;&nbsp;&nbsp;`.sigmaScale` |`Double` | `1.0` (100%, no scaling) | Limited to range [0.5..1.0] 50% to 100% &nbsp;&nbsp;&nbsp;`.sigmaScale` |`Double` | `1.0` (100%, no scaling) | Limited to range [0.5..1.0] 50% to 100%
&nbsp;&nbsp;&nbsp;`.vertical` |`Boolean` | `false` | Compute vertical ground motions &nbsp;&nbsp;&nbsp;`.vertical` |`Boolean` | `false` | Compute vertical ground motions
__`disagg`__ **`disagg`**
&nbsp;&nbsp;&nbsp;`.returnPeriod` |`Double` | `2475` | &nbsp;&nbsp;&nbsp;`.returnPeriod` |`Double` | `2475` |
&nbsp;&nbsp;&nbsp;`.bins` |`Object` | | [4](#notes) &nbsp;&nbsp;&nbsp;`.bins` |`Object` | | [4](#notes)
&nbsp;&nbsp;&nbsp;`.contributorLimit` |`Double` | `0.1` | [5](#notes) &nbsp;&nbsp;&nbsp;`.contributorLimit` |`Double` | `0.1` | [5](#notes)
__`rate`__ **`rate`**
&nbsp;&nbsp;&nbsp;`.bins` |`Object` | | [6](#notes) &nbsp;&nbsp;&nbsp;`.bins` |`Object` | | [6](#notes)
&nbsp;&nbsp;&nbsp;`.distance` |`Double` | `20` km &nbsp;&nbsp;&nbsp;`.distance` |`Double` | `20` km
&nbsp;&nbsp;&nbsp;`.distributionFormat` |`String` | `"INCREMENTAL"` | [`DistributionFormat`][url-distribution] &nbsp;&nbsp;&nbsp;`.distributionFormat` |`String` | `"INCREMENTAL"` | [`DistributionFormat`][url-distribution]
&nbsp;&nbsp;&nbsp;`.timespan` |`Double` | `30` years &nbsp;&nbsp;&nbsp;`.timespan` |`Double` | `30` years
&nbsp;&nbsp;&nbsp;`.valueFormat` |`String` | `"ANNUAL_RATE"` | [`ValueFormat`][url-valueformat] &nbsp;&nbsp;&nbsp;`.valueFormat` |`String` | `"ANNUAL_RATE"` | [`ValueFormat`][url-valueformat]
__`output`__ | **`output`** |
&nbsp;&nbsp;&nbsp;`.directory` |`String` | `hazout` &nbsp;&nbsp;&nbsp;`.directory` |`String` | `hazout`
&nbsp;&nbsp;&nbsp;`.dataTypes` |`String[]` | `["TOTAL","MAP"]` | [`DataType`][url-datatype] &nbsp;&nbsp;&nbsp;`.dataTypes` |`String[]` | `["TOTAL","MAP"]` | [`DataType`][url-datatype]
&nbsp;&nbsp;&nbsp;`.returnPeriods` |`Double[]` | `[475,975,2475,10000]` | [`ReturnPeriods`][url-returnperiods] &nbsp;&nbsp;&nbsp;`.returnPeriods` |`Double[]` | `[475,975,2475,10000]` | [`ReturnPeriods`][url-returnperiods]
__`performance`__ **`performance`**
&nbsp;&nbsp;&nbsp;`.optimizeGrids` |`Boolean` | `true` | [7](#notes) &nbsp;&nbsp;&nbsp;`.optimizeGrids` |`Boolean` | `true` | [7](#notes)
&nbsp;&nbsp;&nbsp;`.smoothGrids` |`Boolean` | `true` | [8](#notes) &nbsp;&nbsp;&nbsp;`.smoothGrids` |`Boolean` | `true` | [8](#notes)
&nbsp;&nbsp;&nbsp;`.systemPartition` |`Integer` | `1000` | [9](#notes) &nbsp;&nbsp;&nbsp;`.systemPartition` |`Integer` | `1000` | [9](#notes)
...@@ -109,9 +109,9 @@ T ≤ 10 s | 0.000333, 0.000499, 0.000749, 0.00112, 0.00169, 0.00253, <br>0.00 ...@@ -109,9 +109,9 @@ T ≤ 10 s | 0.000333, 0.000499, 0.000749, 0.00112, 0.00169, 0.00253, <br>0.00
* [Developer Basics](./Developer-Basics.md#developer-basics) * [Developer Basics](./Developer-Basics.md#developer-basics)
* [Calculation Configuration](./Calculation-Configuration.md#calculation-configuration) * [Calculation Configuration](./Calculation-Configuration.md#calculation-configuration)
* [Site Specification](./Site-Specification.md#site-specification) * [Site Specification](./Site-Specification.md#site-specification)
* [Examples](../../etc/examples) (or * [Using Docker](./Using-Docker.md#using-docker)
[on GitLab](https://code.usgs.gov/ghsc/nshmp/nshmp-haz/-/tree/main/etc/examples)) * See also the [examples](../../etc/examples) directory
* [__Documentation Index__](../README.md) * [**Documentation Index**](../README.md)
--- ---
![USGS logo](./images/usgs-icon.png) &nbsp;[U.S. Geological Survey](https://www.usgs.gov) ![USGS logo](./images/usgs-icon.png) &nbsp;[U.S. Geological Survey](https://www.usgs.gov)
......
# Code Versions # Code Versions
The static datasets of USGS NSHMs prior to 2014 were computed using Fortran (see The static datasets of USGS NSHMs prior to 2014 were computed using Fortran (see
[_nshmp-haz-fortran_](https://github.com/usgs/nshmp-haz-fortran])). The static datasets for the [*nshmp-haz-fortran*](https://github.com/usgs/nshmp-haz-fortran])). The static datasets for the
2014 Conterminous U.S. NSHM were computed using the Fortran codes and 2014 Conterminous U.S. NSHM were computed using the Fortran codes and
[OpenSHA](https://opensha.org/) (for the California portion of the model). The dynamic versions [OpenSHA](https://opensha.org/) (for the California portion of the model). The dynamic versions
of the 2008 and 2014 Conterminous U.S. models were then implemented in the 1st version of of the 2008 and 2014 Conterminous U.S. models were then implemented in the 1st version of
[_nshmp-haz_](https://github.com/usgs/nshmp-haz) (on GitHub). This updated Java codebase uses XML [*nshmp-haz*](https://github.com/usgs/nshmp-haz) (on GitHub). This updated Java codebase uses XML
source models and supports the web services behind the dynamic calculations of the [Unified Hazard source models and supports the web services behind the dynamic calculations of the [Unified Hazard
Hazard Tool](https://earthquake.usgs.gov/hazards/interactive/) (UHT). Hazard Tool](https://earthquake.usgs.gov/hazards/interactive/) (UHT).
The 2nd version of _nshmp-haz_ (this repository) supercedes prior codebases. The development of this The 2nd version of *nshmp-haz* (this repository) supercedes prior codebases. The development of this
version involved a significant refactoring of both the computational code and source model format. version involved a significant refactoring of both the computational code and source model format.
The source models are now defined using JSON, GeoJSON, and CSV files to better reflect the The source models are now defined using JSON, GeoJSON, and CSV files to better reflect the
underlying logic trees and support uncertainty analysis. underlying logic trees and support uncertainty analysis.
## Transitioning from _nshmp-haz_ v1 to v2 ## Transitioning from *nshmp-haz* v1 to v2
NSHMs are very detailed and migrating from one format to another is not trivial and prone to error. NSHMs are very detailed and migrating from one format to another is not trivial and prone to error.
Moreover, approximations (e.g. using 3.1415 for Pi rather than the the value built into most Moreover, approximations (e.g. using 3.1415 for Pi rather than the the value built into most
languages) can yield different results. When multiple such small changes exist, deciphering what languages) can yield different results. When multiple such small changes exist, deciphering what
is giving rise to differences in results can be challenging. is giving rise to differences in results can be challenging.
To document the transition from _nshmp-haz_ v1 to v2, we here attach comparison maps at four return To document the transition from *nshmp-haz* v1 to v2, we here attach comparison maps at four return
periods (475, 975, 2475, and 10,000 year) for the 2018 Conterminous U.S. model. Maps are included periods (475, 975, 2475, and 10,000 year) for the 2018 Conterminous U.S. model. Maps are included
for PGA and 3 spectral periods ( 0.2 s, 1 s, and 5 s). There are no differences in the Central & for PGA and 3 spectral periods ( 0.2 s, 1 s, and 5 s). There are no differences in the Central &
Eastern U.S. and differences in the WUS are <<1%. The difference in hazard in the vicinity of Eastern U.S. and differences in the WUS are <<1%. The difference in hazard in the vicinity of
Salt Lake City arises from the cluster models in _nshmp-haz_ v1 not being able to consider the Salt Lake City arises from the cluster models in *nshmp-haz* v1 not being able to consider the
additional epistemic uncertainty added to the the NGA-West2 ground motion models. additional epistemic uncertainty added to the the NGA-West2 ground motion models.
We continue to investigate the cause of other differences but they are small enough that we are We continue to investigate the cause of other differences but they are small enough that we are
comfortable moving forward deploying _nshmp-haz_ v2 codes and models to our public web services and comfortable moving forward deploying *nshmp-haz* v2 codes and models to our public web services and
applications. This repository includes end-to-end tests for supported NSHMs that may be run applications. This repository includes end-to-end tests for supported NSHMs that may be run
on demand. on demand.
......
...@@ -50,15 +50,6 @@ cd /directory/for/code ...@@ -50,15 +50,6 @@ cd /directory/for/code
git clone https://code.usgs.gov/ghsc/nshmp/nshmp-haz.git git clone https://code.usgs.gov/ghsc/nshmp/nshmp-haz.git
``` ```
## Eclipse Integration (Optional)
Eclipse provides automatic compilation, syntax highlighting, and integration with Git, among
other useful features. To build or modify *nshmp-haz* using [Eclipse](http://www.eclipse.org/),
install the [Eclipse IDE for Java Developers](https://www.eclipse.org/downloads/packages/) or
[Eclipse IDE for Enterprise Java and Web Developers](https://www.eclipse.org/downloads/packages/),
if you plan on developing web services. Import the project into Eclipse: `File > Import >
Gradle > Existing Gradle Project`
--- ---
## Related Pages ## Related Pages
...@@ -67,8 +58,8 @@ Gradle > Existing Gradle Project` ...@@ -67,8 +58,8 @@ Gradle > Existing Gradle Project`
* [Developer Basics](./Developer-Basics.md#developer-basics) * [Developer Basics](./Developer-Basics.md#developer-basics)
* [Calculation Configuration](./Calculation-Configuration.md#calculation-configuration) * [Calculation Configuration](./Calculation-Configuration.md#calculation-configuration)
* [Site Specification](./Site-Specification.md#site-specification) * [Site Specification](./Site-Specification.md#site-specification)
* [Examples](../../etc/examples) (or * [Using Docker](./Using-Docker.md#using-docker)
[on GitLab](https://code.usgs.gov/ghsc/nshmp/nshmp-haz/-/tree/main/etc/examples)) * See also the [examples](../../etc/examples) directory
* [**Documentation Index**](../README.md) * [**Documentation Index**](../README.md)
--- ---
......
...@@ -29,17 +29,17 @@ in magnitude, *M*, and distance, *R*). This formulation relies on models of grou ...@@ -29,17 +29,17 @@ in magnitude, *M*, and distance, *R*). This formulation relies on models of grou
give the probability that an intensity measure level of interest will be exceeded conditioned on give the probability that an intensity measure level of interest will be exceeded conditioned on
the occurrence of a particular earthquake. Such models are commonly referred to as: the occurrence of a particular earthquake. Such models are commonly referred to as:
* __Intensity measure relationships__ * **Intensity measure relationships**
* __Attenuation relationships__ * **Attenuation relationships**
* __Ground motion prediction equations (GMPEs)__ * **Ground motion prediction equations (GMPEs)**
* __Ground motion models (GMMs)__ * **Ground motion models (GMMs)**
The parameterization of modern models (e.g. NGA-West2; Bozorgnia et al., 2014) extends to much The parameterization of modern models (e.g. NGA-West2; Bozorgnia et al., 2014) extends to much
more than magnitude and distance, including, but not limited to: more than magnitude and distance, including, but not limited to:
* __Multiple distance metrics__ (e.g. rJB, rRup, rX, rY) * **Multiple distance metrics** (e.g. rJB, rRup, rX, rY)
* __Fault geometry__ (e.g. dip, width, rupture depth, hypocentral depth) * **Fault geometry** (e.g. dip, width, rupture depth, hypocentral depth)
* __Site characteristics__ (e.g. basin depth terms, site type or Vs30 value) * **Site characteristics** (e.g. basin depth terms, site type or Vs30 value)
## Simple, yes, but used for so much more… ## Simple, yes, but used for so much more…
...@@ -119,10 +119,10 @@ foreach IMT { ...@@ -119,10 +119,10 @@ foreach IMT {
Break the traditional PSHA formulation down into discrete steps and preserve the data associated Break the traditional PSHA formulation down into discrete steps and preserve the data associated
with each step: with each step:
* __[1]__ Source & Site parameterization * **[1]** Source & Site parameterization
* __[2]__ Ground motion calculation (mean and standard deviation only) * **[2]** Ground motion calculation (mean and standard deviation only)
* __[3]__ Exceedance curve calculation (per source) * **[3]** Exceedance curve calculation (per source)
* __[4]__ Recombine * **[4]** Recombine
Whereas the traditional pipeline looks something like this: Whereas the traditional pipeline looks something like this:
...@@ -132,9 +132,9 @@ The functional pipeline can be processed stepwise: ...@@ -132,9 +132,9 @@ The functional pipeline can be processed stepwise:
![image](images/psha-functional.png "PSHA functional pipeline") ![image](images/psha-functional.png "PSHA functional pipeline")
__Need a disaggregation?__ Revisit and parse the results of steps 1 and 2 **Need a disaggregation?** Revisit and parse the results of steps 1 and 2
__Need a response spectra?__ Spawn more calculations, one for each IMT, at step 2. **Need a response spectra?** Spawn more calculations, one for each IMT, at step 2.
## Benefits ## Benefits
...@@ -161,7 +161,7 @@ __Need a response spectra?__ Spawn more calculations, one for each IMT, at step ...@@ -161,7 +161,7 @@ __Need a response spectra?__ Spawn more calculations, one for each IMT, at step
--- ---
[__Documentation Index__](../README.md) [**Documentation Index**](../README.md)
--- ---
![USGS logo](./images/usgs-icon.png) &nbsp;[U.S. Geological Survey](https://www.usgs.gov) ![USGS logo](./images/usgs-icon.png) &nbsp;[U.S. Geological Survey](https://www.usgs.gov)
......
...@@ -5,7 +5,7 @@ the occurrence of various earthquakes. The following tables list the GMMs suppor ...@@ -5,7 +5,7 @@ the occurrence of various earthquakes. The following tables list the GMMs suppor
is not uncommon for a GMM to have multiple concrete implementations. For instance, most subduction is not uncommon for a GMM to have multiple concrete implementations. For instance, most subduction
GMMs, as published, support both interface and intraslab events. GMMs, as published, support both interface and intraslab events.
[[_TOC_]] [TOC]
## GMM Configuration ## GMM Configuration
...@@ -44,7 +44,7 @@ The following sample `gmm-config.json` file applies the NGA-West 2 epistemic unc ...@@ -44,7 +44,7 @@ The following sample `gmm-config.json` file applies the NGA-West 2 epistemic unc
## GMM Uncertainty Models ## GMM Uncertainty Models
_nshmp-haz_ supports additional epistemic uncertainty models derived from the PEER NGA-West 1 *nshmp-haz* supports additional epistemic uncertainty models derived from the PEER NGA-West 1
and PEER NGA-West 2 projects. These models both have factors for distance (`Rrup`) bins and PEER NGA-West 2 projects. These models both have factors for distance (`Rrup`) bins
Rrup < 10 km, 10 km <= Rrup, < 30 km, and 30 km <= Rrup, and for magnitude bins M < 6.0, 6.0 <= Rrup < 10 km, 10 km <= Rrup, < 30 km, and 30 km <= Rrup, and for magnitude bins M < 6.0, 6.0 <=
M < 7.0, and 7.0 <= M. These models can be applied within the `gmm-config.json` file as shown in M < 7.0, and 7.0 <= M. These models can be applied within the `gmm-config.json` file as shown in
...@@ -52,14 +52,14 @@ the [GMM Uncertainty](#gmm-uncertainty) section above. ...@@ -52,14 +52,14 @@ the [GMM Uncertainty](#gmm-uncertainty) section above.
## GMMs By Tectonic Setting ## GMMs By Tectonic Setting
GMMs available in _nshmp-haz_ are tabulated by tectonic setting below. See the javadocs for the GMMs available in *nshmp-haz* are tabulated by tectonic setting below. See the javadocs for the
[GMM Package](https://earthquake.usgs.gov/nshmp/docs/nshmp-lib/gov/usgs/earthquake/nshmp/gmm/package-summary.html) [GMM Package](https://earthquake.usgs.gov/nshmp/docs/nshmp-lib/gov/usgs/earthquake/nshmp/gmm/package-summary.html)
for implementation details of each GMM and comprehensive lists of GMM IDs. for implementation details of each GMM and comprehensive lists of GMM IDs.
### Active Crust GMMs ### Active Crust GMMs
Reference | ID | Component | Notes Reference | ID | Component | Notes
:---------|:--:|:---------:|:------: :---------|:---|:----------|:------:
**NGA-West 2** **NGA-West 2**
[Abrahamson et al., 2014](http://dx.doi.org/10.1193/070913EQS198M) | ASK_14<br>ASK_14_BASIN | RotD50 | [Abrahamson et al., 2014](http://dx.doi.org/10.1193/070913EQS198M) | ASK_14<br>ASK_14_BASIN | RotD50 |
[Boore et al., 2014](http://dx.doi.org/10.1193/070113EQS184M) | BSSA_14<br>BSSA_14_BASIN | RotD50 | [Boore et al., 2014](http://dx.doi.org/10.1193/070113EQS184M) | BSSA_14<br>BSSA_14_BASIN | RotD50 |
...@@ -93,7 +93,7 @@ NGA-East<br>[Goulet et al., 2017](https://peer.berkeley.edu/sites/default/files/ ...@@ -93,7 +93,7 @@ NGA-East<br>[Goulet et al., 2017](https://peer.berkeley.edu/sites/default/files/
[Shahjouei and Pezeshk, 2016](http://dx.doi.org/10.1785/0120140367) | NGA_EAST_SEED_SP16 | RotD50 | 3 [Shahjouei and Pezeshk, 2016](http://dx.doi.org/10.1785/0120140367) | NGA_EAST_SEED_SP16 | RotD50 | 3
**Other** **Other**
[Atkinson, 2008](http://dx.doi.org/10.1785/0120070199)<br>[Atkinson & Boore, 2011](http://dx.doi.org/10.1785/0120100270) | ATKINSON_08_PRIME | horizontal | 4 [Atkinson, 2008](http://dx.doi.org/10.1785/0120070199)<br>[Atkinson & Boore, 2011](http://dx.doi.org/10.1785/0120100270) | ATKINSON_08_PRIME | horizontal | 4
[Atkinson & Boore, 2006](http://dx.doi.org/10.1785/0120050245) | _AB\_06\_\*<br>140BAR\|200BAR<br>none\|\_AB\|\_J_ | horizontal | 4 [Atkinson & Boore, 2006](http://dx.doi.org/10.1785/0120050245) | AB_06_*<br>140BAR \| 200BAR<br>none \| \_AB \| \_J | horizontal | 4
[Atkinson & Boore, 2006](http://dx.doi.org/10.1785/0120050245)<br>[Atkinson & Boore, 2011](http://dx.doi.org/10.1785/0120100270) | AB_06_PRIME | horizontal | 4 [Atkinson & Boore, 2006](http://dx.doi.org/10.1785/0120050245)<br>[Atkinson & Boore, 2011](http://dx.doi.org/10.1785/0120100270) | AB_06_PRIME | horizontal | 4
[Campbell, 2003](http://dx.doi.org/10.1785/0120020002) | CAMPBELL_03<br>CAMPBELL_03_AB<br>CAMPBELL_03_J | Geometric mean | 4 [Campbell, 2003](http://dx.doi.org/10.1785/0120020002) | CAMPBELL_03<br>CAMPBELL_03_AB<br>CAMPBELL_03_J | Geometric mean | 4
[Frankel et al., 1996](https://pubs.usgs.gov/of/1996/532/) | FRANKEL_96<br>FRANKEL_96_AB<br>FRANKEL_96_J | not specified | 4 [Frankel et al., 1996](https://pubs.usgs.gov/of/1996/532/) | FRANKEL_96<br>FRANKEL_96_AB<br>FRANKEL_96_J | not specified | 4
...@@ -112,30 +112,28 @@ for individual NGA-East component model IDs ...@@ -112,30 +112,28 @@ for individual NGA-East component model IDs
### Subduction GMMs ### Subduction GMMs
_Note: See the [GMM javadocs](https://earthquake.usgs.gov/nshmp/docs/nshmp-lib/gov/usgs/earthquake/nshmp/gmm/Gmm.html) *Note: See the [GMM javadocs](https://earthquake.usgs.gov/nshmp/docs/nshmp-lib/gov/usgs/earthquake/nshmp/gmm/Gmm.html)
for a comprehensive list of GMM IDs._ for a comprehensive list of GMM IDs.*
Reference | ID | Component | Notes Reference | ID | Component | Notes
:---------|:---|:----------|:------: :---------|:---|:----------|:------:
**NGA-Subduction** **NGA-Subduction**
[Abrahamson & Gülerce, 2020](https://peer.berkeley.edu/sites/default/files/2020_25.pdf) | _AG\_20\_\*<br>GLOBAL\|CASCADIA\|ALASKA<br>INTERFACE\|SLAB<br>no basin\|\_BASIN_ | RotD50 [Abrahamson & Gülerce, 2020](https://peer.berkeley.edu/sites/default/files/2020_25.pdf) | AG_20_*<br>GLOBAL \| CASCADIA \| ALASKA<br>INTERFACE \| SLAB<br> (none) \| \_BASIN | RotD50
[Kuehn et al., 2020](https://peer.berkeley.edu/sites/default/files/2020_04_kuehn_final.pdf) | _KBCG\_20\_\*<br>GLOBAL\|CASCADIA\|ALASKA<br>INTERFACE\|SLAB<br>no basin\|\_BASIN_ | RotD50 [Kuehn et al., 2020](https://peer.berkeley.edu/sites/default/files/2020_04_kuehn_final.pdf) | KBCG_20_*<br>GLOBAL \| CASCADIA \| ALASKA<br>INTERFACE \| SLAB<br>(none) \| \_BASIN | RotD50
[Parker et al., 2020](https://peer.berkeley.edu/sites/default/files/2020_03_parker_final.pdf) | _PSHAB\_20\_\*<br>GLOBAL\|CASCADIA\|ALASKA<br>INTERFACE\|SLAB<br>no basin\|\_BASIN_ | RotD50 [Parker et al., 2020](https://peer.berkeley.edu/sites/default/files/2020_03_parker_final.pdf) | PSHAB_20_*<br>GLOBAL \| CASCADIA \| ALASKA<br>INTERFACE \| SLAB<br>(none) \|\_BASIN | RotD50
**Other** **Other**
[Atkinson & Boore, 2003](http://dx.doi.org/10.1785/0120020156) | _AB\_03\*<br>GLOBAL\|CASCADIA<br>INTERFACE\|SLAB<br>none\|\_LOW_SAT_| horizontal | [Atkinson & Boore, 2003](http://dx.doi.org/10.1785/0120020156) | AB_03_*<br>GLOBAL \| CASCADIA<br>INTERFACE \| SLAB<br> (none) \| \_LOW_SAT| horizontal |
[Atkinson & Macias, 2009](http://dx.doi.org/10.1785/0120080147) | AM_09_INTERFACE<br>AM_09_INTERFACE_BASIN | Geometric mean | 1 [Atkinson & Macias, 2009](http://dx.doi.org/10.1785/0120080147) | AM_09_INTERFACE<br>AM_09_INTERFACE_BASIN | Geometric mean | 1
BC Hydro<br>[Abrahamson et al., 2016](http://dx.doi.org/10.1193/051712EQS188MR) | _BCHYDRO\_12\_\*<br>INTERFACE\|SLAB<br>none\|\_BASIN<br>none\|\_BACKARC_ | Geometric mean | BC Hydro<br>[Abrahamson et al., 2016](http://dx.doi.org/10.1193/051712EQS188MR) | BCHYDRO_12_*<br>INTERFACE \| SLAB<br> (none) \| \_BASIN<br>(none) \| \_BACKARC | Geometric mean |
BC Hydro NGA<br>[Abrahamson et al., 2018](https://peer.berkeley.edu/sites/default/files/2018_02_abrahamson_9.10.18.pdf)² | _BCHYDRO\_18\_NGA\_\*<br>INTERFACE\|SLAB<br>none\|\_NO_EPI_ | Geometric mean | 3 BC Hydro NGA<br>[Abrahamson et al., 2018](https://peer.berkeley.edu/sites/default/files/2018_02_abrahamson_9.10.18.pdf)² | BCHYDRO_18_NGA_*<br>INTERFACE \| SLAB<br> (none) \| \_NO_EPI | Geometric mean | 3
[McVerry et al., 2000](http://doi.org/10.5459/BNZSEE.39.1.1-58) | MCVERRY_00_INTERFACE<br>MCVERRY_00_SLAB<br>MCVERRY_00_VOLCANIC | Max-horizontal,<br>also supports geometric mean | 4 [McVerry et al., 2000](http://doi.org/10.5459/BNZSEE.39.1.1-58) | MCVERRY_00_INTERFACE<br>MCVERRY_00_SLAB<br>MCVERRY_00_VOLCANIC | Max-horizontal,<br>also supports geometric mean | 4
[Youngs et al., 1997](http://dx.doi.org/10.1785/gssrl.68.1.58) | YOUNGS_97_INTERFACE<br>YOUNGS_97_SLAB | Geometric mean | [Youngs et al., 1997](http://dx.doi.org/10.1785/gssrl.68.1.58) | YOUNGS_97_INTERFACE<br>YOUNGS_97_SLAB | Geometric mean |
[Zhao et al., 2006](http://dx.doi.org/10.1785/0120050122) | _ZHAO\_06\_\*<br>INTERFACE\|SLAB<br>none\|\_BASIN_ | Geometric mean | [Zhao et al., 2006](http://dx.doi.org/10.1785/0120050122) | ZHAO_06_*<br>INTERFACE \| SLAB<br> (none) \| \_BASIN | Geometric mean |
[Zhao et al., 2016](http://dx.doi.org/10.1785/0120150034)<br>[Zhao et al., 2016](http://dx.doi.org/10.1785/0120150056) | ZHAO_16_INTERFACE<br>ZHAO_16_SLAB<br>_ZHAO_16_UPPER_MANTLE_ | Geometric mean<br>(random orientation) | 5
¹ Interface only ¹ Interface only
² Likely to be superseded by the final EQ Spectra paper ² Likely to be superseded by the final EQ Spectra paper
³ Calibrated for Cascadia use only ³ Calibrated for Cascadia use only
⁴ New Zealand model, does not correspond directly with US site class model ⁴ New Zealand model, does not correspond directly with US site class model
⁵ Subduction Slab and Interface
### Regional and Specialized GMMs ### Regional and Specialized GMMs
......
...@@ -5,7 +5,7 @@ capable of generating, otherwise known as a magnitude-frequency distribution (MF ...@@ -5,7 +5,7 @@ capable of generating, otherwise known as a magnitude-frequency distribution (MF
types of MFDs supported in a hazard model are described below. Unless otherwise noted, all the types of MFDs supported in a hazard model are described below. Unless otherwise noted, all the
members listed in the JSON examples below are required. members listed in the JSON examples below are required.
[[_TOC_]] [TOC]
## Single ## Single
......
...@@ -46,7 +46,7 @@ Hawaii | 1998 | 1.1.0 | | TBD | | ...@@ -46,7 +46,7 @@ Hawaii | 1998 | 1.1.0 | | TBD | |
Hawaii | 1998 | 1.0.0 |:small_blue_diamond:| | ASCE7-10 | Hawaii | 1998 | 1.0.0 |:small_blue_diamond:| | ASCE7-10 |
Puerto Rico & <br/> U.S. Virgin Islands | 2003 | v1.0.0 | | | | Puerto Rico & <br/> U.S. Virgin Islands | 2003 | v1.0.0 | | | |
<sup></sup> __Note on the 2014 conterminous U.S. NSHM:__ Initial publication of the <sup></sup> **Note on the 2014 conterminous U.S. NSHM:** Initial publication of the
[2014 model](https://www.usgs.gov/natural-hazards/earthquake-hazards/science/2014-united-states-lower-48-seismic-hazard-long-term) [2014 model](https://www.usgs.gov/natural-hazards/earthquake-hazards/science/2014-united-states-lower-48-seismic-hazard-long-term)
included data to support updates to the U.S. Building Code, specifically hazard curves for peak included data to support updates to the U.S. Building Code, specifically hazard curves for peak
ground acceleration (PGA), and 0.2 and 1.0 second spectral accelerations, all at a BC boundary site ground acceleration (PGA), and 0.2 and 1.0 second spectral accelerations, all at a BC boundary site
...@@ -99,7 +99,7 @@ one of the dynamic editions is likely better. ...@@ -99,7 +99,7 @@ one of the dynamic editions is likely better.
* [Model Editions](./Model-Editions.md#model-editions) * [Model Editions](./Model-Editions.md#model-editions)
* [Logic Trees & Uncertainty](./Logic-Trees-&-Uncertainty.md#logic-trees-&-uncertainty) * [Logic Trees & Uncertainty](./Logic-Trees-&-Uncertainty.md#logic-trees-&-uncertainty)
* [Code Versions](./Code-Versions.md#code-versions) * [Code Versions](./Code-Versions.md#code-versions)
* [__Documentation Index__](../README.md) * [**Documentation Index**](../README.md)
--- ---
![USGS logo](./images/usgs-icon.png) &nbsp;[U.S. Geological Survey](https://www.usgs.gov) ![USGS logo](./images/usgs-icon.png) &nbsp;[U.S. Geological Survey](https://www.usgs.gov)
......
...@@ -4,7 +4,7 @@ A variety of logic tree, configuration, and data files that are common to all mo ...@@ -4,7 +4,7 @@ A variety of logic tree, configuration, and data files that are common to all mo
types are described below and on related pages for specific model components, for example, ground types are described below and on related pages for specific model components, for example, ground
motion models (GMMs) or magnitude-frequency distributions (MFDs). motion models (GMMs) or magnitude-frequency distributions (MFDs).
[[_TOC_]] [TOC]
## Model and Calculation Configuration ## Model and Calculation Configuration
...@@ -81,7 +81,7 @@ or its children will be processed; any standalone sources will be ignored. For e ...@@ -81,7 +81,7 @@ or its children will be processed; any standalone sources will be ignored. For e
**source-group.json:** A specialized form of logic tree that describes model branches that are **source-group.json:** A specialized form of logic tree that describes model branches that are
additive and therefore does not include weights. Examples from the NSHM for the conterminous U.S. additive and therefore does not include weights. Examples from the NSHM for the conterminous U.S.
NSHM include the Cascadia segmented and partial-rupture models, and the New Madrid 1500-yr cluster NSHM include the Cascadia segmented and partial-rupture models, and the New Madrid 1500-yr cluster
branches. The branch objects in a source group _may_ include an optional `scale` member that can branches. The branch objects in a source group *may* include an optional `scale` member that can
be used to impose a probability of occurrence or other scaling requred by a NSHM. If absent, the be used to impose a probability of occurrence or other scaling requred by a NSHM. If absent, the
`scale` value is one. `scale` value is one.
...@@ -99,7 +99,7 @@ be used to impose a probability of occurrence or other scaling requred by a NSHM ...@@ -99,7 +99,7 @@ be used to impose a probability of occurrence or other scaling requred by a NSHM
``` ```
**tree-info.json:** Top level source trees and groups must be accompanied by a file that contains **tree-info.json:** Top level source trees and groups must be accompanied by a file that contains
a unique integer ID for the logic tree. This file _may_ also include a `name` field that, if a unique integer ID for the logic tree. This file *may* also include a `name` field that, if
present, will be used instead of the enclosing directory name, but it is not required. present, will be used instead of the enclosing directory name, but it is not required.
```json ```json
...@@ -124,7 +124,7 @@ For example: ...@@ -124,7 +124,7 @@ For example:
``` ```
See the [ground motion models](./Ground-Motion-Models.md) page for details on GMMs supported in See the [ground motion models](./Ground-Motion-Models.md) page for details on GMMs supported in
_nshmp-haz_ and the related `gmm-config.json` files that governs GMM behavior. *nshmp-haz* and the related `gmm-config.json` files that governs GMM behavior.
### MFD Logic Trees ### MFD Logic Trees
...@@ -151,7 +151,7 @@ source-tree branches. ...@@ -151,7 +151,7 @@ source-tree branches.
How MFDs are intialized (or realized) depends on the presence and contents of `mfd-config.json` and How MFDs are intialized (or realized) depends on the presence and contents of `mfd-config.json` and
`rate-tree.json` files. See the `rate-tree.json` files. See the
[magnitude frequency distributions](./Magnitude-Frequency-Distributions.md) page for details on [magnitude frequency distributions](./Magnitude-Frequency-Distributions.md) page for details on
these files and the types of MFDs supported in _nshmp-haz_. these files and the types of MFDs supported in *nshmp-haz*.
## Rupture Sets ## Rupture Sets
......
...@@ -6,7 +6,7 @@ self describing and represent logic trees and other relationships between source ...@@ -6,7 +6,7 @@ self describing and represent logic trees and other relationships between source
(JSON), source geometry (GeoJSON features), and earthquake rate data (CSV). JSON is well-suited (JSON), source geometry (GeoJSON features), and earthquake rate data (CSV). JSON is well-suited
for representing model data and relationships and is supported in most programming languages. for representing model data and relationships and is supported in most programming languages.
[[_TOC_]] [TOC]
## Directory Structure ## Directory Structure
......
...@@ -4,7 +4,7 @@ The sites at which to perform hazard and related calculations may be defined in ...@@ -4,7 +4,7 @@ The sites at which to perform hazard and related calculations may be defined in
ways. Examples of the file formats described below are available in the resource directory: ways. Examples of the file formats described below are available in the resource directory:
[`etc/nshm`](../../etc/nshm/README.md). [`etc/nshm`](../../etc/nshm/README.md).
__Note on Coordinates:__ *nshmp-haz* supports longitude and latitude values in the closed **Note on Coordinates:** *nshmp-haz* supports longitude and latitude values in the closed
ranges `[-360° ‥ 360°]` and `[-90° ‥ 90°]`. However, mixing site and/or source ranges `[-360° ‥ 360°]` and `[-90° ‥ 90°]`. However, mixing site and/or source
coordinates across the antimeridian (the -180° to 180° transition) will yield unexpected results. coordinates across the antimeridian (the -180° to 180° transition) will yield unexpected results.
For Pacific models and calculations, always use positive or negative longitudes exclusively. For Pacific models and calculations, always use positive or negative longitudes exclusively.
...@@ -104,9 +104,9 @@ outside the 'calculation' polygon are set to zero. For an example, see the ...@@ -104,9 +104,9 @@ outside the 'calculation' polygon are set to zero. For an example, see the
* [Developer Basics](./Developer-Basics.md#developer-basics) * [Developer Basics](./Developer-Basics.md#developer-basics)
* [Calculation Configuration](./Calculation-Configuration.md#calculation-configuration) * [Calculation Configuration](./Calculation-Configuration.md#calculation-configuration)
* [Site Specification](./Site-Specification.md#site-specification) * [Site Specification](./Site-Specification.md#site-specification)
* [Examples](../../etc/examples) (or * [Using Docker](./Using-Docker.md#using-docker)
[on GitLab](https://code.usgs.gov/ghsc/nshmp/nshmp-haz/-/tree/main/etc/examples)) * See also the [examples](../../etc/examples) directory
* [__Documentation Index__](../README.md) * [**Documentation Index**](../README.md)
--- ---
![USGS logo](./images/usgs-icon.png) &nbsp;[U.S. Geological Survey](https://www.usgs.gov) ![USGS logo](./images/usgs-icon.png) &nbsp;[U.S. Geological Survey](https://www.usgs.gov)
......
...@@ -10,17 +10,17 @@ represent models of smoothed seismicity in both crustal and subduction intraslab ...@@ -10,17 +10,17 @@ represent models of smoothed seismicity in both crustal and subduction intraslab
may also be used for fault zones where there is a history of large earthquakes but the fault may also be used for fault zones where there is a history of large earthquakes but the fault
geometry itself is unknown or very poorly defined. geometry itself is unknown or very poorly defined.
[[_TOC_]] [TOC]
Source models for use with _nshmp-haz_ are defined using [JSON](https://www.json.org) and Source models for use with *nshmp-haz* are defined using [JSON](https://www.json.org) and
[GeoJSON](https://geojson.org). _nshmp-haz_ makes determinations about how to represent a source [GeoJSON](https://geojson.org). *nshmp-haz* makes determinations about how to represent a source
based on a GeoJSON geometry type in conjunction with supporting JSON configuration files. Example based on a GeoJSON geometry type in conjunction with supporting JSON configuration files. Example
source configuration files, `*-config.json`, are provided with each source type description. source configuration files, `*-config.json`, are provided with each source type description.
Configuration files must be fully specified with `null` JSON member values used to specify 'do Configuration files must be fully specified with `null` JSON member values used to specify 'do
nothing' where appropriate. Any configuration member value in ALL_CAPS indicates the value is one nothing' where appropriate. Any configuration member value in ALL_CAPS indicates the value is one
of a fixed number of options commonly referred to as an enum. of a fixed number of options commonly referred to as an enum.
__Note on Coordinates:__ _nshmp-haz_ supports longitude and latitude values in the closed ranges **Note on Coordinates:** *nshmp-haz* supports longitude and latitude values in the closed ranges
`[-360°‥360°]` and `[-90°‥90°]`. Note, however, that mixing site and/or source coordinates across `[-360°‥360°]` and `[-90°‥90°]`. Note, however, that mixing site and/or source coordinates across
the antimeridian (the -180° to 180° transition) will yield unexpected results. For Pacific models the antimeridian (the -180° to 180° transition) will yield unexpected results. For Pacific models
and calculations, always use positive or negative longitudes exclusively. and calculations, always use positive or negative longitudes exclusively.
...@@ -43,7 +43,7 @@ Grid sources are represented in a model using a logic tree with a `rupture-sets. ...@@ -43,7 +43,7 @@ Grid sources are represented in a model using a logic tree with a `rupture-sets.
ruptures on each branch. Because gridded seismicity models may be governed by regionally ruptures on each branch. Because gridded seismicity models may be governed by regionally
varying MFD properties (e.g. `mMax`), rupture sets for grids are defined in a JSON array. varying MFD properties (e.g. `mMax`), rupture sets for grids are defined in a JSON array.
__rupture-sets.json__: Defines an array of one or more rupture sets. Multiple rupture sets are **rupture-sets.json**: Defines an array of one or more rupture sets. Multiple rupture sets are
used to model regional differences in MFD properties such as maximum magnitude. The `feature` used to model regional differences in MFD properties such as maximum magnitude. The `feature`
member points to the ID of a geojson feature (in the `grid-sources/features` directory) that member points to the ID of a geojson feature (in the `grid-sources/features` directory) that
defines the bounds of the gridded seismicity source. A grid rupture set `mfd-tree` is never defines the bounds of the gridded seismicity source. A grid rupture set `mfd-tree` is never
...@@ -62,7 +62,7 @@ defined inline and always points to a tree in a ...@@ -62,7 +62,7 @@ defined inline and always points to a tree in a
] ]
``` ```
__grid-config.json__: A `grid-depth-map` defines a mapping of magnitude ranges to logic trees of **grid-config.json**: A `grid-depth-map` defines a mapping of magnitude ranges to logic trees of
depth distributions. The map can use arbitrary names as keys, but the magnitude ranges defined by depth distributions. The map can use arbitrary names as keys, but the magnitude ranges defined by
each member must be non-overlapping. The magnitude ranges are interpreted as closed (inclusive) – each member must be non-overlapping. The magnitude ranges are interpreted as closed (inclusive) –
open (exclusive), e.g. [mMin..mMax). `maxDepth` constrains the maximum depth of any pseudo fault open (exclusive), e.g. [mMin..mMax). `maxDepth` constrains the maximum depth of any pseudo fault
...@@ -107,7 +107,7 @@ to upper-right). While most gridded rate files contain columns of longitude, lat ...@@ -107,7 +107,7 @@ to upper-right). While most gridded rate files contain columns of longitude, lat
some may contain depth values (intraslab sources), maximum magnitude caps, or other values. Scaled some may contain depth values (intraslab sources), maximum magnitude caps, or other values. Scaled
spatial PDFs are the preferred approach to modeling regional rate variations, however it is also spatial PDFs are the preferred approach to modeling regional rate variations, however it is also
possible to define explicit MFDs at each grid node. To do so, the `spatial-pdf` member possible to define explicit MFDs at each grid node. To do so, the `spatial-pdf` member
of a __rupture-sets.json__ is replaced with `grid-mfds`. See of a **rupture-sets.json** is replaced with `grid-mfds`. See
`2018 CONUS NSHM > active-crust > grid-sources` for examples of both approaches. `2018 CONUS NSHM > active-crust > grid-sources` for examples of both approaches.
## Zone Sources ## Zone Sources
...@@ -149,7 +149,7 @@ directly. ...@@ -149,7 +149,7 @@ directly.
} }
``` ```
__zone-config.json:__ Zone source model configuration is identical to a grid source configuration. **zone-config.json:** Zone source model configuration is identical to a grid source configuration.
```json ```json
{ {
...@@ -188,7 +188,7 @@ they occur in multiple locations on the fault surface with appropriately scaled ...@@ -188,7 +188,7 @@ they occur in multiple locations on the fault surface with appropriately scaled
associated with finite fault models may be explicitly defined or derived from slip rates. associated with finite fault models may be explicitly defined or derived from slip rates.
Fault rupture rates may be modeled using explicitly defined MFDs or logic trees of slip rate. Fault rupture rates may be modeled using explicitly defined MFDs or logic trees of slip rate.
__fault-source.geojson__: Defines the geometry and properties of a single source. In the example **fault-source.geojson**: Defines the geometry and properties of a single source. In the example
below the presence of a `rate-map` property indicates MFDs should be constructed from the supplied below the presence of a `rate-map` property indicates MFDs should be constructed from the supplied
slip rates and using the weights defined in a `rate-tree.json`. slip rates and using the weights defined in a `rate-tree.json`.
...@@ -235,7 +235,7 @@ slip rates and using the weights defined in a `rate-tree.json`. ...@@ -235,7 +235,7 @@ slip rates and using the weights defined in a `rate-tree.json`.
} }
``` ```
__fault-config.json__: Controls the point spacing on a gridded surface used to realize the fault **fault-config.json**: Controls the point spacing on a gridded surface used to realize the fault
geometry as well as define the models to use for magnitude scaling and rupture floating. Dip geometry as well as define the models to use for magnitude scaling and rupture floating. Dip
variations and an associated slip-rate scaling model are also supported for normal faults. variations and an associated slip-rate scaling model are also supported for normal faults.
...@@ -249,9 +249,9 @@ variations and an associated slip-rate scaling model are also supported for norm ...@@ -249,9 +249,9 @@ variations and an associated slip-rate scaling model are also supported for norm
} }
``` ```
__rupture-set.json__: When a fault source is represented with a logic tree a **rupture-set.json**: When a fault source is represented with a logic tree a
`rupture-set.json` defines the ruptures for each branch. A rupture set _may_ also define custom `rupture-set.json` defines the ruptures for each branch. A rupture set *may* also define custom
properties and _may_ also contain a `sections` member that defines the fault sections for the properties and *may* also contain a `sections` member that defines the fault sections for the
rupture set (see note on fault section stitching, [below](#fault-section-stitching)). rupture set (see note on fault section stitching, [below](#fault-section-stitching)).
```json ```json
...@@ -272,10 +272,10 @@ rupture set (see note on fault section stitching, [below](#fault-section-stitchi ...@@ -272,10 +272,10 @@ rupture set (see note on fault section stitching, [below](#fault-section-stitchi
When multiple sections are defined for a rupture, the ruptures must be defined in an order that When multiple sections are defined for a rupture, the ruptures must be defined in an order that
preserves the U.S. structural geology right-hand-rule. When stitched together, repeated locations preserves the U.S. structural geology right-hand-rule. When stitched together, repeated locations
at the enpoints of adjacent sections, if present, are removed. The properties of the first section at the enpoints of adjacent sections, if present, are removed. The properties of the first section
govern the properties of the stitched fault, however, a rupture-set _may_ include a properties govern the properties of the stitched fault, however, a rupture-set *may* include a properties
member, the contents of which will override the properties of the first stitched section. Although member, the contents of which will override the properties of the first stitched section. Although
it would be better to have geometric properties of stitched sections be calculated dynamically, it would be better to have geometric properties of stitched sections be calculated dynamically,
the current approach preserves support for past models. A rupture-set _may_ also include a the current approach preserves support for past models. A rupture-set *may* also include a
`coordinates` member that can be used to represent a smoothed trace geometry where stitched `coordinates` member that can be used to represent a smoothed trace geometry where stitched
sections do not share common endpoints. sections do not share common endpoints.
...@@ -287,9 +287,9 @@ as the joint probability of exceeding ground motions from each independent event ...@@ -287,9 +287,9 @@ as the joint probability of exceeding ground motions from each independent event
a cluster may only have an mfd-tree composed of `Mfd.Type.SINGLE` MFDs and the mfd-trees must a cluster may only have an mfd-tree composed of `Mfd.Type.SINGLE` MFDs and the mfd-trees must
match across all sources in a cluster (i.e. each mfd-tree has the same IDs and weights). match across all sources in a cluster (i.e. each mfd-tree has the same IDs and weights).
__cluster-set.json__ A specialized type of rupture set, this file defines the array of fault **cluster-set.json** A specialized type of rupture set, this file defines the array of fault
rupture sets that make up a 'cluster'. As with fault sources, the nested rupture sets in a cluster rupture sets that make up a 'cluster'. As with fault sources, the nested rupture sets in a cluster
set _may_ define `properties` and `sections` members. set *may* define `properties` and `sections` members.
```json ```json
{ {
...@@ -320,13 +320,13 @@ This specialized fault source type supports inversion based rupture rate estimat ...@@ -320,13 +320,13 @@ This specialized fault source type supports inversion based rupture rate estimat
on a fault network such as that used for UCERF3 in the 2014 and 2018 NSHMs for the conterminous on a fault network such as that used for UCERF3 in the 2014 and 2018 NSHMs for the conterminous
U.S. Fault system source sets require three files: `rupture_set.json`, `sections.geojson`, and U.S. Fault system source sets require three files: `rupture_set.json`, `sections.geojson`, and
`ruptures.csv` that are placed together within folders defining branches of a fault system `ruptures.csv` that are placed together within folders defining branches of a fault system
logic tree. Note that system sources _may_ have complementary gridded seismicity source models logic tree. Note that system sources *may* have complementary gridded seismicity source models
with matching logic trees. with matching logic trees.
__rupture-set.json__: Provides identifying information for the ruptures defined in the adjacant **rupture-set.json**: Provides identifying information for the ruptures defined in the adjacant
sections and ruptures files. sections and ruptures files.
__sections.geojson__: defines a feature collection of the fault sections in a fault network. **sections.geojson**: defines a feature collection of the fault sections in a fault network.
Because fault sections are derived by dividing larger faults into subsections, section features Because fault sections are derived by dividing larger faults into subsections, section features
contain several properties that differ from standalone fault section source models (e.g. contain several properties that differ from standalone fault section source models (e.g.
`dip-direction`). `dip-direction`).
...@@ -362,7 +362,7 @@ contain several properties that differ from standalone fault section source mode ...@@ -362,7 +362,7 @@ contain several properties that differ from standalone fault section source mode
} }
``` ```
__ruptures.csv__: Defines the properties of every rupture. The last column in a rupture file **ruptures.csv**: Defines the properties of every rupture. The last column in a rupture file
defines the ordered array of participating fault section IDs using the shorthand defines the ordered array of participating fault section IDs using the shorthand
`1127:1131-2411:2412`. Colons denote continous ranges of sections and hyphens denote breaks. `1127:1131-2411:2412`. Colons denote continous ranges of sections and hyphens denote breaks.
...@@ -389,7 +389,7 @@ using an `slab-config.json` file. ...@@ -389,7 +389,7 @@ using an `slab-config.json` file.
* [Magnitude Frequency Distributions (MFDs)](./Magnitude-Frequency-Distributions.md#magnitude-frequency-distributions) * [Magnitude Frequency Distributions (MFDs)](./Magnitude-Frequency-Distributions.md#magnitude-frequency-distributions)
* [Rupture Scaling Relations](./Rupture-Scaling-Relations.md#rupture-scaling-relations) * [Rupture Scaling Relations](./Rupture-Scaling-Relations.md#rupture-scaling-relations)
* [Ground Motion Models (GMMs)](./Ground-Motion-Models.md#ground-motion-models) * [Ground Motion Models (GMMs)](./Ground-Motion-Models.md#ground-motion-models)
* [__Documentation Index__](../README.md) * [**Documentation Index**](../README.md)
--- ---
![USGS logo](./images/usgs-icon.png) &nbsp;[U.S. Geological Survey](https://www.usgs.gov) ![USGS logo](./images/usgs-icon.png) &nbsp;[U.S. Geological Survey](https://www.usgs.gov)
......
# Using [Docker](https://docs.docker.com/install/)
*nshmp-haz* is available as a public image from
[Docker hub](https://hub.docker.com/r/usgs/nshmp-haz) with tags:
* `latest`: Refers to the latest updates from the main or production branch
* `development-latest`: Refers to forks of the repository.
* `staging-latest`: Latest updates associated with the
[main](https://code.usgs.gov/ghsc/nshmp/nshmp-haz/-/tree/main) branch
* `production-latest`: Latest stable release associated with the
[production](https://code.usgs.gov/ghsc/nshmp/nshmp-haz/-/tree/production) branch
To ensure you have the latest *nshmp-haz* update associated with a specific tag,
always first pull the image from Docker:
```bash
docker pull usgs/nshmp-haz:<tag>
```
> Replace `<tag>` with one of the above tags.
Example:
```bash
docker pull usgs/nshmp-haz:production-latest
```
## Docker Memory on Mac
By default, Docker Desktop for Mac is set to use 2 GB runtime memory. To run *nshmp-haz*, the
memory available to Docker must be [increased](https://docs.docker.com/docker-for-mac/#advanced)
to a minimum of 4 GB.
## Run *nshmp-haz* in Docker
```bash
# Pull docker image
docker pull usgs/nshmp-haz:latest
# Run docker image
docker run \
--env CLASS_NAME=<DisaggCalc | HazardCalc | RateCalc> \
--env IML=<NUMBER> \
--env RETURN_PERIOD=<NUMBER> \
--volume /absolute/path/to/sites/file:/app/sites.<geojson | csv> \
--volume /absolute/path/to/config/file:/app/config.json \
--volume /absolute/path/to/output:/app/output \
usgs/nshmp-haz
```
Where:
* `CLASS_NAME` is the nshmp-haz class to run:
* [DisaggCalc](../../src/main/java/gov/usgs/earthquake/nshmp/DisaggCalc.java)
* [HazardCalc](../../src/main/java/gov/usgs/earthquake/nshmp/HazardCalc.java)
* [RateCalc](../../src/main/java/gov/usgs/earthquake/nshmp/RateCalc.java)
* Other arguments (local files mapped to files within the Docker container with `:/app/...`):
* (required) The absolute path to a [USGS model (NSHM)](./USGS-Models.md)
* Example: `$(pwd)/nshm-hawaii:/app/model`
* (required) The absolute path to a GeoJSON or CSV [site(s)](./Site-Specification.md) file
* CSV example: `$(pwd)/my-csv-sites.csv:/app/sites.csv`
* GeoJSON example: `$(pwd)/my-geojson-sites.geojson:/app/sites.geojson`
* (required) The absolute path to an output directory
* Example: `$(pwd)/my-hazard-output:/app/output`
* (optional) The absolute path to a [configuration](./Calculation-Configuration.md) file
* Example: `$(pwd)/my-custom-config.json:/app/config.json`
## Examples
### [`HazardCalc`](../../src/main/java/gov/usgs/earthquake/nshmp/HazardCalc.java) Example
The following example runs the `HazardCalc` program in nshmp-haz with the
[nshm-hawaii](https://code.usgs.gov/ghsc/nshmp/nshms/nshm-hawaii.git) model and the
assumption a GeoJSON [site](./Site-Specification.md) file exists named `sites.geojson`.
```bash
# Download Hawaii NSHM
git clone https://code.usgs.gov/ghsc/nshmp/nshms/nshm-hawaii.git
# Pull image
docker pull usgs/nshmp-haz:latest
# Run nshmp-haz HazardCalc
docker run \
--env CLASS_NAME="HazardCalc" \
--volume "$(pwd)/nshm-hawaii:/app/model" \
--volume "$(pwd)/sites.geojson" \
--volume "$(pwd)/hawaii-hazard-output:/app/output" \
usgs/nshmp-haz
```
### [`DisaggCalc`](../../src/main/java/gov/usgs/earthquake/nshmp/DisaggCalc.java) Example
The following example runs the `DisaggCalc` program in nshmp-haz with the
[nshm-hawaii](https://code.usgs.gov/ghsc/nshmp/nshms/nshm-hawaii.git) model and the
assumption a GeoJSON [site](./Site-Specification.md) file exists named `sites.geojson`.
```bash
# Download Hawaii NSHM
git clone https://code.usgs.gov/ghsc/nshmp/nshms/nshm-hawaii.git
# Pull image
docker pull usgs/nshmp-haz:latest
# Run nshmp-haz DisaggCalc
docker run \
--env CLASS_NAME="DisaggCalc" \
--env RETURN_PERIOD=475 \
--volume "$(pwd)/nshm-hawaii:/app/model" \
--volume "$(pwd)/sites.geojson" \
--volume "$(pwd)/hawaii-disagg-output:/app/output" \
usgs/nshmp-haz:latest
```
### [`RateCalc`](../../src/main/java/gov/usgs/earthquake/nshmp/RateCalc.java) Example
The following example runs the `RateCalc` program in nshmp-haz with the
[nshm-hawaii](https://code.usgs.gov/ghsc/nshmp/nshms/nshm-hawaii.git) model and the
assumption a GeoJSON [site](./Site-Specification.md) file exists named `sites.geojson`.
```bash
# Download Hawaii NSHM
git clone https://code.usgs.gov/ghsc/nshmp/nshms/nshm-hawaii.git
# Pull image
docker pull usgs/nshmp-haz:latest
# Run nshmp-haz RateCalc
docker run \
--env CLASS_NAME="RateCalc" \
--volume "$(pwd)/nshm-hawaii:/app/model" \
--volume "$(pwd)/sites.geojson" \
--volume "$(pwd)/hawaii-rate-output:/app/output" \
usgs/nshmp-haz
```
## Run Customization
When running *nshmp-haz* with Docker the maximum JVM memory size can
be set with the environment flag (-e, -env):
```bash
docker run \
--env JAVA_MEMORY=<MEMORY> \
...
usgs/nshmp-haz
# Example
docker run \
--env JAVA_MEMORY="12g" \
...
usgs/nshmp-haz
```
Where:
* `JAVA_MEMORY` is the maximum memory for the JVM (default: 8g)
## Run *nshmp-haz* web services in Docker
### Build and Run Docker Locally
The Docker image may be built with the provided web service [Dockerfile](../../ws.Dockerfile).
```bash
cd /path/to/nshmp-haz
# Build docker image
docker build -f ws.Dockerfile -t nshmp-haz-ws .
# Run Docker image
docker run -p 8080:8080 -v "path/to/model:/model" nshmp-haz-ws
```
Web service runs on [http://localhost:8080/](http://localhost:8080/)
The hazard model is read in via Docker volumes.
#### Local Docker Example with NSHM
```bash
# Build docker image
cd /path/to/nshmp-haz
docker build -f ws.Dockerfile -t nshmp-haz-ws .
# Download NSHM CONUS
cd ..
git clone https://code.usgs.gov/ghsc/nshmp/nshms/nshm-conus
# Run web services
docker run -p 8080:8080 -v "$(pwd):/model" nshmp-haz-ws
```
Open browser to [http://localhost:8080/](http://localhost:8080/).
### Run from Container Registry
A public Docker image is avaialable from [Docker hub](https://hub.docker.com/r/usgs/nshmp-haz-ws).
There are 4 main tags:
* `latest`: Refers to the latest updates from the main or production branch
* `development-latest`: Refers to forks of the repository.
* `staging-latest`: Refers to the
[main](https://code.usgs.gov/ghsc/nshmp/nshmp-haz/-/tree/main) branch and is the latest updates
* `production-latest`: Refers to the
[production](https://code.usgs.gov/ghsc/nshmp/nshmp-haz/-/tree/production) branch and is stable
```bash
# Pull image
docker pull usgs/nshmp-haz-ws:latest
# Run
docker run -p 8080:8080 -v "/path/to/model:/model" usgs/nshmp-haz-ws
```
Web service runs on [http://localhost:8080/](http://localhost:8080/)
The hazard model is read in via Docker volumes.
#### Container Registry Example with NSHM
```bash
# Pull image
docker pull usgs/nshmp-haz-ws:latest
# Download NSHM CONUS
cd ..
git clone https://code.usgs.gov/ghsc/nshmp/nshms/nshm-conus
# Run web services
docker run -p 8080:8080 -v "$(pwd):/model" usgs/nshmp-haz-ws
```
Open browser to [http://localhost:8080/](http://localhost:8080/).
### Java Memory
When running **nshmp-haz** web services with Docker
the initial (Xms) and maximum (Xmx) JVM memory sizes can
be set with the environment flag (-e, -env):
```bash
docker run -p <PORT>:8080 -e JAVA_OPTS="-Xms<INITIAL> -Xmx<MAX>" -d usgs/nshmp-haz-ws
# Example
docker run -p 8080:8080 -e JAVA_OPTS="-Xms1g -Xmx8g" -d usgs/nshmp-haz-ws
```
Where `<INITIAL>` and `<MAX >`should be set to the desired initial and maximum memory sizes,
respectively.
---
## Related Pages
* [Building & Running](./Building-&-Running.md#building-&-running)
* [Developer Basics](./Developer-Basics.md#developer-basics)
* [Calculation Configuration](./Calculation-Configuration.md#calculation-configuration)
* [Site Specification](./Site-Specification.md#site-specification)
* [Using Docker](./Using-Docker.md#using-docker)
* See also the [examples](../../etc/examples) directory
* [**Documentation Index**](../README.md)
---
![USGS logo](./images/usgs-icon.png) &nbsp;[U.S. Geological Survey](https://www.usgs.gov)
National Seismic Hazard Mapping Project ([NSHMP](https://earthquake.usgs.gov/hazards/))
# Examples # Examples
These examples are designed to be executed locally while following the READMEs on GitLab. These examples are designed to be executed locally while following the READMEs on GitLab and
assumes the reader has successfully downloaded and built the code as described in the
[documentation](../../docs/README.md). Because each example builds on prior concepts, we recommend
stepping through all the examples in order.
All examples avoid a lengthy call to Java and the `HazardCalc` program by using the following All examples avoid a lengthy call to Java and the `HazardCalc` program by using the following
system alias: system alias:
...@@ -8,8 +12,6 @@ system alias: ...@@ -8,8 +12,6 @@ system alias:
alias hazard='java -cp /path/to/nshmp-haz/build/libs/nshmp-haz.jar gov.usgs.earthquake.nshmp.HazardCalc' alias hazard='java -cp /path/to/nshmp-haz/build/libs/nshmp-haz.jar gov.usgs.earthquake.nshmp.HazardCalc'
``` ```
Because each example builds on prior concepts, it is best step through all the examples, however quickly.
<!-- markdownlint-disable MD001 --> <!-- markdownlint-disable MD001 -->
#### Start: [Example 1 – A simple hazard calculation](1-hazard-curve/README.md) #### Start: [Example 1 – A simple hazard calculation](1-hazard-curve/README.md)
......
dependencies { dependencies {
// NSHMP // NSHMP
......
...@@ -5,7 +5,6 @@ ...@@ -5,7 +5,6 @@
"requires": true, "requires": true,
"packages": { "packages": {
"": { "": {
"name": "nshmp-haz",
"version": "2.0.0", "version": "2.0.0",
"devDependencies": { "devDependencies": {
"markdownlint-cli": "^0.31.1", "markdownlint-cli": "^0.31.1",
......
...@@ -260,7 +260,7 @@ public class DisaggCalc { ...@@ -260,7 +260,7 @@ public class DisaggCalc {
.splitToList(line) .splitToList(line)
.stream() .stream()
.skip(colsToSkip) .skip(colsToSkip)
.mapToDouble(Double::valueOf) .mapToDouble(Double::parseDouble)
.toArray(); .toArray();
EnumMap<Imt, Double> imtImlMap = new EnumMap<>(Imt.class); EnumMap<Imt, Double> imtImlMap = new EnumMap<>(Imt.class);
......
...@@ -107,7 +107,7 @@ public class HazardMaps { ...@@ -107,7 +107,7 @@ public class HazardMaps {
double[] imls = header.subList(headerCount, header.size()) double[] imls = header.subList(headerCount, header.size())
.stream() .stream()
.mapToDouble(Double::valueOf) .mapToDouble(Double::parseDouble)
.toArray(); .toArray();
StringBuilder mapHeader = new StringBuilder(siteStr); StringBuilder mapHeader = new StringBuilder(siteStr);
...@@ -176,7 +176,7 @@ public class HazardMaps { ...@@ -176,7 +176,7 @@ public class HazardMaps {
double[] rates = elements double[] rates = elements
.stream() .stream()
.skip(headerCount) .skip(headerCount)
.mapToDouble(Double::valueOf) .mapToDouble(Double::parseDouble)
.toArray(); .toArray();
for (double returnPeriod : returnPeriods) { for (double returnPeriod : returnPeriods) {
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment