Skip to content
Snippets Groups Projects
Commit f904ecba authored by Clayton, Brandon Scott's avatar Clayton, Brandon Scott
Browse files

Add docs from GitHub wiki

parent c0d671de
No related branches found
No related tags found
1 merge request!692Add docs from GitHub wiki
Showing
with 1092 additions and 5 deletions
# nshmp-haz # Legacy nshmp-haz
[![Build Status](https://travis-ci.com/usgs/nshmp-haz.svg?branch=master)](https://travis-ci.com/usgs/nshmp-haz) [![Build Status](https://travis-ci.com/usgs/nshmp-haz.svg?branch=master)](https://travis-ci.com/usgs/nshmp-haz)
[![codecov](https://codecov.io/gh/usgs/nshmp-haz/branch/master/graph/badge.svg)](https://codecov.io/gh/usgs/nshmp-haz) [![codecov](https://codecov.io/gh/usgs/nshmp-haz/branch/master/graph/badge.svg)](https://codecov.io/gh/usgs/nshmp-haz)
[![Codacy Badge](https://api.codacy.com/project/badge/Grade/26ee9b93f61d4ee097c611caa17875ab)](https://www.codacy.com/app/pmpowers/nshmp-haz?utm_source=github.com&utm_medium=referral&utm_content=usgs/nshmp-haz&utm_campaign=Badge_Grade) [![Codacy Badge](https://api.codacy.com/project/badge/Grade/26ee9b93f61d4ee097c611caa17875ab)](https://www.codacy.com/app/pmpowers/nshmp-haz?utm_source=github.com&utm_medium=referral&utm_content=usgs/nshmp-haz&utm_campaign=Badge_Grade)
...@@ -8,7 +9,4 @@ Mapping Project ([NSHMP](https://earthquake.usgs.gov/hazards/)) code for perform ...@@ -8,7 +9,4 @@ Mapping Project ([NSHMP](https://earthquake.usgs.gov/hazards/)) code for perform
probabilistic seismic hazard (PSHA) and related analyses. These codes are intended for probabilistic seismic hazard (PSHA) and related analyses. These codes are intended for
use with seismic hazard models developed by the NSHMP for the U.S. and its territories. use with seismic hazard models developed by the NSHMP for the U.S. and its territories.
Please see the [wiki](https://github.com/usgs/nshmp-haz/wiki/) for more information. Please see the [docs](./docs/README.md) for more information.
> This repository has been moved to GitLab:
> [https://code.usgs.gov/ghsc/nshmp/nshmp-haz](https://code.usgs.gov/ghsc/nshmp/nshmp-haz).
<!-- Overview -->
### Overview
* [Build and Run Locally](#build-and-run-locally)
* [Run with Docker](#run-with-docker)
<!-- Build and Run Locally -->
## Build and Run Locally
### Build Requirments
Building and running *nshmp-haz* requires prior installation of Git and Java.
Please see [this page](developer-basics) for system configuration guidance.
### Building
Navigate to a location on your system where you want *nshmp-haz* code to reside, clone the repository, and compile:
```bash
cd /path/to/project/directory
git clone https://github.com/usgs/nshmp-haz.git
cd nshmp-haz
./gradlew assemble
```
This creates a single file, `build/libs/nshmp-haz.jar`, with no dependencies for use in hazard calculations. `./gradlew` executes the Gradle Wrapper script (there is a `gradlew.bat` equivalent for Windows users using the native command prompt). This executes any tasks (e.g. `assemble`) after downloading all required dependencies, Gradle itself included.
### Computing Hazard
The [`HazardCalc`](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/HazardCalc.html) program computes hazard curves at one or more sites for a variety of intensity measures. Computing hazard curves requires at least 2, and at most 3, arguments. For example:
```bash
java -cp nshmp-haz.jar gov.usgs.earthquake.nshmp.HazardCalc model sites [config]
```
At a minimum, the [source model](Source-Models) and the [site(s)](Sites) at which to perform calculations must be specified. The source model should be specified as a path to a directory or zip file. A single site may be specified with a string; multiple sites must be specified using either a comma-delimited (CSV) or [GeoJSON](http://geojson.org) file. Under the 2-argument scenario, model initialization and calculation configuration settings are drawn from the [configuration](configuration) file that *must* reside at the root of the model directory. Alternatively, the path to a custom configuration file may also be supplied as a third argument. It can be used to override any calculation settings, but not model initialization settings.
See the [examples](/usgs/nshmp-haz/tree/master/etc/examples) directory for more details.
### Computing Deaggregations
Like `HazardCalc`, The [`DeaggCalc`](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/DeaggCalc.html) program performs deaggregations at one or more sites for a variety of intensity measures, but requires an additional `returnPeriod` argument, in years. For example:
```bash
java -cp nshmp-haz.jar gov.usgs.earthquake.nshmp.DeaggCalc model sites returnPeriod [config]
```
Deaggregations build on and output `HazardCalc` results along with other deaggregation specific files. Deaggregations also have some independent [configuration](configuration#config-deagg) options.
### Computing Earthquake Rates and Probabilities
*TODO*
### Matlab Access
Another convenient way to use the nshmp-haz library is via Matlab. A [sample script](/usgs/nshmp-haz/blob/master/etc/matlab/) contains the necessary instructions.
<!-- Run with Docker -->
## Run with Docker
### Build Requirments
- [Docker](https://docs.docker.com/install/)
### Getting Latest Docker Image
To ensure you are on the latest nshmp-haz update, pull the image from Docker:
```bash
docker pull usgs/nshmp-haz-legacy
```
### Docker Memory on Mac
By default, Docker Desktop for Mac is set to use 2 GB runtime memory. To run nshmp-haz, the
memory available to Docker must be [increased](https://docs.docker.com/docker-for-mac/#advanced)
to a minimum of 4 GB.
### Running
The nshmp-haz application may be run as a Docker container which mitigates the need to install
Git, Java, or other dependencies besides Docker. A public image is available on
Docker hub at <https://hub.docker.com/r/usgs/nshmp-haz-legacy>
which can be run with:
```bash
docker run \
-e PROGRAM=<deagg | hazard | rate> \
-e MODEL=<WUS-20[08|14|18] | CEUS-20[08|14|18] | AK-2007> \
-e RETURN_PERIOD=<RETURN_PERIOD> \
-v /absolute/path/to/sites/file:/app/sites.<geojson | csv> \
-v /absolute/path/to/config/file:/app/config.json \
-v /absolute/path/to/output:/app/output \
usgs/nshmp-haz-legacy
# Example
docker run \
-e PROGRAM=hazard \
-e MODEL=WUS-2018 \
-v $(pwd)/sites.geojson:/app/sites.geojson \
-v $(pwd)/config.json:/app/config.json \
-v $(pwd)/hazout:/app/output \
usgs/nshmp-haz-legacy
```
Where:
* `PROGRAM` is the nshmp-haz program to run:
* deagg = [`DeaggCalc`](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/DeaggCalc.html)
* hazard = [`HazardCalc`](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/HazardCalc.html)
* rate = [`RateCalc`](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/RateCalc.html)
* `MODEL` is the [USGS model](https://github.com/usgs/nshmp-haz/wiki/usgs-models) to run with:
* CEUS-2018 or WUS-2018: [Conterminous U.S. 2018](https://github.com/usgs/nshm-cous-2018)
* CEUS-2014 or WUS-2014: [Conterminous U.S. 2014](https://github.com/usgs/nshm-cous-2014)
* CEUS-2008 or WUS-2008: [Conterminous U.S. 2008](https://github.com/usgs/nshm-cous-2008)
* AK-2007: [Alaska 2007](https://github.com/usgs/nshm-ak-2007)
* `RETURN_PERIOD`, in years, is only required when running a deaggregation
* The [site(s)](Sites) file must be a GeoJSON or CSV file and the absolute path to the site(s) file
must be given
* CSV example: `$(pwd)/my-csv-sites.csv:/app/sites.csv`
* GeoJSON example: `$(pwd)/my-geojson-sites.geojson:/app/sites.geojson`
* The absolute path to the [configuration](configuration) must be given but is an optional argument
* Example: `$(pwd)/my-custom-config.json:/app/config.json`
* To obtain any results from the Docker container the absolute path to an output directory
must be given
* Example: `$(pwd)/my-hazard-output:/app/output`
#### Running Customization
When running nshmp-haz with Docker the initial (Xms) and maximum (Xmx) JVM memory sizes can
be set with the environment flag (-e, -env):
```bash
docker run \
-e JAVA_XMS=<JAVA_XMS> \
-e JAVA_XMX=<JAVA_XMX> \
...
usgs/nshmp-haz-legacy
```
Where:
* `JAVA_XMS` is the intial memory for the JVM
* Default: 8g
* `JAVA_XMX` is the maximum memory for the JVM
* Default: 8g
### Next: [Configuration](configuration)
A `config.json` file resides at the root of every [source model](source-models).
This file supplies model initialization and default calculation configuration parameters; see the [examples](/usgs/nshmp-haz/tree/master/etc/examples) directory, or any [USGS model](usgs-models), for examples.
The file must be composed of well-formed [JSON](http://json.org). [Why JSON?](#why-use-json)
### Model Initialization Parameters
Model initialization parameters *must* be supplied; there are no default values. In addition, these parameters may *not* be overridden once a model has been initialized in memory. However, one can configure parts of a model differently, [for example](/usgs/nshmp-model-cous-2014/blob/master/Western%20US/Interface/config.json).
Parameter | Type | Notes |
--------- | ---- | ----- |
__`model`__ |
&nbsp;&nbsp;&nbsp;`.name` |`String` |
&nbsp;&nbsp;&nbsp;`.surfaceSpacing` |`Double` |(in km)
&nbsp;&nbsp;&nbsp;`.ruptureFloating` |`String` |[RuptureFloating](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/eq/fault/surface/RuptureFloating.html)
&nbsp;&nbsp;&nbsp;`.ruptureVariability` |`Boolean` |
&nbsp;&nbsp;&nbsp;`.pointSourceType` |`String` |[PointSourceType](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/eq/model/PointSourceType.html)
&nbsp;&nbsp;&nbsp;`.areaGridScaling` |`String` |[AreaSource.GridScaling](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/eq/model/AreaSource.GridScaling.html)
### Calculation Configuration Parameters
Calculation configuration parameters are optional (i.e. defaults are used for missing values) and overridable. Please see [building and running](building-&-running) and the [examples](/usgs/nshmp-haz/tree/master/etc/examples) for details on how to override a calculation configuration.
Parameter | Type | Default | Notes |
--------- | ---- | ------- | ----- |
<a id="config-hazard"></a>__`hazard`__ |
&nbsp;&nbsp;&nbsp;`.exceedanceModel` |`String` | `TRUNCATION_UPPER_ONLY` |[`ExceedanceModel`](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/calc/ExceedanceModel.html)
&nbsp;&nbsp;&nbsp;`.truncationLevel` |`Double` | `3.0`
&nbsp;&nbsp;&nbsp;`.imts` |`String[]` | `[ PGA, SAOP2, SA1P0 ]` |[`Imt`](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/gmm/Imt.html)
&nbsp;&nbsp;&nbsp;`.defaultImls` |`Double[]` | PGA, SA = `[ 0.0025, 0.0045, 0.0075, 0.0113, 0.0169, 0.0253, 0.0380, 0.0570, 0.0854, 0.128, 0.192, 0.288, 0.432, 0.649, 0.973, 1.46, 2.19, 3.28, 4.92, 7.38 ]`<br/><br/>PGV = `[ 0.0100, 0.0177, 0.0312, 0.0552, 0.0976, 0.173, 0.305, 0.539, 0.953, 1.68, 2.98, 5.26, 9.30, 16.4, 29.1, 51.3, 90.8, 160, 284, 501 ]`
&nbsp;&nbsp;&nbsp;`.customImls` |`Map<String, Double[]>` | *empty* |[example](/usgs/nshmp-haz/blob/master/etc/examples/2-custom-config/config.json)
&nbsp;&nbsp;&nbsp;`.gmmUncertainty` |`Boolean` | `false` |[:one:](#one-hazardgmmuncertainty)
&nbsp;&nbsp;&nbsp;`.valueFormat` |`String` | `ANNUAL_RATE` |[`ValueFormat`](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/calc/ValueFormat.html)
<a id="config-deagg"></a>__`deagg`__ | | |[:two:](#two-deagg)
&nbsp;&nbsp;&nbsp;`.bins` |`Object` |
&nbsp;&nbsp;&nbsp;`.contributorLimit` |`Double` | `0.1`
<a id="config-rate"></a>__`rate`__ |
&nbsp;&nbsp;&nbsp;`.bins` |`Object` | |[:three:](#three-ratebins)
&nbsp;&nbsp;&nbsp;`.distance` |`Double` | `20` km
&nbsp;&nbsp;&nbsp;`.distributionFormat` |`String` | `INCREMENTAL` |[`DistributionFormat`](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/calc/DistributionFormat.html)
&nbsp;&nbsp;&nbsp;`.timespan` |`Double` | `30` years
&nbsp;&nbsp;&nbsp;`.valueFormat` |`String` | |[`ValueFormat`](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/calc/ValueFormat.html)
<a id="config-site"></a>__`site`__ |
&nbsp;&nbsp;&nbsp;`.vs30` |`Double` | `760.0` |[`Site`](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/calc/Site.html)
&nbsp;&nbsp;&nbsp;`.vsInferred` |`Boolean` | `true`
&nbsp;&nbsp;&nbsp;`.z1p0` |`Double` | `null` or `NaN` |[:four:](#four-sitez1p0-sitez2p5)
&nbsp;&nbsp;&nbsp;`.z2p5` |`Double` | `null` or `NaN` |[:four:](#four-sitez1p0-sitez2p5)
<a id="config-output"></a>__`output`__ |
&nbsp;&nbsp;&nbsp;`.directory` |`String` | `hazout`
&nbsp;&nbsp;&nbsp;`.dataTypes` |`String[]` | `[ TOTAL ]` |[`DataType`](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/calc/DataType.html)
<a id="config-performance"></a>__`performance`__ |
&nbsp;&nbsp;&nbsp;`.optimizeGrids` |`Boolean` | `true` |[:six:](#six-performanceoptimizegrids)
&nbsp;&nbsp;&nbsp;`.collapseMfds` |`Boolean` | `true` |[:seven:](#seven-performancecollapsemfds)
&nbsp;&nbsp;&nbsp;`.systemPartition` |`Integer` | `1000` |[:eight:](#eight-performancesystempartition)
&nbsp;&nbsp;&nbsp;`.threadCount` |`String` | `ALL` |[`ThreadCount`](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/calc/ThreadCount.html)
###### :one: `hazard.gmmUncertainty`
If values for additional epistemic uncertainty on ground motion have been defined, this value en/disables this feature.
###### :two: `deagg`
The `deagg.bins` field is mapped to a data container that specifies the following default ranges and intervals for distance, magnitude, and epsilon binning:
``` JSON
"bins": {
"rMin": 0.0,
"rMax": 200.0,
"Δr": 10.0,
"mMin": 5.0,
"mMax": 8.4,
"Δm": 0.2,
"εMin": -3.0,
"εMax": 3.0,
"Δε": 0.5
}
```
The `bins` object must be fully specified; partial overrides do not apply to nested JSON objects.
`contributorLimit` specifies the cutoff (in %) below which contributing sources are not listed in deaggregation results.
###### :three: `rate.bins`
The `rate.bins` field is mapped to a data container that specifies the following default magnitude binning range and interval:
``` JSON
"bins": {
"mMin": 4.2,
"mMax": 9.4,
"Δm": 0.1
}
```
The `bins` object must be fully specified; partial overrides do not apply to nested JSON objects.
###### :four: `site.z1p0`, `site.z2p5`
Basin terms may be specified as `null` or `NaN` (both unquoted). `null` is preferred as `NaN` does not conform to the JSON spec. When trying to override default values, however, a `null` term will be ignored whereas `NaN` will override any existing value.
###### :five: `output.flushLimit`
The number of results to write at a time. More memory is required for higher limits.
###### :six: `performance.optimizeGrids`
Grid optimizations are currently implemented for any non-fixed strike grid source. For any site, rates across all azimuths are aggregated in tables of distance and magnitude.
###### :seven: `performance.collapseMfds`
Some magnitude-frequency distribution logic-trees can be combined. This is currently always `true`.
###### :eight: `performance.systemPartition`
The number of ruptures in a fault-system source to process concurrently.
### Why Use JSON?
Whereas sources are defined using XML, configuration files use JSON. Although one or the other formats could be used exclusively, each has unique characteristics and benefits. For instance, for long files, XML is somewhat more readable than JSON. It also permits comments inline, whereas JSON does not. Java has a built in support for writing highly performant XML parsers, and as there is not a 1:1 relationship between source model XML and resultant Java objects in *nshmp-haz*, it is a bit easier to work with. XML also permits a formal structure to be imposed, and although XML schema are not presently used, they may be in the future.
JSON, on the other hand, is a very simple data exchange format designed with the web in mind. The values that govern calculation settings are most likely to be exposed via web services and the job of exchanging such data is much more straightforward using a format that maps natively to Javascript objects.
### Next: [Site Specification](sites)
This project loosely adheres to [Google Java Style](https://google.github.io/styleguide/javaguide.html) and is informed by [Effective Java](http://en.wikipedia.org/wiki/Joshua_Bloch#Effective_Java). All source files and documentation are encoded using [UTF-8](https://en.wikipedia.org/?title=UTF-8). Developers should also consider the following:
* Line lengths:
* code: __100__, but shorter is better.
* comments: __80__, for readability.
* Use unicode symbols where possible: `double μ = 0.4;`
* Use static imports for only the most common utilities and preconditions:
* `java.lang.math`
* `com.google.common.base.Preconditions`
### Dependencies
The following are required to build and run *nshmp-haz*. Additional testing a code coverage dependencies are listed in the [license](/usgs/nshmp-haz/blob/master/LICENSE.md).
1. [Java 8 JDK](http://www.oracle.com/technetwork/java/javase/downloads/jdk7-downloads-1880260.html) (or higher)
2. [Ant](http://ant.apache.org)
3. [Guava](http://code.google.com/p/guava-libraries/)
4. [Gson](https://code.google.com/p/google-gson/)
The following provides basic guidance on how to set up command-line use of nshmp-haz.
### What are Git and GitHub?
* __Git__ is a distributed version control system ([more](https://git-scm.com)).
* __GitHub__ is a public hosting service that facilitates sharing and collaborative development of code and provides a home on the web for documentation and other project resources ([more](https://help.github.com/categories/bootcamp/)).
* Git repositories may be created anywhere on your system. Git repositories may be created locally and pushed to Github for sharing, or they may be created on GitHub and pulled down to your local system for editing.
### Required Software
* Java 8 JDK: [Oracle](http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html) or [OpenJDK](http://openjdk.java.net/install/)
* [Git](https://git-scm.com/downloads)
- Git is included in the macOS [developer tools](https://developer.apple.com/xcode/).
- Windows users may want to consider [Git for Windows](https://git-for-windows.github.io) or [GitHub Desktop](https://desktop.github.com), both of which include a linux-like terminal (Git BASH) in which subsequent commands listed here will work.
Other dependencies are managed with [Gradle](https://gradle.org/), which does not require a separate installation. Gradle is clever about finding Java, but some users may have to explicitly define a `JAVA_HOME` environment variable. For example, on Unix-like systems with `bash` as the default shell, one might add the following to `~/.bash_profile`:
```bash
# macOS
export JAVA_HOME="$(/usr/libexec/java_home -v 1.8)"
# Linux
export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_144
```
On Windows systems, environment variables are set through the `System Properties > Advanced > Environment Variables...` control panel. Depending on where Java is installed, `JAVA_HOME` might be:
```bash
JAVA_HOME C:\Program Files\Java\jdk1.8.0_144
```
### Set Up Git
Follow the instructions on [GitHub](https://help.github.com/articles/set-up-git). Some users may find it easier to use [Git for Windows](https://git-for-windows.github.io) or [GitHub Desktop](https://desktop.github.com), respectively. This installs the components your system requires along with a helpful desktop application for managing communication between local repositories and GitHub, and viewing file diffs as you make changes. Launch the application and perform any configuration steps, using your GitHub username and email address.
### Get the Code
```bash
cd /directory/for/code
git clone https://github.com/usgs/nshmp-haz.git
```
### Eclipse Integration (Optional)
Eclipse provides automatic compilation, syntax highlighting, and integration with Git, among other useful features. To build or modify *nshmp-haz* using [Eclipse](http://www.eclipse.org/), install:
* Eclipse IDE for [Java Developers](http://www.eclipse.org/downloads/packages/eclipse-ide-java-developers/oxygenr) or [Java EE Developers](http://www.eclipse.org/downloads/packages/eclipse-ide-java-ee-developers/oxygenr), if you plan on developing web services.
* `File > Import > Gradle > Existing Gradle Project`
### Return to: [Building & Running](building-&-running)
### Abstract
Probabilistic seismic hazard analysis (PSHA; Cornell, 1968) is elegant in its relative simplicity. However, in the more than 40-years since its publication, the methodology has come to be applied to increasingly complex and non-standard source and ground motion models. For example, the third Uniform California Earthquake Rupture Forecast ([UCERF3](http://pubs.usgs.gov/of/2013/1165/)) upended the notion of discrete faults as independent sources, and the USGS national seismic hazard model uses temporally clustered sources. Moreover, as the logic trees typically employed in PSHAs to capture epistemic uncertainty grow larger, so too does the demand for a more complete understanding of uncertainty. At the USGS, there are additional requirements to support source model mining, deaggregation, and map-making, often through the use of dynamic web-applications. Implementations of the PSHA methodology commonly iterate over all sources that influence the hazard at a site and sequentially build a single hazard curve. Such a linear PSHA computational pipeline, however, proves difficult to maintain and modify to support the additional complexity of new models, hazard products, and analyses. The functional programming paradigm offers some relief. The functional approach breaks calculations down into their component parts or steps, storing intermediate results as immutable objects, making it easier to: chain actions together; preserve intermediate data or results that may still be relevant (e.g. as in a deaggregation); and leverage the concurrency supported by many modern programming languages.
#### Traditional PSHA formulation (after Baker, 2013):
[[images/psha-formula.png|alt=PSHA formulation of Baker (2013)]]
Briefly, the rate, *λ*, of exceeding an intensity measure, *IM*, level may be computed as a summation of the rate of exceeding such a level for all relevant earthquake sources (discretized in magnitude, *M*, and distance, *R*). This formulation relies on models of ground motion that give the probability that an intensity measure level of interest will be exceeded conditioned on the occurrence of a particular earthquake. Such models are commonly referred to as:
* __Intensity measure relationships__
* __Attenuation relationships__
* __Ground motion prediction equations (GMPEs)__
* __Ground motion models (GMMs)__
The parameterization of modern models (e.g. NGA-West2; Bozorgnia et al., 2014) extends to much more than magnitude and distance, including, but not limited to:
* __Multiple distance metrics__ (e.g. rJB, rRup, rX, rY)
* __Fault geometry__ (e.g. dip, width, rupture depth, hypocentral depth)
* __Site characteristics__ (e.g. basin depth terms, site type or Vs30 value)
#### Simple, yes, but used for so much more…
While this formulation is relatively straightforward and is typically presented with examples for a single site, using a single GMM, and a nominal number of sources, modern PSHAs commonly include:
* Multiple thousands of sources (e.g. the 2014 USGS NSHM in the Central & Eastern US includes all smoothed seismicity sources out to 1000km from a site).
* Different source types, the relative contributions of which are important, and the GMM parameterizations of which may be different.
* Sources (and associated ruptures – source filling or floating) represented by logic trees of magnitude-frequency distributions (MFDs).
* Source MFDs subject to logic trees of uncertainty on Mmax, total rate (for the individual source, or over a region, e.g. as in UCERF3) or other properties of the distribution.
* Logic trees of magnitude scaling relations for each source.
* Source models that do not adhere to the traditional formulation (e.g. cluster models of the NSHM).
* Logic trees of ground motion models.
#### And further extended to support…
* Response Spectra, Conditional Mean Spectra – multiple intensity measure types (IMTs; e.g. PGA, PGD, PGV, multiple SAs)
* Deaggregation
* Banded deaggregation (multiple deaggregations at varying IMLs)
* Maps – many thousands of sites
* Uncertainty analyses
#### How are such calculations managed?
* PSHA codes typically compute hazard in a linear fashion, looping over all relevant sources for a site.
* Adding additional GMMs, logic trees, IMT’s, and sites is addressed with more, outer loops:
```PHP
foreach IMT {
foreach Site {
foreach SourceType {
foreach GMM {
foreach Source {
// do something
}
}
}
}
}
```
* Support for secondary analyses, such as deaggregation is supplied by a separate code or codes and can require repeating many of the steps performed to generate an initial hazard curve.
#### What about scaleability, maintenance, and performance?
* Although scaleability can be addressed for secondary products, such as maps, by distributing individual site calculations over multiple processors and threads, it is often difficult to leverage multi-core systems for individual site calculations. This hampers one’s ability to leverage multi-core systems in the face of ever more complex source and ground motion models and their respective logic trees.
* A linear pipeline complicates testing, requiring end to end tests rather than tests of discrete calculations.
* Multiple codes repeating identical tasks invite error and complicate maintenance by multiple individuals.
#### Enter functional programming…
* http://en.wikipedia.org/wiki/Functional_programming
* Functional programming languages have been around for some time (e.g. Haskell, Lisp, R), and fundamental aspects of functional programming/design are common in many languages. For example, a cornerstone of the functional paradigm is the anonymous (or lambda) function; in Matlab, one may write [sqr = @(x) x.^2;].
* In Matlab, one may pass function ‘handles’ (references) to other functions as arguments. This is also possible in Javascript, where such handles serve as callbacks. Given the rise in popularity of the functional style, Java 8 recently added constructs in the form of the function and streaming APIs, and libraries exists for other languages.
#### How do PSHA and related calculations leverage such an approach?
Break the traditional PSHA formulation down into discrete steps and preserve the data associated with each step:
* **[1]** Source & Site parameterization
* **[2]** Ground motion calculation (mean and standard deviation only)
* **[3]** Exceedance curve calculation (per source)
* **[4]** Recombine
Whereas the traditional pipeline looks something like this:
[[images/psha-linear.png|alt=PSHA linear pipeline]]
The functional pipeline can be processed stepwise:
[[images/psha-functional.png|alt=PSHA functional pipeline]]
**Need a deagreggation?** Revisit and parse the results of steps 1 and 2
**Need a response spectra?** Spawn more calculations, one for each IMT, at step 2.
#### Benefits:
* It’s possible to build a single calculation pipeline that will handle a standard hazard curve calculation and all of its extensions without repetition.
* Pipeline performance scales with available hardware.
* No redundant code.
* Can add or remove transforms or data at any point in the pipeline, or build new pipelines without adversely affecting existing code.
#### Drawbacks:
* Greater memory requirements.
* Additional (processor) work to manage the flow of calculation steps.
#### References
* Baker J.W. (2013). An Introduction to Probabilistic Seismic Hazard Analysis (PSHA), White Paper, Version 2.0, 79 pp.
* Bozorgnia, Y., et al. (2014) NGA-West2 Research Project, *Earthquake Spectra*, Vol. 30, No. 3, pp. 973-987.
* Cornell, C.A., 1968, Engineering seismic risk analysis, *Bulletin of the Seismological Society of America*, Vol. 58, No. 5, pp. 1583-1606.
Ground motion models (GMMs) forecast the range of ground motions that may be observed conditioned on the occurrence of various earthquakes. Internally, every earthquake in a `HazardModel` is parameterized as a `GMM_Input`, for which ground motions can be calculated.
### Ground Motion Model File Structure
```XML
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<GroundMotionModels>
<ModelSet maxDistance="150.0">
<Model id="BA_08" weight="0.3333"/>
<Model id="CB_08" weight="0.3333"/>
<Model id="CY_08" weight="0.3334"/>
</ModelSet>
</GroundMotionModels>
```
Ground motion model files also provide limited support for additional epistemic uncertainty on ground motion ([example](https://github.com/usgs/nshmp-model-cous-2008/blob/master/Western%20US/Fault/gmm.xml)) and will likley support uncertainty on ground motion sigma with the anticipated implementation of [NGA-East](http://peer.berkeley.edu/ngaeast/). The format for these uncertainty models is subject to change at any point in the future.
A second distance cutoff is also supported ([example](https://github.com/usgs/nshmp-model-cous-2014/blob/master/Central%20%26%20Eastern%20US/Fault/gmm.xml)). The models specified at the greater distance must be a subset of the near-distance models, but may have different weights. This feature may be removed in future updates.
The weights in any set of ground motion models need not sum to 1.0, but this may be enforced in the future.
### Next: [USGS Models](usgs-models)
This page documents known implementation differences between nshmp-haz, USGS Fortran codes, and OpenSHA.
* __NGA Focal Mechanism Flags:__ Whereas BA 2008 and other models accepted double values for focal mechanism weights (f_s, f_r, f_n, f_u), their intended use was for only 1 to be set to 1 and the rest to 0. The 2008 fortran codes supplied fractions (e.g. f_s=0.5 and f_r=0.5). The current code treated each focal mechanism type independently and then added the weighted hazard curves. This was corrected in 2014 nshm fortran code, but nshmp-haz behavior remains inconsistent with 2008 Fortran behavior.
***nshmp-haz*** is a Java-based platform for conducting probabilistic seismic hazard (PSHA) and related analyses. It is developed and maintained by the National Seismic Hazard Mapping Project ([NSHMP](http://earthquake.usgs.gov/hazards/)) within the U.S. Geological Survey's ([USGS](https://www.usgs.gov)) earthquake hazards program ([EHP](http://earthquake.usgs.gov)). The primary goals *nshmp-haz* are to:
1. Enable high performance seismic hazard calculations required to generate detailed maps over large areas and
2. Support a variety of USGS web services and applications related to seismic hazards research and the dissemination of hazard data.
### Overview
* [Building & Running](building-&-running)
* [Configuration](configuration)
* [Site Specification](sites)
* [Source Models](source-models)
* [Source Types](source-types)
* [Ground Motion Models](ground-motion-models)
* [USGS Models](usgs-models)
* [Examples](https://github.com/usgs/nshmp-haz/tree/master/etc/examples)
* [Dependencies & Development](dependencies-&-development)
* [Functional PSHA](functional-psha)
* [API Docs](http://usgs.github.io/nshmp-haz/javadoc)
* [License](https://github.com/usgs/nshmp-haz/blob/master/LICENSE.md)
For more information on PSHA see:
* [Probabilistic Seismic Hazard Analysis, a Primer](http://www.opensha.org/sites/opensha.org/files/PSHA_Primer_v2_0.pdf) by Edward Field
* [An Introduction to Probabilistic Seismic Hazard Analysis](http://web.stanford.edu/~bakerjw/Publications/Baker_(2015)_Intro_to_PSHA.pdf) by Jack Baker
### Next: [Building & Running](building-&-running)
The sites at which to perform hazard and related calculations may be defined in a variety of ways. Examples of the file formats described below are available in the resource directory: [`etc/nshm`](/usgs/nshmp-haz/tree/master/etc/nshm).
__Note on Coordinates:__ *nshmp-haz* supports longitude and latitude values in the closed ranges `[-360° ‥ 360°]` and `[-90° ‥ 90°]`. Note, however, that mixing site and/or source coordinates across the antimeridian (the -180° to 180° transition) will yield unexpected results. For Pacific models and calculations, always use positive or negative longitudes exclusively.
### Site String
In the simple case where there is a single site of interest, most *nshmp-haz* programs accept a comma-delimited string of the form: `name,lon,lat[,vs30,vsInf[,z1p0,z2p5]]`, where `vs30`, `vsInf`, `z1p0`, and `z2p5` are optional. Note that if `vs30` is supplied, so too must `vsInf`. Likewise if `z1p0` is supplied, so too must `z2p5`. If the string contains any spaces, escape them or wrap the entire string in double quotes.
For any site parameter values that are not supplied on the command line or in the file formats below, the following defaults are used (see the [site defaults configuration](configuration#calculation-configuration-parameters)):
```
name: Unnamed
vs30: 760.0
vsInf: true
z1p0: NaN (GMM will use default basin depth model)
z2p5: NaN (GMM will use default basin depth model)
```
### Comma-Delimited Format (\*.csv)
* Header row must identify columns.
* Valid and [optional] column names are:
`[name,] lon, lat [, vs30] [, vsInf] [, z1p0] [, z2p5]`
* At a minimum, `lon` and `lat` must be defined.
* Columns can be in any order and any missing fields will be populated with the default values listed above.
* If a site `name` is supplied, it is included in the first column of any output curve files.
### GeoJSON Format (\*.geojson)
Although more verbose than the comma-delimited format, [GeoJSON](http://geojson.org) is a more versatile format for encoding geographic data. Moreover, GeoJSON files are rendered directly on GitHub providing immediate visual feedback as to the contents of a file. See the [GitHub GeoJSON](https://help.github.com/articles/mapping-geojson-files-on-github/) page for more information.
*nshmp-haz* uses GeoJSON for both lists of sites and to define map regions. If you encounter problems when formatting JSON files, use [JSONLint](http://jsonlint.com) or [GeoJSONLint](http://geojsonlint.com) for validation.
#### Site Lists
A site list is expected as a `Point` `FeatureCollection`. For example:
```JSON
{
"type": "FeatureCollection",
"features": [
{
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [-122.25, 37.80]
},
"properties": {
"title": "Oakland CA",
"vs30": 760.0,
"vsInf": true,
"z1p0": 0.048,
"z2p5": 0.607
}
}
]
}
```
As with the CSV format, the minimum required data is a `geometry` `coordinates` array. All `properties` are optional. When using GeoJSON, the `title` property maps to the name of the site. Additional properties, if present, are ignored by *nshmp-haz* but permitted as they may be relevant for other applications. For example, [styling properties](https://help.github.com/articles/mapping-geojson-files-on-github/#styling-features) may be used to improve rendering on GitHub and elsewhere.
For a fully fledged example, see the [NSHM test sites](/usgs/nshmp-haz/blob/master/etc/nshm/sites-nshmp.geojson) file.
#### Map Regions
GeoJSON is also used to define *nshmp-haz* map regions. For example, see the file that defines a region commonly used when creating hazard and other maps for the [Los Angeles basin](/usgs/nshmp-haz/blob/master/etc/nshm/map-la-basin.geojson).
A map region is expected as a `Polygon` `FeatureCollection`. Currently, *nshmp-haz* only supports a `FeatureCollection` with 1 or 2 polygons. When a single polygon is defined, it must consist of a single, simple closed loop. Additional arrays that define holes in the polygon (per the GeoJSON spec) are not processed and results are undefined for self-intersecting coordinate arrays. This polygon feature supports the same (optional and/or extra) `properties` as the `Point` features described above. The site properties are applied to all sites in the defined region. In addition, this feature *must* define a `spacing` property with a value in decimal degrees that governs the site density within the region. This polygon defines the area over which hazard calculations are to be performed.
A second polygon may be used to define a map region. This polygon *must* be defined first, *must* have a feature `id` of `Extents`, and *must* be rectangular (in a mercator projection) with edges parallel to lines of latitude and longitude. This polygon is used primarily when outputting hazard curves in the USGS binary format. Any points in the 'calculation' polygon outside the 'extents' polygon are ignored; hazard values at any points within the 'extents' polygon but outside the 'calculation' polygon are set to zero. For an example, see the [NSHMP Western US](/usgs/nshmp-haz/blob/master/etc/nshm/map-wus.geojson) map site file.
### Next: [Source Models](source-models)
Sources are representations of earthquakes that occur with some rate. They can be well defined faults with a specific geometry, or point sources derived from historic earthquake catalogs for which planar pseudo-fault representations may be used. In either case, there may be multiple earthquake sizes associated with a given source.
### Outline
* [Source File Structure](#source-file-structure)
- [Sample File Structure](#sample-file-structure)
* [Common Source Elements](#common-source-elements)
- [Magnitude-Frequency Distributions (MFDs)](#magnitude-frequency-distributions-mfds)
- [Rupture-Scaling Relations](#rupture-scaling-relations)
### Source File Structure
An earthquake source model, or [HazardModel](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/eq/model/HazardModel.html), is specified using XML files grouped in folders by [SourceType](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/eq/model/SourceType.html). The currently supported source types are:
```
[ Area, Cluster, Fault, Grid, Interface, Slab, System ]
```
One level of nested folders is permitted to facilitate logical 'source groups' and the use of different ground motion models ([GMMs](ground-motion-models) or [model initialization](configuration) settings. Source files, except those for [Fault System Sources](source-types#system-sources), and nested 'source group' folders may be given any name. Ground motions are computed using the models defined in the `gmm.xml` file in each source directory. If a 'source group' does not contain `gmm.xml`, one is required in the parent 'source type' folder. See file structure example below.
#### Sample File Structure:
```
ModelDirectory/
├─ config.json
├─ Fault/ <─── source type
│ ├─ faultSources1.xml
│ ├─ faultSources2.xml
│ ├─ ...
│ ├─ gmm.xml
│ │
│ └─ FaultGroup/ <─── source group
│ ├─ otherFaultSources.xml
│ └─ gmm.xml
├─ Grid/
│ ├─ gridSources.xml
│ ├─ ...
│ ├─ gmm.xml <─── uses parent GMMs ─┐
│ │ │
│ ├─ GridGroup1/ │
│ │ └─ otherGridSources.xml ──────────┘
│ │
│ └─ GridGroup2/
│ ├─ otherGridSources.xml
│ └─ gmm.xml
└─ ...
```
### Common Source Elements
#### Magnitude-Frequency Distributions (MFDs)
A source requires a description of the sizes and rates of all earthquakes it is capable of generating, otherwise known as a magnitude-frequency distribution. For conciseness, all source files may supply default MFDs so that individual sources need only specify those attributes that are different from the default. Standard [MFD types](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/mfd/MfdType.html):
* Single
* [Gutenberg-Richter](http://en.wikipedia.org/wiki/Gutenberg–Richter_law)
* [Tapered Gutenberg-Richter](http://scec.ess.ucla.edu/~ykagan/moms_index.html)
* Incremental
XML examples:
```XML
<!-- Single magnitude MFD
- Used to represent the rupture of a specific size event
on a fault; historically this may have been referred to
as a 'characteristic' rupture.
- Can float over or fill source. -->
<IncrementalMfd type="SINGLE" weight="0.25" id="NAME"
rate="1.0" floats="[true | false]" m="6.5" />
<!-- Gutenberg-Richter MFD
- Used to represent a range of evenly discretized magnitude
events with systematically varying rates.
- Always floats when used for a fault source; never floats
for grid sources.
- 'a' is the incremental log10(number of M=0 events). -->
<IncrementalMfd type="GR" weight="0.25" id="NAME"
a="1.0" b="1.0" dMag="0.1" mMin="5.0" mMax="7.0" />
<!-- Tapered Gutenberg-Richter MFD
- Same as Gutenberg-Richter, above, but with an exponential
taper applied to the cumulative number of events with
seismic moment greater than M.
- Only used for grid sources. -->
<IncrementalMfd type="GR_TAPER" weight="0.25" id="NAME"
a="1.0" b="1.0" dMag="0.1" mCut="6.5" mMin="5.0" mMax="7.0" />
<!-- Incremental MFD
- General purpose MFD that consists of varying magnitudes
and rates.
- Can float over or fill source. -->
<IncrementalMfd type="INCR" weight="0.25" id="NAME"
mags="[5.05, 5.15, ...]" rates="[1.0e-2, 0.9e-2, ...]"
floats="[true | false]" />
```
#### Rupture Scaling Relations
Fault-based sources commonly require a model that describes the geometry of partial (or floating) ruptures. Likewise, grid sources require a model of fault-length or -area when building pseudo-faults or performing point source distance corrections that consider magnitude-dependent rupture sizes with unkown strike. Such models are composed of published magnitude-length or -area relations and restrictions on aspect-ratio and/or maximum rupture width that are independent of the published model.
See the [API Docs](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/eq/fault/surface/RuptureScaling.html) for details on specific model combinations and implementations. A rupture scaling model is always sepcified as an attribute of `<SourceProperties />`; see examples in next section.
### Next: [Source Types](source-types)
There are seven basic [source types](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/eq/model/SourceType.html):
```
[ Area, Cluster, Fault, Grid, Interface, Slab, System ]
```
Source files for each are written in [plain old XML](http://en.wikipedia.org/wiki/Plain_Old_XML). All attributes present in the examples below are required with the exception of MFD's for which default distributions are being used; see examples below.
__Note on value types:__ Most attribute values or element content are parsed as a `String`, `Double`, `Double[]` (array) or `Boolean`; those attribute values in ALL_CAPS are parsed as `enum` types and are case sensitive, [for example](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/mfd/MfdType.html).
__Note on using default MFDs:__ Default MFDs, if supported and present, must be fully specified. Any MFD encountered in a source will map all missing attributes from the default(s) of the same `type`. If no default for an MFD `type` is present, the source MFD must be fully specified
__Note on Coordinates:__ *nshmp-haz* supports longitude and latitude values in the closed ranges `[-360° ‥ 360°]` and `[-90° ‥ 90°]`. Note, however, that mixing site and/or source coordinates across the antimeridian (the -180° to 180° transition) will yield unexpected results. For Pacific models and claculations, always use positive or negative longitudes exclusively.
### Outline
* [Area Sources](#area-sources)
* [Cluster Sources](#cluster-sources)
* [Fault Sources](#fault-sources)
* [Grid Sources](#grid-sources)
* [Interface Sources](#interface-sources)
* [Slab Sources](#slab-sources)
* [System Sources](#system-sources)
### Area Sources
Area sources are similar to [Grid Sources](#grid-sources) except that a single MFD applies to an entire area with rates proportionally scaled for use at all grid nodes. The discretization of area sources into grids may be allowed to [vary with distance](http://usgs.github.io/nshmp-haz/javadoc/index.html?gov/usgs/earthquake/nshmp/eq/model/AreaSource.GridScaling.html) from a site as specified by a model initialization [configuration](configuration).
```XML
<?xml version="1.0" encoding="UTF-8"?>
<AreaSourceSet name="Source Set Name" weight="1.0">
<!-- Sources ... -->
<Source name="Area Source Name" id="0">
<!-- Specify MFDs ... -->
<IncrementalMfd type="GR" weight="1.0" id="name"
a="0.0" b="0.8" dMag="0.1" mMax="7.0" mMin="5.0" />
<Geometry>
<!-- Border polygon. Individual locations specified by
whitespace separated tuples of longitude,latitude,depth
(NO SPACES); same as the KML <coordintes/> format. -->
<Border>
-117.0,34.0,0.0
-117.1,34.1,0.0
-117.3,34.2,0.0
...
</Border>
</Geometry>
</Source>
<!-- Add more sources ... -->
<Source />
...
</AreaSourceSet>
```
### Cluster Sources
Cluster sources are composed of two or more fault sources that rupture independently but very closely spaced in time. Ground motions from cluster sources are modeled as the joint probability of exceeding ground motions from each independent event.
```XML
<?xml version="1.0" encoding="UTF-8"?>
<ClusterSourceSet name="Source Set Name" weight="1.0">
<!-- (optional) Settings block for any data that applies to all
sources. -->
<Settings>
<!-- (optional) The reference MFD to use. Note: Cluster sources
only support SINGLE MFDs at this time. -->
<DefaultMfds>
<IncrementalMfd type="SINGLE" weight="1.0" id="name"
rate="0.002" floats="false" m="0.0" />
</DefaultMfds>
<!-- Although not used, a rupture scaling model is required
to initalize the fault sources nested in each cluster. -->
<SourceProperties ruptureScaling="NSHM_FAULT_WC94_LENGTH" />
</Settings>
<!-- Sources must follow Settings ... -->
<Cluster name="Cluster Source Name" id="0" weight="0.2">
<Source name="Fault Source Name" id="0">
<!-- Specify MFDs; only SINGLE is supported -->
<IncrementalMfd type="SINGLE" weight="0.2" id="name" m="6.6" />
<IncrementalMfd type="SINGLE" weight="0.5" id="name" m="6.9" />
<IncrementalMfd type="SINGLE" weight="0.3" id="name" m="7.3" />
<!-- Then geometry ... -->
<Geometry dip="45.0" rake="0.0" width="14.0">
<!-- Trace must follow right-hand rule. -->
<!-- Individual locations specified by whitespace
separated tuples of longitude,latitude,depth
(NO SPACES); same as KML <coordintes/> format. -->
<Trace>
-117.0,34.0,0.0
-117.1,34.1,0.0
-117.3,34.2,0.0
...
</Trace>
</Geometry>
</Source>
</Cluster>
<!-- Add more sources ... -->
<Cluster />
...
</ClusterSourceSet>
```
### Fault Sources
```XML
<?xml version="1.0" encoding="UTF-8"?>
<FaultSourceSet name="Source Set Name" weight="1.0">
<!-- (optional) Settings block for any data that applies to all
sources. -->
<Settings>
<!-- (optional) The reference MFDs to use. -->
<DefaultMfds>
<IncrementalMfd type="SINGLE" weight="0.6667" id="name"
rate="0.0" floats="false" m="0.0" />
<IncrementalMfd type="GR" weight="0.3333" id="name"
a="0.0" b="0.8" dMag="0.1" mMin="5.0" mMax="7.0" />
...
</DefaultMfds>
<!-- (optional) A magnitude uncertainty model that will be
applied to every source:
- <Epistemic/> varies mMax and scales variant rates by
the supplied weights; it is only ever applied to SINGLE
and GR MFDs.
- 'cutoff' is magnitude below which uncertainty will be
disabled.
- <Aleatory/> applies a (possibly moment-balanced) ±2σ
Gaussian distribution to mMax; it is only ever applied
to SINGLE MFDs (possibly in conjunction with epistemic).
- 'count' is the number of magnitude bins spanned by
the distribution.
- <Aleatory/> or '<Epistemic/>', or the entire block
may be omitted. -->
<MagUncertainty>
<Epistemic cutoff="6.5"
deltas="[-0.2, 0.0, 0.2]" weights="[0.2, 0.6, 0.2]" />
<Aleatory cutoff="6.5"
moBalance="true" sigma="0.12" count="11" />
</MagUncertainty>
<SourceProperties ruptureScaling="NSHM_FAULT_WC94_LENGTH" />
</Settings>
<!-- Sources must follow Settings ... -->
<Source name="Fault Source Name" id="0">
<!-- Specify MFDs ...
- at a minimum, 'type' must be defined, assuming
reference MFDs are present. -->
<IncrementalMfd type="SINGLE" rate="1.0" m="7.4" />
<IncrementalMfd type="GR" a="1.0e-2" dMag="0.1" mMax="7.4" />
<!-- Then geometry ... -->
<Geometry depth="1.0" dip="45.0" rake="0.0" width="14.0">
<!-- Trace must follow right-hand rule. -->
<!-- Individual locations specified by whitespace separated
tuples of longitude,latitude,depth (NO SPACES); same
as KML <coordintes/> format. -->
<Trace>
-117.0,34.0,0.0
-117.1,34.1,0.0
-117.3,34.2,0.0
...
</Trace>
</Geometry>
</Source>
<!-- Add more sources ... -->
<Source />
...
</FaultSourceSet>
```
### Grid Sources
```XML
<?xml version="1.0" encoding="UTF-8"?>
<GridSourceSet name="Source Set Name" weight="1.0" id="0">
<!-- (optional) Settings block for any data that applies to all
sources. -->
<Settings>
<!-- (optional) The reference MFDs to use; although optional,
using reference MFDs greatly reduces grid source file
sizes. -->
<DefaultMfds>
<IncrementalMfd type="GR" weight="1.0" id="name"
a="0.0" b="0.8" dMag="0.1" mMax="7.0" mMin="5.0" />
<IncrementalMfd type="INCR" weight="1.0" id="name"
mags="[5.05, 5.25, 5.45, 5.65, 5.85, 6.05, 6.25, 6.45]"
rates="[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]" />
...
</DefaultMfds>
<!-- Grid sources require attitional information about the
distribution of focal mechanisms and depths to use:
- 'magDepthMap' is a ';' separated list cutoff magnitudes
mapped to depths and associated weights. In the example
below events of M<6.5 are located at a depth of 5 km,
with a full weight of 1. The [depth:weight] mapping may
contain multiple, ',' separated values, e.g.
[6.5::[5.0:0.8,1.0:0.2], ...].
- 'maxDepth' constrains the maximum depth of any finite
point source representations.
- 'focalMechMap' is a ',' separated list of focal
mechanism identifiers and associated wieghts.
- In both maps above, weights must sum to 1.
- Use 'NaN' for unknown strike. Note that if a strike
value is defined, sources will be implementated as
FIXED_STRIKE and any configuration settings will be
ignored. -->
<SourceProperties
magDepthMap="[6.5::[5.0:1.0]; 10.0::[1.0:1.0]]"
maxDepth="14.0"
focalMechMap="[STRIKE_SLIP:0.5,NORMAL:0.0,REVERSE:0.5]"
ruptureScaling="NSHM_POINT_WC94_LENGTH"
strike="120.0" />
</Settings>
<!-- Nodes are specialized <IncrementalMfd/> elements that specify
the location of individual grid sources and have the necessary
attributes to define the MFD for the source. -->
<Nodes>
<Node type="GR" a="0.0823" mMax="7.2">-119.0,34.0,0.0</Node>
<Node type="GR" a="0.0823" mMax="7.1">-119.1,34.0,0.0</Node>
<Node type="GR" a="0.0823" mMax="6.8">-119.2,34.0,0.0</Node>
<Node type="GR" a="0.0823" mMax="7.1">-119.3,34.0,0.0</Node>
<Node type="SINGLE" rates="[1.0e-2, ...]">-119.4,34.0,0.0</Node>
<Node type="SINGLE" rates="[1.0e-2, ...]">-119.5,34.0,0.0</Node>
<Node type="GR" a="0.0823" mMax="6.9">-119.3,34.0,0.0</Node>
...
</Nodes>
</GridSourceSet>
```
### Interface Sources
```XML
<?xml version="1.0" encoding="UTF-8"?>
<SubductionSourceSet name="Source Set Name" weight="1.0">
<!-- See Fault Sources for 'Settings' examples. -->
<Settings />
<!-- Sources must follow Settings ... -->
<Source name="Subduction Source Name" id="0">
<!-- Specify MFDs ... -->
<IncrementalMfd type="SINGLE" weight="1.0" id="name"
rate="1.0" m="8.2" />
<!-- Then geometry ... -->
<Geometry rake="90.0">
<!-- As with Fault Sources, trace must follow right-hand
rule. -->
<!-- Individual locations specified by whitespace separated
tuples of longitude,latitude,depth (NO SPACES); same as
KML <coordintes/> format. -->
<Trace>
-124.7,41.0,0.0
-124.6,44.0,0.0
-124.5,47.0,0.0
...
</Trace>
<!-- (optional) Subduction sources may specify a lower trace,
also following the right-hand-rule. If a lower trace
is NOT defined, the parent geometry element must include
'depth', 'dip', and 'width' attributes in the same manner
as a fault source. -->
<LowerTrace>
-124.5,41.0,0.0
-124.4,44.0,0.0
-124.3,47.0,0.0
...
</LowerTrace>
</Geometry>
</Source>
<!-- Add more sources ... -->
<SubductionSource />
...
</SubductionSourceSet>
```
### Slab Sources
Subduction intraslab sources are currently specified the same way as [Grid Sources](#grid-sources).
### System Sources
Fault system source sets require three files:
* `fault_sections.xml`
* `fault_ruptures.xml`
* `grid_sources.xml`
that are placed together within a _source group_ folder. Fault system source sets represent a single logic-tree branch or an average over a group of branches and has a gridded (or smoothed seismicity) source component that is coupled with the fault-based rates in the model.
`fault_sections.xml` defines the geometry of a fault network as a set of indexed fault sections:
```XML
<?xml version="1.0" encoding="UTF-8"?>
<SystemFaultSections name="Source Set Name">
<!-- Specify section 'index' and 'name' -->
<Section index="0" name="Section Name">
<!-- Specify section geometry -->
<Geometry aseis="0.1" dip="50.0" dipDir="89.459"
lowerDepth="13.0" upperDepth="0.0">
<!-- Unlike Fault Sources, trace does not need to follow
right-hand rule as 'dipDir' is supplied above. -->
<!-- Individual locations specified by whitespace separated
tuples of longitude,latitude,depth (NO SPACES); same as
KML <coordintes/> format. -->
<Trace>
-117.75,35.74,0.00
-117.76,35.81,0.00
</Trace>
</Geometry>
</Section>
<!-- Add more sections ... -->
<Section />
</IndexedFaultSections>
```
`fault_ruptures.xml` defines the geometry of fault sources, referencing fault sections by index:
```XML
<?xml version="1.0" encoding="UTF-8"?>
<SystemSourceSet name="Source Set Name" weight="1.0">
<!-- <Settings/> block may be included; see Fault Sources and
Grid Sources for examples. -->
<!-- Sources must follow Settings ...
- indexed fault sources do not require a name.-->
<Source>
<!-- Specify MFDs ... -->
<IncrementalMfd rate="1.0e-05" floats="false" m="6.58" type="SINGLE"
weight="1.0" />
<!-- Then geometry ...
- 'indices' is an array of index ranges, ordered from
one end of the source to the other -->
<Geometry indices="[[0:5],[13:22],[104:106]" rake="0.0" />
</Source>
<!-- Add more sources ... -->
<Source />
...
</IndexedFaultSourceSet>
```
`grid_sources.xml` is structured and processed in the same way as a [grid source](#grid-sources).
### Next: [Ground Motion Models](ground-motion-models)
The USGS has historically updated the earthquake hazard model for the conterminous U.S. every 6 years. In addition, the USGS intermittently creates or updates models for other regions such as Alaska, Hawaii, and other U.S. territories, and periodically produces models for other countries. These models are intended for use with the USGS probabilistic earthquake hazard codebase: [*nshmp-haz*](https://github.com/usgs/nshmp-haz).
Each model region and year has a dedicated repository:
* [Conterminous US (2014)](/usgs/nshm-cous-2014)
* [Conterminous US (2008)](/usgs/nshm-cous-2008)
* [Alaska (2007)](/usgs/nshm-ak-2007)
See also:
* [USGS Earthquake Hazards website](http://earthquake.usgs.gov/hazards/)
### Go to the [examples](/usgs/nshmp-haz/tree/master/etc/examples) ...or return [home](home).
*TODO need about-deaggregation; check Ex 7 link*
### What are ε and ε₀?
The ground motion, _Y_, that might be recorded at a specific site from an earthquake 'source' with a given magnitude and distance is typically modeled as a log-normal variate such that the logarithm of _Y_, which we denote _y_, has a normal distribution, with mean, _μ_, and standard deviation, _σ_. Epsilon is defined as the standardized _y_ at a site from a specific source: _ε_ = (_y_ – _μ_) ∕ _σ_. Because we compute the probability of exceeding some ground motion at a site, _p₀_, in probabilisitic seismic hazard analysis, our deaggregation tools instead report _ε₀_ = (_y₀_ – _μ_) ∕ _σ_.
[<img valign="middle" src="images/usgs-icon.png">](https://www.usgs.gov) &nbsp;[U.S. Geological Survey](https://www.usgs.gov) – National Seismic Hazard Mapping Project ([NSHMP](https://earthquake.usgs.gov/hazards/))
[Home](home)
[Building & Running](building-&-running)
&nbsp;&nbsp;&nbsp;[Configuration](configuration)
&nbsp;&nbsp;&nbsp;[Site Specification](sites)
[Source Models](source-models)
&nbsp;&nbsp;&nbsp;[Source Types](source-types)
&nbsp;&nbsp;&nbsp;[Ground Motion Models](ground-motion-models)
&nbsp;&nbsp;&nbsp;[USGS Models](usgs-models)
[Examples](https://github.com/usgs/nshmp-haz/tree/master/etc/examples)
[Dependencies & Development](dependencies-&-development)
[Functional PSHA](functional-psha)
[API Docs](http://usgs.github.io/nshmp-haz/javadoc)
[License](https://github.com/usgs/nshmp-haz/blob/master/LICENSE.md)
docs/images/psha-formula.png

37.7 KiB

docs/images/psha-functional.png

11.7 KiB

docs/images/psha-linear.png

14.5 KiB

docs/images/usgs-icon.png

1.07 KiB

0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment