Skip to content
Snippets Groups Projects
Commit 72f134d4 authored by Laura A DeCicco's avatar Laura A DeCicco
Browse files

Updates to help files.

parent f188d3a5
No related branches found
No related tags found
1 merge request!53Help file updates.
...@@ -16,7 +16,7 @@ ...@@ -16,7 +16,7 @@
#' @param format string, can be "tsv" or "xml", and is only applicable for daily and unit value requests. "tsv" returns results faster, but there is a possiblitiy that an incomplete file is returned without warning. XML is slower, #' @param format string, can be "tsv" or "xml", and is only applicable for daily and unit value requests. "tsv" returns results faster, but there is a possiblitiy that an incomplete file is returned without warning. XML is slower,
#' but will offer a warning if the file was incomplete (for example, if there was a momentary problem with the internet connection). It is possible to safely use the "tsv" option, #' but will offer a warning if the file was incomplete (for example, if there was a momentary problem with the internet connection). It is possible to safely use the "tsv" option,
#' but the user must carefully check the results to see if the data returns matches what is expected. The default is therefore "xml". #' but the user must carefully check the results to see if the data returns matches what is expected. The default is therefore "xml".
#' @param expanded logical defaults to FALSE. If TRUE, retrieves additional information, only applicable for qw data. #' @param expanded logical defaults to \code{TRUE}. If \code{TRUE}, retrieves additional information, only applicable for qw data.
#' @param ratingType can be "base", "corr", or "exsa". Only applies to rating curve data. #' @param ratingType can be "base", "corr", or "exsa". Only applies to rating curve data.
#' @keywords data import USGS web service #' @keywords data import USGS web service
#' @return url string #' @return url string
...@@ -43,7 +43,7 @@ ...@@ -43,7 +43,7 @@
#' url_meas <- constructNWISURL(siteNumber, service="meas") #' url_meas <- constructNWISURL(siteNumber, service="meas")
#' } #' }
constructNWISURL <- function(siteNumber,parameterCd="00060",startDate="",endDate="", constructNWISURL <- function(siteNumber,parameterCd="00060",startDate="",endDate="",
service,statCd="00003", format="xml",expanded=FALSE, service,statCd="00003", format="xml",expanded=TRUE,
ratingType="base"){ ratingType="base"){
service <- match.arg(service, c("dv","uv","iv","qw","gwlevels","rating","peak","meas")) service <- match.arg(service, c("dv","uv","iv","qw","gwlevels","rating","peak","meas"))
......
...@@ -6,20 +6,20 @@ ...@@ -6,20 +6,20 @@
#' recommended to use the RDB format for importing multi-site data. #' recommended to use the RDB format for importing multi-site data.
#' #'
#' @param obs_url character containing the url for the retrieval #' @param obs_url character containing the url for the retrieval
#' @param asDateTime logical, if TRUE returns date and time as POSIXct, if FALSE, Date #' @param asDateTime logical, if \code{TRUE} returns date and time as POSIXct, if \code{FALSE}, Date
#' @param qw logical, if TRUE parses as water quality data (where dates/times are in start and end times) #' @param qw logical, if \code{TRUE} parses as water quality data (where dates/times are in start and end times)
#' @param tz character to set timezone attribute of datetime. Default is an empty quote, which converts the #' @param tz character to set timezone attribute of datetime. Default is an empty quote, which converts the
#' datetimes to UTC (properly accounting for daylight savings times based on the data's provided tz_cd column). #' datetimes to UTC (properly accounting for daylight savings times based on the data's provided tz_cd column).
#' Possible values to provide are "America/New_York","America/Chicago", "America/Denver","America/Los_Angeles", #' Possible values to provide are "America/New_York","America/Chicago", "America/Denver","America/Los_Angeles",
#' "America/Anchorage","America/Honolulu","America/Jamaica","America/Managua","America/Phoenix", and "America/Metlakatla" #' "America/Anchorage","America/Honolulu","America/Jamaica","America/Managua","America/Phoenix", and "America/Metlakatla"
#' @param convertType logical, defaults to TRUE. If TRUE, the function will convert the data to dates, datetimes, #' @param convertType logical, defaults to \code{TRUE}. If \code{TRUE}, the function will convert the data to dates, datetimes,
#' numerics based on a standard algorithm. If false, everything is returned as a character #' numerics based on a standard algorithm. If false, everything is returned as a character
#' @return A data frame with the following columns: #' @return A data frame with the following columns:
#' \tabular{lll}{ #' \tabular{lll}{
#' Name \tab Type \tab Description \cr #' Name \tab Type \tab Description \cr
#' agency_cd \tab character \tab The NWIS code for the agency reporting the data\cr #' agency_cd \tab character \tab The NWIS code for the agency reporting the data\cr
#' site_no \tab character \tab The USGS site number \cr #' site_no \tab character \tab The USGS site number \cr
#' datetime \tab POSIXct \tab The date and time of the value converted to UTC (if asDateTime = TRUE), \cr #' datetime \tab POSIXct \tab The date and time of the value converted to UTC (if asDateTime = \code{TRUE}), \cr
#' \tab character \tab or raw character string (if asDateTime = FALSE) \cr #' \tab character \tab or raw character string (if asDateTime = FALSE) \cr
#' tz_cd \tab character \tab The time zone code for datetime \cr #' tz_cd \tab character \tab The time zone code for datetime \cr
#' code \tab character \tab Any codes that qualify the corresponding value\cr #' code \tab character \tab Any codes that qualify the corresponding value\cr
......
...@@ -4,7 +4,7 @@ ...@@ -4,7 +4,7 @@
#' NWIS site, parameter code, statistic, startdate and enddate. #' NWIS site, parameter code, statistic, startdate and enddate.
#' #'
#' @param obs_url character containing the url for the retrieval #' @param obs_url character containing the url for the retrieval
#' @param asDateTime logical, if TRUE returns date and time as POSIXct, if FALSE, Date #' @param asDateTime logical, if \code{TRUE} returns date and time as POSIXct, if \code{FALSE}, Date
#' @param tz character to set timezone attribute of datetime. Default is an empty quote, which converts the #' @param tz character to set timezone attribute of datetime. Default is an empty quote, which converts the
#' datetimes to UTC (properly accounting for daylight savings times based on the data's provided tz_cd column). #' datetimes to UTC (properly accounting for daylight savings times based on the data's provided tz_cd column).
#' Possible values to provide are "America/New_York","America/Chicago", "America/Denver","America/Los_Angeles", #' Possible values to provide are "America/New_York","America/Chicago", "America/Denver","America/Los_Angeles",
......
...@@ -4,7 +4,7 @@ ...@@ -4,7 +4,7 @@
#' but the general functionality is correct. #' but the general functionality is correct.
#' #'
#' @param obs_url character containing the url for the retrieval #' @param obs_url character containing the url for the retrieval
#' @param asDateTime logical, if TRUE returns date and time as POSIXct, if FALSE, Date #' @param asDateTime logical, if \code{TRUE} returns date and time as POSIXct, if \code{FALSE}, Date
#' @param tz character to set timezone attribute of datetime. Default is an empty quote, which converts the #' @param tz character to set timezone attribute of datetime. Default is an empty quote, which converts the
#' datetimes to UTC (properly accounting for daylight savings times based on the data's provided tz_cd column). #' datetimes to UTC (properly accounting for daylight savings times based on the data's provided tz_cd column).
#' Possible values to provide are "America/New_York","America/Chicago", "America/Denver","America/Los_Angeles", #' Possible values to provide are "America/New_York","America/Chicago", "America/Denver","America/Los_Angeles",
...@@ -42,7 +42,9 @@ ...@@ -42,7 +42,9 @@
#' } #' }
importWaterML2 <- function(obs_url, asDateTime=FALSE, tz=""){ importWaterML2 <- function(obs_url, asDateTime=FALSE, tz=""){
if(url.exists(obs_url)){ if(file.exists(obs_url)){
doc <- xmlTreeParse(obs_url, getDTD = FALSE, useInternalNodes = TRUE)
} else {
doc = tryCatch({ doc = tryCatch({
h <- basicHeaderGatherer() h <- basicHeaderGatherer()
returnedDoc <- getURL(obs_url, headerfunction = h$update) returnedDoc <- getURL(obs_url, headerfunction = h$update)
...@@ -62,9 +64,7 @@ importWaterML2 <- function(obs_url, asDateTime=FALSE, tz=""){ ...@@ -62,9 +64,7 @@ importWaterML2 <- function(obs_url, asDateTime=FALSE, tz=""){
message(paste("URL does not seem to exist:", obs_url)) message(paste("URL does not seem to exist:", obs_url))
message(e) message(e)
return(NA) return(NA)
}) })
} else {
doc <- xmlTreeParse(obs_url, getDTD = FALSE, useInternalNodes = TRUE)
} }
if(tz != ""){ if(tz != ""){
......
...@@ -3,7 +3,7 @@ ...@@ -3,7 +3,7 @@
#' Imports data from NWIS about meaured parameter based on user-supplied parameter code. #' Imports data from NWIS about meaured parameter based on user-supplied parameter code.
#' This function gets the data from here: \url{http://nwis.waterdata.usgs.gov/nwis/pmcodes} #' This function gets the data from here: \url{http://nwis.waterdata.usgs.gov/nwis/pmcodes}
#' #'
#' @param parameterCd character of USGS parameter codes. This is usually an 5 digit number. #' @param parameterCd character of USGS parameter codes (or multiple parameter codes). This is usually an 5 digit number.
#' @keywords data import USGS web service #' @keywords data import USGS web service
#' @return parameterData data frame with all information from the USGS about the particular parameter. #' @return parameterData data frame with all information from the USGS about the particular parameter.
#' #'
...@@ -19,7 +19,6 @@ ...@@ -19,7 +19,6 @@
#' #'
#' @export #' @export
#' @examples #' @examples
#' # These examples require an internet connection to run
#' paramINFO <- readNWISpCode(c('01075','00060','00931')) #' paramINFO <- readNWISpCode(c('01075','00060','00931'))
readNWISpCode <- function(parameterCd){ readNWISpCode <- function(parameterCd){
......
...@@ -5,16 +5,18 @@ ...@@ -5,16 +5,18 @@
#' A list of statistic codes can be found here: \url{http://nwis.waterdata.usgs.gov/nwis/help/?read_file=stat&format=table} #' A list of statistic codes can be found here: \url{http://nwis.waterdata.usgs.gov/nwis/help/?read_file=stat&format=table}
#' #'
#' @param siteNumbers character of USGS site numbers. This is usually an 8 digit number #' @param siteNumbers character of USGS site numbers. This is usually an 8 digit number
#' @param pCodes character of USGS parameter code(s). This is usually an 5 digit number. #' @param parameterCd character of USGS parameter code(s). This is usually an 5 digit number.
#' @param startDate character starting date for data retrieval in the form YYYY-MM-DD. Default is "" which indicates #' @param startDate character starting date for data retrieval in the form YYYY-MM-DD. Default is "" which indicates
#' retrieval for the earliest possible record. #' retrieval for the earliest possible record.
#' @param endDate character ending date for data retrieval in the form YYYY-MM-DD. Default is "" which indicates #' @param endDate character ending date for data retrieval in the form YYYY-MM-DD. Default is "" which indicates
#' retrieval for the latest possible record. #' retrieval for the latest possible record.
#' @param expanded logical defaults to TRUE. If TRUE, retrieves additional information. Expanded data includes #' @param expanded logical defaults to \code{TRUE}. If \code{TRUE}, retrieves additional information. Expanded data includes
#' remark_cd (remark code), result_va (result value), val_qual_tx (result value qualifier code), meth_cd (method code), #' remark_cd (remark code), result_va (result value), val_qual_tx (result value qualifier code), meth_cd (method code),
#' dqi_cd (data-quality indicator code), rpt_lev_va (reporting level), and rpt_lev_cd (reporting level type). #' dqi_cd (data-quality indicator code), rpt_lev_va (reporting level), and rpt_lev_cd (reporting level type). If \code{FALSE},
#' @param reshape logical. Will reshape the data to a wide format if TRUE (default is FALSE). This is only #' only returns remark_cd (remark code) and result_va (result value). Expanded = \code{FALSE} will not give
#' available for 'expanded' data. #' sufficient information for unbiased statistical analysis.
#' @param reshape logical, reshape the data. If \code{TRUE}, then return a wide data frame with all water-quality in a single row for each sample.
#' If \code{FALSE} (default), then return a long data frame with each water-quality result in a single row.
#' @param tz character to set timezone attribute of datetime. Default is an empty quote, which converts the #' @param tz character to set timezone attribute of datetime. Default is an empty quote, which converts the
#' datetimes to UTC (properly accounting for daylight savings times based on the data's provided tz_cd column). #' datetimes to UTC (properly accounting for daylight savings times based on the data's provided tz_cd column).
#' Possible values to provide are "America/New_York","America/Chicago", "America/Denver","America/Los_Angeles", #' Possible values to provide are "America/New_York","America/Chicago", "America/Denver","America/Los_Angeles",
...@@ -38,6 +40,8 @@ ...@@ -38,6 +40,8 @@
#' url \tab character \tab The url used to generate the data \cr #' url \tab character \tab The url used to generate the data \cr
#' queryTime \tab POSIXct \tab The time the data was returned \cr #' queryTime \tab POSIXct \tab The time the data was returned \cr
#' comment \tab character \tab Header comments from the RDB file \cr #' comment \tab character \tab Header comments from the RDB file \cr
#' siteInfo \tab data.frame \tab A data frame containing information on the requested sites \cr
#' variableInfo \tab data.frame \tab A data frame containing information on the requested parameters \cr
#' } #' }
#' @export #' @export
#' @import reshape2 #' @import reshape2
...@@ -47,16 +51,16 @@ ...@@ -47,16 +51,16 @@
#' siteNumbers <- c('04024430','04024000') #' siteNumbers <- c('04024430','04024000')
#' startDate <- '2010-01-01' #' startDate <- '2010-01-01'
#' endDate <- '' #' endDate <- ''
#' pCodes <- c('34247','30234','32104','34220') #' parameterCd <- c('34247','30234','32104','34220')
#' #'
#' rawNWISqwData <- readNWISqw(siteNumbers,pCodes,startDate,endDate) #' rawNWISqwData <- readNWISqw(siteNumbers,parameterCd,startDate,endDate)
#' rawNWISqwDataReshaped <- readNWISqw(siteNumbers,pCodes, #' rawNWISqwDataReshaped <- readNWISqw(siteNumbers,parameterCd,
#' startDate,endDate,reshape=TRUE) #' startDate,endDate,reshape=TRUE)
#' #'
readNWISqw <- function (siteNumbers,pCodes,startDate="",endDate="", readNWISqw <- function (siteNumbers,parameterCd,startDate="",endDate="",
expanded=TRUE,reshape=FALSE,tz=""){ expanded=TRUE,reshape=FALSE,tz=""){
url <- constructNWISURL(siteNumbers,pCodes,startDate,endDate,"qw",expanded=expanded) url <- constructNWISURL(siteNumbers,parameterCd,startDate,endDate,"qw",expanded=expanded)
data <- importRDB1(url,asDateTime=TRUE, qw=TRUE, tz = tz) data <- importRDB1(url,asDateTime=TRUE, qw=TRUE, tz = tz)
originalHeader <- comment(data) originalHeader <- comment(data)
...@@ -70,7 +74,7 @@ readNWISqw <- function (siteNumbers,pCodes,startDate="",endDate="", ...@@ -70,7 +74,7 @@ readNWISqw <- function (siteNumbers,pCodes,startDate="",endDate="",
wideDF <- dcast(longDF, ... ~ variable + parm_cd ) wideDF <- dcast(longDF, ... ~ variable + parm_cd )
wideDF[,grep("_va_",names(wideDF))] <- sapply(wideDF[,grep("_va_",names(wideDF))], function(x) as.numeric(x)) wideDF[,grep("_va_",names(wideDF))] <- sapply(wideDF[,grep("_va_",names(wideDF))], function(x) as.numeric(x))
groupByPCode <- as.vector(sapply(pCodes, function(x) grep(x, names(wideDF)) )) groupByPCode <- as.vector(sapply(parameterCd, function(x) grep(x, names(wideDF)) ))
data <- wideDF[,c(1:length(columnsToMelt)-1,groupByPCode)] data <- wideDF[,c(1:length(columnsToMelt)-1,groupByPCode)]
comment(data) <- originalHeader comment(data) <- originalHeader
...@@ -81,7 +85,7 @@ readNWISqw <- function (siteNumbers,pCodes,startDate="",endDate="", ...@@ -81,7 +85,7 @@ readNWISqw <- function (siteNumbers,pCodes,startDate="",endDate="",
} }
siteInfo <- readNWISsite(siteNumbers) siteInfo <- readNWISsite(siteNumbers)
varInfo <- readNWISpCode(pCodes) varInfo <- readNWISpCode(parameterCd)
attr(data, "siteInfo") <- siteInfo attr(data, "siteInfo") <- siteInfo
attr(data, "variableInfo") <- varInfo attr(data, "variableInfo") <- varInfo
......
...@@ -2,7 +2,7 @@ ...@@ -2,7 +2,7 @@
#' #'
#' Imports data from USGS site file site. This function gets data from here: \url{http://waterservices.usgs.gov/} #' Imports data from USGS site file site. This function gets data from here: \url{http://waterservices.usgs.gov/}
#' #'
#' @param siteNumbers character USGS site number. This is usually an 8 digit number #' @param siteNumbers character USGS site number (or multiple sites). This is usually an 8 digit number
#' @keywords data import USGS web service #' @keywords data import USGS web service
#' @return A data frame with at least the following columns: #' @return A data frame with at least the following columns:
#' \tabular{lll}{ #' \tabular{lll}{
......
...@@ -107,14 +107,14 @@ readNWISuv <- function (siteNumbers,parameterCd,startDate="",endDate="", tz=""){ ...@@ -107,14 +107,14 @@ readNWISuv <- function (siteNumbers,parameterCd,startDate="",endDate="", tz=""){
readNWISpeak <- function (siteNumbers,startDate="",endDate=""){ readNWISpeak <- function (siteNumbers,startDate="",endDate=""){
# Doesn't seem to be a peak xml service # Doesn't seem to be a peak xml service
url <- constructNWISURL(siteNumber,NA,startDate,endDate,"peak") url <- constructNWISURL(siteNumbers,NA,startDate,endDate,"peak")
data <- importRDB1(url, asDateTime=FALSE) data <- importRDB1(url, asDateTime=FALSE)
data$peak_dt <- as.Date(data$peak_dt) data$peak_dt <- as.Date(data$peak_dt)
data$gage_ht <- as.numeric(data$gage_ht) data$gage_ht <- as.numeric(data$gage_ht)
siteInfo <- readNWISsite(siteNumber) siteInfo <- readNWISsite(siteNumbers)
attr(data, "siteInfo") <- siteInfo attr(data, "siteInfo") <- siteInfo
attr(data, "variableInfo") <- NULL attr(data, "variableInfo") <- NULL
......
No preview for this file type
...@@ -5,7 +5,7 @@ ...@@ -5,7 +5,7 @@
\usage{ \usage{
constructNWISURL(siteNumber, parameterCd = "00060", startDate = "", constructNWISURL(siteNumber, parameterCd = "00060", startDate = "",
endDate = "", service, statCd = "00003", format = "xml", endDate = "", service, statCd = "00003", format = "xml",
expanded = FALSE, ratingType = "base") expanded = TRUE, ratingType = "base")
} }
\arguments{ \arguments{
\item{siteNumber}{string or vector of strings USGS site number. This is usually an 8 digit number} \item{siteNumber}{string or vector of strings USGS site number. This is usually an 8 digit number}
...@@ -27,7 +27,7 @@ retrieval for the latest possible record.} ...@@ -27,7 +27,7 @@ retrieval for the latest possible record.}
but will offer a warning if the file was incomplete (for example, if there was a momentary problem with the internet connection). It is possible to safely use the "tsv" option, but will offer a warning if the file was incomplete (for example, if there was a momentary problem with the internet connection). It is possible to safely use the "tsv" option,
but the user must carefully check the results to see if the data returns matches what is expected. The default is therefore "xml".} but the user must carefully check the results to see if the data returns matches what is expected. The default is therefore "xml".}
\item{expanded}{logical defaults to FALSE. If TRUE, retrieves additional information, only applicable for qw data.} \item{expanded}{logical defaults to \code{TRUE}. If \code{TRUE}, retrieves additional information, only applicable for qw data.}
\item{ratingType}{can be "base", "corr", or "exsa". Only applies to rating curve data.} \item{ratingType}{can be "base", "corr", or "exsa". Only applies to rating curve data.}
} }
......
...@@ -9,11 +9,11 @@ importRDB1(obs_url, asDateTime = FALSE, qw = FALSE, convertType = TRUE, ...@@ -9,11 +9,11 @@ importRDB1(obs_url, asDateTime = FALSE, qw = FALSE, convertType = TRUE,
\arguments{ \arguments{
\item{obs_url}{character containing the url for the retrieval} \item{obs_url}{character containing the url for the retrieval}
\item{asDateTime}{logical, if TRUE returns date and time as POSIXct, if FALSE, Date} \item{asDateTime}{logical, if \code{TRUE} returns date and time as POSIXct, if \code{FALSE}, Date}
\item{qw}{logical, if TRUE parses as water quality data (where dates/times are in start and end times)} \item{qw}{logical, if \code{TRUE} parses as water quality data (where dates/times are in start and end times)}
\item{convertType}{logical, defaults to TRUE. If TRUE, the function will convert the data to dates, datetimes, \item{convertType}{logical, defaults to \code{TRUE}. If \code{TRUE}, the function will convert the data to dates, datetimes,
numerics based on a standard algorithm. If false, everything is returned as a character} numerics based on a standard algorithm. If false, everything is returned as a character}
\item{tz}{character to set timezone attribute of datetime. Default is an empty quote, which converts the \item{tz}{character to set timezone attribute of datetime. Default is an empty quote, which converts the
...@@ -27,7 +27,7 @@ A data frame with the following columns: ...@@ -27,7 +27,7 @@ A data frame with the following columns:
Name \tab Type \tab Description \cr Name \tab Type \tab Description \cr
agency_cd \tab character \tab The NWIS code for the agency reporting the data\cr agency_cd \tab character \tab The NWIS code for the agency reporting the data\cr
site_no \tab character \tab The USGS site number \cr site_no \tab character \tab The USGS site number \cr
datetime \tab POSIXct \tab The date and time of the value converted to UTC (if asDateTime = TRUE), \cr datetime \tab POSIXct \tab The date and time of the value converted to UTC (if asDateTime = \code{TRUE}), \cr
\tab character \tab or raw character string (if asDateTime = FALSE) \cr \tab character \tab or raw character string (if asDateTime = FALSE) \cr
tz_cd \tab character \tab The time zone code for datetime \cr tz_cd \tab character \tab The time zone code for datetime \cr
code \tab character \tab Any codes that qualify the corresponding value\cr code \tab character \tab Any codes that qualify the corresponding value\cr
......
...@@ -8,7 +8,7 @@ importWaterML1(obs_url, asDateTime = FALSE, tz = "") ...@@ -8,7 +8,7 @@ importWaterML1(obs_url, asDateTime = FALSE, tz = "")
\arguments{ \arguments{
\item{obs_url}{character containing the url for the retrieval} \item{obs_url}{character containing the url for the retrieval}
\item{asDateTime}{logical, if TRUE returns date and time as POSIXct, if FALSE, Date} \item{asDateTime}{logical, if \code{TRUE} returns date and time as POSIXct, if \code{FALSE}, Date}
\item{tz}{character to set timezone attribute of datetime. Default is an empty quote, which converts the \item{tz}{character to set timezone attribute of datetime. Default is an empty quote, which converts the
datetimes to UTC (properly accounting for daylight savings times based on the data's provided tz_cd column). datetimes to UTC (properly accounting for daylight savings times based on the data's provided tz_cd column).
......
...@@ -8,7 +8,7 @@ importWaterML2(obs_url, asDateTime = FALSE, tz = "") ...@@ -8,7 +8,7 @@ importWaterML2(obs_url, asDateTime = FALSE, tz = "")
\arguments{ \arguments{
\item{obs_url}{character containing the url for the retrieval} \item{obs_url}{character containing the url for the retrieval}
\item{asDateTime}{logical, if TRUE returns date and time as POSIXct, if FALSE, Date} \item{asDateTime}{logical, if \code{TRUE} returns date and time as POSIXct, if \code{FALSE}, Date}
\item{tz}{character to set timezone attribute of datetime. Default is an empty quote, which converts the \item{tz}{character to set timezone attribute of datetime. Default is an empty quote, which converts the
datetimes to UTC (properly accounting for daylight savings times based on the data's provided tz_cd column). datetimes to UTC (properly accounting for daylight savings times based on the data's provided tz_cd column).
......
...@@ -6,7 +6,7 @@ ...@@ -6,7 +6,7 @@
readNWISpCode(parameterCd) readNWISpCode(parameterCd)
} }
\arguments{ \arguments{
\item{parameterCd}{character of USGS parameter codes. This is usually an 5 digit number.} \item{parameterCd}{character of USGS parameter codes (or multiple parameter codes). This is usually an 5 digit number.}
} }
\value{ \value{
parameterData data frame with all information from the USGS about the particular parameter. parameterData data frame with all information from the USGS about the particular parameter.
...@@ -26,7 +26,6 @@ Imports data from NWIS about meaured parameter based on user-supplied parameter ...@@ -26,7 +26,6 @@ Imports data from NWIS about meaured parameter based on user-supplied parameter
This function gets the data from here: \url{http://nwis.waterdata.usgs.gov/nwis/pmcodes} This function gets the data from here: \url{http://nwis.waterdata.usgs.gov/nwis/pmcodes}
} }
\examples{ \examples{
# These examples require an internet connection to run
paramINFO <- readNWISpCode(c('01075','00060','00931')) paramINFO <- readNWISpCode(c('01075','00060','00931'))
} }
\keyword{USGS} \keyword{USGS}
......
...@@ -3,13 +3,13 @@ ...@@ -3,13 +3,13 @@
\alias{readNWISqw} \alias{readNWISqw}
\title{Raw Data Import for USGS NWIS QW Data} \title{Raw Data Import for USGS NWIS QW Data}
\usage{ \usage{
readNWISqw(siteNumbers, pCodes, startDate = "", endDate = "", readNWISqw(siteNumbers, parameterCd, startDate = "", endDate = "",
expanded = TRUE, reshape = FALSE, tz = "") expanded = TRUE, reshape = FALSE, tz = "")
} }
\arguments{ \arguments{
\item{siteNumbers}{character of USGS site numbers. This is usually an 8 digit number} \item{siteNumbers}{character of USGS site numbers. This is usually an 8 digit number}
\item{pCodes}{character of USGS parameter code(s). This is usually an 5 digit number.} \item{parameterCd}{character of USGS parameter code(s). This is usually an 5 digit number.}
\item{startDate}{character starting date for data retrieval in the form YYYY-MM-DD. Default is "" which indicates \item{startDate}{character starting date for data retrieval in the form YYYY-MM-DD. Default is "" which indicates
retrieval for the earliest possible record.} retrieval for the earliest possible record.}
...@@ -17,12 +17,14 @@ retrieval for the earliest possible record.} ...@@ -17,12 +17,14 @@ retrieval for the earliest possible record.}
\item{endDate}{character ending date for data retrieval in the form YYYY-MM-DD. Default is "" which indicates \item{endDate}{character ending date for data retrieval in the form YYYY-MM-DD. Default is "" which indicates
retrieval for the latest possible record.} retrieval for the latest possible record.}
\item{expanded}{logical defaults to TRUE. If TRUE, retrieves additional information. Expanded data includes \item{expanded}{logical defaults to \code{TRUE}. If \code{TRUE}, retrieves additional information. Expanded data includes
remark_cd (remark code), result_va (result value), val_qual_tx (result value qualifier code), meth_cd (method code), remark_cd (remark code), result_va (result value), val_qual_tx (result value qualifier code), meth_cd (method code),
dqi_cd (data-quality indicator code), rpt_lev_va (reporting level), and rpt_lev_cd (reporting level type).} dqi_cd (data-quality indicator code), rpt_lev_va (reporting level), and rpt_lev_cd (reporting level type). If \code{FALSE},
only returns remark_cd (remark code) and result_va (result value). Expanded = \code{FALSE} will not give
sufficient information for unbiased statistical analysis.}
\item{reshape}{logical. Will reshape the data to a wide format if TRUE (default is FALSE). This is only \item{reshape}{logical, reshape the data. If \code{TRUE}, then return a wide data frame with all water-quality in a single row for each sample.
available for 'expanded' data.} If \code{FALSE} (default), then return a long data frame with each water-quality result in a single row.}
\item{tz}{character to set timezone attribute of datetime. Default is an empty quote, which converts the \item{tz}{character to set timezone attribute of datetime. Default is an empty quote, which converts the
datetimes to UTC (properly accounting for daylight savings times based on the data's provided tz_cd column). datetimes to UTC (properly accounting for daylight savings times based on the data's provided tz_cd column).
...@@ -48,6 +50,8 @@ Name \tab Type \tab Description \cr ...@@ -48,6 +50,8 @@ Name \tab Type \tab Description \cr
url \tab character \tab The url used to generate the data \cr url \tab character \tab The url used to generate the data \cr
queryTime \tab POSIXct \tab The time the data was returned \cr queryTime \tab POSIXct \tab The time the data was returned \cr
comment \tab character \tab Header comments from the RDB file \cr comment \tab character \tab Header comments from the RDB file \cr
siteInfo \tab data.frame \tab A data frame containing information on the requested sites \cr
variableInfo \tab data.frame \tab A data frame containing information on the requested parameters \cr
} }
} }
\description{ \description{
...@@ -59,10 +63,10 @@ A list of statistic codes can be found here: \url{http://nwis.waterdata.usgs.gov ...@@ -59,10 +63,10 @@ A list of statistic codes can be found here: \url{http://nwis.waterdata.usgs.gov
siteNumbers <- c('04024430','04024000') siteNumbers <- c('04024430','04024000')
startDate <- '2010-01-01' startDate <- '2010-01-01'
endDate <- '' endDate <- ''
pCodes <- c('34247','30234','32104','34220') parameterCd <- c('34247','30234','32104','34220')
rawNWISqwData <- readNWISqw(siteNumbers,pCodes,startDate,endDate) rawNWISqwData <- readNWISqw(siteNumbers,parameterCd,startDate,endDate)
rawNWISqwDataReshaped <- readNWISqw(siteNumbers,pCodes, rawNWISqwDataReshaped <- readNWISqw(siteNumbers,parameterCd,
startDate,endDate,reshape=TRUE) startDate,endDate,reshape=TRUE)
} }
\seealso{ \seealso{
......
...@@ -6,7 +6,7 @@ ...@@ -6,7 +6,7 @@
readNWISsite(siteNumbers) readNWISsite(siteNumbers)
} }
\arguments{ \arguments{
\item{siteNumbers}{character USGS site number. This is usually an 8 digit number} \item{siteNumbers}{character USGS site number (or multiple sites). This is usually an 8 digit number}
} }
\value{ \value{
A data frame with at least the following columns: A data frame with at least the following columns:
......
No preview for this file type
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment