The Availability of Water

Oct. 23, 2012

Introduction
Water is the natural capital of the growing world population. Services built on our natural capital are the currency of the 21st century. The timing and spatial distribution of surface water quantity—and the variability in quality of that water—define how we design and build the infrastructure necessary for our energy, agriculture, mining, transportation, and industrial sectors.

While water supports our infrastructure, it can also take lives. Droughts and floods are threats that require our constant vigilance. Our abilities to predict flooding, plan for droughts, and support healthy ecosystems are challenged by land-use and climate change. Safe drinking water sources and entire ecosystems depend on continuous improvements in our understanding of, and efforts to protect, our water resources.

In fact, it is difficult to overstate the importance of the availability, reliability, and accuracy of data from water monitoring programs. Today’s hydrometric monitoring networks range from volunteer stewardship of small watersheds to continental-scale programs. Collectively, they are the basis for every action taken to support beneficial uses of water and to minimize threats from water.

Written for water resource managers, this whitepaper outlines the 5 essential elements of a successful hydrological monitoring program:

1. Quality Management System

2. Network Design

3. Technology

4. Training

5. Data Management

The day-to-day work of the stream hydrographer has changed substantially from even a decade ago. It is time to review how these changes impact the end-to-end system for collecting and publishing credible and defensible data. This document presents a modern “best practices” approach to hydrometric monitoring. The practices are fully scalable to any size of network and can improve the availability, reliability, and accuracy of all of our water information assets.

Quality Management System
A Quality Management System (QMS) is a set of standard operating procedures that govern the data production process to ensure that the data are of consistent, known quality. Every monitoring program requires clear objectives for (1) data quality, (2) service, and (3) security that are closely linked with the needs of the end users. The QMS provides rules to direct and control an organization towards meeting these quality management objectives.

In evaluating or creating a QMS, water resource managers must keep in mind the concept of “fitness for purpose.” Data adequate to order an evacuation of a floodplain, for example, may be inadequate for testing a hypothesis about a trend. End-users of data develop a trust relationship with data providers based on their confidence that the quality management objectives–for data quality, service, and security–have been met with respect to their intended purpose.

Quality Objectives. Quality is a result of observation and information production processes. These processes need to be enforced by formal compliance with documented standard operating procedures. There are several industry sources for hydrometric standards, including

  • US Geological Survey (USGS) Techniques & Methods
  • USGS Techniques of Water Resources Investigations
  • ISO Technical Committee 113
  • World Meteorological Organization (WMO) Operational Hydrology Reports

A commitment to internationally accepted technical standards provides a basis for inter-comparability of data. Data produced by different agencies (or even by different hydrographers within the same agency) should have similar accuracy and precision. This means if hydrographers independently monitor the same gauge, the resultant discharge hydrographs would be very similar and without systematic bias.

Service Objectives. The service objectives address the completeness of the data (for given levels of quality assurance at different lag times since observation). Historically, hydrometric data was published annually, as aggregated daily values and extreme statistics. Today, the focus is on real-time, continuous publication of unit value data. A modern hydrometric service needs to address evolving expectations for data reliability and timeliness.

Achieving the desired service objectives is primarily a function of the balance between:

  • staffing (e.g., response time for instrument failure);
  • equipment specifications (i.e., instrument reliability);
  • life-cycle management of equipment (i.e., calibration and control procedures);
  • efficiencies in data production (e.g., automated notifications, auto-corrections, and auto-publication); and,
  • feedback from the data production process (e.g., sufficient metadata to support a continuous improvement process).

There is also an increasing expectation that data should be openly discoverable, searchable, and accessible. Harmonized standards for data inter-operability are provided by the Open Geospatial Consortium (OGC). For example, the WaterML2.0 standard provides for the exchange of (1) point-based time series data, (2) processed values such as forecasts and aggregations, and (3) relevant information on monitoring points, procedures, and context. By working within the OGC framework, water resource managers ensure that observations can be provided in the context of relevant coverages and features.

Security Objectives. Hydrometric data are valuable. There are large capital, human, and operational investments in discharge information. The security objectives aim to protect these investments over the life of the data. In a well maintained data management environment, the value of the data accrues with time.

But any information legacy is vulnerable to neglect, loss, and destruction. Technological advancements can result in fragmented records and incompatible formats. Continuity between modern systems and historical archives must be managed with care and diligence.

The Global Climate Observing System (GCOS) Principles provide several best practices for maintaining data integrity when managing time series data. In particular: “The details and history of local conditions, instruments, operating procedures, data processing algorithms, and other factors pertinent to interpreting data (i.e., metadata) should be documented and treated with the same care as the data.”

Best practices for data curating ensure that (1) the data are secure and stored out of harm’s way, (2) metadata are complete, and (3) documentation is available for any changes in methods that could potentially impact the integrity of the data.

Results Focus. It is one thing to clearly articulate the desired data quality, service, and security objectives. However, the Quality Management System must also verify that the product meets the needs of end-users. Any departure from expected results should provide feedback, creating a loop of continuous improvement. The needs of end-users change with time so the QMS has to be adaptive.

Verifying that the quality objectives have been met is a two-step process. Quality Control is a system of routine and consistent checks to ensure data integrity, completeness, and compliance with stated standard operating procedures. Quality Assurance is a system of independent review procedures to verify that the data quality objectives are met.

Most National Hydrometric Services have developed their own QMS, however, some are choosing to become certified in the standardized ISO 9000 method.

Network Design
Network design is an ongoing process with new stations being established and existing stations being discontinued as program priorities and funding evolve. This process must be managed with selective thinning and pruning, while nurturing new growth to fill data voids. Updating the design of a network is fundamentally a sampling problem. The challenge is to find the right balance between hydrometric monitoring objectives and site desirability.

Sampling the Phenomena of Interest. How will the information be used? The design process must begin with the end in mind. Locations upstream and downstream of dams or diversions are both useful, but for very different purposes. An upstream location is an integration of all runoff process occurring in the contributing watershed, whereas a downstream location is rich in information about what will be happening in receiving aquatic and riparian ecosystems. A good location is one where the variation in discharge is sensitive to the phenomena of interest.

Figure 1. Minimum density per station (area in km2/station)

The monitoring objectives determine which parameters need to be included in the network design. If the objective is regulatory compliance or to develop statistics for engineering design, then perhaps the only parameter needed is discharge. However, if the purpose is to understand runoff processes, to develop water management policies, or to calibrate predictive models, then network design should consider all relevant components of the water cycle, including stores (e.g., groundwater, snowpack, and lake levels) and flux (e.g., temperature, evaporation, and precipitation). The measurement of some parameters (e.g., sediment and water quality) must be colocated with discharge gauging if loadings are a requirement. Jurisdictional collaboration is integral to the network design process and ensures an efficient, coordinated approach to monitoring within a watershed.

Sampling the Hydroscape. The design of a successful hydrometric monitoring network must next consider how the variability in space needs to be sampled so that the variability in time can be effectively monitored. In other words, the location of gauges should reflect the geophysical complexity of the landscape. In order to satisfy the assumption that the data are scalable and representative, gauges must be located across the scale of the geophysical variability of the watershed.

The WMO Guide to Hydrological Practices recommends the station densities shown in Figure 1. Ultimately, the pragmatic station density in a region is a function of risk tolerance. These regional-scale density recommendations may be inadequate to fully characterize local-scale threats from flooding or to provide the needed guidance for local-scale water supply management. Risk tolerance is often particularly high in the developing world; resulting in a perpetual need to react to, rather than prevent, water related crises.

Selecting the Site. Once the monitoring objectives and criteria for geophysical representativeness are established, then a specific reach of river can be selected for monitoring. A desirable location is one with (1) uniform, gradually varying flow, (2) inexpensive site access, (3) stable geophysical features for vertical control benchmarks and for channel control, and (4) safe stream gauging conditions.

Monitoring objectives often restrict the choice of possible locations to those with adverse monitoring conditions. A mismatch between local conditions and appropriate technology results in poor quality data and high maintenance requirements for both field and office procedures. Technologies are available to mitigate for almost any compromise needed in site selection, but the most reliable and affordable solutions are predicated on good site selection.

Site selection affects the following outcomes:

  • data persistence (i.e., a well selected location should produce data for generations to come),
  • data quality (e.g., conformance with underlying assumptions),
  • data representativeness (i.e., relevance to ungauged locations),
  • operational costs (e.g., site access),
  • liability risks (i.e., occupational and/or public safety),
  • selection of methods (e.g., use of rating curve vs. index velocity method), and
  • reliability risks (e.g., exposure to vandalism).

With so much at stake, a significant investigation is warranted for any change in network size. Unfortunately, water resource managers often come under pressure with direction to expand or contract the network on short notice (for example, to make the change by fiscal year end). Thus, many important decisions are made in haste. As a best practice, network design should be an ongoing process with preparedness to make wise choices on short notice.

Technology
Selecting the best technology for a given location is more complex than ever before. Even when choosing a simple pressure transducer, a hydrologist must consider the type (e.g. piezoelectric, capacitive, inductive, potentiometric, vibrating wire, vibrating cylinder, or strain-gauge) and the method of deployment (e.g., bubbler, vented, or compensated). For each combination of these technologies there are numerous vendors and products available–and each product has a performance specification that can be characterized by an error band, hysteresis, resolution, sensitivity, and time constant.

Hydrometric network operators must consider several additional factors:

  • Reliability requirements: an acceptable mean time between failures
  • Accuracy in the deployed setting: the blanking distance of some acoustic Doppler current profilers (ADCPs), for example, may be too great to correctly measure discharge for some stream geometries.
  • Cost of site access: for remote sites, the incremental costs of an acoustic Doppler velocity meters (ADVMs) for use with an index-velocity model may be easily recouped by reduced site visits.
  • Local site factors: high sediment transport, algal blooms, and river ice are all factors that warn against deploying expensive submersible technology.
  • Instrument sensitivity and precision: relates to the time and effort spent on post-processing of the data
  • Training and familiarity: limiting the variety of products deployed in a region can greatly reduce both the training burden and the likelihood of blunders caused by a lack of familiarity with a specific device.

Factors that affect the total cost of ownership of technology include the initial capital cost, field calibration and service frequency requirements, unscheduled field visits to repair or replace, time and effort spent on corrections and post-processing of the data, data lost due to sensor failure, amount of data degraded by high uncertainty, and supplies (e.g., compressed gas and/or power source). Money saved at the time of purchase can be easily exceeded by operations and maintenance costs.

Total Cost of Ownership. Low-cost monitoring equipment does, nonetheless, have its place. For example, in monitoring a high-risk location (e.g., during a dynamic river ice breakup), one needs to get as much data as possible before the sensor is inevitably lost or destroyed. There can be as much as an order of magnitude of difference in the cost of sensors. Low-cost sensors have also led to the concept of “a network as a sensor” where several redundant sensors can be deployed at a gauge. In some cases, it is advantageous to use the average of these independent, if imprecise, measurements and also get a measure of the aggregate uncertainty. This concept also lends itself to deploying many low-cost sensors to sample landscapes at the scale of space-based observation systems.

In the context of total cost of operation, telecommunication technologies offer a significant improvement in data reliability as a result of real-time station health monitoring and improvements in the timing of stream gauging activities.

Training
No investment in technology can compensate for poor choices in data collection and data handling. Errors by procedural blunders are the most difficult to detect and correct in data post-processing. Training accelerates the rate that competencies are gained while simultaneously reducing the frequency of blunders. Training is, arguably, more important than ever. The demographic in many monitoring agencies today has a double hump of new recruits and pre-retirees, creating an urgent need to compensate for loss of experience with improvements in knowledge.

Stream hydrographers must be skilled in many disciplines to be truly effective. The measurement of flowing water is a sophisticated application of science and engineering principles. Decisions made in the field and for data interpretation require a basic understanding of physics, chemistry, biology, hydrology, hydrodynamics, fluvial geomorphology, math, and statistics.

Additionally, the installation and operation of hydrometric monitoring equipment requires skills in plumbing, wiring, and programming. Stream gauging requires expert interpretation of quality management protocols with respect to the selection and application of methodologies while considering the specific context of the measurement conditions. The stream hydrographer must make decisions to limit adverse environmental effects and to preserve both personal and public safety.

While there are limited options for training, here are some sources to consider. Some National Hydrometric Services (e.g., USGS) offer courses to the general public. Short courses in hydrometric methods are also available from hardware and software vendors, various colleges, and UNESCO. Some useful, if limited, online training resources include

  • USGS Surface Water Training
  • World Hydrological Cycle Observing System (WHYCOS)
  • University of Idaho
  • Humboldt College
  • Comet Training

Investments in training improve data quality, increase productivity, improve gauge reliability, and enhance safety. Training in stream hydrography must be a continuous process to keep current with best practices as they apply to new and emerging technologies.

Data Management
Improvements to hydrological monitoring programs often focus on field based technologies. What is frequently overlooked is how the data are managed after acquisition. Hydrologic data are complex. Stream hydrographers are responsible for storing, validating, analyzing, and reporting on vast amounts of water data.

Specialized Hydrologic Data Management Systems are available to meet the evolving needs of hydrologists and to support current industry standards for water information management. Software designed specifically for hydrologists is required to achieve excellence and effectiveness in any hydrological monitoring program. The advantages of modern Data Management Systems are explored next.

Auditable and Defensible Data. As discussed, the Quality Management System establishes the credibility of the data production process. One important role of the Data Management System is to establish the defensibility of the data by providing evidence of compliance with the QMS. This means the Data Management System must preserve the full history of the data, including who did what, when, how, and why.

As a best practice, raw data must be preserved intact and all changes must be recorded and be reversible, if needed. This means that data can be rolled back in time to show exactly what edits, corrections, approvals, or notes were applied at any point in time. This is particularly important when dynamically publishing data using webpages or Web services, as opposed to static documents. The complete history (of who did what, when, where, how, and why) supports peer quality control and supervisory quality assurance. This history confirms the second half of the quality management mantra: “say what you do, do what you say.”

Centralized and Accessible Data. Hydrologists must manage many types of data in all kinds of formats, for example: lab data in Excel, time series in CSV, gauging data in hardware vendor software, and station data in GIS. As a best practice, all of this data and supporting metadata are consolidated and managed as a secure, coherent collection. The best solutions support relational queries of this data collection. Web service connections to this database mean that data and metadata are accessible from anywhere, at any time.

Real-Time Data and Automation. A modern hydrometric monitoring system delivers data dynamically in real-time. Ideally, the best data are continuously available and can be served using international standards for inter-operability. This means that end-users benefit as soon as new data are appended, erroneous values are filtered, corrections are applied, rating curves updated, or shift corrections applied. The best solutions also provide end-users with informative metadata about the quality and status of the data. Data can be filtered based on the state of the data in the QMS process. Archival quality data are clearly identified and “locked” from further editing.

Automated notifications provide timely warnings about hydrological events and alert hydrographers to any faults or station health indicators that require immediate attention. Automated data correction algorithms censor invalid values and correct persistent and/or predictable errors in real-time. This eliminates some of the most onerous and repetitive tasks, allowing the stream hydrographer to focus on high value interpretive analysis. Automated reporting provides high value data products to water resources professionals and decision makers on an event-driven or scheduled basis.

Credible Rating Curves. The best solutions for developing and validating rating curves are engineered from basic hydraulic principles. The full suite of information gathered in the field is relevant to the calibration process, not just the x, y coordinates of the rating measurements. This includes consideration of site photos, cross sections, field notes, measurement quality, control conditions, historical ratings, and the time series of stage data. It has been shown to be less work and more accurate to use an evidence-based approach to curve-fitting rather than to be forever “chasing” the curve using statistical regression techniques.

With modern hydrometric monitoring systems, discharge derivation models are calibrated with respect to underlying hydraulic science and engineering principles. The result is:

  • improved confidence in extrapolation (within the range of known channel geometry);
  • improved agreement on a solution (i.e., different hydrographers will independently produce similar results); and
  • improved defensibility of results (i.e., rating curve parameters help to constrain the solution).

It is often necessary to accommodate shifting channel control conditions with corrections to the stage-discharge model. The best solutions for managing shift corrections include the inspection and interpretation of field observations, residuals plot, and time series visualizations.

Advanced visual interpretation and analysis of the data is needed to identify errors that cannot be detected automatically. Sophisticated graphical tools available with data management systems make it easier to calibrate time series data using field observations from a reference gauge. Specialized corrections can be made for many of the common, often repetitive, errors typical of the technologies used for hydrometric monitoring. Sophisticated methods are needed to estimate longer gaps in the data and for periods of ice effect. Extensive and comprehensive abilities are required to comment on these actions and to add event markers, quality grades, and to change the status of the data.

Data Visualization, Correction, and Markup

Reporting and Publication. The best data management systems provide for continuity in reporting with customizable report templates that can be tailored to match legacy reports. New high-value reports can be developed from scratch or by modifying templates for industry standard reports. The content for the reports can be filtered according to status in the QMS so that reports of archival quality data can be readily produced for conventional publication. Access to Web services provides the ability to dynamically publish data, based on metadata filters, using industrywide standards.

About the Author

Stu Hamilton

Stu Hamilton is a senior hydrologist at Aquatic Informatics.

Related

Photo 59441063 © Rawpixelimages | Dreamstime.com
Photo 91848825 © Leowolfert | Dreamstime.com
Photo 102975436 © Photovs | Dreamstime.com

Photo 39297166 © Mike2focus | Dreamstime.com
Photo 140820417 © Susanne Fritzsche | Dreamstime.com
Microplastics that were fragmented from larger plastics are called secondary microplastics; they are known as primary microplastics if they originate from small size produced industrial beads, care products or textile fibers.
Microplastics that were fragmented from larger plastics are called secondary microplastics; they are known as primary microplastics if they originate from small size produced industrial beads, care products or textile fibers.
Microplastics that were fragmented from larger plastics are called secondary microplastics; they are known as primary microplastics if they originate from small size produced industrial beads, care products or textile fibers.
Microplastics that were fragmented from larger plastics are called secondary microplastics; they are known as primary microplastics if they originate from small size produced industrial beads, care products or textile fibers.
Microplastics that were fragmented from larger plastics are called secondary microplastics; they are known as primary microplastics if they originate from small size produced industrial beads, care products or textile fibers.
Photo 43114609 © Joshua Gagnon | Dreamstime.com
Dreamstime Xxl 43114609
Dreamstime Xxl 43114609
Dreamstime Xxl 43114609
Dreamstime Xxl 43114609
Dreamstime Xxl 43114609