Got Tough Questions? Ask a Model

Computer simulations make it possible to try out new ideas without the risk of disrupting communities or ecosystems, to evaluate how a watershed would respond to acts of nature or engineering. Jeff Shiner believes computer modeling gives engineers and decision makers a set of powerful tools to make better decisions for the wellbeing of the community and the health of the watershed. They also make it possible to envision nature’s potential for good or for ill, and to be ready before extraordinary events occur.

Shiner, a civil engineer with the Metropolitan St. Louis Sewer District (MSD) in Missouri, notes that MSD has developed an ongoing relationship with the software provider Vieux & Associates to perform important computer modeling tasks for its stormwater program. According to Shiner, the company’s RainVieux software is a workhorse of the stormwater program. The agency uses the software package to obtain the highly accurate rainfall data it needs for planning. RainVieux computationally combines radar rainfall data with rain gauge measurements tabulated by instruments the county MSD owns and operates. The application allows users to download, display, and query rainfall in near real time.

Using RainVieux, MSD crosschecks its rain gauges against radar data and receives bias-corrected rainfall reports on a daily basis. This allows managers to pinpoint, repair, replace, or calibrate any malfunctioning or errant rain gauges whose readings fall outside the bounds of probability, ensuring data that are as accurate as possible.

Shiner says that about a decade ago, hydrologists performed a study to develop rain depth, duration, and frequency (DDF) curves for St. Louis County to help engineers envision the effects of precipitation from storms the region would be likely to dish out. More recently, MSD updated the data for its DDF curves, using the capabilities of RainVieux to perform an analysis of thousands of storm cells that had passed over the county. These studies were used to verify the rainfall statistics that the municipality can use as a reliable basis for capital improvement planning.

A Stream of Knowledge
Jared Barr, an engineer with MSD, says MSD has embarked on a few additional modeling exercises using a different Vieux & Associates software package called Vflo. According to Barr, it works together seamlessly with RainVieux, incorporating Vflo’s hydrology model with RainVieux’s highly accurate rainfall data to simulate runoff during storms.

Barr says MSD maintains a high-resolution light detection and ranging (LIDAR) dataset for the county’s topography, which is updated about every two years, along with digitized impervious area data from aerial photography, both of which can be easily entered into the Vflo program. In addition, he supplies the model with infiltration parameters from various sources such as the US Geological Survey soil data sets or the National Land Cover Dataset.

After running the model, Barr says, the results become very visible. “When you’re looking at the model, you see a bunch of grid cells, and then you’ll see arrows indicating flow path. Obviously, you can add background images. Without doing any of the computations, you can select one of the pixels and the program will highlight all of the other pixels that are upstream of that area, so it gives you an indication of what the drainage area is immediately, or the size of the drainage area. When you perform your computation for whatever event you’re looking at, it will generate hydrographs for that outlet or for a set of outlets. You can also choose other observation areas within the watershed—it will generate a hydrograph for that area, and it can give you an indication of depth of flow for those areas as well.”

Barr says part of the problem with stormwater in urban areas is “you get overland flooding that can sometimes surcharge a sanitary or combined sanitary stormwater sewer.” He adds that Vflo can help analyze these situations as well.

“The advantage of Vflo is not only is it able to give you a hydrograph based on rainfall, but it can also give an estimate of what the flow path is, and what the flow depth is,” allowing engineers to evaluate the potential impacts of overland flooding. Barr says that to model overland flooding with other software such as 1D modeling software, “you have to have some estimate of flow path.” In contrast, he notes, “You don’t have to, outside of the program, determine your drainage path, so that makes it quicker to use. Vflo takes the whole 2D surface and it will compute for each grid cell which way the water is flowing, what’s the depth of that grid, and where the water is coming from.”

Shiner observes that the ability to run simulations is having a big influence on how engineers think about stormwater, helping them see beyond the old Band-Aid solutions of building concrete channels to treat water as an inconvenience or a hazard to be dispatched as rapidly as possible. He believes models like Vflo help practitioners take a more holistic approach to stormwater, looking at “the overall health of the watershed, and trying to determine where the source of the problems are.”

With Vflo, he says, “You can see flood waves move down the system and be confident that it’s based on good science and good input data, and that you’re going to get a reasonable and accurate result. That does help you do a better job of planning the improvements you need on a watershed-wide basis.”

True Grid
Although it is important to know where stormwater is coming from and where it is going, Arthur Deicke, owner of Environmental Pollution Solutions LLC in Santa Rosa, CA, says it is also critical in stormwater management to get a handle on the relationship between stormwater infrastructure and other aspects of a site.

When the new California Industrial Stormwater General Permit goes into effect in July 2015, every industrial facility, whether light industry or heavy manufacturing, will need to have a grasp on this kind of information and will need to produce stormwater management maps as part of their permit requirements. Deicke is well on his way to being prepared for meeting these new rules. As part of a volunteer team working to develop a training program for Qualified Industrial Stormwater Practitioners (QISPs), he has been helping prepare the way for others as well.

Under the new rules, permittees will be required to assign a QISP for each facility covered under the permit that has “entered Level 1 status in the Exceedance Response Action (ERA).” One of the duties of the QISP will be keeping track of stormwater management activities at the facility.

Drawing up stormwater management site maps may well be an unfamiliar task for many new practitioners, Deicke says. As a result, he believes many of the facility managers covered by the new permit requirements will not have the technical expertise or resources to produce detailed site maps from scratch using the traditional CAD approaches. However, Deicke believes Mapistry will ease the way for managers with little or no engineering background or drafting skill to comply with the permit requirements.

In fact, as an engineer, Deicke admits that drawing up site maps used to be rather time consuming in his own practice, but that technology has recently changed the way he works.

In the past, Deicke found himself constrained to using complex CAD software for the task. “I typically used four separate maps,” he says, but Mapistry software, based on the satellite photos and GIS, allows users to type in an address and move about the site logging the location of areas of concern and stormwater BMPs on the fly. “It has the tools you can use to draw lines, shapes, and polygons and other tools to label them. I take that and make a polygon noting where the facility is in relation to lakes and bays—it’s a quick and easy site map.”

He says, “There is a clear advantage to using satellite images for stormwater planning maps. It brings out things you can’t see with a typical AutoCAD map. You can see what’s going on on adjoining properties, you can see trees—all of that is useful. When I’m looking at a parking lot of an industrial site, I can see if somebody is storing something that has the potential to affect stormwater runoff or if there may be something near the site that could generate run on.”

He also appreciates the flexibility Mapistry allows. “You can take it out and enter the information on an iPad or tablet and take it back to the office or just call it in.”

For the QISP training project, Deicke and his team developed hypothetical sites, built maps, wrote plans, and imagined scenarios of what could go wrong to help prepare prospective practitioners.

Ryan Janoch, CEO of Mapistry, says a new release from the company in response to California’s new industrial permit will allow users to “quickly prepare a permit-required site map and to prepare a SWPPP [stormwater pollution prevention plan] automatically, based on information in the site map and questions they answer through the program interface—like TurboTax does for taxes.”

Janoch explains, “Our tool will have layers in the mapping tool already created that match the permit requirements, so users just have to add in their site-specific information. It also will have a built in checklist that has the IGP requirements and a training tool that shows them how to build a site map themselves.

“The site map they create can be used for submission with a No Exposure Certification that many light industries—previously exempt from the California permit—will do, or can be submitted with the SWPPP as part of the Notice of Intent permit-required documents that the rest of the facilities will do. This will save engineers and facility managers thousands of dollars and multiple days of drafting and writing time.”

A Model Spreads Its Waters
As president and owner of GMV Engineering, Gabor Vasarhelyi relies on XP Solutions’ XPSWMM as his key simulation tool for a variety of projects related to stormwater and drainage. From his company’s base in Burnaby, BC, he has provided consulting engineering services for numerous unique projects in the Corporation of Delta, one of the most ecologically sensitive regions in North America.

With the elevation of Delta very near sea level, Vasarhelyi likens the region to the lowlands of Holland on the North Sea coast. Covering 13,000 hectares, Delta is protected by levees with 21 outfalls. The gem of the region is the rare and environmentally sensitive Burns Bog. The 3,000-hectare Burns Bog watershed, whose only source of water supply is direct precipitation, represents the most southerly location of a raised peat bog in North America. Apart from its abundant wildlife habitat, hosting more than 40 species of mammals, it has also been nominated as a United Nations Educational Scientific and Cultural Organization (UNESCO) World Heritage site of tremendous ecological consequence requiring an extra level of care and stewardship. As a consequence, Vasarhelyi says, projects undertaken in the area must support Burns Bog management and restoration objectives.

According to Vasarhelyi, the landscape where GMV Engineering operates, while by no means alpine, includes areas of steep hillsides and canyons. In some of these areas mountainous terrain suddenly gives way to wide, flat floodplains, with very abrupt changes in hydrologic regime. During storms, which in the region typically last from two to five days, the unusual topography generates high-velocity flows that can almost instantly be transformed into overland flows when the water reaches the flatlands, with water flowing over the boundaries of natural channels from one watershed into neighboring watersheds.

He says a good model relies on good information and detail, but not too much of it. “My philosophy is to describe the system in as much detail as is necessary for the application, but not more.” His goal is to limit the number of simplifications and assumptions that go into describing the system, hence to minimize the uncertainty in the results.

With projects spanning a wide spectrum, from designing storm drainage for a 40-plus-kilometer lowlands highway to irrigation enhancement projects for the agricultural portions of Delta and a comprehensive study of the entire Delta watershed system, including Burns Bog, Vasarhelyi says he finds XPSWMM’s link-node model flexible and robust enough for numerous situations. The software allows him to model the many unique features of the region.

“It’s suitable to urban and natural watersheds,” he says. “I can apply it to projects in different areas; I can use it to model pump stations, flood gates, and hydrology with real-time controls.”

Because the area is so flat, Vasarhelyi says, it is important to keep in mind that “when you make some changes in a drainage system in one location it could impact the drainage system of the large area. A key point is that when I build a development or whatever the specific project, I always analyze it in the context of the overall drainage system, so that’s not like looking at an isolated element, but looking at it as part of a large system.”

He explains, “I always analyze the specific system that is subject of my study at the time, in the context of the overall system.” In the case of a project covering perhaps 200 hectares, “you still look at it in the context of the entire 13,000-hectare watershed.”

As he works on a variety of projects for a variety of clients, Vasarhelyi says, XPSWMM is flexible enough to handle different levels of detail, making it possible to add new detail to the model as each project demands. XP Solutions says that along with link node capability, its model also has 2D grid capability for overland flows, a combination that differentiates it from many models on the market.

One particular highway project, Vasarhelyi says, began as a conceptual model with limited detail when he began working with it eight years ago. He has since doubled the number of nodes from the original 1,500 to 3,000 and is working to generate from it customized “design-level models” for individual projects in the region.

EPA’s Top Models
EPA’s Stormwater Management Model (SWMM) has been around more than 40 years. However, it has seen some big changes in those years, evolving with the rest of the computer world while transitioning from its first home on the most advanced mainframe computers of the time to ever more convenient hosts. Today EPA offers the program as a free download that can be run on desktop computers or PCs under Microsoft windows.

SWMM’s source code is freely available in the public domain for anyone to work with, either to run the model or to modify it and generate upgrades of their own. According to an agency spokesperson, commercial vendors and software developers have added a lot of value to SWMM in this way with their upgraded versions of its hydrological models. These upgrades have a multiplier effect, adapting and fine-tuning the models with features applicable to an ever-expanding array of locales and situations, including many overseas that otherwise might have extremely limited access to pollution control software.

Among the features enterprising vendors and computer experts have introduced to their versions of SWMM are GIS interactivity, proprietary interface designs, and upgrades from EPA’s standard one-dimensional (1D) capability to 2D capability, making it possible to model the movement of stormwater overland and simulate the depth of floodwater as it flows over topographical features.

“What makes SWMM unique is that you can model storms as the rain falls, you can take a rainfall record and see how the sewer will respond—people can model 20 years of storms in a matter of days or hours” says Michael Tryby, environmental engineer and expert SWMM programmer with EPA’s National Risk Management Research Laboratory.

Vasarhelyi agrees. SWMM he says, provides a good way to simulate the natural environment. “You can simulate decades in a few hours and generate the information you need fast depending on the size of the model.”

Reality Check
“All models are wrong; some are useful,” says Barr, quoting the text on modeling titled Empirical Model Building and Response Surfaces by G. E. P. Box and N. R. Draper. Vasarhelyi expresses the sentiment another way: “All models are bad, but some are less bad than others.”

All models are based on numerical equations; it’s their strength and their weakness. The physical phenomena models represent have been distilled, if not to their essence, at least to the most relevant factors that can be reliably represented in mathematic terms.

Barr elaborates on these strengths and weaknesses in general terms: “Far too often in engineering, numerical models are held as the standard of truth when describing physical phenomena.”

But, he says, “They are approximations of reality and are only as good as the basic hypothesis upon which the model equations are developed. This is why it is essential for an engineer to understand the model and not treat it as a ‘black box’ that gives answers based upon the input. For an engineer to trust the solutions provided by a model, it must be demonstrated that the simplifications made by the model are reasonably close to the reality of the situation being analyzed.”

Vasarhelyi says that although engineers need not know the source code of a model, it is critical that they understand the physics of the water and how the models they use work. “You need to know the formulas and the different parameters” that can be represented in them, such as the shape of a channel, the roughness of the channel bed, the character of the soil or vegetation, and other conditions of a site that could be described within the model.

Barr says it’s also important to know the right questions to ask a model. “It is essential to know what is needed before a model can be selected. For example, in a hydrologic analysis, if the engineer wants to have a peak flow at the outlet of a small watershed, then it may be sufficient to use the Rational Method rather than a more detailed rainfall runoff model; after all the engineer does not require the hydrograph shape or total flow volume. However, if the engineer desires to understand hydrograph attenuation in a watershed with multiple reservoirs, then the question requires a detailed hydrologic/hydraulic model. Without knowing the real question to be answered, the engineer is at risk of using either an overly complex model when it is not needed, or worse, expecting a model to provide an answer to a question it is not capable of answering.”

Finally, a model’s work is not complete until a human being steps up to put the results in context. Barr explains, “It’s rare that I find you can take the results just as they come out of the model and apply them directly.

“One example is in river flood models. Often a 1D model is used to provide an estimate of flood levels and inundation area, but too often the black line drawn on a floodplain map is viewed as exact and that one side is flooded and the other is not. Also, in a 1D model, the flood elevation is prescribed as constant across a cross-section.”

Barr says taking raw modeling results literally in this way invites “a fallacy.” He says in stormwater modeling, practitioners should keep in mind that “the elevation computed is an average elevation and may in fact vary greatly across a wide floodplain. It is the engineer’s responsibility to interpret these results and communicate them to the end users.”

John Hopcroft, winner of the 1986 Turing Award, widely considered the Nobel Prize of computer science, has witnessed firsthand a quantum leap since he started working in the field of computing more than 50 years ago. Speaking last year at the Heidelberg Laureate Forum to some of the world’s top computer scientists and mathematicians, he said the question he and his colleagues strove to answer as they contemplated circuits and switches, algorithms and data, was “How can you make a computer useful?” Today, he says, computer scientists are teaming with practitioners from all walks of life and asking a very different question: “What can you use computers to do?” Creating better and better models of our world to help society work through important decisions is one increasingly visible answer.

Michael Tryby of EPA places the matter in perspective. He says a model is an abstraction providing a numerical representation of a physical phenomenon. It is not a replica of reality, but rather, “It is a way of looking at reality. SWMM has a unique way of abstracting reality that works well for stormwater.”

Navigating Digital Dilemmas

This past summer at the Heidelberg Laureate Forum in Heidelberg, Germany, I met and spoke with some of the world’s top experts in computer science and math while they shared their ideas with 200 of the world’s top student researchers in engineering, math, and sciences in a range of disciplines.

But it wasn’t all shop talk about solving the trickiest equations or the most difficult algorithms; in fact, a few of the esteemed laureates offered their own insights on how to help society get through life and work in a world increasingly dominated by digital technology.

Manuel Blum, a Turing Award winner in 1995 for work in computational complexity theory, shared helpful hints anyone can use. Recognizing the challenge many of us who work with computers face with online security, he suggested that anyone who uses the same password or even similar passwords for more than one website or vendor is, to put it terms milder than his own, taking on needless risk.

Showing his concern over the dilemma of whether to trust our passwords to our memory or our hiding places, he shares his own hack-proof method of devising innumerable secure passwords without the need to write down or memorize a single one of them.

Using a system he devised, which he calls human computable protocols, all it takes are a few simple mental games transposing letters, numbers, and secretly imagined objects to create an easily remembered virtual mental machine that can generate and recall an unlimited supply of fresh, uncrackable passwords on demand. He says anyone using his technique will greatly enhance their security when accessing the Web or mobile devices or when interacting with applications on the cloud.

Vinton Cerf, known along with Robert Kahn as one of the two “Fathers of the Internet,” received the 2004 Turing Award along with Kahn. In contrast to the backpacks, sweaters, and jeans predominant among engineers and math scholars in the crowd, Cerf walked onto the podium outfitted in natty three-piece suit, accented by silk necktie with a most classically impeccable Windsor knot. In spite of his cheerful persona and social grace (it seemed over the course of the week-long Forum he’d probably conversed, at least briefly, with each of the 200 participants), from the podium he made a gloomy prediction. Cerf introduced Forum participants to the dismal specter of a looming digital dark age.

Now holding the title of Google’s chief technology guru, as his business card reads, Cerf has seen, with increasing frequency, new generations of operating systems, hardware, and software rapidly replace one another, driving competitors to extinction with each new advance of technology. Obviously he’s not opposed to advances in technology. However, he says the downside of these advances is that no one has taken any systematic measures to ensure that the computers of the future have any means to extract the data on, by then, abandoned and defunct operating systems and formats used by today’s technologies. Cerf believes, furthermore, that intellectual property rights and market forces encourage software and hardware providers to maintain secrecy (so that no one can duplicate their machines without paying their price) about what goes on inside their black boxes right up until they happen to get blindsided by the next generation of wizardry.

Cerf warns that, as a result, when today’s data systems or formats vanish from the scene, they’ll potentially take massive chunks of data and even parts of history with them. While some of that data may be recoverable as raw strings of numbers, he says, certain types of information, which he terms metadata (essential to modeling), telling the computer how to handle the strings of numbers in its memory, for example, indicating whether they represent temperature data, rainfall records, pressures, or formatting for spreadsheets and tables, would be exceedingly difficult to reconstruct.

And he’s seen it happen first hand. For instance, the early correspondence between the collaborators that shaped the Internet is now lost forever partly because the Teletext machines on which the messages were transmitted no longer exist and cannot be resurrected. He meekly admits some complicity. At the time, not thinking it would be such a big deal, Cerf and his fellow young researchers on the DARPA (Defense Advanced Research Projects Agency) project that resulted in the Internet routinely disposed, with little ceremony, of Teletext notes that now would probably be considered historic treasures.

He has proposals for two possible solutions, one technical and one political. Cerf imagines a new kind of self-reading instruction set that would be embedded with each piece of data entered into computers, indicating to machines of the future, no matter what shape they may take, how the data are to be decoded. At the moment no one knows how to do that. The other solution he proposes would be a political mandate that functional specimens of computer hardware, operating systems, and programs be held in escrow by some permanent agency, before they go extinct, in a sort of archival digital zoo, ready if called upon some time in the future to unravel mysterious data lurking on some long-forgotten gizmo. Politics being what they are today, no one knows how to do that yet either.

About the Author

David C. Richardson

David C. Richardson is a frequent contributor to Forester Media publications.