Recent U.S. Geological Survey research has found that natural biochemical processes in water moving back and forth between a stream and its underlying sediment were significant in removing nitrate from streams in the Illinois River basin, one of the world’s most intensively farmed regions.
The USGS study in a nitrogen-polluted stream found that the flow of streamwater through a very thin zone of sediment enhances chemical reactions that decrease nitrate delivery to coastal areas where nitrogen fuels formation of hypoxic "dead zones."
"One of the thorniest issues in the overall quality of our Nation's waters is relatively high levels of nitrates and other nutrients in many of our streams and rivers,” said Lori Caramanian, Department of the Interior Deputy Assistant Secretary for Water and Science. "A better understanding of the natural processes that reduce nutrients in our streams and rivers will help us mange our waterways in a more effective manner."
Beneath all streams and rivers is a shallow layer of sediment that is permeated by water exchange across the sediment surface. This boundary between the world of earth and water in streams is referred to by scientists as the "hyporheic" zone, from Greek words meaning "under the flow." The hyporheic zone can be thought of as the stream's "skin," since it serves vital functions such as the removal of dissolved and particulate contaminants being transported by the stream.
Previous research has established under laboratory conditions that hyporheic flow should be critical to sparking reactions that improve stream water quality, but field studies have generally been unable to reveal the contribution of hyporheic flow to decreasing the flow of contaminants to sensitive downstream waters.
This field study determined that a very thin skin, a mere four centimeters (1.6 in.) of sediment, was effective in removing nitrate from streams of the Illinois River basin during late summer. The crucial investigative approach was labeling in-stream nitrate with an isotopic tracer that could be followed at very fine scales in the sediment and simultaneously tracked for kilometers downstream.
The study scientists found that hyporheic flow increased nitrate removal by renewing the supply of dissolved organic carbon and nitrate to specialized bacteria in the sediment that performed denitrification, a reaction that converts dissolved nitrate to gaseous nitrogen and so removes nitrate permanently from flowing water.
The top four centimeters of sediment had the greatest abundance of denitrifying bacteria, in addition to the highest levels of hyporheic flow. Sediment properties in this thin layer were also conducive to the formation of oxygen-free micro zones that are required for the reaction to take place.
"USGS hydrologic research is focused on, among other things, improving our understanding of the biochemical processes at work in our waterways so that we can provide policy makers with information that will lead to better informed decisions." observed Jerad Bales, USGS Acting Associate Director for Water. "This work is an excellent example of how science is critically important for effectively addressing the one of the important environmental issues of our time."
Stream restoration is a billion dollar industry in the U.S., although its water quality benefits are not widely proven. Most restoration structures are designed in a manner that creates relatively deep hyporheic flow, which adds, this study demonstrated, only minimally to additional hyporheic flow and nitrogen removal in comparison to shallow hyporheic flow operating alone.
The study suggests how restoration structures might be modified to protect naturally functioning hyporheic zones and how hyporheic flow could be increased in order to stimulate greater removal of stream nitrate by denitrification.
These findings have immediate importance to the U.S. Environmental Protection Agency’s ongoing effort to evaluate federal jurisdiction in headwater streams, ponds, and wetlands where processes such as hyporheic flow may positively influence water quality and deliver additional benefits to downstream ecological health and recreational values of rivers and estuaries.
The study was published in the October 2013 edition of Water Resources Research. The findings were presented December 11 at the fall meeting of the American Geophysical Union.
The Department of the Interior's U.S. Geological Survey (USGS) and NASA presented the 2013 William T. Pecora Award for achievement in Earth remote sensing to Dudley B. Chelton, distinguished professor of Earth, Ocean, and Atmospheric Sciences at Oregon State University, Corvallis.
Chelton was recognized for his contributions to ocean remote-sensing science, education, and applications. The award was presented Wednesday by Suzette Kimball, USGS acting director, and Michael Freilich, director of the Earth Science Division in NASA's Science Mission Directorate, at the American Geophysical Union meeting in San Francisco.
The Department of the Interior and NASA present the Pecora Awards to honor outstanding contributions in the field of remote sensing and its application to understanding Earth. The award was established in 1974 to honor the memory of William T. Pecora, former USGS director and Interior undersecretary. Pecora was influential in the establishment of the Landsat satellite program, which created a continuous, 40-plus-year record of Earth's land areas.
"Every year the Pecora Award signifies the very high value that both the USGS and NASA place in observing Earth from space," said Kimball. "As our natural resources around the world continue to be stressed by a growing population and changing climate, it is more critical than ever that we have an objective, comprehensive view of the changes happening to our planet."
Chelton is a pioneer in the oceanographic use of satellite data to explore the role of the ocean in the Earth's climate system. His work has led to new hypotheses in ocean studies and has inspired many follow-up investigations by the ocean remote-sensing community, increasing the practice and appreciation of ocean remote-sensing.
"Throughout his career, Dudley has been known for developing statistical methods to analyze existing satellite data while preparing for the next generation of remote-sensing instruments," said Freilich.
After receiving a Ph.D. in physics from the University of Colorado, Boulder, he moved to NASA's Jet Propulsion Laboratory in 1980 to analyze newly available data from Seasat. His 1981 paper in Nature demonstrated the ability of satellite instruments to make global observations of the ocean. Chelton moved to Oregon State University in 1983 where he established an ocean remote-sensing program that has grown into national prominence.
The comprehensive understanding of the technical and statistical aspects of ocean remote-sensing serves as the foundation of Chelton's major scientific discoveries. For over thirty years, he has led efforts to improve satellite-derived measurements of the four primary ocean variables that can be sensed remotely: sea surface height, surface winds, sea surface temperature, and ocean surface biological productivity.
Chelton is a Fellow of the American Geophysical Union and the American Meteorological Society and received a NASA Public Service Medal. Many of his 110 papers and book chapters have become standard references in his field.
Reporters: An example map (Yellowstone National Park summer temperature data) from the viewer is available at the end of this release, but you can also find your county's data here.
For the first time, maps and summaries of historical and projected temperature and precipitation changes for the 21st century for the continental U.S. are accessible at a county-by-county level on a website developed by the U.S. Geological Survey in collaboration with the College of Earth, Oceanic and Atmospheric Sciences at Oregon State University.
The maps and summaries are based on NASA downscaling of the 33 climate models used in the 5th Climate Model Intercomparison Project and the current Intergovernmental Panel on Climate Change (IPCC) Assessment Report. The resulting NASA dataset is on an 800-meter grid with national coverage.
The USGS leveraged this massive dataset and distilled the information into easily understood maps, 3-page summaries and spreadsheet compatible data files for each state and county in the United States. A similar implementation for the USGS nested hydrologic units will be available in the next month.
"This product is innovative, user-friendly and invaluable for assessing and understanding climate model simulations of local and regional climate and climate change whether you’re a policy maker, a manager, a planner, an educator or another engaged U.S. citizen," said Matthew Larsen, associate director for the USGS Climate and Land Use Program. "The maps and summaries at the county level condense a huge volume of data into formats that are informative for planning, teaching, adaptation and mitigation purposes."
USGS scientists Jay Alder and Steve Hostetler, who designed and implemented the project as part of their other efforts at visualizing climate models, noted that users can not only view the county average of all the 30 climate models, but they can also select individual models to see how they compare or differ.
To make the number of permutations more manageable for the viewer, Alder and Hostetler averaged the data for the historical period and two future IPCC climate scenarios into 25-year periods (1980-2004, 2025-2049, 2050-2074 and 2075-2099) that span the 21st century. Absolute values and changes in temperature and precipitation for these periods are accessible through the viewer. Other useful tools for characterizing climate change include plots of monthly averages of temperature and precipitation, time-series spanning 1950-2099, and tables that summarize possible changes in the extremes of temperature and precipitation.
"We believe that this product will be useful for a variety of purposes," Alder said. "For example," he said, "farmers and land managers can use the information to help them think about adaptation and mitigation strategies, or educators can use it to teach students about aspects of climate model simulations that underpin IPCC Assessment Reports."
The maps and summaries are available here.
More information about USGS Climate and Land-Use Research is available here.Example of the web application displaying changes in maximum summer (July) temperature for Park County, WY (home of Yellowstone National Park). The time-series chart below the map displays two emission scenarios: RCP8.5 (“business as usual”) and RCP 4.5 (“greenhouse gas reduction/remediation”) from 1950-2100. By the end of the century, the maximum temperature in Park County is projected to warm by 7.5 °C (13.5 °F) under the RCP 8.5 (business as usual) scenario and 3.9 °C (7.0 °F) under RCP 4.5 (greenhouse gas reduction/remediation).
Communities and coastal habitats in the southern Chesapeake Bay region face increased flooding because, as seawater levels are rising in the bay, the land surface is also sinking._ A new USGS report released today concludes that intensive groundwater withdrawals are a major cause of the sinking land, or 'land subsidence', that contributes to flooding risks in the region.
"From a practical viewpoint, sea level is relative to the land surface," said Jerad Bales, Acting Associate Director for Water at USGS. “Whether the water is rising or the land is sinking, or both, the effect is the same: greater vulnerability to coastal storms and loss of important coastal habitat, both of which result in economic losses."
The new study presents a variety of data and findings from previous studies to examine land subsidence in the southern Chesapeake Bay region.
Previous USGS studies have r established that the Chesapeake Bay region has the highest rates of relative sea-level rise on the East Coast. The sea-level rise rates around the Chesapeake Bay range from 3.2 to 4.7mm/per year with 4.4 mm/yr in Norfolk. (A penny is about 1 mm thick.) Land subsidence alone causes more than half of the observed relative sea-level rise in the southern Chesapeake Bay.
While there are several factors influencing land subsidence, aquifer system compaction, caused by extensive groundwater pumping in the Virginia Coastal Plain, is a major cause in the Norfolk area. Land subsidence has occurred around Norfolk at an average rate of 3 mm/year since 1940.
Low-lying communities and critical habitats in the Chesapeake Bay region are especially vulnerable to damage from the relative sea-level rise caused by land subsidence. Communities in the southern Bay can experience increased flooding. The loss of coastal marsh and wetlands decreases the extent of specific habitat that waterfowl need to winter in the Bay region.
The report suggests that changing groundwater management practices could slow or mitigate land subsidence and relative sea-level rise. Moving groundwater pumping away from high-risk areas or decreasing groundwater withdrawal rates can reduce subsidence in low-lying areas prone to flooding. These results will be used by federal and state managers to consider adaptation strategies in their efforts to restore and protect the Chesapeake Bay.
Continued monitoring, mapping, and modeling are scientific tools needed to help natural resource managers and urban planners understand and reduce or mitigate land subsidence.
Changing resource management practices in response to rising seas and sinking land will require sustained public commitment.
Sea Level Rise Accelerating in U.S. Atlantic Coast (USGS release, 6/24/2012)
SAN FRANCISCO — The U.S. Geological Survey participates in the American Geophysical Union's fall meeting with hundreds of technical presentations. Below are some highlights of USGS science at AGU this year. Highlights about the technical sessions are presented in chronological order with session numbers, and room numbers in San Francisco's Moscone Convention Center (either Moscone South, MS, or Moscone West, MW). For more information, visit the AGU Fall meeting website.
News media representatives are invited to visit the USGS booth in the AGU Exhibit Hall. This is an easy place to connect with USGS data, publications, and information. Please contact Leslie Gordon to arrange for an interview with the USGS scientists.
News Conferences – Moscone West, Room 3000, Level 3
Dynamic Mars from Long–Term Observations
Tuesday, 12/10, 11:30 a.m. – Participating USGS Scientist Colin Dundas
Associated oral session with USGS Scientist Colin Dundas
Observations of Ice-Exposing Impacts on Mars over Three Mars Years
Wednesday, 12/11, 9:20 a.m., MW 2022/P31C-07
Titan as You've Never Seen it Before
Thursday, 12/12, 11:30 a.m. – Participating USGS Scientist Randolph Kirk
Associated oral session with USGS Scientist Randolph Kirk
Cassini RADAR Observes Titan’s Kraken Mare, The Largest Extraterrestrial Sea
Friday, 12/13, 11:05 a.m., MW 2007/P52B-04
Public Lecture --Sunday
Sunday, 12/8, 12:00 p.m. – MS 102
Free Public Lecture - Imagine an America without Los Angeles: Natural Hazards and the Complexity of Urban America
USGS Scientist: Lucy Jones
Lucy Jones will discuss how science can improve society’s resiliency to earthquakes. Free and open to the public.
Technical Sessions --Monday
Monday, 12/9, 8:00 a.m. – MS Poster Hall
Influence of Older Structure on Quaternary Faulting in Northeastern California
USGS Scientist: Vicki Langenheim
Geologically young faulting and volcanism may be influenced by a concealed crustal structure between Mt. Shasta and Lassen Peak. This structure is revealed by tiny perturbations in the Earth's gravity and magnetic fields caused by differences in rock density and magnetization.
Monday, 12/9, 8:15 a.m. – MW 2004
Deep Soil Carbon and Vulnerabilities to Anthropogenic Change
USGS Scientist: Jennifer Harden
Soils store large amounts of organic carbon (C), thus have helped regulate greenhouse gases and temperatures of the earth’s atmosphere. Land use change and rapid warming now influence the capacity for soils to actively store carbon. Scientists explore basic principles of soil formation and C cycling in order to understand how soils will respond to anthropogenic change.
Monday, 12/9, 1:40 p.m. – MS Poster Hall
Science For Decision-Makers: Climate Change Indicators For The North-Central California Coast And Ocean
USGS Scientist: Tom Suchanek
Ocean climate indicators were developed in a project based at NOAA’s Gulf of the Farallones National Marine Sanctuary for the North-central California coast and ocean, from Año Nuevo to Point Arena, including the Pacific coastline of the San Francisco Bay Area. These represent the first regional ocean climate indicators in the National Marine Sanctuary System. The indicators were developed in collaboration with over 50 regional research scientists and resource managers representing federal and state agencies, research universities and institutions, and non-governmental organizations.
Monday, 12/9,1:40 p.m. – MS Poster Hall
Comparison of Nutrient Sources in a Former Salt Pond Under Restoration
USGS Scientist: Brent Topping
Nutrient level fluctuations can disturb an ecosystem, and a key monitoring question during wetland restoration efforts is nutrient flux and transport. With the implementation of the South Bay Restoration Program in 2008, water quality in the Alviso Salt Ponds, California, has been monitored to document the effects of changing hydrologic connections among the ponds and the adjacent pond, slough and estuary. Ongoing research is shedding light on how bottom transport may be an important movement mechanism for both nutrients and toxicants in a rebuilding ecosystem.
Tuesday, 12/10, 9:15 a.m. – MW 3016
Multi-Scale Simulations of Past and Future Projections of Hydrology in Lake Tahoe Basin, California-Nevada
USGS Scientist: Richard Niswonger
Using a new-generation, linked surface- and groundwater-flow model, we examine impacts of climate changes and extremes in the Lake Tahoe basin. Climatic impacts are simulated in terms of water-availability and flood responses to selected climate-change projections and to an extreme ("ARkStorm") storm scenario and its resulting floods.
Tuesday, 12/10, 9:45 a.m. – MW 2003
Predicting Barrier Island Evolution Through Numerical-Model Scenarios
USGS Scientist: Nathaniel Plant
Prediction of barrier island evolution using numerical models can explain which processes, natural or human, are most important to long-term changes that affect future vulnerability to storms, sea-level rise, and human modification. Scientists will show numerical simulations of processes that transport sand along and across a barrier island during storms.
Tuesday, 12/10, 11:05 a.m. – MW 2000
A Global Perspective on Warmer Droughts as a Key Driver of Forest Disturbances and Tree Mortality
USGS Scientist: Craig Allen
Global warming and droughts are causing greater forest-water stress across large regions, and amplifying forest disturbances, particularly drought-induced tree mortality, wildfire, and insect outbreaks. Emerging global-scale patterns of drought- and heat-induced forest die-off are presented, including a newly updated map overview of documented die-off events from around the world, demonstrating the vulnerability of all major forest types to forest drought stress, even in typically wet environments.
Tuesday, 12/10, 1:40 p.m. – MS 103
Recent Microscopic Imager Results from Opportunity
USGS Scientist: Ken Herkenhoff
Exploration of Endeavour crater by the Mars Exploration Rover Opportunity continues, with the rover approaching more exposures of clay minerals detected from orbit; the latest Microscopic Imager results will be presented.
Tuesday, 12/10, 1:40 p.m. – MS Poster Hall
Magnetic Tides of Honolulu
USGS Scientists: Jeffrey Love, E. Joshua Rigler
Geomagnetic tides are time-periodic variations in the Earth’s magnetic field. Using almost a century of magnetic observatory data collected at the USGS in Honolulu Hawaii, we analyze magnetic tides caused by the relative motion and interaction of the Earth, Moon, and Sun, and the sunspot solar cycle.
Tuesday, 12/10, 1:40 p.m. – MS 103
Limits of Statistical Climate-fire Modeling: What Goes Up Must Come Down
USGS Scientist: Jeremy Littell
Climate affects wildfires, but “how” varies across ecosystems. Water balance (water surplus and drought) characterizes these effects, and scientists used it to project how fire could change under climate change. Will the whole West burn up? In some forests, it might appear so, but the whole story is more nuanced.
Tuesday, 12/10, 2:10 p.m. – MS 103
Different Climate–Fire Relationships on Forested and Non-Forested Landscapes in California
GC23G-03/ Oral presentation
USGS Scientist: Jon Keeley
Although wildfire activity is expected to increase due to global warming and other climate changes in the future, this study shows it is more complicated than a simple increase in fires with increased temperature. While climate will likely play an important role in determining fire regimes in the high elevation mountain forests, there is less evidence that it will alter fires at lower elevations. Future fires in California’s foothill and coastal environments will be affected by many global changes, particularly increases in human populations.
Tuesday, 12/10, 2:55 p.m. – MS 103
Can climate change increase fire severity independent of fire intensity?
USGS Scientist: Phillip van Mantgem
Regional warming may be linked to increasing fire size and frequency in forests of the western United States. Recent studies have also suggested that warming temperatures are correlated with increased fire severity (post-fire tree mortality), though the precise mechanism is unclear. Our research presents evidence that trees subject to environmental stress are more sensitive to subsequent fire damage. (see related news: http://www.usgs.gov/newsroom/article.asp?ID=3649)
Tuesday, 12/10, 3:25 p.m. – MW 3002
Are Large-scale Manipulations of Streamflow for Ecological Outcomes Effective Either as Experiments or Management Actions?
USGS Scientist: Chris Konrad
Water managers increasingly address ecological sustainability as part of dam operations. Dam releases for ecological outcomes have been practiced for over half a century to improve ecological conditions in rivers and estuaries. A review of more than 100 large-scale flow experiments evaluates their effectiveness for learning how to achieve sustainable water management.
Wednesday, 12/11, 8:00 a.m. – MS Poster Hall
Surprise and Opportunity for Learning in Grand Canyon: The Glen Canyon Dam Adaptive Management Program
USGS Scientist: Ted Melis
Flow experiments from Glen Canyon Dam since 1990, have informed federal managers trying to mitigate peak water flow impacts on Colorado River resources. Results were not predicted, but were "surprise" learning opportunities for adaptive river managers. Major uncertainties remain about the influence of global warming on the river’s native fish and beaches.
Wednesday, 12/11, 11:20 a.m. – MS 307
Missing Great Earthquakes
USGS Scientist: Susan Hough
The past decade has witnessed an apparent bumper crop of great earthquakes, with a total of six events above M8.5. Best available historical catalogs reveal only seven M≥8.5 earthquakes during the entire 19th century. Although the average long-term rate of global great earthquakes remains uncertain, one can show that great earthquakes are missing and/or estimated in best-available historical catalogs. Since the largest known earthquakes in many regions occurred before seismometers were developed around 1900, some of our estimates of largest possible magnitudes are likely too low. This suggests that so-called black swan events like the 2011 Tohoku, Japan, earthquake, while still not commonplace events, are not such rare beasts after all.
Wednesday, 12/11, 11:20 a.m. – MW 3003
An Integrated, Indicator Framework for Assessing Large-Scale Variations and Change in Seasonal Timing and Phenology
USGS Scientist: Julio Betancourt
As part of the National Climate Assessment's Indicator System, the Seasonality and Phenology Indicators Technical Team proposed a framework for tracking variations and trends in seasonal timing of surface climate, snow and ice, vegetation green-up and flammability, and bird migration across the U.S. These national indicators are measured by day-of-year, number of days, or latitude of observation at a given date.
Wednesday, 12/11, 1:40 p.m. – MS Poster Hall
Tracking Hydrothermal Feature Changes in Response to Seismicity and Deformation at Mud Volcano Thermal Area, Yellowstone
USGS Scientist: Angie Diefenbach
Mapping surficial change over 50 years at Mud Volcano thermal area in Yellowstone using readily accessible archives of aerial photographs from several federal agencies, gives scientists a better understanding of the links between seismicity and deformation episodes to increased heat and gas emissions at thermal areas.
Wednesday, 12/11, 1:55 p.m. – MW 3009
Influences on the Morphologic Response to Hurricane Sandy: Fire Island, NY
USGS Scientist: Cheryl Hapke
Hurricane Sandy fundamentally altered the geomorphology of Fire Island, NY. Changes included severe beach erosion, razing of the dunes, extensive overwash and breaching of the island. The response during Sandy varied considerably along the island and appears to be largely controlled by the local geology (associated poster session Monday, 12/9 at 1:40 p.m. – MS Poster Hall).
Wednesday, 12/11, 3:10 p.m. – MW 3009
Sandy-related Morphologic Changes in Barnegat Bay, NJ
USGS Scientist: Jennifer Miselis
Estuaries are some of the most productive habitats in the world. Biological, chemical, and physical estuarine processes are influenced by changes in depth and sediment composition, but storm-related changes are rarely measured. Our study integrates airborne and boat-based sensors and sampling to understand estuarine changes caused by Superstorm Sandy.
Thursday, 12/12, 8:00 a.m. – MS Poster Hall
Fog as an ecosystem service in northern California
USGS Scientist: Alicia Torregrosa
Humans can greatly benefit from temperature cooling derived from coastal fog such as reducing the number of hospital visits/emergency response requests from heat stress-vulnerable population sectors or decreased energy consumption during periods when summer maximum temperatures are lower than normal. The thermal relief provided by summertime fog and low clouds is equivalent in magnitude to the temperature increase projected by the driest and hottest of regional downscaled climate models using the A2 (“worst”) IPCC scenario. Extrapolating these thermal calculations can facilitate future quantifications of the ecosystem service provided by summertime low clouds and fog.
Thursday, 12/12, 8:00 a.m. – MS Poster Hall
SAFRR Tsunami Scenario: Economic Impacts and Resilience
USGS Scientists: Anne Wein
The SAFRR Tsunami Scenario models a hypothetical but plausible tsunami, created by an M9.1 earthquake occurring offshore from the Alaskan peninsula, and its impacts on the California coast. We provide an overview of the likely inundation areas, current velocities in key ports and harbors, physical damage and repair costs, economic consequences, environmental impacts, social vulnerability, emergency management, and policy implications for California associated with the tsunami scenario. Scenario users are those who must make mitigation decisions before, response decisions during, and recovery decisions after future tsunamis.
(associated oral presentations on Friday, 12/13 starting at 5:30 p.m. – MS 309)
Thursday, 12/12, 8:00 a.m. – MS Poster Hall
Multi-Temporal Harmonization of Independent Land-Use/Land-Cover Datasets for the Conterminous United States
USGS Scientist: Chris Soulard
USGS Land Change research aims to extend LULC change monitoring beyond 1973-2000 to more recent dates, without resource-intensive manual interpretation. We leveraged a range of existing LULC products and improved LULC classification by identifying agreement between datasets. This process, termed harmonization, has proven to be a cost efficient way to create reliable LULC maps.
Thursday, 12/12, 8:00 a.m. – MS Poster Hall
Megasplash at Lake Tahoe
USGS Scientist: Jim Moore, Richard Schweickert (Univ. of Nevada)
One of the largest landslides on the continent occurred in Lake Tahoe 12,000 to 21,000 years ago. Backwash from the gigantic splash caused by the 2.5 cubic-mile landslide formed major tsunamis. This backwash was equivalent to 15 major rivers flowing into the lake at the same time, and would have decimated life in the splash zone surrounding the lake.
Thursday, 12/12, 8:45 a.m. – MW 3001
Integrated Climate/Land Use/Hydrological Change Scenarios for Assessing Threats to Ecosystem Services on California Rangelands
USGS Scientist: Kristin Byrd
Scientists have developed integrated climate/land use/hydrological change scenarios for assessing threats to ecosystem services on California rangelands. Model outputs quantify the impact of urbanization on water supply and show the importance of soil storage capacity. Scenarios have applications for climate-smart conservation and land use planning.
Thursday, 12/12, 9:15 a.m. – MS 307
Understanding the Largest Deep Earthquake Ever Recorded
USGS Scientist: Robert Graves, Shengji Wei (Caltech)
In May 2013 a M8.3 earthquake ruptured beneath the Sea of Okhotsk at a depth of 610 kilometers, far below the Earth's crust. The entire earthquake sequence took just 30 seconds with energy released in four major shocks. This suggests that deep earthquakes are more efficient in dissipating stress than shallow earthquakes.
Thursday, 12/12, 11:20 a.m. – MW 3007
High Resolution Space-Time Analysis of Ice Motion at a Rapidly Retreating Tidewater Glacier
USGS Scientist: Shad O’Neel
Rapid changes to rates of sea level rise are forced in large part by tidewater glacier dynamics. With unprecedented detail, we analyze discharge from Alaska’s Columbia Glacier supporting other lines of evidence that the retreat has peaked and is now declining, suggesting regional ice mass loss rates may also decrease.
Thursday, 12/12, 3:25 p.m. – MS 308
Detecting Deep Crustal Magma Movement: Exploring Linkages Between Increased Gas Emission, Deep Seismicity, and Deformation Prior to Recent Volcanic Activity
USGS Scientist: Cynthia Werner
In 2003, deep long-period earthquakes, CO2 emissions, and surface uplift were described as three ‘promising indicators’ of deep magmatic processes. Now, ten years later, new data suggests that indeed that combination of very subtle changes in these parameters can help understand and predict changes in volcanic activity months in advance.
Thursday, 12/12, 5:00 p.m. – MS 307
Megacity Megaquakes: Two Near Misses, and the Clues they Leave for Earthquake Interaction
USGS Scientist: Ross Stein
Two recent mega-earthquakes, a M8.8 earthquake off the Chilean coast and a M9.0 earthquake off the coast of Japan, resulted in a large number of fatalities. Even though the capital cities of Santiago and Tokyo escaped severe damage, the rate of lesser shocks beneath each city jumped by a factor of about 10 following each megaquake. What does this portend for the likelihood of future large earthquakes? Are these really aftershocks, and are large shocks more probable now than before the mega-earthquakes?
Thursday, 12/12, 5:30 p.m. – MS 305
Crowd-Sourcing for Earthquake Monitoring and Rapid Response
USGS Scientist: Sarah Minson
Earthquake early warning systems are being implemented in select locations. Expansion to high-risk regions lacking seismic infrastructure, however, is cost-limited. Scientists demonstrate that a stand-alone system comprising cell-phone quality GPS stations is inexpensive enough to be implemented globally and accurate enough to provide early warning of large earthquakes and tsunami.
Thursday, 12/12, 5:45 p.m. – MS 309
Six Large Tsunamis in the Past ~1700 years at Stardust Bay, Sedanka Island, Alaska
USGS Scientist: Robert Witter
On a small island facing the Aleutian-Alaska subduction zone, the 1957 Andreanof Islands tsunami deposited beach sand and stranded drift logs 18 meters above sea level. Five older sand sheets suggest great earthquakes along this part of the Aleutian megathrust generate Pacific-wide tsunamis on average every 325 years. Intriguingly, the age of the predecessor of the 1957 tsunami overlaps the time of unusual marine flooding on Kaua'i about 400 years ago.
Friday, 12/13, 8:00 a.m. – MS Poster Hall
Recent Applications of Continental-Scale Phenology Data for Science, Conservation, and Resource Management
USGS Scientist: Jake Weltzin
Professional and “citizen” scientists are contributing data on seasonal plant and animal activity across the United States – as part of a national project called Nature’s Notebook – to inform science and natural resource management. Featured applications include a national index of Spring and tools to support detection and eradication of invasive plants.
Friday, 12/13, 11:05 a.m. – MS 309
Community Vulnerability to Tsunami Threats in the U.S. Pacific Northwest
USGS Scientist: Nathan Wood
Coastal communities in northern California, Oregon, and Washington are classified based on similar characteristics of vulnerability to tsunamis associated with Cascadia subduction zone earthquakes. Research focuses on the number and type of at-risk individuals in hazard zones, including estimates of needed evacuation time. Results can be used to prioritize risk-reduction efforts that address common issues across multiple communities.
Friday, 12/13, 2:10 p.m. – MS 307
The Effect of Porosity on Fault Slip Mechanisms at East Pacific Rise Transform Faults: Insight From Observations and Models at the Gofar Fault
USGS Scientist: Emily Roland
East Pacific Rise transform (strike-slip) faults demonstrate significant variability along their length in their ability to generate large earthquakes. Using observations and models, scientists consider how changes in fault zone material properties, specifically porosity of fault zone rocks and pore fluid pressure, may influence rupture segmentation.
Friday, 12/12, 5:30 p.m. – MS 309
The SAFRR Tsunami Scenario: Improving Resilience for California from a Plausible M9 Earthquake near the Alaska Peninsula
USGS Scientist: Stephanie Ross
The SAFRR Tsunami Scenario models a hypothetical but plausible tsunami, created by an M9.1 earthquake occurring offshore from the Alaskan peninsula, and its impacts on the California coast. The scenario includes the likely inundation areas, current velocities in key ports and harbors, physical damage and repair costs, economic consequences, environmental impacts, social vulnerability, emergency management, and policy implications for California, providing the basis for improving preparedness, mitigation, and continuity planning for tsunamis, which can reduce damage and economic impacts and enhance recovery efforts.
Friday, 12/12, 5:45 p.m. – MS 309
Environmental and Environmental-Health Implications of the USGS SAFRR California Tsunami Scenario
USGS Scientist: Geoffrey Plumlee
The SAFRR Tsunami Scenario models a hypothetical but plausible tsunami, created by an M9.1 earthquake occurring offshore from the Alaskan peninsula, and its impacts on the California coast. Environmental impacts from contamination and potential for human exposures to contaminants and hazardous materials, are an underappreciated hazard from tsunamis. Inundation-related damages to major ports, boat yards, and many marinas could release complex debris, crude oil, various fuel types, other petroleum products, some liquid bulk cargo and dry bulk cargo, and diverse other pollutants into nearby coastal marine environments and onshore in the inundation zone.
Rising water temperatures as a result of climate change may harm already endangered or threatened native freshwater mussels in North America, according to a new U.S. Geological Survey report.
During laboratory tests, USGS scientists and partners found that the heart and growth rates of some species of young freshwater mussels declined as a result of elevated water temperatures, and many died. Freshwater mussels have been compared to the "canary in the coal mine" in that they are indicators of good water and sediment quality in U.S. rivers. They are also important in the aquatic food web, filter large amounts of water and suspended particles, and serve as food for other organisms. The study is published in the December issue of the journal Freshwater Science.
"Native freshwater mussels may be especially sensitive to climate change because of their patchy distribution, limited mobility, and dependence on host fish for their larval stage, as well as fragmentation of their ranges by habitat alteration," said Teresa Newton, USGS scientist and an author of the report. "Many species are currently in danger of extinction."
The scientists studied the effects of high water temperatures, ranging from 20-35 degrees Celsius (68-95 degrees Fahrenheit), on three species of two-month-old freshwater mussels: pink mucket, fat mucket and washboard. Temperatures at which at least 50 percent of the populations died after 28 days ranged from 25.3-30.3 degrees Celsius (about 78-87 degrees Fahrenheit). Heart rates in the pink mucket and washboard mussels declined significantly with increasing water temperature.
The observed effects may ultimately decrease biodiversity and cause a shift to more temperature-tolerant mussel species.
"Freshwater mussels are the most endangered group of organisms in the U.S. and in the world," Newton said. "More estimates of the upper thermal limits in native mussels are urgently needed to assess the potential effects of global climate change on native mussel populations."
Over 70 percent of North America’s 302 mussel species are imperiled or extinct. Declines in the abundance and diversity of these mussels have been attributed to a wide array of human activities that cause pollution, water-quality degradation and habitat destruction.
This research was supported by the USGS National Climate Change and Wildlife Science Center which provides scientific information to help land managers effectively respond to climate change.
Additional information on native freshwater mussels in the midwestern U.S. is available at the USGS Upper Midwest Environmental Sciences Center website.
The Alum Shale in Denmark contains an estimated mean of 6.9 trillion cubic feet of undiscovered, technically recoverable natural gas, according to a new report by the U.S. Geological Survey. This estimate comes from the first-ever USGS assessment of shale gas resources in Denmark.
The geological foundation that underpins the assessment was facilitated by data provided by the Geological Survey of Denmark and Greenland. USGS released the assessment to the GSDG in a meeting earlier this morning.
"This is a potential resource for Denmark, although there is no current production there," said USGS Acting Director Suzette Kimball. "The complicated geology in Denmark and the difficulty involved in assessing it really demonstrates how important it is to have a robust geologic model underpinning all of our assessments."
The Alum Shale is part of the Baltic Basin and is made up of two assessment units, the onshore portion and the offshore portion. The offshore area was estimated to contain a mean of 4.4 trillion cubic feet of gas and the onshore area was estimated to contain 2.5 trillion cubic feet of gas.
Using its geology-based methodology, the USGS team estimated recoverable gas resources to range from 0 to 13.3 trillion cubic feet of natural gas in the Alum Shale. The wide range in the estimate reflects the geological uncertainty inherent in this as-yet largely untested resource. A complicated geological history of burial, uplift, and erosion may have led to the loss of the natural gas over time, which contributes to the large uncertainty.
In most areas of Denmark, the burial history of the Alum Shale resulted in temperatures consistent with the formation of oil; however, subsequent additional heating transformed the oil into natural gas. Thus shale oil is not expected from the Alum Shale.
Areas of the Alum Shale with potential for gas production are found beneath Jutland and the Island of Zealand, including the City of Copenhagen, and beneath parts of the North Sea, the Kattegat and the Baltic Sea near Bornholm.
This assessment is part of the USGS World Petroleum Project, in which the USGS is assessing conventional and unconventional formations to determine undiscovered, technically recoverable resource potential. Previously published USGS assessments of unconventional resources have included areas in Poland, India, China, and several countries in South America.
Continuous oil and gas, also sometimes referred to as unconventional, remains in or near the original source rock, and, instead of escaping the source rock and collecting in distinct accumulations like conventional oil and gas, is dispersed unevenly over large geographic areas.
Technically recoverable oil resources are those producible using currently available technology and industry practices. USGS is the only provider of publicly available estimates of undiscovered technically recoverable oil and gas resources of the world.
To learn more about this or other geologic assessments, please visit the USGS Energy Resources Program website. Stay up to date with USGS energy science by subscribing to our newsletter or by following us on Twitter.
Since 1972, the Landsat program has allowed scientists and analysts to observe the world beyond the power of human sight, monitor changes to the land, and detect critical trends in the conditions of natural resources.
To learn more about who uses Landsat imagery and the value these users see in Landsat imagery, the U.S. Geological Survey analyzed responses to a survey of more than 40,000 individuals who accessed free Landsat images from the archive at the USGS Earth Resources Observation and Science (EROS) Center in Sioux Falls, S.D. Over 11,000 users responded to the survey.
Recently published in a USGS report, the survey findings demonstrate that a very wide range of customers use Landsat — from educators to Earth scientists, foresters to urban planners, agricultural managers and water users, and many more. These diverse users were surveyed about their specific utilization of Landsat imagery, as well as the impacts of doing without Landsat imagery and its value to each group.
"The value of Landsat's unique 40-year archive of Earth imagery is incalculable," said Anne Castle, Department of the Interior Assistant Secretary for Water and Science, who welcomed publication of the survey. "But with this study, we can begin to quantify the benefits of Landsat to the national economy and to its many users."
Respondents used Landsat imagery in 38 different primary applications, ranging from environmental sciences to agriculture to planning, administration of natural resources, and humanitarian aid. Three-quarters of respondents said the imagery is somewhat or very important to their work and stated that they were moderately or very dependent on Landsat imagery to do their jobs. Almost two-thirds of users reported that they would have to discontinue half of their work, on average, if new and archived Landsat imagery were unavailable.
The value of Landsat imagery was quantified through a contingent valuation method that estimates the aggregated annual economic benefits derived from the imagery. Based on the survey results, economists estimated the benefits from Landsat imagery distributed directly by the USGS in 2011 to be just over $1.79 billion for U.S. users and almost $400 million for international users, resulting in a total annual economic benefit of $2.19 billion. This estimate does not include benefits from further distribution and reuse of the imagery after it has been obtained from the USGS or from the use of value-added products derived from Landsat imagery.
Landsat images are unique in that they provide complete global coverage, they span over 41 years of continuous Earth observation, and they are available for free to anyone in the world. No other satellite provides that combination of attributes.
The USGS report, "Users, Uses, and Value of Landsat Satellite Imagery—Results from the 2012 Survey of Users," is available online. The survey was the second completed as part of a larger study, which also includes a survey conducted in 2009. The Landsat program is jointly managed by USGS and NASA.
The U.S. Geological Survey (USGS) and NASA will host a public meeting on December 4 in which both agencies will provide details about how user needs will be assessed to help inform NASA's Sustainable Land Imaging Program. User requirements will be a key consideration in the design and implementation of future space-borne systems that are intended to provide global, continuous Landsat-quality observations of Earth for at least the next 25 years.
The Users Forum will feature a structured methodology that USGS has been developing for acquiring and evaluating user requirements for Earth observation. USGS presenters at the forum will explain some preliminary findings and offer opportunities for feedback about the approach and the requirements gathered to date.
Both USGS and NASA value public participation in establishing long-term user needs for Landsat or equivalent Earth observation data. This is a notice of a meeting, not a solicitation of any kind.
Event: USGS/NASA Sustainable Land Imaging Users Forum
Time: 1:00 – 4:15 p.m. EST
Date: Wednesday, December 4, 2013
Location: NASA Goddard Visitors Center Auditorium, 8800 Greenbelt Road, Greenbelt, MD, 20771.
Registration: Online at NASA website.
Learn further details and register.
For the first time since 1995 the U.S. Geological Survey will reinstate reporting the amount of water consumed in the production of thermoelectric power.
Tracking the consumptive use of water by thermoelectric power plants could allow water resource managers to evaluate the influence of this type of use on the overall water budget of a watershed. The use of heat and water budgets to estimate water consumption at individual thermoelectric plants provides a useful check on other estimation approaches and in many cases may be the most accurate method available.
Thermoelectric water withdrawal refers to the water removed from groundwater or surface water for use in a thermoelectric power plant, mainly for cooling purposes. Much of the water that is currently withdrawn for cooling is reintroduced into the environment, and immediately available for reuse.
The consumptive use occurs when some of the water is evaporated during the cooling process or incorporated into byproducts as a result of the production of electricity from heat. Once the water is consumed, it is no longer able to be reintroduced into the environment.
"Thermoelectric withdrawal occurs in both freshwater and saline water sources," says Eric J. Evenson, Coordinator, USGS National Water Census. "It is the most significant use of saline water in the country."
This study presents a method for collecting location and cooling-equipment data. An upcoming study will be released providing the consumption numbers derived from our heat/water budget models. About half of the water withdrawals in the United States are for thermoelectric cooling water, however, most of the water is returned to the environment after use.
The methods for estimating evaporation presented in this study will play a key role in the National Water Census, a USGS research program on national water availability and use that develops new water accounting tools and assesses water availability at the regional and national scales.
"The most significant contribution of this report," according to Timothy H. Diehl, Hydrologist at the Tennessee Water Science Center, "is to present an updated method for estimating evaporation from surface water downstream from once-through cooling systems, and make the tool available in the form of a spreadsheet."
The USGS classifies water withdrawals for thermoelectric cooling by the two types of cooling systems used at the plants: recirculating systems and once-through systems. A recirculating cooling system circulates water through the generating plant condensers and is then cooled in a structure such as a cooling tower or cooling pond, before it is re-used in the same process. A once-through cooling system withdraws water from a surface-water source to circulate through the generating plant condensers and then discharges the water back to surface water at a higher temperature.
"Most consumption by once-through cooling systems and recirculating ponds takes the form of evaporation from surface water. This type of consumption has been estimated by a variety of methods and sometimes considered insignificant, " according to Diehl.
This action was taken at the recommendation of Government Accountability Office reports on the Energy Water Nexus and represents a joint effort between the USGS and the Energy Information Administration.
- Methods for Estimating Water Consumption for Thermoelectric Power Plants in the United States
- Tennessee Water Science Center
- National Water Census
- Progress Toward Establishing a National Assessment of Water Availability and Use
A new study based on Earth-observing satellite data comprehensively describes changes in the world's forests from the beginning of this century. Published in Science today, this unparalleled survey of global forests tracked forest loss and gain at the spatial granularity of an area covered by a baseball diamond (30-meter resolution).
Led by Matthew C. Hansen of the University of Maryland and assisted by USGS co-author Thomas R. Loveland, a team of scientists analyzed data from the Landsat 7 satellite to map changes in forests from 2000 to 2012 around the world at local to global scales. The uniform data from more than 650,000 scenes taken by Landsat 7 ensured a consistent global perspective across time, national boundaries, and regional ecosystems.
"Tracking changes in the world's forests is critical because forests have direct impacts on local and national economies, on climate and local weather, and on wildlife and clean water," said Anne Castle, Assistant Secretary of the Interior for Water and Science. "This fresh view of recent changes in the world’s forests is thorough, objective, visually compelling, and vitally important."
Overall, the study found that from 2000 to 2012 global forests experienced a loss of 888,000 square miles (2.3 million square kilometers), roughly the land area of the U.S. states east of the Mississippi River. During the study period, global forests also gained an area of 309,000 square miles (800,000 square kilometers), approximately the combined land area of Texas and Louisiana.
The global survey found that Russia experienced the most forest loss overall (in absolute numbers) over the study period. Brazil was the nation with the second highest level of forest loss, but other countries, including Malaysia, Cambodia, Cote d’Ivoire, Tanzania, Argentina and Paraguay, experienced a greater proportional loss of forest cover. Indonesia exhibited the largest increase in forest loss; its losses on an annual basis during 2011-12 were twice what they were during 2000-03.
Brazil is a global exception in terms of forest change during this timeframe, with a dramatic policy-driven reduction in the rate of deforestation in the Amazon Basin. Brazil's use of free Landsat data in documenting trends in deforestation was crucial to its policy formulation and implementation. To date, only Brazil produces and shares spatially explicit information on annual forest extent and change.
In the United States, the most intensive forest change was noted in the southeastern states where pine plantations allow for cyclic tree harvesting for timber, followed by immediate planting of tree replacements. In this area, over 30 percent of the forest cover was either lost or regrown during the study period.
Deforestation as well as deliberate forest regrowth are human factors that accounted for most of the forest change. Natural forces — for instance, wildfire, windstorms, insect infestations, and regrowth of abandoned agricultural areas — also caused forest changes, which were also methodically mapped.
"Ever since the USGS made Landsat data free to anyone in 2008, Landsat imagery has served as a reliable common record, a shared vocabulary of trusted data about Earth conditions," Castle continued. "It's been said that the free data policy is like giving every person on the globe a free library card to the world's best library on Earth observations."
"With the free data policy, we have seen a remarkable revolution in the use of Landsat for documenting the changes in the Earth's land cover," said Tom Loveland, chief scientist at the USGS Earth Resources Observation and Science (EROS) Center and a co-author of the study. "This multi-organization project was only feasible with the existence of free Landsat data. The invaluable Landsat archive supplies high-quality, long-term, consistent global data at a scale appropriate for tracking forest gains and losses."
The research team included scientists from the University of Maryland, the U.S. Geological Survey, Google, the State University of New York, Woods Hole Research Center, and South Dakota State University. The collaborative study is published in the Nov. 15 issue of the journal Science.
Key to the project was collaboration with the Google Earth Engine team, who leveraged sophisticated cloud computing technology to enable University of Maryland researchers to compute the vast amount of data in a matter of days. Additional project funding was provided to the University of Maryland by the Gordon and Betty Moore Foundation.
Since 1972, the Landsat program has played a critical role in supplying continuous, objective data that can be used to monitor, understand, and manage the resources needed to sustain human life, such as food, water, and forests. The Landsat 8 satellite launched in February 2013 is designed to extend the four-decade Landsat record of Earth observation. The Landsat program is jointly managed by NASA and USGS.
- "High-resolution global maps of 21st-century forest cover change," Science, 15 Nov 2013.
- Forest cover maps in Google Earth Engine
- Global Forest Change animation videos
- USGS Landsat
- NASA Landsat
Learn even more by participating
November 18 at 10:00 a.m. PT, 1:00 ET
The USGS, in cooperation with other Federal agencies, has posted new Ohio US Topo quadrangles (748 maps) which include partial Public Land Survey System (PLSS). Ohio is the first state east of the Mississippi River to have PLSS data added to US Topo maps, joining Wyoming and Colorado in the west.
"It is great to have these 748 updated US Topo maps for our state available online at no charge," said Charley Hickman, the Geospatial Liaison for Ohio. "We appreciate the continuing improvements in this product, including the availability of PLSS township, range, and section information."
The PLSS is a way of subdividing and describing land in the United States. All lands in the public domain are subject to subdivision by this rectangular system of surveys, which is regulated by the U.S. Department of the Interior. Other selected states will begin getting PLSS map data during the next respective revision cycle.
The new design for US Topo maps improves readability of maps for online and printed use, while retaining the look and feel of the traditional USGS topo map. Map symbols are easy to read when the digital aerial photograph layer imagery is turned on.
Other re-design enhancements and new features:
- New shaded relief layer for enhanced view of the terrain
- Military installation boundaries, post offices and cemeteries
- New road classification
- A slight screening (transparency) has been applied to some features to enhance visibility of multiple competing layers
- New PDF legend attachment
- Metadata formatted to support multiple browsers
US Topo maps are created from geographic datasets in The National Map, and deliver visible content such as high-resolution aerial photography, which was not available on older paper-based topographic maps. The new US Topo maps provide modern technical advantages that support wider and faster public distribution and on-screen geographic analysis tools for users.
The new digital topographic maps are PDF documents with geospatial extensions (GeoPDF®) image software format and may be viewed using Adobe Reader, available as a no-cost download.
These new quads replace the first edition US Topo maps for Ohio. The replaced maps will be added to the USGS Historical Topographic Map Collection and are also available for free download from The National Map and the USGS Map Locator & Downloader website.
US Topo maps are updated every three years. The initial round of the 48 conterminous state coverage was completed in September of 2012. Hawaii and Puerto Rico maps have recently been added. More than 400 new US Topo maps for Alaska have been added to the USGS Map Locator & Downloader, but will take several years to complete.
For more information, go to: http://nationalmap.gov/ustopo/
USGS scientists have determined that high-salinity groundwater found more than 1,000 meters (0.6 mi.) deep under the Chesapeake Bay is actually remnant water from the Early Cretaceous North Atlantic Sea and is probably 100-145 million years old. This is the oldest sizeable body of seawater to be identified worldwide.
Twice as salty as modern seawater, the ancient seawater was preserved like a prehistoric fly in amber, partly by the aid of the impact of a massive comet or meteorite that struck the area about 35 million years ago, creating Chesapeake Bay.
"Previous evidence for temperature and salinity levels of geologic-era oceans around the globe have been estimated indirectly from various types of evidence in deep sediment cores," said Ward Sanford, a USGS research hydrologist and lead author of the investigation. "In contrast, our study identifies ancient seawater that remains in place in its geologic setting, enabling us to provide a direct estimate of its age and salinity."
The largest crater discovered in the United States, the Chesapeake Bay impact crater is one of only a few oceanic impact craters that have been documented worldwide.
About 35 million years ago a huge rock or chunk of ice traveling through space blasted a 56-mile-wide hole in the shallow ocean floor near what is now the mouth of the Chesapeake Bay. The force of the impact ejected enormous amounts of debris into the atmosphere and spawned a train of gigantic tsunamis that probably reached as far as the Blue Ridge Mountains, more than 110 miles away.
The impact of the comet or meteorite would have deformed and broken up the existing arrangement of aquifers (water-bearing rocks) and confining units (layers of rock that restrict the flow of groundwater). Virginia's "inland saltwater wedge" is a well-known phenomenon that is thought to be related to the impact crater. The outer rim of the crater appears to coincide with the boundary separating salty and fresh groundwater.
"We knew from previous observations that there is deep groundwater in quite a few areas in the Atlantic Coastal Plain around the Chesapeake Bay that have salinities higher than seawater,” said Jerad Bales, acting USGS Associate Director for Water. "Various theories related to the crater impact have been developed to explain the origin of this high salinity. But, up to this point, no one thought that this was North Atlantic Ocean water that had essentially been in place for about 100 million years."
"This study gives us confidence that we are working directly with seawater that dates far back in Earth’s history,” Bales continued. “The study also has heightened our understanding of the geologic context of the Chesapeake Bay region as it relates to improving our understanding of hydrology in the region."
USGS Professional Paper 1612. "The Effects of the Chesapeake Bay Impact Crater on the Geological Framework and Correlation of Hydrogeologic Units of the Lower York-James Peninsula, Virginia."
New research by the U.S. Geological Survey conducted on the Delmarva Peninsula, which forms the Eastern Shore of the Chesapeake Bay, indicates it may take several decades for many water-quality management practices aimed at reducing nitrogen input to the Bay to achieve their full benefit due to the influence of groundwater.
The USGS findings provide critical information on how long it may take to see the water quality in the Bay improve as more stringent practices are implemented to reduce nutrients and sediment to tidal waters. Having established a calculation for the total nitrogen, phosphorus, and sediment pollutants that are allowable for the Chesapeake watershed, known as the total maximum daily load (TMDL), the U.S. Environmental Protection Agency (EPA) is working with Maryland, Pennsylvania, Virginia and the four other Bay watershed jurisdictions to ensure that all water-quality practices needed to reduce the flow of nutrients and sediment to the Bay are in place by 2025.
"This new understanding of how groundwater affects water-quality restoration in the Chesapeake Bay will help sharpen our focus as many agencies, organizations, and individuals work together to improve conditions for fish and wildlife," said Lori Caramanian, Department of the Interior Deputy Assistant Secretary for Water and Science. "In turn, improved environmental conditions will serve to further people’s enjoyment and promote the economic benefits of the Nation’s largest estuary."
The responses of watershed systems and ecosystems to environmental management actions at any location can vary from rapid changes (such as the swift beneficial effect of a wastewater treatment plant upgrade) to longer improvement intervals of several decades. In the Chesapeake Bay, "lag response time" refers to the time between implementing management actions and the resulting improvements in water quality. Lag times will vary for nitrogen, phosphorus, and sediment.
This USGS study focused on nitrogen. Some of the nitrogen will run off directly into a stream, but a large portion on the Delmarva (more than two thirds) is affected by the slow travel times of nutrients moving from their land source through underground aquifers to a receiving stream or estuary.
Sources of nitrogen include fertilizer and manure applications to agricultural land, wastewater and industrial discharges, runoff from urban areas, domestic septic drain fields, and air emissions. Excess nitrogen contributes to algal blooms that cause low dissolved oxygen in the Bay and related fish kills each summer and impact recreational activities.
For this study USGS scientist Ward Sanford developed a complex model for water, geology, and chemical interactions that he applied to seven separate watersheds on the Delmarva Peninsula. Based on the concept of nitrogen mass-balance regression, the model was able to reproduce the time history of nitrate concentrations in area streams and wells, including a recent slowdown in the rate of concentration increase in streams. The model was then also used to forecast future nitrogen delivery from the Delmarva Peninsula to the Bay under different nitrogen management scenarios.
The new study shows that ages of groundwater and associated nitrogen from the Delmarva Peninsula into the Chesapeake Bay range from less than a year to centuries, with median ages ranging from 20 to 40 years. These groundwater age distributions are markedly older than previously estimated for areas west and north of the Bay, which has a median age of 10 years. The older ages occur because the porous, sandy aquifers on the Delmarva yield longer groundwater return times than the fractured-rock areas of the Bay watershed.
The USGS research found that in some areas of the Delmarva the groundwater currently discharging to streams is gradually transitioning to waters containing higher amounts of nitrate due to fertilizer used during the 1970s through the 1990s. Similarly, the total amount of nitrogen in the groundwater is continuing to rise as a result of the slow groundwater response times.
Without additional management practices being implemented, the study forecasts about a 12% increase in nitrogen loads from the Delmarva to the Bay by 2050. The study provides several scenarios for reducing nitrogen to the water table and the amount of time needed to see the reductions in groundwater discharging to streams. For example, the model predicts that a 25% reduction in the nitrogen load to the water table will be required to have a 13% reduction in load to the bay.
However, the results also indicate that nutrient management practices implemented over the past decade or so have begun to work and confirm that the amount of the nitrogen loading to streams in the future will depend on the rigor of water-quality practices implemented to reduce nutrients at present.
This study highlights the complexities of environmental restoration of the Bay. The findings help refine the expectations of resource managers and citizens alike of how long it may take to see substantial water-quality improvements in the Bay, and they may provide additional insight into the effectiveness of different types of land management practices given the time lag created by local groundwater response times.
The study was done as part of increased federal efforts under the President’s 2009 Chesapeake Executive Order, which directs Federal agencies, including the EPA and the Department of the Interior, to "begin a new era" in protection and restoration of the Chesapeake Bay. With a watershed that spreads across six states and Washington, DC, the Chesapeake Bay is the largest estuary in the United States and one of the largest and most biologically productive estuaries in the world.
"Quantifying Groundwater's Role in Delaying Improvements to Chesapeake Bay Water Quality," Environmental Science & Technology
SACRAMENTO, Calif. — Although dousing the flames was foremost in people's minds during the recent Rim Fire in Stanislaus National Forest and Yosemite National Park, U.S. Geological Survey scientific work continues well after the fire is out. USGS scientists are continuing their critical research characterizing the hidden dangers faced after large wildfires.
While the fire was still smoldering in September, the multi-agency BAER (Burned Area Emergency Response) team developed a burn-severity map and shared it with USGS scientists. USGS assessed the potential for debris flows that tend to occur when the winter rains soak the steep slopes following fires by adding critical information on soil characteristics, the ruggedness of the terrain, and the typical amount of rainfall in that area in order to model the likelihood and possible volumes of debris flows. The just published Rim Fire debris-flow hazards assessment map, will help land and emergency managers focus mitigation treatments on where the greatest damage might be done by post-fire debris flows.
As the Rim Fire growth slowed and the BAER team began their onsite evaluation of post-fire threats to life and property, the USGS California Water Science Center and the San Francisco Public Utilities Commission, operator of the Hetch Hetchy Regional Water System, assessed project facilities and USGS streamgages within the fire perimeter. Many of these streamgages were scorched and in need of repair, and the facilities are now subject to increased potential of debris flows and flash flooding.
"Our ongoing science research and the relationships and collaborations we have built with wildfire responders and land managers help mitigate and reduce the hazards of wildfire in the future," said USGS Pacific Regional Director Mark Sogge.
The most time-critical task for USGS scientists now is to work with the California Department of Water Resources, Modesto Irrigation District, and Turlock Irrigation District to establish a streamgage and extensive water-quality monitoring at a location downstream from the fire extent and upstream from Don Pedro Reservoir in the Sierra Nevada foothills. Documenting the quantity and quality of water entering the reservoir, and modeling streamflow changes in response to the fire, gives reservoir managers the tools to understand the cumulative effects of the Rim Fire on future water supplies.
Throughout the fire, USGS satellite imagery, such as Landsat’s before-and-after images of the Rim Fire, provided firefighters with situational awareness, and helped them mount their campaign against the intense and fast flames. Maps and satellite imagery were a daily element of Integrated Fire Command briefings.
USGS continues to work with the Calif. Dept. of Water Resources, the SFPUC, and the Don Pedro Reservoir managers to include the Rim Fire burn area data in, and thereby increase the utility of, hydrologic models currently being developed. Additional remote sensing information will be incorporated into the updated hydrologic model to help the managers understand the runoff and streamflow as it changes through the years after the fire.
USGS is also monitoring soil moisture and incorporating soil properties, burn severity, and expected rates of vegetation recovery into a computer model to understand the hydrologic processes, responses, and sensitivities of the burned areas to a range of potential climatic conditions in the next five years.
Although the challenging job of managing such fires rests with other agencies, USGS provides the underlying science for sound land management decisions, before, during, and after wildfires. The USGS role studying natural hazards such as floods, landslides, earthquakes and volcanoes is well known, but fewer people are aware of the USGS scientific work in major wildfire events, which are one of the most regular and sometimes most devastating natural hazards in the West.
Well before the Rim Fire, USGS fire ecologists routinely provided scientific information to help state and federal agencies manage wildfire as a critical and natural ecosystem process and as a potential hazard to human life, property and infrastructure in the Sierra Nevada as well as Southern California. A new video tells this story. USGS ecologists also monitor the effects of ash on wildlife.
These are only a few of the many ways USGS scientists and experts help to manage the tremendous potential for wildfire-related calamity in the West.Likelihood of significant debris-flow hazard within and downstream of the Rim Fire burn area. Red areas are most likely to generate debris flows. (High resolution image 36 MB PDF.)
Although recent declines in nitrate in the Illinois River are promising, increasing nitrate levels at other sites throughout the basin are a continuing cause for concern.
Nitrate levels in the Illinois River decreased by 21 percent between 2000 and 2010, marking the first time substantial, multi-year decreases in nitrate have been observed in the Mississippi River Basin since 1980, according to a new USGS study.
Unfortunately, similar signs of progress were not widespread. "Nitrate levels continue to increase in the Missouri and Mississippi Rivers, including the Mississippi’s outlet to the Gulf of Mexico," said Lori Sprague, USGS research hydrologist.
"These results show that solving the problem of the dead zone will not be easy or quick. We will need to work together with our federal and state partners to develop strategies to address nitrate concentrations in both groundwater and surface water," said Lori Caramanian, Department of the Interior Deputy Assistant Secretary for Water and Science.
"Expanded research and monitoring is absolutely essential to tracking progress in reducing nutrient pollution in the Mississippi River Basin," said Nancy Stoner, acting Assistant Administrator for Water at the U.S. Environmental Protection Agency and co-chair of the Hypoxia Task Force. "The federal agencies and states that are part of the Hypoxia Task Force greatly appreciate this work by USGS and how it advances the science in the Mississippi River Basin."
Excessive nitrate and other nutrients from the Mississippi River strongly influence the extent and severity of the hypoxic zone that forms in the northern Gulf of Mexico every summer. This hypoxic zone, known as the "dead zone," is characterized by extremely low oxygen levels in bottom or near-bottom waters, degraded water quality, and impaired marine life. The 2013 Gulf hypoxic zone encompassed 5,840 square miles, an area the size of Connecticut.
The reasons for increases or declines in annual nitrate levels are unknown. Reliable information on trends in contributing factors, such as fertilizer use, livestock waste, agricultural management practices, and wastewater treatment improvements, is needed to better understand what is causing increases or decreases in nitrate.
Nitrate trends from 1980 to 2010 were determined at eight long-term USGS monitoring sites in the Mississippi River Basin, including four major tributaries (Iowa, Illinois, Ohio and Missouri rivers) and four locations along the Mississippi River using methodology that adjusts for year-to-year variability in streamflow conditions.
- Nitrate concentrations steadily decreased by 21 percent in the Illinois River from 2000 to 2010. Decreases were also noted in the Iowa River during this time, but the declines were not as large (10 percent).
- Consistent increases in nitrate concentrations occurred between 2000 and 2010 in the upper Mississippi River (29 percent) and the Missouri River (43 percent).
- Nitrate concentrations in the Ohio River are the lowest among the eight Mississippi River Basin sites and have remained relatively stable over the last 30 years.
- Nitrate concentrations increased at the Mississippi River outlet by 12 percent between 2000 and 2010.
Nitrate increased at low streamflows throughout the basin, except for the Ohio and Illinois Rivers. Groundwater is likely the dominant source of nitrate during low flows. It may take decades for nitrate to move from the land where it was applied, migrate through groundwater, and eventually flow into these rivers. Because of this lag time, it can take several years or decades before the full water-quality effects of either increased groundwater contamination, or of improved management practices, are evident in the rivers.
The USGS report and additional information on nitrate trends in concentration and flux for each of the eight sites are available online. Additional information on USGS long-term monitoring sites in the Mississippi River Basin is also available online (PDF).
Research on nitrate trends is conducted as part of the USGS National Water-Quality Assessment (NAWQA) program. This program provides an understanding of water-quality conditions, whether conditions are getting better or worse over time, and how natural features and human activities combine to affect water quality. Additional information on the NAWQA program can be found online.
USGS Long-term Nitrate Trends Monitoring Sites
The illustrating graphic depicts the Mississippi River drainage in the central U.S. and indicates the geographic location of the eight permanent nitrate monitoring stations:
- Mississippi River at Clinton, IA
- Iowa River at Wapello, IA
- Illinois River at Valley City, IL
- Mississippi River below Grafton, IL
- Missouri River at Hermann, MO
- Mississippi River at Thebes, IL
- Ohio River near Grand Chain, IL
- Mississippi River above Old River Outflow Channel, LA
Americans place high value on butterfly royalty. A recent study suggests they are willing to support monarch butterfly conservation at high levels, up to about 6 ½ billion dollars if extrapolated to all U.S. households.
If even a small percentage of the population acted upon this reported willingness, the cumulative effort would likely translate into a large, untapped potential for conservation of the iconic butterfly.
Monarch butterfly populations have been declining across Mexico, California and other areas of the United States since 1999. A 2012 survey at the wintering grounds of monarchs in Mexico showed the lowest colony size ever recorded.
"The multigenerational migration of the monarch butterfly is considered one of the world’s most spectacular natural events," said Jay Diffendorfer, a USGS scientist and the study’s lead author. "However, managing migratory species is difficult because they can cross international borders and depend on many geographic areas for survival."
Much of the decline in monarch numbers has been blamed on the loss of milkweed, the native plants on which monarch caterpillars feed.
"While many factors may be affecting monarch numbers, breeding, migrating, and overwintering habitat loss are probably the main culprits," said Karen Oberhauser, a monarch biologist at the University of Minnesota and a co-author of the study. "In the U.S., the growing use of genetically-modified, herbicide-tolerant crops, such as corn and soybeans, has resulted in severe milkweed declines and thus loss of breeding habitat."
The authors suggest that the universal popularity of monarchs could encourage a market for monarch-friendly plants.
"This is the first nation-wide, published, economic valuation survey of the general public for an insect. The study indicates that economic values of monarch butterflies are potentially large enough to mobilize people for conservation planting and funding habitat conservation," said John Loomis, the lead economist on the study from Colorado State University.
"The life cycle of monarchs creates opportunities for untapped market-based conservation approaches," Diffendorfer continued. "Ordinary households, conservation organizations, and natural resource agencies can all plant milkweed and flowering plants to offset ongoing losses in the species’ breeding habitat."
According to the annual survey of the National Gardening Association, households that identify as "do-it-yourself lawn and gardeners" spent $29.1 billion in related retail sales in 2012.
"By reallocating some of those purchases to monarch-friendly plants, people would be able to contribute to the conservation of the species as well as maintain a flower garden," said Oberhauser. "Helping restore the monarch’s natural habitat, and potentially the species’ abundance, is something that people can do at home by planting milkweed and other nectar plants."
Unfortunately, many plants purchased by gardeners have been treated with systemic insecticides that can kill both pollinators that consume the nectar, and caterpillars, like monarchs, that eat the leaves.
"This study shows that not only might consumers pay more for monarch-friendly milkweeds grown without systemic insecticides in the potting soil, but also that consumers might be more interested overall in buying nectar-producing plants or milkweeds if they knew a small percentage of sales will be donated to habitat conservation," said Diffendorfer.
The study, released today in Conservation Letters, was authored by researchers with the USGS, Colorado State University, the University of Minnesota, and others, who participated in a USGS John Wesley Powell Center for Analysis and Synthesis working group.
About Monarch Butterflies
Monarchs are very popular in both society and throughout education. The monarch butterfly is currently the official insect or butterfly of seven different U.S. states, and is celebrated in festivals held across North America. Monarchs have been the focus of many school’s science curricula as well as the subjects of multiple citizen-science projects.
- National Valuation of Monarch Butterflies Indicates an Untapped Potential for Incentive-based Conservation
- John Wesley Powell Center for Analysis and Synthesis
- Ecosystem Services
- Geosciences and Environmental Change Science Center
- Colorado State University, Agricultural Resource Economics
- University of Maryland, Department of Biology
- University of Arizona, School of Natural Resources and the Environment
- University of Arizona, Udall Center for Studies in Public Policy
- Scripps Institution of Oceanography
- National Gardening Association
- European Forest Institute
- Upper Midwest Environmental Sciences Center
- Wiley Online Library: Conversation Letters
- Monarch Butterfly Fund
- Monarch Joint Venture
Satellite Data Yield New Understanding Of How Galápagos Volcanoes Are Formed And May Erupt In The Future
HAWAI`I ISLAND, Hawaii — The chance transit of a satellite over the April 2009 eruption of Fernandina volcano — the most active in South America's famed Galápagos archipelago — has revealed for the first time the mechanism behind the characteristic pattern of eruptive fissures on the island chain's volcanoes, according to a new study by University of Miami (UM) and U.S. Geological Survey (USGS) scientists. Their model not only sheds light on how Galápagos volcanoes grow, which has been a subject of debate since Darwin’s time, but may also help in forecasting the locations of future eruptions, adding to the vast scientific knowledge acquired by study of this iconic island chain.
In the study, Marco Bagnardi, a doctoral candidate at the UM Rosenstiel School of Marine & Atmospheric Science (RSMAS) and visiting scientist at USGS' Hawaiian Volcano Observatory, analyzed surface deformations on Fernandina from European Space Agency (ESA) ENVISAT satellite images acquired just two hours before the 2009 eruption. He sought to explain why Hawaiian and Galápagos volcanoes, while similar in some respects, show different eruptive patterns. Eruptions from Hawai‘i's volcanoes tend to occur along narrow rift zones that extend radially from the summit, while the shields of western Galápagos volcanoes have eruptive fissures that circle the summit near the calderas but are oriented radially on the flanks below.
Bagnardi found that magma began moving away from Fernandina's reservoir as a sill, or horizontal feature, before rotating vertically and erupting in a fissure perpendicular to the summit on the volcano's southwest flank. A sill also fed the 2005 eruption at Fernandina, but that magma moved upward to form a fissure circling the volcano's summit closer to the top. These data suggest that sills feed all eruptive activity in the Galápagos but that they rotate in different ways as they propagate toward the host volcano's surface.
"Our findings have literally turned our perspective by 90 degrees," Bagnardi said. "For decades, we thought that eruptions in the Galápagos were starting with the intrusion of vertical blade-like bodies. We now know this is not the case."
"Without knowing why the fissures are arranged the way they are, we had no way to infer the geologic history of the volcanoes and how they evolved over time," said USGS geophysicist Mike Poland, a co-author of the study. "This research allows scientists to better model how the volcanoes grow and behave, especially with respect to past and future activity," Poland said. "You can't move forward with solving the puzzle because you are missing a key part of the story. This work fills in a major missing piece and will allow better interpretations of a multitude of parameters about Galapagos volcanoes."
UM RSMAS is part of a long-term satellite-based monitoring program of the Galápagos volcanoes using Interferometric Synthetic Aperture Radar (InSAR), a technique that uses repeat satellite observations to measure millimeter-scale ground displacements. Bagnardi credited the "extremely fortunate" timing of data collected by the ENVISAT satellite, which passed over the island just when magma was already on its way toward the surface. Fernandina erupts every four to five years, on average, and the satellite passed over the archipelago only once every several days, he said.
The team theorizes that Hawaiian and Galápagos volcanoes behave differently because neighboring volcanoes of the Galápagos grew concurrently and did not affect one another as they formed. Hawaiian volcanoes, however, grow sequentially, meaning that older volcanoes control the structural development, including eruptive patterns, of younger edifices.
Based on the relations between deformation and eruptions at Fernandina in 1995 and 2005, the authors argue that the next eruption of Fernandina will be from a fissure circling the volcano's summit on the southwest side of the caldera in the area uplifted by the 2009 sill intrusion.
"Unfortunately, we are still not able to predict the timing of future eruptions in the Galápagos," Bagnardi said. "However, providing a forecast for the location and type of eruption is a step in the right direction."
Satellite imagery used in the study was also provided by the Japanese Space Exploration Agency (JAXA). This research was supported by the National Aeronautics and Space Administration (NASA) and the National Science Foundation (NSF).
The paper in Earth and Planetary Science Letters, "A New Model for the Growth of Basaltic Shields Based on Deformation of Fernandina Volcano, Galápagos Islands," by Marco Bagnardi, Falk Amelung and Michael P. Poland, is available online.
Since January 2009, more than 200 magnitude 3.0 or greater earthquakes have rattled Central Oklahoma, marking a significant rise in the frequency of these seismic events.
The U.S. Geological Survey and Oklahoma Geological Survey are conducting collaborative research quantifying the changes in earthquake rate in the Oklahoma City region, assessing the implications of this swarm for large-earthquake hazard, and evaluating possible links between these earthquakes and wastewater disposal related to oil and gas production activities in the region.
Studies show one to three magnitude 3.0 earthquakes or larger occurred yearly from 1975 to 2008, while the average grew to around 40 earthquakes per year from 2009 to mid-2013.
"We've statistically analyzed the recent earthquake rate changes and found that they do not seem to be due to typical, random fluctuations in natural seismicity rates," said Bill Leith, USGS seismologist. "These results suggest that significant changes in both the background rate of events and earthquake triggering properties needed to have occurred in order to explain the increases in seismicity. This is in contrast to what is typically observed when modeling natural earthquake swarms."
The analysis suggests that a contributing factor to the increase in earthquakes triggers may be from activities such as wastewater disposal--a phenomenon known as injection-induced seismicity. The OGS has examined the behavior of the seismicity through the state assessing the optimal fault orientations and stresses within the region of increased seismicity, particularly the unique behavior of the Jones swarm just east of Oklahoma City. The USGS and OGS are now focusing on determining whether evidence exists for such triggering, which is widely viewed as being demonstrated in recent years in Arkansas, Ohio and Colorado.
This "swarm" includes the largest earthquake ever recorded in Oklahoma, a magnitude 5.6 that occurred near Prague Nov. 5, 2011. It damaged a number of homes as well as the historic Benedictine Hall at St. Gregory's University, in Shawnee, Okla. Almost 60 years earlier in1952, a comparable magnitude 5.5, struck El Reno and Oklahoma City. More recently, earthquakes of magnitude 4.4 and 4.2 hit east of Oklahoma City on April 16, 2013, causing objects to fall off shelves.
Following the earthquakes that occurred near Prague in 2011, the agencies issued a joint statement, focusing on the Prague event and ongoing seismic monitoring in the region. Since then, the USGS and OGS have continued monitoring and reporting earthquakes, and have also made progress evaluating the significance of the swarm.
Important to people living in the Oklahoma City region is that earthquake hazard has increased as a result of the swarm. USGS calculates that ground motion probabilities, which relate to potential damage and are the basis for the seismic provisions of building codes, have increased in Oklahoma City as a result of this swarm. While it’s been known for decades that Oklahoma is "earthquake country," the increased hazard has important implications for residents and businesses in the area.
To more accurately determine the locations and magnitudes of earthquakes in Oklahoma, the OGS operates a 15-station seismic network. Data from this system, and from portable seismic stations installed in the Oklahoma City region, are sent in real-time to the USGS National Earthquake Information Center, which provides 24x7 reporting on earthquakes worldwide.
COOK, Wash. — The eggs of endangered Kootenai River white sturgeon are less likely to hatch on some of the surfaces that have been made more common by human, or anthropogenic, changes on the river, a new U.S. Geological Survey report has found.
The white sturgeon (Acipenser transmontanus), once common in much of North America, is a very large, slow-to-mature fish that has evolved little from its late Cretaceous ancestors 175 million years ago. It has great cultural significance for the Kootenai Tribe of Idaho and many other Northwest Tribes. White sturgeon was harvested in many places for caviar, and dams and other development have altered its habitat in ways whose implications are still being studied. White sturgeon in Idaho and Montana’s Kootenai River basin were listed as endangered in 1994, and poor recruitment (the number of a species’ young to survive to maturity) in other West Coast populations is a concern.
"Sturgeons are imperiled across the globe. Our scientists are committed to working with partners, including tribes, to address sturgeon issues across the region," said Jill Rolland, director of the USGS Western Fisheries Research Center.
In the report, prepared in cooperation with the Kootenai Tribe of Idaho, USGS research fishery biologist Mike Parsley and biological science technician Eric Kofoot examined hatch success in the laboratory on various surfaces, such as clean rocks, algae-covered rocks and sand, that sturgeon eggs settle and adhere to in the wild while they develop into larvae. The scientists found sand to be a poor surface, because the developing sturgeon embryos failed to attach to it. River rocks covered in algae yielded poor results, in part because they were more hospitable to fungus that threatens sturgeon embryos, while waterlogged wood and clean rocks performed well.
The report notes that sand substrates, or surfaces, now dominate the highly altered Kootenai River in areas currently used by spawning sturgeon, and that dam operation for flood management and hydropower during the spawning season have largely eliminated spring scouring flows that typically would clean rocks of algae and other growth. Finally, the report raises several possibilities, based on the findings, for maximizing white sturgeon recruitment, including substrate-type recommendations for spawning-habitat restoration and the incorporation of scouring flows to clean spawning substrate prior to the spawning season.
"This is another piece in the puzzle of understanding why some white sturgeon populations in highly altered river systems succeed and others don’t," Parsley said.
The publication, "Hatch Success of White Sturgeon Embryos Incubated on Various Substrates," USGS Report Series 2013-5180, by Michael J. Parsley and Eric Kofoot, is available online.