Today the U.S. Geological Survey led a congressional briefing featuring state and regional water stakeholders who spoke about vital uses of comprehensive water information that would be met by the National Water Census called for by the SECURE Water Act of 2009.
Growing populations, increased energy development, and the uncertain effects of a changing climate magnify the need for an improved understanding of water use and water availability. However, no comprehensive and current national assessment of water resources exists.
A report released in April, Progress Toward Establishing a National Assessment of Water Availability and Use, fulfilled a requirement under the 2009 law for the Secretary of the Interior to report to Congress on progress made in implementing the national water availability and use assessment program, also referred to as a National Water Census.
"It’s true in other fields and no less so for water: you can’t manage what you don’t measure," said Anne Castle, Department of the Interior Assistant Secretary for Water and Science. "The Water Census will quantify water supply and demand consistently across the entire country, fill in gaps in existing data, and make that information available to anyone who needs it—and that represents a huge step forward on the path toward water sustainability."
The National Academy of Sciences applauded the concept of a Water Census in 2009, suggesting that it would be "an ongoing, effective tool, on a par with the social and economic censuses, that supports national decision making."
"As competition for water grows — for irrigation of crops, for the production of energy, for use by cities and communities, and for the environment — the need for information and tools to aid water-resource managers also grows," said Tony Willardson, Executive Director, Western States Water Council.
"The more accurately we can assess the quantity and quality of our water resources, the better we can know whether our strategies for conserving and improving those resources are actually having the desired beneficial effect," observed Bob Tudor, Deputy Director, Delaware River Basin Commission. Willardson and Tudor were speakers at the briefing.
A Water Census is a complex undertaking, which points to why national water availability and use have not been comprehensively assessed in more than 35 years. Since then, competition for water resources has increased greatly and, in addition to human use, considerably more importance is now attached to the availability of water for environmental and ecosystem needs.
The USGS envisions the Water Census to be a key ongoing activity that, like the population census mandated by the Constitution, supports national decision-making in many different ways.
The resources currently available for this census are finite, however. USGS foresees that estimates of flow at ungaged locations and estimates of evapotranspiration will be among the earliest products of the National Water Census. Providing complete water-use information and adequately assessing the Nation’s groundwater resources with respect to water availability will require additional time.
Although the existing data are limited and much work remains to be done, funding over the past two years has allowed important progress. The USGS will continue to work with partner agencies and organizations to maximize the utility of the information for a broad range of uses.
The Water Census is part of an overarching Department of the Interior (DOI) initiative known as WaterSMART (Sustain and Manage America’s Resources for Tomorrow). Through WaterSMART, the Department is working to achieve a sustainable water strategy to help meet the Nation’s water needs. The Water Census will help inform that strategy.
The USGS is developing plans for the Water Census in coordination with other Federal and non-Federal agencies, universities, and other organizations. Collaboration across agency boundaries ensures that information produced by the USGS can be aggregated with data on other types of physical, social, economic, and environmental factors that affect water availability. Both the USGS and the U.S. Bureau of Reclamation (USBR) have substantive responsibilities under the Department of the Interior WaterSMART initiative.
One of the geographic focus areas of the Water Census is the drainage basin of the Colorado River, which covers parts of seven states, delivers water to more than 30 million people, irrigates nearly 4 million acres of cropland in the U.S. and Mexico, and supplies hydropower plants that annually generate more than 10 billion kilowatt-hours. Increasing population, decreasing streamflows, and the uncertain effects of a changing climate amplify the need for an improved understanding of water use and water availability in this crucial watershed.
In keeping with rapid demand, the USGS has posted new US Topo quadrangles covering Colorado (1,794 maps) and Minnesota (1,689). These new quads replace the first edition US Topo maps for those states. The replaced maps will be added to the USGS Historical Topographic Map Collection and are also available for free download from The National Map and the USGS Map Locator & Downloader website.
The new design for US Topo maps improves readability of maps for online and printed use, while retaining the look and feel of the traditional USGS topo map. Also, map symbols are now easier to read over the digital aerial photograph layer whether the imagery is turned on or off.
Other re-design enhancements and new features:
- New shaded relief layer for enhanced view of the terrain
- Military installation boundaries, post offices and cemeteries
- New road classification
- A slight screening (transparency) has been applied to some features to enhance visibility of multiple competing layers
- New PDF legend attachment
- Metadata formatted to support multiple browsers
In addition, the new Colorado US Topo quads include recreational trails in National Forests, provided by the U.S. Forest Service. Although this first test of trails was successful, the Forest Service does not yet have comparable data in other states, and schedules for adding trails in all National Forests have not been set.
"We are excited to about these two updates that are part of our continual effort to improve US Topo maps for our users," said Vicki Lukas, USGS Chief of Partner and User Engagement. "First, the new design makes US Topo maps even easier to use, and the new Colorado maps include Forest Service trails as a new feature."
US Topo maps are updated every three years. The initial round of the 48 conterminous state coverage was completed last September. Hawaii and Puerto Rico maps are being completed this year. New US Topo maps for Alaska have started, but will take several years to complete.
US Topo maps are created from geographic datasets in The National Map, and deliver visible content such as high-resolution aerial photography, which was not available on older paper-based topographic maps. The new US Topo maps provide modern technical advantages that support wider and faster public distribution and on-screen geographic analysis tools for users.
For more information, go to: http://nationalmap.gov/ustopo/Figure showing proposed US Topo production schedule. States that were updated in 2012 are in yellow; states that have, or will be updated in 2013 are colored red; and states are scheduled to be updated in 2014 are in blue. (High resolution image)
ST. PETERSBURG, Fla. — Acidification of the Arctic Ocean is occurring faster than projected according to new findings published in the journal PLOS ONE. The increase in rate is being blamed on rapidly melting sea ice, a process that may have important consequences for health of the Arctic ecosystem.
Ocean acidification is the process by which pH levels of seawater decrease due to greater amounts of carbon dioxide being absorbed by the oceans from the atmosphere. Currently oceans absorb about one-fourth of the greenhouse gas. Lower pH levels make water more acidic and lab studies have shown that more acidic water decrease calcification rates in many calcifying organisms, reducing their ability to build shells or skeletons. These changes, in species ranging from corals to shrimp, have the potential to impact species up and down the food web.
The team of federal and university researchers found that the decline of sea ice in the Arctic summer has important consequences for the surface layer of the Arctic Ocean. As sea ice cover recedes to record lows, as it did late in the summer of 2012, the seawater beneath is exposed to carbon dioxide, which is the main driver of ocean acidification.
In addition, the freshwater melted from sea ice dilutes the seawater, lowering pH levels and reducing the concentrations of calcium and carbonate, which are the constituents, or building blocks, of the mineral aragonite. Aragonite and other carbonate minerals make up the hard part of many marine micro-organisms’ skeletons and shells. The lowering of calcium and carbonate concentrations may impact the growth of organisms that many species rely on for food.
The new research shows that acidification in surface waters of the Arctic Ocean is rapidly expanding into areas that were previously isolated from contact with the atmosphere due to the former widespread ice cover.
"A remarkable 20 percent of the Canadian Basin has become more corrosive to carbonate minerals in an unprecedented short period of time. Nowhere on Earth have we documented such large scale, rapid ocean acidification" according to lead researcher and ocean acidification project chief, U.S. Geological Survey oceanographer Lisa Robbins.
Globally, Earth's ocean surface is becoming acidified due to absorption of man-made carbon dioxide. Ocean acidification models show that with increasing atmospheric carbon dioxide, the Arctic Ocean will have crucially low concentrations of dissolved carbonate minerals, such as aragonite, in the next decade.
"In the Arctic, where multi-year sea ice has been receding, we see that the dilution of seawater with melted sea ice adds fuel to the fire of ocean acidification" according to co-author, and co-project chief, Jonathan Wynn, a geologist from the University of the South Florida. "Not only is the ice cover removed leaving the surface water exposed to man-made carbon dioxide, the surface layer of frigid waters is now fresher, and this means less calcium and carbonate ions are available for organisms."
Researchers were able to investigate seawater chemistry at high spatial resolution during three years of research cruises in the Arctic, alongside joint U.S.-Canada research efforts aimed at mapping the seafloor as part of the U.S. Extended Continental Shelf program. In addition to the NOAA supported ECS ship time, the ocean acidification researchers were funded by the USGS, National Science Foundation, and National Oceanic and Atmospheric Administration.
Compared to other oceans, the Arctic Ocean has been rather lightly sampled. "It's a beautiful but challenging place to work," said Robert Byrne, a USF marine chemist. Using new automated instruments, the scientists were able to make 34,000 water-chemistry measurements from the U.S. Coast Guard icebreaker. "This unusually large data set, in combination with earlier studies, not only documents remarkable changes in Arctic seawater chemistry but also provides a much-needed baseline against which future measurements can be compared." Byrne credits scientists and engineers at the USF college of Marine Science with developing much of the new technology.
A new report by the U.S. Geological Survey evaluates how well the USGS streamgage network meets needs for streamflow information by assessing the ability of the network to produce various streamflow statistics at locations that have streamgages (gaged) and that do not have streamgages (ungaged).
The report analyzes where there are gaps in the network of gaged locations, how accurately useful statistics can be calculated with a given length of record, and whether the current network allows for estimation of these statistics at ungaged locations. The results of the report indicate that there is variability across the Nation’s streamflow data-collection network in terms of the spatial and temporal coverage of streamgages.
In general, the eastern United States has better coverage than the western U.S. The arid southwestern U.S., Alaska, and Hawaii were observed to have the poorest spatial coverage, using the dataset assembled for this study. With the exception of Hawaii, these areas also tended to have short streamflow records.
Differences in hydrology lead to differences in the uncertainty of statistics calculated in different regions of the country. Arid and semiarid areas of the central and southwestern U.S. generally exhibited the highest levels of interannual variability in flow, leading to larger uncertainty in flow statistics at both gaged and ungaged locations.
At ungaged locations, information can be transferred from nearby streamgages if there is sufficient similarity between the gaged watersheds and the ungaged watersheds of interest. Areas where streamgages exhibit high correlation with other streamgages are most likely to be suitable for this type of information transfer. The areas with the most highly correlated streamgages appear to coincide with mountainous areas of the U.S. Lower correlations are found in the central U.S. and coastal areas of the southeastern U.S.
Information transfer from gaged basins to ungaged basins is also most likely to be successful when basin attributes show high similarity. At the scale of the analysis completed in this study, the attributes of basins upstream of USGS streamgages cover the full range of basin attributes observed at potential locations of interest fairly well. Some exceptions included very high or very low elevation areas and very arid areas.
- A national streamflow network gap analysis
USGS Scientific Investigations Report: 2013-5013
- USGS Surface Water Information
- USGS National Streamflow Information Program (NSIP)
- National Water Census
While scientists can't predict when a great earthquake producing a pan-Pacific tsunami will occur, thanks to new tools being developed by federal and state officials, scientists can now offer more accurate insight into the likely impacts when tsunamis occur. This knowledge can lead officials and the public to reduce the risk of the future tsunamis that will impact California.
What are the potential economic impacts? Which marinas could be destroyed? Who needs to be prepared for evacuations? A newly published report looks at these factors and more.
A hypothetical yet plausible scenario was developed where a tsunami is created by an earthquake offshore from the Alaskan peninsula and extends to the California coast. This is detailed in a new report titled, the U.S. Geological Survey's Science Application for Risk Reduction (SAFRR) Tsunami Scenario.
Some of the issues highlighted in the scenario include public safety and economic loss. In this scenario approximately 750,000 people would need to be evacuated, with 90,000 of those being tourists and visitors. Additionally, one-third of the boats in California's marinas could be damaged or completely sunk, resulting in $700 million in losses. It was concluded that neither of California's nuclear power plants would likely be damaged by this particular event.
Looking back to 2011, not only was Japan devastated by the magnitude 9.1 Tohoku earthquake, but the resulting tsunami also swept through California and caused $50-100 million of damage. This shows that tsunamis near and far can lead to billions of dollars in losses in California.
The SAFRR Tsunami Scenario is a collaborative effort between the USGS, the California Geological Survey, the California Governor's Office of Emergency Services, the National Oceanic and Atmospheric Administration, other agencies, and academic and other institutions.
"The good news is that three-quarters of California's coastline is cliffs, and thus immune to the harsher and more devastating impacts tsunamis could pose," said Lucy Jones, who is the USGS Science Advisor for Risk Reduction and leads the SAFRR Project. "The bad news is that the one-quarter at risk is some of the most economically valuable property in California."
"In order to effectively protect communities from tsunamis, we must first know what to plan for," continued Jones. "By starting with science, there is a clearer understanding on how tsunamis function and their potential impacts. The scenario will serve as a long-lasting resource to raise awareness and provide scientifically-sound and unbiased information to decision makers in California and abroad."
In this scenario, scientists specifically outline the likely inundation areas, current velocities in key ports and harbors, physical damage and repair costs; economic consequences; environmental impacts; social vulnerability; emergency management; and policy implications for California.
This scenario will also be the focus of discussion at a workshop series starting September 4 that is convened in partnership with the California Tsunami Hazard Mitigation Program. USGS scientists and partners will explain the scenario and results to stakeholders in the coastal communities of California. The workshops aims to establish a community of experts while fostering the use of science in decision-making.
The workshops will be hosted by the Cabrillo Marine Aquarium (September 4); Santa Barbara County Office of Emergency Management (September 5); San Diego County Office of Emergency Management (September 6); Santa Cruz County Office of Emergency Management (September 9); and the Port of San Francisco (September 10).
The SAFRR Project is the same USGS research group that produced the ShakeOut Scenario in 2008, examining the consequences of a probable major earthquake on the southern San Andreas Fault, and the ARkStorm Scenario in 2011, examining the risks associated with extreme rain events associated with atmospheric rivers.The March 11, 2011 Tohoku-oki tsunami caused significant damage to ships and docks within Crescent City Harbor in California. A number of ships were sunk within the harbor. Because of extensive sedimentation and potential contaminated debris within the harbor, recovery efforts took over a year. Photo Credit: Rick Wilson, California Geological Survey. (Larger image) Maximum current speeds for the Port of Los Angeles (POLA) and the Port of Long Beach (POLB), according to the SAFRR Tsunami Scenario. In the POLA, currents are strongest at Angels Gate, the Cabrillo Marina, the Boat Yard, and the old Navy Yard. Once the tsunami event is underway, navigation through the Gate would be very dangerous. In the Cabrillo Marina and Boat Yard, currents are likely strong enough to break apart floating docks, damage piles, and pull small vessels from their mooring lines. The strongest currents are found in the old Navy Yard; however there are no exposed floating assets in this immediate area. At the POLB, again strong currents are found at the Gate. Also in the POLB, strong and jet-like currents are predicted at the entrance to the main cargo container area (Pier J). Currents here may be strong enough to damage, and possible break, mooring lines. Photo Credit: SAFRR Tsunami Scenario. (Larger image) Areas that would be inundated (in red) from the SAFRR Tsunami Scenario at Oakland Airport and Alameda in California. Large portions of Bay of Farm Island and Oakland Airport would be flooded. Photo Credit: SAFRR Tsunami Scenario. (Larger image) Areas that would be inundated (in red) from the SAFRR Tsunami Scenario in Newport Beach in Orange County, California. This includes complete and partial flooding of islands and near overtopping of Balboa Peninsula. Photo Credit: SAFRR Tsunami Scenario. (Larger image) A map of the California coast in the City of Long Beach showing areas predicted to be inundated (in red) by the SAFRR Tsunami scenario. These include the Long Beach Convention Center and many retail businesses. Photo Credit: SAFRR Tsunami Scenario. (Larger image)
Wastewater treatment plants that process waters from oil and gas development were found to discharge elevated levels of toxic chemicals known as brominated disinfection byproducts, according to a new study by the U.S. Geological Survey.
Disinfection byproducts are created by chemical reactions when water is disinfected. Of the hundreds of known, or suspected, disinfection byproducts possibly created by disinfection processes, the brominated forms are among the most toxic.
"While these findings do not indicate an immediate threat to aquatic life or human health, the study provides new data on the water quality of streams receiving discharged wastewater that can be used to inform decisions about management and treatment of produced waters," said Michelle Hladik, primary author of the report.
Waters that are co-produced when oil and gas resources are extracted from deep geological formations are commonly called produced waters. Produced waters are composed of naturally occurring materials characteristic of the geologic formations in which they originate. Often, the water in these formations is a brine with high concentrations of bromide, iodide, and other ions such as sodium and chloride.
Produced waters can originate from unconventional (e.g. hydraulic fracturing) and conventional oil and gas extractions. Management of produced waters includes a variety of methods, such as recycling, road spreading, deep-well injection, and processing by wastewater treatment plants.
Several different types of brominated disinfection byproducts can be created when produced waters with high levels of bromide are disinfected.
Currently, and during the time of the study, most wastewaters from unconventional oil and gas activities such as hydraulic fracturing in the study area have been deep well injected and therefore not processed by wastewater treatment plants. However, this study did not attempt to quantify the relative proportions of produced waters originating from the various unconventional or conventional oil and gas extraction activities.
The study examined river water samples downstream from the discharges of publicly-owned and commercial wastewater treatment plants that were processing produced waters with high levels of naturally occurring bromide. These samples were compared with water just upstream of the plants and with samples from wastewater treatment plants that did not process produced waters from oil and gas development.
The water was examined for 29 different disinfection byproducts, including brominated and non-brominated disinfection byproducts. The brominated disinfection byproducts were detected more frequently and at much higher levels in river water impacted by disinfected produced waters than at other sites.
The study is entitled "Discharges of produced waters from oil and gas extraction via wastewater treatment plants are sources of disinfection by-products to receiving streams," and is published in Science of the Total Environment. The study may be accessed online.
The USGS provides information on the quality of our environment; identifies emerging environmental issues; and provides information to aid decision-making by regulators, policymakers, industry and the public. To learn more about this study and other USGS Environmental Health research, please visit the USGS Environmental Health website or sign up for the USGS GeoHealth Newsletter.
Two recent images from the Landsat 8 satellite compare land conditions in the vicinity of Yosemite National Park before and during the Rim Fire. The images, from August 15 before the fire began and from August 31, can be contrasted and downloaded from the USGS Earth Resources Observation and Science (EROS) Center.
As of September 3, the Rim Fire, currently the fourth-largest wildfire in California history, has burned over 235,841 acres (about 16 times the land area of Manhattan Island) and is 70-percent contained. The Rim Fire started August 17 on lands to the west of Yosemite National Park, but spread quickly into western regions of the park.
Landsat imagery provides critical vegetation and fuels information that is used to model fire behavior and make tactical decisions. After a fire, scientist and land managers use Landsat imagery to determine the severity of the fire's effect and to monitor the recovery of the land.
Both images are false-colored to allow identification of critical vegetation and fuels information. In the images fire appears bright red, vegetation is green, smoke is blue, clouds are white, and bare ground is tan-colored.
The USGS supports both the Department of the Interior and U.S. Forest Service wildfire response. Throughout the fire season, USGS regularly uploads images for wildfires from several satellites to the Hazard Data Distribution System. These remotely sensed data are used by wildfire responders to map potential risks to communities and determine immediate post-fire effects. So far in 2013, 2,156 images have been distributed for wildfires.
The USGS helps staff the Geospatial Multi-Agency Coordination Group (GeoMAC). The GeoMAC viewer is a mapping application that allows fire managers and the public to access online maps of current fire locations and fire perimeters. Currently, for 2013, GeoMAC is maintaining up-to-date perimeter information for 620 wildfires across the United States.
Landsat is a joint effort of both USGS and NASA. Landsat images are unique in that they provide complete global coverage, they are available for free, and they span more than 41 years of continuous earth observation.
- USGS Southern California Wildfire Risk Scenario Project
- NASA 'Fire Towers' in Space
- USGS Landsat
- NASA Landsat
PASADENA, Calif. — Imagine the Delaware River abruptly rising toward Philadelphia in a tsunami-like wave of water. Scientists now propose that this might not be a hypothetical scenario. A newly published paper concludes that a modest (one-foot) tsunami-like event on the East Coast was generated in the past by a large offshore earthquake. This result may have potential ramifications for emergency management professionals, government officials, businesses and the general public.
Early in the morning of Jan. 8, 1817, earthquake shaking was felt along the Atlantic seaboard as far north as Baltimore, Md., and at least as far south as Charleston, S.C. Later that morning, a keen observer documented an abrupt rise in the tide on the Delaware River near Philadelphia, commenting on the earthquake felt earlier to the south, and remarking that the tidal swell was most likely "the reverberation or concussion of the earth operating on the watery element."
Scientists have previously interpreted this earthquake to have a magnitude around 6 and a location somewhere in the Carolinas or slightly offshore. In a new study, USGS research geophysicist Susan Hough and colleagues reconsider the accounts of shaking and, for the first time, consider in detail the Delaware River account. They show that the combined observations point to a larger magnitude and a location farther offshore than previously believed. In particular, they show that a magnitude-7.4 earthquake located 400-500 miles off South Carolina or Georgia could have generated a tsunami wave large enough to account for the tidal swell on the Delaware. Using new computer-assisted research techniques, they uncover first-hand accounts from newspapers and ships' logs that give a wider perspective on the 1817 event. Notably, the predicted timing of such a tsunami wave from this location matches the documented timing in the eyewitness account.
The USGS monitors earthquakes offshore, and in recent years has undertaken research to better understand shaking and tsunami hazard from offshore earthquakes and landslides. Scientific understanding of faults and geological processes in this part of the Atlantic is limited. Still, it has long been understood that large, infrequent offshore earthquakes may pose a tsunami hazard to the Atlantic coast. In 1978, a magnitude-6 earthquake occurred roughly 240 miles southwest of Bermuda, even farther offshore than the inferred location of the 1817 earthquake. In 1929, the magnitude-7.2 Grand Banks, Newfoundland, earthquake triggered a submarine landslide that generated a large tsunami. Waves 10-13 feet high struck the Newfoundland coast, killing 29 people and leaving 10,000 temporarily homeless.
The inferred 1817 tsunami was significantly smaller than the Newfoundland disaster. However, the new interpretation by Hough and colleagues highlights the potential earthquake and tsunami hazard along the Atlantic seaboard from the still poorly understood offshore earthquake faults. The new study highlights that there is still work to be done to characterize this hazard in the southeastern United States.
The study, "Reverberations on the Watery Element: A Significant, Tsunamigenic Historical Earthquake Offshore the Carolina Coast," by Susan E. Hough, Jeffrey Munsey, and Steven N. Ward, is published in the September/October issue of Seismological Research Letters.
More than 400 new topographic maps are now available for the state of Alaska. The new maps are part of the U.S. Geological Survey Alaska Mapping Initiative, to update foundational data for the state and to replace the existing maps that are about 50 years old.
"These new digital maps of Alaska are elevating our visual record of the surface of the state to 21st century levels," said Anne Castle, Assistant Secretary of the Interior for Water and Science. "The associated advances in human safety, navigation, and natural resource management cannot be overestimated. The productive partnership between the State government and the USGS is facilitating acquisition of the necessary data to complete digital mapping of Alaska, which is a critical chapter in the history of our geographical knowledge of the North American continent."
The first 400-plus new US Topo maps for Alaska are now accessible and are the beginning of a multi-year project, ultimately leading to more than 11,000 new maps for the entire state. The goal of the AMI is the production of a complete series of digital topographical maps at a scale of 1:25,000 to replace the 1:63,360-scale maps produced about 50 years ago. The maps will be published in digital PDF format (GeoPDF©) and are available for free download and manipulation on a computer.
These new maps include several layers, with an option for the user to turn them on or off. Major updated features include:
- Satellite image layers which allows a recent view of the earth's surface.
- Contours and shaded relief layers showing the lay of the land derived from newly acquired 5-meter radar elevation data.
- Surface water features from the USGS National Hydrography Dataset, which are updated by local stewards and USGS.
- Glaciers updated using Randolph Glacier Inventory data.
- Boundaries integrated from multiple sources, including Census and major Federal landholders.
- The Public Land Survey System layer from the Bureau of Land Management.
- Roads from a commercial vendor under a USGS contract.
- Railroads and the Trans-Alaska oil pipeline data from local sources.
- Important buildings including police stations, schools, and hospitals.
- Airports, heliports and seaplane landing strips compiled by USGS from multiple sources.
- Feature names from the USGS-maintained Geographic Names Information System.
To ensure that the maps meet current accuracy specifications and standards, the maps will be made using newly acquired elevation and imagery data from multiple state, federal and commercial sources. The map-making process will be largely automated using software specially adapted by the USGS to create approximately 11,275 digital map quadrangles, covering the entire area of the state.
Mapping in Alaska did not keep pace with records for the rest of the nation as a result of difficult terrain, remote locations, and vast distances. Modern mapping information does not exist over the majority of land in the state. Prior to this effort, topographical maps for much of Alaska were about 50 years out of date and not produced to current standards, which rely largely on high resolution digital imagery and elevation data. As a consequence, essential public services have suffered, among them transportation planning and safety, urban and regional planning, economic development, natural resource management, conservation and scientific research.
This new generation of digital topographic maps will continue the rich and valuable USGS cartographic history, and serve the Nation by providing reliable scientific information to describe and understand the Earth; minimize loss of life and property from natural disasters; manage water, biological, energy, and mineral resources; and enhance and protect quality of life.
For more information and download, go to: http://nationalmap.gov/alaska/Historically, Alaska has been a proving ground for many developments in modern surveying and mapping. Field surveying and topographical mapping of the Alaskan interior by the USGS began in the 1890s following the discovery of gold in the Yukon. Travel was often by dog sled and pack train, canoe and walrus-skin kayak as shown in this undated photo. (Larger image) Part of a new US Topo quadrangle map of Fairbanks, Alaska – southwest borough with enhanced elevation contours, surface water, names, transportation, and structures data. (Larger image, 25 MB)
Hydraulic fracturing fluids are believed to be the cause of the widespread death or distress of aquatic species in Kentucky's Acorn Fork, after spilling from nearby natural gas well sites. These findings are the result of a joint study by the U.S. Geological Survey and the U.S. Fish and Wildlife Service.
The Acorn Fork, a small Appalachian creek, is habitat for the federally threatened Blackside dace, a small colorful minnow. The Acorn Fork is designated by Kentucky as an Outstanding State Resource Waters.
"Our study is a precautionary tale of how entire populations could be put at risk even with small-scale fluid spills," said USGS scientist Diana Papoulias, the study's lead author. "This is especially the case if the species is threatened or is only found in limited areas, like the Blackside dace is in the Cumberland."
The Blackside dace typically lives in small, semi-isolated groups, so harmful events run the risk of completely eliminating a local population. The species is primarily threatened with loss of habitat.
After the spill of hydraulic fracturing fluid, state and federal scientists observed a significant die-off of aquatic life in Acorn Fork including the Blackside dace as well as several more common species like the Creek chub and Green sunfish. They had been alerted by a local resident who witnessed the fish die-off. The U.S. Fish and Wildlife Service and the Commonwealth of Kentucky are currently working towards restoration of the natural resources that were injured by the release.
To determine the cause of the fish die-off, the researchers collected water and fish samples immediately following the chemical release in 2007.
The samples analyses and results clearly showed that the hydraulic fracturing fluids degraded water quality in Acorn Fork, to the point that the fish developed gill lesions, and suffered liver and spleen damage as well.
"This is an example of how the smallest creatures can act as a canary in a coal mine," said Tony Velasco, Ecologist for the Fish and Wildlife office in Kentucky, who coauthored the study, and initiated a multi-agency response when it occurred in 2007. "These species use the same water as we do, so it is just as important to keep our waters clean for people and for wildlife."
The gill lesions were consistent with exposure to acidic water and toxic concentrations of heavy metals. These results matched water quality samples from Acorn Fork that were taken after the spill.
After the fracturing fluids entered Acorn Fork Creek, the water’s pH dropped from 7.5 to 5.6, and stream conductivity increased from 200 to 35,000 microsiemens per centimeter. A low pH number indicates that the creek had become more acidic, and the stream conductivity indicated that there were higher levels of dissolved elements including iron and aluminum.
Blackside dace are a species of ray-finned fish found only in the Cumberland River basin of Kentucky and Tennessee and the Powell River basin of Virginia. It has been listed as a federally-threatened species by the Service since 1987.
Hydraulic fracturing is the most common method for natural gas well-development in Kentucky.
The report is entitled "Histopathological Analysis of Fish from Acorn Fork Creek, Kentucky Exposed to Hydraulic Fracturing Fluid Releases," and is published in the scientific journal Southeastern Naturalist, in a special edition devoted to the Blackside dace.
To learn more about this study and other contaminants research, please visit the USGS Environmental Health web page, the USGS Columbia Environmental Research Center, or the U.S. Fish and Wildlife Service Environmental Contaminants web page.
Declining bighorn sheep populations may be vulnerable to some of the fatal diseases, including chronic wasting disease (CWD), that are found in their western U.S. habitats, according to a new U.S. Geological Survey study.
USGS National Wildlife Health Center (NWHC) research showed that bighorn sheep are likely susceptible to the deadly neurological diseases scrapie and CWD, which are occurring in or near natural bighorn sheep environments. These fatal diseases are caused by mysterious proteins called prions, and are known to infect domestic sheep (scrapie) and non-domestic deer, elk, and moose (CWD). The USGS study is published in the journal BMC Veterinary Research, and is available online.
"Bighorn sheep are economically and culturally important to the western U.S.," said Dr. Christopher Johnson, USGS scientist and senior author of the report. "Understanding future risks to the health of bighorn sheep is key to proper management of the species."
USGS laboratory tests found evidence that bighorn sheep could be vulnerable to CWD from either white-tailed deer or elk, and to a domestic sheep prion disease known as scrapie. However, none of a small number of bighorn sheep sampled in the study showed evidence of infection.
"Our results do not mean that bighorns get, or will eventually get, prion diseases," Johnson said. "However, wildlife species like bighorn sheep are increasingly exposed to areas where CWD occurs as the disease expands to new geographical areas and increases in prevalence."
The laboratory test results could be useful to wildlife managers because bighorn sheep habitats overlap with farms and ranches with scrapie-infected sheep and regions where CWD is common in deer, elk, and moose.
Bighorn sheep populations in western North America have declined from habitat loss and, more recently, epidemics of fatal pneumonia thought to be transmitted to them from domestic sheep. Prion diseases are another possible threat to this valuable species.
For more information on prion diseases such as CWD, please visit the USGS NWHC website.
Hurricane Sandy Eroded Half of Fire Island's Beaches and Dunes: New Report Quantifies Coastal Change
ST. PETERSBURG, Fla. – Beaches and dunes on Fire Island, New York, lost more than half of their pre-storm volume during Hurricane Sandy, leaving the area more vulnerable to future storms.
While the damage and destruction on Fire Island was immediately evident after the storm, a new U.S. Geological Survey study released today is the first to quantify the actual changes to the coast caused by the storm.
"The beaches and dunes of the island were severely eroded during Sandy," said Cheryl Hapke, a USGS research geologist and lead author of the study. "The island was breached in three locations, and there was widespread damage and destruction of coastal infrastructure, including private residences. The report shows that the beaches and dunes lost 54.4 percent of their pre-storm volume, and the dunes experienced overwash along 46.6 percent of the island, dramatically changing the island’s shape."
Field surveys conducted immediately after Sandy documented low, flat beaches and extensive dune erosion. Assessment of overwash deposits -- the material that was carried to the interior of the island -- indicates that most of the sand lost from the beaches and dunes during Hurricane Sandy was moved offshore, carried by waves and storm surge. Of the volume of sand that was lost from the beaches and dunes, 14 percent was deposited inland.
"The impact from Sandy was unprecedented in recent times," said Hapke. "It is important that efforts to rebuild on the island be guided by the science, which shows that Sandy profoundly altered the shape and position of the barrier island, shifting it landward and redistributing large amounts of sand. Storms like Sandy are part of the natural evolution of barrier islands, which ultimately result in islands that are more resilient to sea level rise."
The extreme erosion of the beach and loss of dunes made the island more vulnerable to subsequent winter storms. In the course of the following winter months, the shoreline position shifted as much as 57.5 meters (189 feet) inland. Although several areas begin to experience some recovery in the early spring, at the end of the survey period only a small fraction, 18 percent, of the pre-Sandy beach volume had returned.
"Barrier islands provide natural protection against storms, shielding coastlines from rising waves and tides," said Hapke. "The loss of so much sand increases the vulnerability of this area of coastline to future storms."
Fire Island is the longest of the barrier islands that lie along the south shore of Long Island, New York. The majority of the island is part of Fire Island National Seashore and not only provides the first line of defense against storms, but is a unique and important recreational and ecosystem resource. USGS research on Fire Island focuses understanding the evolution of the form and structure of the barrier system on a variety of time scales, including storm driven change in the region.
Scientists have detected magmatic water — water that originates from deep within the Moon's interior — on the surface of the Moon. These findings, published in the August 25 issue of Nature Geoscience, represent the first such remote detection of this type of lunar water, and were arrived at using data from NASA's Moon Mineralogy Mapper (M3).
The discovery represents an exciting contribution to the rapidly changing understanding of lunar water, said Rachel Klima, a planetary geologist at the Johns Hopkins University Applied Physics Laboratory (APL) in Laurel, Md., and lead author of the paper, "Remote detection of magmatic water in Bullialdus Crater on the Moon."
"For many years, researchers believed that the rocks from the Moon were 'bone dry' and that any water detected in the Apollo samples had to be contamination from Earth," said Klima, a member of the NASA Lunar Science Institute's (NLSI) Scientific and Exploration Potential of the Lunar Poles team. "About five years ago, new laboratory techniques used to investigate lunar samples revealed that the interior of the Moon is not as dry as we previously thought. Around the same time, data from orbital spacecraft detected water on the lunar surface, which is thought to be a thin layer formed from solar wind hitting the lunar surface."
"This surficial water unfortunately did not give us any information about the magmatic water that exists deeper within the lunar crust and mantle, but we were able to identify the rock types in and around Bullialdus crater," said co-author Justin Hagerty, of the U.S. Geological Survey. "Such studies can help us understand how the surficial water originated and where it might exist in the lunar mantle."
In 2009, the M3, aboard the Indian Space Research Organisation's Chandrayaan-1 spacecraft, fully imaged the lunar impact crater Bullialdus. "It's within 25 degrees latitude of the equator and so not in a favorable location for the solar wind to produce significant surface water," Klima explained. "The rocks in the central peak of the crater are of a type called norite that usually crystallizes when magma ascends but gets trapped underground instead of erupting at the surface as lava. Bullialdus crater is not the only location where this rock type is found, but the exposure of these rocks combined with a generally low regional water abundance enabled us to quantify the amount of internal water in these rocks."
After examining the M3 data, Klima and her colleagues found that the crater has significantly more hydroxyl — a molecule consisting of one oxygen atom and one hydrogen atom — compared to its surroundings. "The hydroxyl absorption features were consistent with hydroxyl bound to magmatic minerals that were excavated from depth by the impact that formed Bullialdus crater," Klima writes.
The internal magmatic water provides information about the Moon's volcanic processes and internal composition, Klima said. "Understanding this internal composition helps us address questions about how the Moon formed, and how magmatic processes changed as it cooled. There have been some measurements of internal water in lunar samples, but until now this form of native lunar water has not been detected from orbit."
The detection of internal water from orbit means that scientists can begin to test some of the findings from sample studies in a broader context, including in regions that are far from where the Apollo sites are clustered on the near side of the Moon. "Now we need to look elsewhere on the Moon and try to test our findings about the relationship between the incompatible trace elements (e.g., thorium and uranium) and the hydroxyl signature," Klima said. "In some cases this will involve accounting for the surface water that is likely produced by interactions with the solar wind, so it will require integration of data from many orbital missions."
"This impressive research confirms earlier lab analyses of Apollo samples, and will help broaden our understanding of how this water originated and where it might exist in the lunar mantle," said NLSI Director Yvonne Pendleton.
Along with Klima and Hagerty, Joshua Cahill and David Lawrence, both of APL, co-authored the paper. The research was supported by the NASA Lunar Advanced Science and Engineering Program, NLSI and the NASA Planetary Mission Data Analysis Program.
The Applied Physics Laboratory, a not-for-profit division of The Johns Hopkins University, meets critical national challenges through the innovative application of science and technology. For more information, visit the JHUAPL website.
Real-time Monitoring Pays Off for Tracking Nitrate Pulse in Mississippi River Basin to the Gulf of Mexico
Cutting edge optical sensor technology is being used in the Mississippi River basin to more accurately track the nitrate pulse from small streams, large tributaries and ultimately to the Gulf of Mexico.
Excessive springtime nitrate runoff from agricultural land and other sources in the Mississippi drainage eventually flows into the Mississippi River. Downstream, this excess nitrate contributes to the Gulf of Mexico hypoxic zone, an area with low oxygen known commonly as the "dead zone." NOAA-supported researchers reported that the summer 2013 dead zone covered about 5,840 square miles, an area the size of Connecticut.
These optical sensors measure and transmit nitrate data every 15 minutes to 3 hours and are located at the mouth of the Mississippi River near Baton Rouge, LA, and at several large tributaries to the Mississippi River—including the Missouri River at Hermann, MO; Ohio River at Olmsted, IL; Ohio, Illinois River at Florence, IL; and Iowa River at Wapello, IA – to track how nitrate concentrations from different areas of the watershed pulse in response to rainfall and seasons.This graph shows the pulsing of the nitrate concentration of the Misssissippi River at Baton Rouge, LA, along with the streamflow at the same point, from November 2011 to August 2013.
About 622 million pounds of nitrogen were transported in May and June of 2013 at the Mississippi River Baton Rouge station, said Brian Pellerin, USGS research hydrologist. "This is roughly equivalent to the amount of fertilizer nitrogen applied annually to about 4 million acres of corn."
Nitrate sensors in Iowa, Illinois, Indiana, Nebraska, Kansas, Wisconsin, Missouri, and Arkansas provide new insights for researchers into the storage and transport of nitrate from headwaters to the Gulf of Mexico.
"Real-time information will improve our ability to measure the effectiveness of management actions by allowing us to track the movement and quantity of nitrate delivered from small streams all the way to the Gulf Coast," said Lori Caramanian, the Department of the Interior’s Deputy Assistant Secretary for Water and Science. "These sensors will give us an unprecedented level of precision in tracing the origins of excessive nitrate, and will be a valuable tool in tracking progress toward the goal of reducing the size of the dead zone."
Real-time nitrate monitoring in Iowa is being used by drinking water utilities to determine when to switch on nitrate-removal systems or when to mix water with multiple sources that have lower concentrations. Both actions result in higher costs for drinking water. "Real-time nitrate concentrations in the Raccoon River at Van Meter, Iowa, peaked at 20.7 milligrams per liter in May 2013. This is more than double the U.S. Environmental Protection Agency’s maximum contaminant level for drinking water," said Kevin Richards, Director of the USGS Iowa Water Science Center.
The USGS, in cooperation with numerous local, state, and other federal agencies, currently operates over 52 real-time nitrate sensors across the Nation, of which 36 are in the Mississippi River Basin. These data are available at USGS Water-Quality Watch. Real-time nitrate monitoring is supported by the USGS National Stream Quality Accounting Network, Cooperative Water Program, and the National Water-Quality Assessment Program.
The USGS also continuously monitors water levels and streamflows at thousands of the nation's streams on a real-time basis. These data are available at USGS Current Streamflow Conditions.
Plans for remapping parts of the East Coast where Hurricane Sandy altered seafloors and shorelines, destroyed buildings, and disrupted millions of lives last year are being announced today by three federal agencies. This remapping plan comes one day after the Administration's Hurricane Sandy Rebuilding Task Force progress report.
The USGS, NOAA, and the U.S. Army Corps of Engineers are using emergency supplemental funds provided by Congress to survey coastal waters and shorelines, acquiring data that will update East Coast land maps and nautical charts.
Using ships, aircraft, and satellites, the agencies will measure water depths, look for submerged debris, and record altered shorelines in high priority areas from South Carolina to Maine, as stipulated by Congress in the Disaster Relief Appropriations Act of 2013. The areas to be remapped will be based on their relative dangers to navigation, effects from the storm, and discussions with state and local officials as well as the maritime industry.
"Our approach is to map once, then use the data for many purposes," said NOAA Rear Admiral Gerd Glang, director of NOAA’s Office of Coast Survey. "Under the Ocean and Coastal Mapping Integration Act, NOAA and its federal partners are taking a 'whole ocean' approach to get as much useful information as possible from every dollar invested to help states build more resilient coastlines."
Preliminary U.S. damage estimates are near $50 billion, making Sandy the second-costliest cyclone to hit the United States since 1900. There were at least 147 direct deaths recorded across the Atlantic basin due to Sandy, with 72 of these fatalities occurring in the mid-Atlantic and northeastern United States. This is the greatest number of U.S. direct fatalities related to a tropical cyclone outside of the southern states since Hurricane Agnes in 1972.
"The human deaths and the powerful landscape-altering destruction caused by Hurricane Sandy are a stark reminder that our nation must become more resilient to coastal hazards," said Kevin Gallagher, associate director for Core Science Systems at USGS. "Sandy's most fundamental lesson is that storm vulnerability is a direct consequence of the elevation of coastal communities in relation to storm waves. Communities will benefit greatly from the higher resolution and accuracy of new elevation information to better prepare for storm impacts, develop response strategies, and design resilient and cost-efficient post-storm redevelopment."
The USGS will collect very high-resolution elevation data to support scientific studies related to the hurricane recovery and rebuilding activities, watershed planning and resource management. USGS will collect data in coastal and inland areas depending on their hurricane damages and the age and quality of existing data. The elevation data will become part of a new initiative, called the 3D Elevation Program, to systematically acquire improved, high-resolution elevation data across the United States.
The data acquired by the three agencies, much of which will be stored at NOAA's National Geophysical Data Center, and through NOAA's Digital Coast, will be open to local, state, and federal agencies as well as academia and the general public. The information can be applied to updating nautical charts, removing marine debris, replenishing beaches, making repairs, and planning for future storms and coastal resilience.
The three federal agencies are collaborating for greater topographic and hydrographic coverage and to promote efficiency. Earlier this year, a NOAA navigation response team surveyed the waters around Liberty Island and Ellis Island in New York harbor, measuring water depths and searching for debris that could cause a danger to navigation. Also, NOAA Ship Thomas Jefferson began surveying the approaches to the Delaware Bay in June.
NOAA plans to contract with commercial firms for additional hydrographic survey projects and high resolution topographic and bathymetric elevation data and imagery in the region.
The Army Corps of Engineers and its Joint Airborne Lidar Bathymetry Technical Center of Expertise are covering particular project areas in Massachusetts, Virginia, and New Jersey. They will coordinate operations, research, and development in airborne lidar bathymetry and complementary technologies for USACE, NOAA, and the U.S. Navy.
The mapping crowd-sourcing program, known as The National Map Corps (TNMCorps), encourages citizens to collect structures data by adding new features, removing obsolete points, and correcting existing data for The National Map database. Structures being mapped in the project include schools, hospitals, post offices, police stations and other important public buildings.
Since the start of the project in 2012, more than 780 volunteers have made in excess of 13,000 contributions. In addition to basic editing, a second volunteer peer review process greatly enhances the quality of data provided back to The National Map. A few months ago, volunteers in 35 states were actively involved. This final release of states opens up the entire country for volunteer structures enhancement.
To show appreciation of our volunteer's efforts, The National Map Corps has instituted a recognition program that awards “virtual" badges to volunteers. The badges consist of a series of antique surveying instruments ranging from the Order of the Surveyor's Chain (25 – 50 points) to the Theodolite Assemblage (2000+ points). Additionally, volunteers are publically acclaimed (with permission) via Twitter, Facebook and Google+.
"I enjoy mapping structures, it's a unique combination of validating structures from aerial photography and web-based sources," says TNMCorps volunteer Don Kloker. "My structures contributions have provided me with an excellent geography lesson and I have learned many things about communities that I most likely would not have been otherwise able to experience." Don has contributed more than 2,000 points and quickly reached the highest recognition badge, the Theodolite Assemblage.
The citizen geographers/cartographers who participate in this program make a significant addition to the USGS's ability to provide accurate information to the public. Data collected by volunteers become part of The National Map structures dataset which is available to users free of charge.
“TNMCorps allows me to update structure locations and their official names from the Geographic Names Information System (GNIS)," said Corey Plank, Cartographer for the US Bureau of Land Management. "These updates allow The National Map and US Topo map series to better represent ground structures and official labels."
As part of an effort to engage civilian organizations, this year's 4-H National Youth Science Day, planned for October 9, 2013, will feature geographic technology projects that are part of TNMCorps data collection efforts.
Tools on TNMCorps website explain how a volunteer can edit any area, regardless of their familiarity with the selected structures, and becoming a volunteer for TNMCorps is easy; go to The National Map Corps website to learn more and to sign up as a volunteer. If you have access to the Internet and are willing to dedicate some time to editing map data, we hope you will consider participating!The first available virtual badge, The Order of the Surveyor’s Chain, awarded to TNMCoprs volunteers who collect more than 25 points. (Larger image) Status map of the U.S. showing the current location of volunteers and the dates when the states started collection. (Larger image)
The Ice Age Trail, one of 11 National Scenic Trails in the U.S. and Wisconsin's only State Scenic Trail, follows the edge of the most recent continental glacier as it traveled south through the upper Midwest. These parts of the state's landscape are world-renowned as one of the best places to see the variety of landforms that resulted from glaciation. Several of the 1,109 new US Topo quadrangles display parts of the 640 miles of established Ice Age Trail segments, which are all contained within the state borders.
"Wisconsin’s US Topo maps display the relief of the state's glacial features that the Trail then interprets on the ground," says Tim Malzhan, director of trail operations for the Ice Age Trail Alliance, the nonprofit arm in a partnership that manages and supports the Trail. "When people use these maps, seeing the ribbon of the Ice Age Trail as it crosses the state will allow them to learn about and explore Wisconsin’s glacial history."
The Ice Age Trail will one day be a continuous 1,200-mile footpath spanning the state from the Minnesota border on the west to Lake Michigan on the east. The Ice Age Trail was designated a National Scenic Trail on Oct. 3, 1980, when Congress amended the National Trails System Act to authorize and establish the Ice Age National Scenic Trail as part of the National Trails System.
The USGS partnered with the National Park Service, Wisconsin Department of Natural Resources and Ice Age Trail Alliance to incorporate the Ice Age Trail onto Wisconsin's maps. These three agencies worked cooperatively to shape the IAT into one of the premier hiking trails in the country. The USGS has future plans to include all National Scenic Trails in The National Map products, including updates to the US Topo map series.
As with all US Topo map updates, the replaced maps will be added to the USGS Historical Topographic Map Collection and are also available for download.
To download US Topo maps: http://nationalmap.gov/ustopo/Map of the state of Wisconsin that shows the path of the Ice Age National Scenic Trail across the state. The IAT starts on the western border of Wisconsin in the northern part of the state. It meanders east to north-central Wisconsin and then turns south, almost to the state’s southern border. It then turns northeast and winds up to the Door County Peninsula, where it ends overlooking the shores of Green Bay. (Larger image)
The National Trails System was established by Act of Congress in 1968 (amended 2009). The Act grants the Secretary of Interior and the Secretary of Agriculture authority over the National Trails System. The Act defines four types of trails. Two of these types, the National Historic Trails and National Scenic Trails, can only be designated by Act of Congress.
There are 11 National Scenic Trails:
- Appalachian National Scenic Trail
- Pacific Crest National Scenic Trail
- Continental Divide National Scenic Trail
- North Country National Scenic Trail
- Ice Age National Scenic Trail
- Potomac Heritage National Scenic Trail
- Natchez Trace National Scenic Trail
- Florida National Scenic Trail
- Arizona National Scenic Trail
- New England National Scenic Trail
- Pacific Northwest National Scenic Trail
Key factors have been identified that help determine the vulnerability of public-supply wells to contamination. A new USGS report describes these factors, providing insight into which contaminants in an aquifer might reach a well and when, how and at what concentration they might arrive.
About one-third of the U.S. population gets their drinking water from public-supply wells.
"Improving the understanding of the vulnerability of public-supply wells to contamination is needed to safeguard public health and prevent future contamination," said Suzette Kimball, acting USGS Director. "By examining ten different aquifers across the nation, we have a more thorough and robust understanding of the complexities and factors affecting water quality in our public supplies."
The study explored factors affecting public-supply-well vulnerability to contamination in ten study areas across the Nation. The study areas include Modesto, Calif., Woodbury, Conn., near Tampa, Fla., York, Nebr., near Carson City and Sparks, Nev., Glassboro, N. J., Albuquerque, N. Mex., Dayton, Ohio, San Antonio, Tex., and Salt Lake City, Utah.
Measures that are crucial for understanding public-supply-well vulnerability include: 1) the sources of the water and contaminants in the water that infiltrate the ground and are drawn into a well; 2) the geochemical conditions encountered by the groundwater; and 3) the range of ages of the groundwater that enters a well.
"Common sense might say that wells located near known contaminant sources would be the most vulnerable, but this study found that even where contaminant sources are similar, there are differences in public-supply-well vulnerability to contamination," said Sandra Eberts, the study team leader.
The study found that conditions in some aquifers enable contaminants to remain in the groundwater longer or travel more rapidly to wells than conditions in other aquifers. Direct pathways, such as fractures in rock aquifers or wellbores of non-pumping wells, frequently affect groundwater and contaminant movement, making it difficult to identify which areas at land surface are the most important to protect from contamination. An unexpected finding is that human-induced changes in recharge and groundwater flow caused by irrigation and high-volume pumping for public supply changed aquifer geochemical conditions in numerous study areas. Changes in geochemical conditions often release naturally occurring drinking-water contaminants such as arsenic and uranium into the groundwater, increasing concentrations in public-supply wells.
Knowledge of how human activities change aquifer conditions that control which contaminants are released to groundwater and how persistent those contaminants are once in the groundwater can be used by water managers to anticipate future water quality and associated treatment costs.
The quality of drinking water from the Nation's public water systems is regulated by the U.S. Environmental Protection Agency under the Safe Drinking Water Act. The USGS studies are intended to complement drinking water monitoring required by federal, state and local programs.
This new report, Factors affecting public-supply-well vulnerability to contamination: understanding observed water quality and anticipating future water quality, was done by the USGS National Water-Quality Assessment Program. NAWQA conducts regional and national assessments of the Nation's water quality to provide an understanding of water-quality conditions, where conditions are getting better or worse over time, and how natural features and human activities affect those conditions.
Learn more about the transport of contaminants to public-supply wells:
About 4,000 people are expected to attend the annual meeting of the Ecological Society of America in Minneapolis from Aug. 4 to 9, 2013. The theme of this year’s conference is Sustainable Pathways: Learning from the Past and Shaping the Future.
Forest Drought Stress in Southwest May Exceed Most Severe Droughts in Last Thousand Years: Severe wildfires and drought-induced tree deaths have increased greatly over the past two decades in the southwestern United States. Historical ecological sources about Southwest fire regimes and forest patterns over the past 10,000 years provide context for recent fire and vegetation trends. Specifically, these sources show that regional forest landscapes are greatly affected by interactive changes among human land management, climate and disturbances. Such linkages are further emphasized through the newly developed forest drought-stress index (FDSI) for the Southwest, which uses extremely robust relationships among historical tree-ring growth, warm-season temperature and cold-season precipitation to reconstruct the FDSI back to AD 1000 from a massive archive of tree-ring growth data. This research by USGS, Los Alamos National Laboratory, University of Arizona, and other university partners shows very strong relationships between FDSI and regional forest productivity, tree mortality, bark-beetle outbreaks and wildfire. Moving forward, if temperatures increase as projected, background levels of southwestern forest drought-stress by the 2050s will exceed that of the most severe droughts in the past 1,000 years, which almost certainly means imminent changes in forest structure and composition. Overall, interactions among climate, land-use history and disturbance processes are driving the current pulse of major forest transitions in the Southwest. In the face of such rapid changes, it is imperative to explore adaptation strategies to foster ecosystem resilience. This work is addressed in two presentations: 1) Land cover change in the Southwest: wildfire risk, drought-induced tree mortality, and the convergence of climate, land management, and disturbance trends in regional forests and woodlands, will be in room 101c on Aug. 9 at 8:40 a.m. Contact Craig Allen, firstname.lastname@example.org, 505-795-1571; and 2) A forest is not a pan of water: temperature and vapor-pressure deficit as potent drivers of regional forest drought stress, will be in room 101A on Aug. 6 at 9:50 a.m. Contact Park Williams, email@example.com, 505-667-6551.
Response of North American Desert Plants to Climate Change: Forecasts for Management and Planning: Forecasted climate warming and changes in precipitation patterns in North American deserts can have a strong impact on plant species already stressed by low water availability. Accurate forecasts of climate-induced changes to desert plant assemblages are needed by managers because dryland ecosystems are prone to abrupt and potentially irreversible degradation and reductions in productivity. To help managers mitigate and adapt to climate change impacts, USGS researchers have synthesized over a century (1906-2012) of climate and vegetation monitoring results from more than 1,500 vegetation plots across the Colorado Plateau, and the Sonoran, Chihuahuan and Mojave deserts. In all of these deserts, dominant plant species and plant diversity responded to drought and elevated temperature.
On the Colorado Plateau, large declines of cool-season perennial bunchgrasses occurred, primarily driven by high temperatures. In the Sonoran Desert, increases in cacti occurred with hotter temperatures, and decreases in warm-season perennial grasses and sub-shrubs occurred with less annual precipitation. Tree and shrub species in the Sonoran Desert were less responsive to changing climatic conditions than other species, but some woody species were sensitive to warmer temperatures and less winter precipitation, especially on south-facing slopes. In the Chihuahuan Desert, many grasses and forbs had large responses to summer precipitation, whereas most woody vegetation showed small responses to winter precipitation. In the Mojave Desert, winter drought was related to declines of shrubs at some sites. USGS research also highlights “climate pivot points” that mark important shifts from increases to decreases in plant abundance along climatic gradients. These results are being used to assist with management decisions, improve monitoring protocols and inform climate change vulnerability assessments for land managers. This presentation (OOS 16-4), Regional signatures of plant response to climate across North American deserts: Forecasts for management and planning, will take place in room 101B on Aug. 7 at 9 a.m. Contact Seth Munson, firstname.lastname@example.org, work cell, 303-810-4896.
Large, Invasive Rodents: Are They Heading Your Way? The nutria is a large, prolific and water-loving invasive rodent that has become established in many parts of the world after being introduced for the fur industry, including in the Southeast and the Pacific Northwest regions of the United States. In the Southeast and elsewhere, they wreak havoc in coastal marshes and bald cypress swamps by feeding on the tender roots of plants, seedlings and saplings, completely stripping vegetation in areas where the animals are concentrated. Historically nutria ranges have expanded regionally northward following milder winters and contracted southward following more severe winters. This USGS study examined the current and potential distribution of nutria in the Pacific Northwest. Due to a string of relatively mild winters nutria populations have been expanding northward in the United States, suggesting that nutria populations could extend their range substantially both in the Pacific Northwest, the Mississippi Valley and the Eastern Seaboard in the future since climate change models predict milder winter temperatures though the USA. Large-scale management of the species requires knowledge of its current and potential distribution. This presentation, Using a combined hydrologic network-climate model of the invasive nutria (Myocastor coypus) to understand current distributions and range expansion potential under climate change scenarios, will be in room 101H on Aug. 9 at 10:50 a.m. Contact Catherine Jarnevich, email@example.com, 970-226-9439.
In a related study, USGS scientists investigated the activity patterns of urban nutria populations in Lafayette, La., and Portland, Ore., since little is known about this subject. The study found that daily as well as seasonal activity patterns differed in the two geographic areas, leading to current efforts to explore the role that alternative factors might play in the differing activity patterns. This presentation, Comparison of activity patterns of nutria (Myocastor coypus) between urban pond complexes in Lafayette, Louisiana, USA and Portland, Oregon, USA, will be in Room L100B on Aug. 5 at 1:30 p.m. Contact Jacoby Carter, firstname.lastname@example.org, 337-266-8620.
People, Cameras, and Action! Teaming Up to Better Understand Phenology: The implications and impacts of climate change on the earth’s phenology – the timing of plant and animal life-cycle events – are increasingly well documented. Two continental-scale observation networks, PhenoCam and the USA National Phenology Network, which is managed by the USGS, are collaborating with the National Ecological Observatory Network (NEON) to develop more refined phenological monitoring processes and to explore new opportunities for collaborative research. While PhenoCam quantifies plant phenology by using high-frequency camera monitoring of plant canopies, the USA-NPN contributes ground-based plant and animal data through its crowd-sourcing phenology program, Nature’s Notebook. Both organizations are collaborating with NEON, a continental-scale ecological observing system, to enhance and codify best practices for phenological data collection and to collect and integrate phenological data across multiple spatial scales. The joint efforts of these programs will bridge major knowledge gaps in the field of phenology: not only will cameras provide new techniques for validating satellite-derived land-surface phenology products, but multi-faceted phenology datasets will aid in investigations of the feedback between ecosystem phenology and carbon/water/energy fluxes between the biosphere and atmosphere. This presentation, Integrating Phenocam and USA National Phenology Network continental-scale approaches into NEON phenology data products, will be in room L100I on Aug. 6 at 4:30 p.m. Contact Jake Weltzin, email@example.com (cell: 703-485-5138) or the lead author Michael Toomey, firstname.lastname@example.org, 860-986-3804.
Crowd-Sourcing Needed to Take the Pulse of Our Planet: The USA National Phenology Network serves science and society by collecting and organizing valuable data on plant and animal activity across the United States, and by setting global standards for integrated monitoring of plant and animal seasonal activity to understand impacts of climate change on ecological systems. Most data entered into the Network’s national database are submitted by citizen scientists through the national-scale, multi-taxa phenology observation program, Nature’s Notebook. With 2,500 active participants and more than 2.3 million contributions since the program went live in 2008, volunteers and professional scientists work side by side to observe and record the important phases in the annual life cycles of plants and animals. This presentation will provide a broad overview of the Network and its partners and participants, but will focus on recent successes embodied in local- to national-scale projects including detection of invasive species, recent and historical trends in phenology, and potential future changes in phenology in the eastern deciduous forest. This presentation, The National Phenology Database: A multi-taxa, continental-scale dataset for scientific inquiry, will be in room LL101 on Aug. 8 at 4 p.m. Contact Jake Weltzin, email@example.com, 703-485-5138.
Climate Science Centers: Sparking Collaboration through Research & Resource Management: This special session will introduce participants to the Department of Interior Climate Science Centers (CSCs) and their unique position to unite researchers with cultural and natural resource managers to facilitate a full-cycle approach to the use of research in support of management decisions. A panel composed of leaders from the CSCs, members of the University Consortia and Landscape Conservation Cooperatives, and other collaborators/clients will provide an overview of the approaches used to support the CSC mission: to serve the scientific needs of managers of fish, wildlife, habitats, and ecosystems as they plan for a changing climate by providing scientific support for climate-adaptation identification and implementation of climate-adaptation strategies across a full range of natural and cultural resources. Participants will benefit from an overview of the CSC support capacities, research solicitation and funding processes with hopes to spark future collaborations. This presentation, Climate Science Centers: now supporting resource management with science at a location near you!, will be in room 101A on Aug. 5 at 10:15 a.m. Contact Stephen Gray (firstname.lastname@example.org), 907-301-7830.
Actionable Climate Change Science Strategically Tying Research to Management and Policy Needs: Prompt access to climate adaptation science for policymakers and managers is vital to effectively plan for climate change in a timely manner. Up until this point, the scientific community has employed a largely ineffective “conveyor-belt” approach to this process, in which managers both define scientific needs and assign projects to scientists. To streamline this process, the Department of the Interior Climate Science Centers have designed a new approach in which this procedure is executed in a more integrated and promptly actionable method. Using strategic decision-based approaches, the CSCs are creating a series of pilot projects that will focus on developing science outcomes that are tied to strategic management decisions. Unlike previous models, these teams, consisting of scientists, managers, decision-makers and stakeholders, will work collaboratively throughout the project to assure science outputs are consistent with management needs. These CSC pilot projects will form the basis for a national science agenda that will support climate adaptation decision-making processes. This presentation, Actionable science in an era of rapid climate change, tying observations and predictions to policies and action, will be in Auditorium room 3 on Aug. 8 at 4:10 p.m. Contact Doug Beard,email@example.com, 571-265-4623
Bridging the Gap between Science and Decisions: Climate-science researchers and resource-management decision-makers inhabit different professional worlds, but those worlds must come together to ensure scientifically informed management decisions. Effective cooperation and interaction between these groups are essential, yet hampered by professional and institutional barriers. Despite these disconnects, numerous case studies exist in which research has been applied effectively to climate-change management decisions. These case studies provide a foundation for identifying best practices for both researchers and decision-makers. These best practices include patient, persistent engagement among relevant parties. If climate-change research is to be used effectively in decision-making, researchers will need to step outside traditional comfort zones, listen carefully to decision-makers, and maintain continuing dialogue. New professional models must be encouraged, in which effective engagement and actionable science become part of the professional reward structure in research institutions. This presentation, Seeking Leopold's Quadrant: how do we foster research that addresses needs of resource-management decision-makers?, will be in room 101C on Aug. 9 at 9:50 a.m. Contact Stephen T. Jackson, firstname.lastname@example.org, 307-760- 0750.
U.S. Engagement in the Global Shift in Biodiversity and Ecosystem Services: The Intergovernmental Platform on Biodiversity and Ecosystem Services is in full swing with 109 member countries; its first IPBES plenary conference took place in Bonn, Germany (also the site of its Secretariat), in January 2013. The United States scientific community should be engaged as full participants in the IPDS process. The session speakers will discuss the changing landscape in global environmental science initiatives, present the latest updates in the IPBES process, share current and future opportunities for input, and discuss ways to broadly engage the U.S scientific community and other stakeholders in preparation for the December 2013 second IPBES plenary. This presentation, Biodiversity and ecosystem services on the global stage: IPBES and you, will be in room L100F on Aug. 7 at 8:00 p.m. Contact Doug Beard, email@example.com, 571-265-4623.
The U.S. Geological Survey has developed analysis software to facilitate research of chemical mixtures. The new software consists of four computer programs to help hydrologists, toxicologists, and other professionals investigate chemical mixtures in the environment.
The study of mixtures is difficult due to the enormous number of environmental mixtures. The number of mixtures increases rapidly with the number of co-occurring chemicals.
"Studying chemical mixtures is important because most organisms, including people, are exposed to mixtures in their environments, and little is known about their potential health effects," said Jon Scott, a retired USGS scientist and primary author. "The software includes tools for investigating which chemicals are found in mixtures, how often the mixtures occur in the environment, and the concentrations of mixture components relative to various health benchmarks."
The new tool is documented in the on-line report, Software for Analysis of Chemical Mixtures—Composition, Occurrence, Distribution, and Possible Toxicity, by Jonathon Scott, Kenneth Skach, and Patricia Toccalino. The software described in the report, along with other USGS programs, can be obtained online.
The mixture-analysis software was developed by the USGS National Water Quality Assessment (NAWQA) Program to document methods for mixture analysis and serve as a foundation for future studies. The NAWQA Program conducts regional and national assessments of the Nation’s water quality to provide an understanding of water-quality conditions, whether conditions are getting better or worse over time, and how natural features and human activities affect those conditions.