The newest sets of US Topo maps cover the states of Texas and New York. The 4,309 quadrangles for Texas and 972 quads covering New York replace the existing US Topo maps for those states, and will be added to the USGS Historical Topographic Map Collection. All of these maps are available for free download from The National Map and the USGS Map Store website.
Last September the USGS marked the important milestone of completing the initial round of US Topo map production for the 48 contiguous states. The agency is continuing to improve the US Topo map product, moving into the next round of national map revisions. Hawaii is currently in production and Alaska production will start later this year.
"The US Topo program is a dynamic product and the new maps over Texas and New York demonstrate our commitment to a very aggressive three year revision cycle while at the same time adding new content", said Mike Cooley, the US Topo Project Manager. "I encourage you to take a look at these maps and drop us a comment on how we are doing via our drop box, as your input is important to us."
New feature additions and improvements on the updated US Topo maps include:
- Woodland tint derived from the National Land Cover Dataset
- Fire stations
- State and county boundaries
- Forest service boundaries
- Commercial roads in lieu of census roads
- Forest Service roads and road numbers
US Topos are derived from key layers of geographic data found in The National Map, which delivers visible content such as high resolution aerial photography, which was not available on older paper-based topographic maps. The new US Topo maps provide modern technical advantages that support wider and faster public distribution and on-screen geographic analysis tools for users.
Future enhancements to the US Topo are scheduled to include additional tools and map content such as a shaded relief layer, updated structures, enhanced transportation, additional federal boundaries and Forest Service trails. Wyoming, which was added in the fall of 2012, also featured Public Land Survey System (PLSS). The USGS expects to produce more than 18,500 revised quadrangles annually. US Topo maps are updated every three years.
For more information, go to: http://nationalmap.gov/ustopo/
WASHINGTON, DC - Secretary of the Interior Ken Salazar today released a report to Congress on the progress of the National Water Census, which is being developed at the U.S. Geological Survey (USGS) to help the nation address its critical water needs.
Peak flooding on the Red River at Fargo will likely occur sometime after April 15, according to U.S. Geological Survey streamgage data and National Weather Service information.
Scientists with the USGS and NWS meteorologists are closely monitoring the Red River at Fargo, N.D., and Moorhead, Minn., in anticipation of April flooding. USGS streamgages indicate that on Wednesday, April 3 the river still had not begun its spring rise, meaning that the impending 2013 flood will be considerably later than the large floods of 2009 and 2011. The 2013 flood likely will be later than the 1997 flood, which was exacerbated by an early April blizzard.
"The large floods at Fargo that have previously occurred in April—1952, 1965, 1969, 1979, and 1997—peaked from April 15 to April 19," said Gregg Wiche, Director of the USGS North Dakota Water Science Center. "Above normal snowpack and cold March temperatures have contributed to this year’s late melt."
According to NWS preliminary data, 2013 brought the sixth coldest March since hydrologic observations began in 1900. This year also had the deepest average snow depth for the last day of March since weather records began in Fargo in the mid-1880s. The NWS ranked the month of March, 2013, as the 14th for coldest average temperature, the 12th snowiest, and the 11th wettest (including rain and melted snow) for Fargo.
The USGS compares current Red River conditions to past large floods on its Fargo flood tracking webpage.
Additional data for the USGS Red River at Fargo streamgage is available online.
NWS flood forecasts for the Red River at Fargo are available online.This chart compares current gage height of the Red River at Fargo, N.D., to floods in 1997, 2009, and 2011 at the same location. The chart is available for download.
COLUMBIA, S.C. – South Carolina and Georgia water resource managers have powerful new tools at their fingertips to help make critical decisions on the timing and quantity of freshwater availability in coastal rivers.
Developed by the U.S. Geological Survey and Advanced Data Mining International, the two new decision support systems will help decision makers determine how much drinking water they will be able to pull from rivers in the face of climate change, sea-level rise and saltwater intrusion.
The user-friendly products were developed as part a new report titled Simulation of salinity intrusion along the Georgia and South Carolina coasts using climate change scenarios.
Research shows that the availability of freshwater in coastal streams will likely be affected in the future due to the combination of climate change and sea-level rise. The balance between freshwater and saltwater in coastal streams is primarily governed by the interaction between streamflow and sea level, and coastal rivers are constantly responding to changing streamflow and tidal conditions.
The decision support systems -- which include salinity simulation models, model controls, historical databases, and model output in a spreadsheet application – were created for the cities and towns on the Georgia and South Carolina coast that withdraw drinking water from the Atlantic Intracoastal Water and the Waccamaw River in South Carolina, and the Savannah River in Georgia, to predict saltwater intrusion near municipal intakes.
"Predicting the changes in the frequency of salinity intrusion event is critical for water-resource planning in the coastal region of the Southeastern United States due to the large number of municipal water-supply intakes in coastal rivers," said Paul Conrads, a USGS hydrologist and lead author of the study.
At a location just downstream from an intake that provides drinking water for Myrtle Beach area, the decision support system estimated that a 1-foot rise in sea level would increase the frequency of salinity at the intake and double the amount of time that freshwater would not be available at the intake.
"The decision support systems for the two rivers are essentially easy-to-use spreadsheets that integrate all the science, data, and models needed to perform high quality risk assessments," said Edwin Roehl, lead software developer for the project.
The study also evaluated the effect of climate-change projections from a global circulation model on change in salinity intrusion. The global circulation models predict changes in precipitation and temperature. These changes can affect streamflows to the coasts and change salinity intrusion. The results from the global circulation model projections indicates that, for one intake, the annual number of salinity intrusion events will increase and there would be a seasonal shift, with most salinity intrusion events occurring in the fall rather than the summer.
Although increases in sea-level and reductions in streamflow show substantial effects that would have operational consequence for municipal water-treatment plants, the climate change scenarios shown in the report would allow water-resource managers to plan adaptation efforts to minimize the effect of increased salinity of source water. Adaptation efforts may include timing of withdrawals during outgoing tides, increased storage of raw water, timing larger releases of regulated flows appropriately to move the saltwater-freshwater interface downstream, and the blending of higher conductance surface water with lower conductance water from an alternative source such as groundwater.
CORVALLIS, Ore.— New scientific findings published in Ecology reveal that interactions of climate, soils, shrubs, and a natural nitrogen fertilization process affect regrowth of forests following wildfire in southern Oregon and northern California. Managers can use this information to consider post-fire management practices, including fertilization and shrub-removal.
Scientists studying forests that burned in 1987 discovered an interesting pattern in a natural fertilization process. The highest levels of natural nitrogen fertilization occurred at cool, dry sites where tree growth is slow and where nitrogen for growth is needed the least. In contrast, the lowest nitrogen additions occurred at warm, moist sites where tree growth and associated nitrogen needs are greatest.
This counterintuitive result occurred because natural nitrogen fertilization by nitrogen-fixing shrubs was suppressed by competition with oaks, maples, and other vegetation where tree growth was greatest, in warm, moist sites.
Nitrogen, an essential nutrient for tree growth, often is lost during a forest fire. An important way to recover forest fertility is an ecological process called biological nitrogen fixation. Some common shrubs, like Ceanothus, form unique relationships with bacteria and convert inert nitrogen gas from the air into forms of nitrogen in the soil that the trees can use for growth. Free-living soil bacteria also fix nitrogen. This natural process is the main source of nitrogen fertility in forests.
The scientists found that the rate at which Ceanothus shrubs added nitrogen to the system could be suppressed as tree biomass increased. Even though warm, wet sites stimulated the growth of nitrogen-fixing shrubs, these conditions stimulated the growth of other plants even more. Eventually, these changes limited the recovery of nitrogen fertility in the most productive sites.
According to Stephanie Yelenik, the lead author of the study, nitrogen additions by Ceanothus shrubs and by free-living soil bacteria provided an average of 7.5 pounds of nitrogen per acre per year. Over the 22 years following the major fire when the forest’s vegetation and nitrogen burned, this added up to about 165 pounds of nitrogen per acre. Although probably insufficient to fully replace wildfire nitrogen losses on the study sites, these contributions were substantial. Yelenik was affiliated with Oregon State University at the time of the study.
"There are important related results. Biological nitrogen fixation involving Ceanothus shrubs was up to 90 times greater than contributions from free-living soil microorganisms," said USGS scientist Steve Perakis, who participated in the study. "The contribution from Ceanothus would be even greater if other plants didn't compete so strongly. So ultimately competition among different plant species governed nitrogen input in the forests studied."
"The loss of nitrogen to wildfire has always been of concern to managers; however, the enormity of this loss only recently has been quantified," said Tom Sensenig, a U.S. Forest Service ecologist. "This study not only informs managers about the importance of shrubs for restoring nitrogen, but identifies the dynamics among species and the specific processes influencing nitrogen fixation and recovery across differing sites. Principally, this new information will help in developing post-fire management options and plans for specific forest types in this region. For example, on drier lower-quality sites, Ceanothus, the most prevalent nitrogen-fixing shrub identified, could be retained to the greatest extent possible by only treating the minimal vegetation necessary to assure seedling survival. On wetter, higher-productivity sites, treating more competitive species at a higher intensity may be more effective for maximizing nitrogen recovery, while benefiting seedling survival as well."
According to Yelenik, without additional fire or other forms of disturbance, Ceanothus largely disappears from productive sites in about 30 years as the tree canopy shades out the understory vegetation. Because Ceanothus is the major player in biological nitrogen fixation, from then on, nitrogen levels may remain consistently low in sites that have the necessary temperature and moisture conditions to promote rapid tree growth. On these sites, there may be opportunities to conduct vegetation management or to allow low-severity fires to burn as a way of encouraging the presence of nitrogen-fixing shrubs in the forest understory.
The study sites were located in forested mountains of the Klamath Region. This region is prone to wildfires, and the frequency and severity of the fires shape vegetation patterns. The study occurred 20 to 22 years after fire in sites that were salvage logged in the first 2 to 3 years after fire and then planted with conifer trees. Perakis believes the results are best applied to this region, but the interactions between climate, soils, shrubs, and natural nitrogen fertilization merit study elsewhere to see if similar constraints to nitrogen fixation occur in other forests recovering from fire.
PORTLAND, Ore. — The U.S. Geological Survey has developed the "Shoreline Management Tool," a GIS software program designed to test ways of managing land and water resources adjacent to a lake or stream. The new software tool will help water-, land-, and wildlife-resource managers balance competing needs when managing surface-water levels for water quantity, water depth, area of inundation, and area of dry land. These factors relate directly to water supply, water quality, shoreline habitat for plants and animals, and human use of water and land areas.
Assessing the effects of changing surface-water levels historically has been difficult because of the complexity of the analysis. The management tool enables the user to define criteria such as water depth and land-surface slope and aspect to identify areas where conditions meet the needs for certain land or water uses or that provide habitat suitable for specific plants and animals.
The tool comprises an interactive GIS program and spreadsheets that allow users to specify the input data and criteria for analysis, process the data, and create results in the form of maps, data tables, and graphs. The tool is designed for use by natural-resource managers with only limited expertise with GIS.
Although the tool was initially developed to evaluate conditions in the lower Wood River Valley in the upper Klamath Basin, Oregon, it is designed to be transferable to other areas using easily generated or readily available data.
The Shoreline Management Tool was conceived and developed by the USGS with cooperation from the Bureau of Land Management.
The program is documented in the report, "The Shoreline Management Tool—An ArcMap Tool for Analyzing Water Depth, Inundated Area, Volume, and Selected Habitats, with an Example for the Lower Wood River Valley, Oregon," by Daniel T. Snyder, Tana L. Haluska, and Darius Respini-Irwin, is published as U.S. Geological Survey Open-File Report 2012–1247 and is available online.
Fishes residing near oil platforms in southern California have similar contaminant levels as fishes in nearby natural sites, according to two recent reports by the U.S. Geological Survey, which were conducted to assist the Bureau of Ocean Energy Management (BOEM) in understanding potential consequences of offshore energy development.
Since the underwater portion of many offshore oil and gas platforms often provides habitat to a large number of fishes and invertebrates, some stakeholders have called for ocean managers to consider a "rigs-to-reefs" option during the decommissioning phase of a platform. This option would maintain some of the submerged structure to function as an artificial reef after oil and gas production has ended. The findings of this study address questions regarding how the industrial legacy of this kind of artificial reef may affect local fish populations.
Scientists analyzed the amount of contaminants from crude oil exposure present in three species of fish residing at oil platforms within the Santa Barbara Channel and the San Pedro Basin in California. The amount of contaminants present in fish tissue samples at seven platform sites was compared to samples at natural nearby sites. The brand new and recent USGS reports are available online.
"As part of this study, we developed methods capable of detecting the extremely low levels of contaminants that we anticipated in these ocean fishes, especially since they avoid natural oil seeps," said USGS scientist Robert Gale. "These results will assist decision-makers in helping to protect the environment off the coast of California."
Some of the most important contaminants related to oil operations are polycyclic aromatic hydrocarbons (PAHs). Several PAHs are probable human carcinogens and many are toxic to fish and other aquatic life. Scientists were able to develop a new, more accurate method of sampling small traces of PAHs that may have been ingested and broken down within the fish. Samples were taken from species thought to be most sensitive to PAH contamination. These species, including Pacific sanddab, kelp rockfish, and kelp bass, also tend to be targeted by fishermen. PAH concentrations were either very low or undetectable in all fish sampled for this study.
"These important results suggest two things," said BOEM marine biologist Donna Schroeder. "First, existing offshore oil platforms provide food and shelter to local fishes without increasing their background contaminant loads. Second, since there is no detectable PAH signal from ongoing operations, we would expect that if the State of California wanted to implement a rigs-to-reefs program, there would likely be no change, pollution-wise, in the quality of the offshore environment, which appears to be pretty good."
Scientists also looked at industrial chemicals in the Pacific sanddab species, including polychlorinated biphenyls (PCBs), flame retardants (polybrominated diphenylethers, PBDEs), and pesticides (OCPs). These contaminants were also found at low levels in all fish sampled, with no observed pattern between natural and platform habitats.
The Bureau of Ocean Energy Management promotes energy independence, environmental protection and economic development through responsible, science-based management of offshore conventional and renewable energy. While the agency is responsible for analyzing the potential environmental impacts of removing oil and gas platforms in federal waters, the Bureau of Safety and Environmental Enforcement approves applications for decommissioning and ensures that they are conducted safely and in compliance with federal regulations. For additional information on BOEM activities, visit http://www.boem.gov/.
LAFAYETTE - Tiny sea creatures no bigger than a thumbtack are being credited for playing a key role in helping provide healthy habitats for many kinds of seafood, according to a new study by the Virginia Institute of Marine Science and U.S. Geological Survey.
The little crustacean “grazers,” some resembling tiny shrimp, are critical in protecting seagrasses from overgrowth by algae, helping keep these aquatic havens healthy for native and economically important species. Crustaceans are tiny to very large shelled animals that include crab, shrimp, and lobster.
The researchers found that these plant-eating animals feast on the nuisance algae that grow on seagrass, ultimately helping maintain the seagrass that provides nurseries for seafood. The grazers also serve as food themselves for animals higher on the food chain.
Drifting seaweed, usually thought of as a nuisance, also plays a part in this process, providing an important habitat for the grazing animals that keep the seagrass clean.
“Inconspicuous creatures often play big roles in supporting productive ecosystems,” said Matt Whalen, the study’s lead author who conducted this work while at VIMS and is now at the University of California, Davis. “Think of how vital honeybees are for pollinating tree crops or what our soils would look like if we did not have earthworms. In seagrass systems, tiny grazers promote healthy seagrasses by ensuring algae is quickly consumed rather than overgrowing the seagrass. And by providing additional refuge from predators, fleshy seaweeds that drift in and out of seagrass beds can maintain larger grazer populations and enhance their positive impact on seagrass.”
USGS scientist Jim Grace, a study coauthor, emphasized that seagrass habitats are also quite beneficial to people.
“Not only do these areas serve as nurseries for commercially important fish and shellfish, such as blue crabs, red drum, and some Pacific rockfish, but they also help clean our water and buffer our coastal communities by providing shoreline protection from storms,” Grace said. “These tiny animals, by going about their daily business of grazing, are integral to keeping healthy seagrass beds healthy.”Comparison of algae fouling on eelgrass with and without grazers. Copyrighted photo courtesy of Matthew Whalen/UC Davis. Copyrighted photo courtesy of Matthew Whalen/UC Davis. (High resolution image)
In fact, the authors wrote, if not for the algal munching of these grazers, algae could blanket the seagrasses, blocking out sunlight and preventing them from photosynthesizing, which would ultimately kill the seagrasses. Seagrass declines in some areas are attributed partly to excessive nutrients in water bodies stimulating excessive algal growth on seagrasses.
“Coastal managers have been concerned for years about excess fertilizer and sediment loads that hurt seagrasses,” said J. Emmett Duffy of Virginia Institute of Marine Science and coauthor of the study. “Our results provide convincing field evidence that grazing by small animals can be just as important as good water quality in preventing nuisance algae blooms and keeping seagrass beds healthy.”
The USGS scientists involved in this study serve as members of a worldwide consortium of researchers examining the health of seagrasses. This research by Virginia Institute of Marine Science and USGS researchers is the first in a series of studies worldwide on seagrass ecosystems.
The study, “Temporal shifts in top-down versus bottom-up control of epiphytic algae in a seagrass ecosystem,” was published in the recent issue of Ecology, a journal by the Ecological Society of America.
The USGS is expanding the involvement of volunteers to enhance data collection about structures for The National Map.
This program, known as The National Map Corps, focuses on encouraging citizens to collect data relating to structures by both adding new features and/or correcting existing data within The National Map database. These structures can include schools, hospitals, post offices, police stations and other important public places.
Collaborative pilot projects in Colorado were recently used to test the concept of crowd-sourcing. While the project is on-going, early indications point to positive results and show the success of using TNMC volunteers to enhance data sets.
Over a trial period of ten months, 143 volunteers collected, improved, or deleted data on more than 6,400 structures in Colorado. The volunteers’ actions were accurate and exceeded USGS quality standards. In the Colorado pilot project the volunteer-collected data showed an improvement of approximately 25 percent in both location and attribute accuracy for existing data points. Completeness, or the extent to which all appropriate features were identified and recorded, was nearly perfect.
The significant results of the Colorado pilot have led to a phased, nation-wide expansion of the crowd-sourcing /volunteer project. The states in the first expansion of TNMC are: Arkansas, Alaska, Colorado, Delaware, Georgia, Idaho, Maryland, Michigan, Montana, North Dakota, New Jersey, New Mexico, Ohio, Oregon, South Carolina, Utah, Washington, West Virginia
After an evaluation of the quality and procedures of the first group of states, the second set will be made available. Ultimately, by the end of 2013, the third batch of states will complete the expansion of the program.
"The response by volunteers in Colorado exceeded our expectations both in terms of the number of volunteers and the quality of the data they collected”, said Kari Craun, the Director of the USGS National Geospatial Technical Operations Center. “The Volunteered Geographic Information (VGI) community represents a fantastic, untapped resource to assist USGS in maintaining data that are part of The National Map.”
While some familiarity with the area that a volunteer chooses is helpful, one doesn’t have to live near a particular place to contribute. The tools on TNMC website, along with ancillary information available on the Internet, are generally sufficient to edit a distant area.
There have been several instances of crowd-sourced geographic information making significant contributions to research and databases in government, private sector, and non-profit organizations. The goal of the TNMC is to provide data for the nation’s primary federal mapping agency in its effort to provide accurate and authoritative spatial data via the web-based National Map.
The citizen geographers/cartographers who participate in this program will make a significant addition to the USGS’s ability to provide accurate information to the public. Data collected by volunteers become part of TNM Structures dataset which is available to users free of charge.
Without a network of volunteers, the desired information would not be collected this year and the existing data would not be updated. TNMC volunteers perform important work that otherwise will not be accomplished in the foreseeable future.
Becoming a volunteer for TNMC is easy; go to the National Map Corps website to learn more and to sign up as a volunteer. If you have access to the Internet and are willing to dedicate some time to editing map data, we hope you will consider participating!
Twenty-five years of monitoring and studying Alaska's volcanoes by the Alaska Volcano Observatory have improved global understanding of how volcanoes work and how to live safely with volcanic eruptions. Timely warnings from AVO throughout its 25-year history have helped reduce the impact of erupting volcanoes, protecting lives, property, and economic well-being.
On April 1, the Alaska Volcano Observatory, a cooperative program of the U.S. Geological Survey, the University of Alaska Fairbanks Geophysical Institute, and the Alaska Division of Geological and Geophysical Surveys, will mark its 25th anniversary.
"Since 1988, AVO has responded to over 70 eruptive events from Alaska’s 52 historically active volcanoes," said John Power, USGS geophysicist and scientist-in-charge of AVO. "Many of these eruptions affected local and international air traffic, oil production, the fishing industry, municipalities, businesses, and citizens."
The primary volcano hazard in Alaska is airborne ash that endangers aircraft flying the busy North Pacific air routes connecting North America and Asia. The hazard played out dramatically on December 15, 1989 when a wide-body passenger jet encountered an ash cloud from Redoubt Volcano and lost power in all four engines over the Talkeetna Mountains. Fortunately, after more than 4 harrowing minutes of descent, engines were restarted and the plane landed safely in Anchorage. This near-tragedy prompted renewed international efforts to more effectively address the hazards of airborne volcanic ash.
In addition to endangering aircraft, volcanoes near population centers can pose significant hazards to infrastructure and communities from ash fall, lahars, and other rapidly flowing mixtures of hot rock fragments, fluids, and gases.
AVO has developed a far-reaching volcano monitoring program in Alaska and partnered with federal, state and municipal agencies, to improve warnings of volcanic eruptions. AVO led the development of the standard Aviation Color Code to communicate hazards in a simple, consistent manner; this warning system is now endorsed by the International Civil Aviation Organization for use by volcano observatories worldwide. AVO pioneered cooperative programs with volcanologists in the Russian Far East, also home to dozens of explosive volcanoes that threaten aircraft, to create a system to warn the aviation industry of eruptions in Kamchatka and the Kuriles.
Over 25 years, AVO expanded from an early focus on just Cook Inlet volcanoes to a current monitoring and research program that includes daily observations of all 52 historically active volcanoes in Alaska. To address the aviation hazard, AVO expanded ground-based monitoring networks from Cook Inlet to volcanoes on the Alaska Peninsula and Aleutian Islands. Throughout the years, AVO and its colleagues developed innovative ways to track earthquake activity, ground deformation, and volcanic gas output, and analyze satellite imagery in the harsh Alaskan environment. Geologic studies of volcanoes and eruptions by AVO scientists provide insights into eruptive histories, information needed to assess future hazards and inform planning efforts.
AVO issues daily and weekly updates of volcanic activity in Alaska. The most recent information along with a wide range of volcano information, real-time data, and images is available on the AVO website. Volcanic activity notices are also served through Twitter @alaska_avo.
The University of Alaska Fairbanks Geophysical Institute turns observations into information, from the center of the Earth, to the center of the Sun. Visit the UAFGI website for more information.
The Alaska Division of Geological and Geophysical Surveys determines the potential of Alaskan land for resources, groundwater, and geologic hazards. More information is available online.
People living near asphalt pavement sealed with coal tar have an elevated risk of cancer, according to a study in the journal Environmental Science and Technology. Much of this calculated excess risk results from exposures in children, age six or younger, to polycyclic aromatic hydrocarbons (PAHs) from the sealant.
"The increased cancer risk associated with coal-tar-sealed asphalt (CSA) likely affects a large number of people in the U.S. Our results indicate that the presence of coal-tar-based pavement sealants is associated with significant increases in estimated excess lifetime cancer risk for nearby residents," said E. Spencer Williams, Ph.D., principal author of the study and Baylor University assistant research scientist at the Center for Reservoir and Aquatic Systems Research in Baylor's College of Arts & Sciences.
Researchers from Baylor University in Waco, Texas, and the U.S. Geological Survey in Austin, Texas, are the first to report on the potential human health effects of PAHs in settled house dust and soil in living spaces and soil adjacent to parking lots sealed with coal-tar-based products.
"Exposure to these compounds in settled house dust is a particularly important source of risk for children younger than six years of age, as they are expected to ingest this material at higher rates," Williams said. "This indicates that the use of coal-tar-based pavement sealants magnifies aggregate exposures to PAHs in children and adults in residences adjacent to where these products are used and is associated with human health risks in excess of widely accepted standards."
Data on PAHs in settled house dust used for this analysis were published previously by the same authors. In that study, settled house dust and parking lot dust were sampled for 23 ground-floor apartments in Austin, Texas. The parking lot surfaces adjacent to the apartments were coal-tar-sealed asphalt, asphalt-based sealant over asphalt pavement, or unsealed concrete. Concentrations of PAHs were 25 times higher in house dust in residences adjacent to coal-tar-sealed pavement compared to those with other pavement types.
"This study was the first to find a strong association between a product or a behavior and PAHs in house dust," said Barbara Mahler, the USGS research hydrologist who oversaw the study.
For this study, doses and risk associated with residences adjacent to unsealed asphalt lots were considered relative to those adjacent to (CSA) parking lots. Benzo(a)pyrene concentrations in CSA-affected settled house dust were high relative to those reported in most parts of the U.S. where coal-tar-based sealcoat is not used (California and Arizona). Data for PAHs in coal-tar-sealed asphalt-affected soils and unsealed asphalt-affected soils are available from samples from New Hampshire and suburban Chicago.
The analysis did not consider exposure to the dust on the pavement itself, which has PAH concentrations 10s to 100s of times higher than those in house dust or soil, or inhalation of air over sealed pavement.
"Over time, about half of the PAHs in the sealcoat are released into the air, and concentrations in air are extremely high, particularly in the hours to days after application," said Peter Van Metre, USGS research hydrologist and author of two papers on volatilization of PAHs from sealcoat.
Sealcoat is a black, shiny substance sprayed or painted on the asphalt pavement of parking lots, driveways, and playgrounds to improve appearance and protect the underlying asphalt. An estimated 85 million gallons of coal-tar-based sealant are applied to pavement each year, primarily east of the Continental Divide in the U.S. and parts of Canada. Coal-tar-based pavement sealants are 15 to 35 percent coal-tar pitch, which has been classified as a human carcinogen by the International Agency for Research on Cancer. Over time, the dried sealant is worn away from pavement surfaces, and the resulting mobile dust particles can be transported into nearby homes.
"Although the analysis presented here is based on a limited dataset, the results indicate that biomonitoring might be warranted to characterize the exposure of children and adults to PAHs associated with coal-tar-based pavement sealant," Williams said. "Further investigation is also needed into the impacts of coal-tar-based pavement sealants on PAH concentrations in indoor and outdoor environments."
To learn more about Baylor University and their nationally ranked research program, visit their website.
Water Quality Differences Affect Aquatic Health of Urban Streams in Kansas City and Independence, Missouri
Downstream areas of the Blue River and Little Blue River basins are highly affected by urban development, according to a new U.S. Geological Survey study that compares the aquatic-life status of streams in the Kansas City, Mo. metropolitan area using macroinvertebrate populations as an indicator of stream health.
This study increases our understanding of aquatic life and water quality in urban streams. The differences in aquatic-life status of the Blue River and Little Blue River indicate how stormwater, wastewater discharges, and upstream reservoirs affect urban streams.
Macroinvertebrates, or animals without a backbone that are visible to the unaided eye, were collected in the Blue River basin in Kansas City, Mo., and the Little Blue River and Rock Creek basins in Independence, Mo., as part of two urban water-quality studies to assess the aquatic-life status of urban streams. Aquatic macroinvertebrates, which include insects, worms, mussels, and crayfish, are at the base of the food chain in aquatic environments. They are the main food source for many other animals such as fish and ducks, so scientists commonly use them to study the ability of a stream to support aquatic life.
"None of the samples collected from the Blue River had characteristics considered to be fully able to support aquatic life," said USGS co-author Heather Krempa. "However, about one out of ten spring samples and about four out of ten fall samples from the Little Blue River did have characteristics considered to be fully supporting of aquatic life."
Macroinvertebrate samples were collected from streams and analyzed several ways, including counting the total number and types of macroinvertebrates collected, grouping them based on feeding methods, and calculating the tolerance of the macroinvertebrates to pollution and environmental stress. Samples were scored to provide information about the stream at the sample location and were compared among sites. The aquatic-life status scores for the Little Blue River and its tributaries were higher, indicating more optimal conditions, than for the Blue River and its tributaries.
A Stream Condition Index that combines several different measures of macroinvertebrate populations was used to describe and assign three categories to the stream sites: non-, partially, and fully biologically supporting.
Wastewater-treatment plant discharges during low flows and combined sewer overflows into the Blue River lower aquatic-life scores and likely reduce water quality. Separate stormwater sewer system and reservoir releases to the Little Blue River may raise water quality and aquatic-life scores.
The study, "Assessment of Macroinvertebrate Communities in Adjacent Urban Stream Basins, Kansas City, Missouri, Metropolitan Area, 2007 through 2011," has been released as USGS Scientific Investigations Report 2012-5284 and is available online.
WASHINGTON -- NASA and the Department of the Interior's U.S. Geological Survey (USGS) have released the first images from the Landsat Data Continuity Mission (LDCM) satellite, which was launched Feb. 11.
The natural-color images show the intersection of the United States Great Plains and the Front Range of the Rocky Mountains in Wyoming and Colorado. In the images, green coniferous forests in the mountains stretch down to the brown plains with Denver and other cities strung south to north.
LDCM acquired the images at about 1:40 p.m. EDT March 18. The satellite's Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) instruments observed the scene simultaneously. The USGS Earth Resources Observation and Science Center in Sioux Falls, S.D., processed the data.
"We are very excited about this first collection of simultaneous imagery," said Jim Irons, LDCM project scientist at NASA's Goddard Space Flight Center in Greenbelt, Md. "These images confirm we have two healthy, functioning sensors that survived the rigors of launch and insertion into Earth orbit."
Since launch, LDCM has been going through on-orbit testing. The mission operations team has completed its review of all major spacecraft and instrument subsystems, and performed multiple spacecraft attitude maneuvers to verify the ability to accurately point the instruments.
The two LDCM sensors collect data simultaneously over the same ground path. OLI collects light reflected off the surface of Earth in nine different regions of the electromagnetic spectrum, including bands of visible light and near-infrared and short-wave-infrared bands, which are beyond human vision. TIRS collects data at two longer wavelength thermal infrared bands that measure heat emitted from the surface.
By looking at different band combinations, scientists can distinguish features on the land surface. These features include forests and how they respond to natural and human-caused disturbances, and the health of agricultural crops and how much water they use. Data from LDCM will extend a continuous, 40-year-long data record of Earth's surface from previous Landsat satellites, an unmatched, impartial perspective that allows scientists to study how landscapes all across the world change through time.
"These first scenes from the new Landsat satellite continue the remarkable output from the Landsat program with better, more useful imagery and information," said Matthew C. Larsen, associate director for climate and land use change at the U.S. Geological Survey in Reston, Va. "We are gratified that this productive partnership between USGS and NASA has maintained the continuity and utility of this essential satellite tool, providing the foundation for land and water management around the globe."
As planned, LDCM currently is flying in an orbit slightly lower than its operational orbit of 438 miles (705 kilometers) above Earth's surface. As the spacecraft's thrusters raise its orbit, the NASA-USGS team will take the opportunity to collect imagery while LDCM is flying under Landsat 7, also operating in orbit. Measurements collected simultaneously from both satellites will allow the team to cross-calibrate the LDCM sensors with Landsat 7's Enhanced Thematic Mapper-Plus instrument.
"So far, our checkout activities have gone extremely well," said Ken Schwer, LDCM project manager at Goddard. "The mission operations team has done a tremendous job getting us to the point of imaging Earth." During the next few weeks, this team will calibrate the instruments and verify they meet performance specifications.
After its checkout and commissioning phase is complete, LDCM will begin its normal operations in May. At that time, NASA will hand over control of the satellite to the USGS, which will operate it throughout its planned five-year mission life. The satellite will be renamed Landsat 8. USGS will process data from OLI and TIRS and add it to the Landsat Data Archive at the USGS Earth Resources Observation and Science Center, where it will be distributed for free via the Internet.
Visit LDCM First Images to view the images.
For further information about LDCM, visit LDCM Mission.
For status and technical information about all Landsat satellites, visit Landsat Missions.
A new rapid water-quality test may prevent beaches from being closed by providing accurate same day results of bacteria levels, according to a new study by the U.S. Geological Survey.
With increasing outbreaks of waterborne illnesses, beaches have been at the forefront of recent research on human health risk. This new rapid water-quality test, developed by the Environmental Protection Agency (EPA), will help managers across the country determine whether beaches are safe for swimming in order to keep the public from getting sick. Previous tests could not provide same-day results, so managers had to decide whether to close a beach based on findings from the day before.
USGS scientists analyzed the accuracy of EPA’s rapid test by looking at past water quality data from five beaches along Lake Michigan to determine what the outcomes would have been if the rapid test was used. These findings were then compared to two older methods of testing which require 24 hours for results. Scientists discovered that results from the rapid test met EPA’s safe swimming criteria more often than the older tests. If this method had been used during the study period examined, the summers of 2009 and 2010, it may have prevented hundreds of beach closure days and possibly significantly decreased incidences of waterborne illnesses. The full report is available online.
“This study provides beach managers with a virtual “test drive” of this tool; it gives them an idea of what they can expect in terms of beach monitoring decision making,” said USGS scientist Meredith Nevers. “Our research shows that EPA’s rapid test can be an effective tool for beach managers to help keep their recreational beach goers happy and safe.”
Beach closures not only impact recreational users in the summertime, but they also create huge losses for the local economy. Studies have found that the value of a beach trip is between $20-$36 per person per day — revenue which may be lost to local economies when beaches are closed.
The new rapid test, called quantitative polymerase chain reaction for enterococci, is recommended by the EPA, but it is not a requirement. The test has been included in the 2012 EPA guidelines for safe levels of indicator bacteria, including: Escherichia coli (E. coli) and enterococci. The test can be used at both freshwater and marine beaches. To learn more about EPA’s recreational water quality criteria, visit their webpage.
The Powder River Basin of Wyoming and Montana contains about 162 billion short tons (BST) of recoverable coal from a total of 1.07 trillion short tons of in-place resources according to a new USGS assessment. This assessment also estimates that 25 BST of those resources are currently economical to recover, the first such estimate released by the USGS for coal for an entire basin.
The Powder River Basin—a large geologic feature located in northeastern Wyoming and southeastern Montana—contains the largest deposits of low-sulfur subbituminous coal in the world. This study is significant because it illustrates that only a relatively small percentage of in-place coal resources are technically and economically recoverable.Powder River Basin Assessment Map — A map showing the four assessment units for the 2013 USGS Powder River Basin coal assessment.
“The United States is well-known for its rich endowment of coal resources and our in-place estimates bear that out,” said USGS Acting Director Suzette Kimball. “It’s important to note, however, the substantial difference between what is in-place and what is technically recoverable, let alone economic. This new basin-wide assessment provides that critical link for government and private managers to make informed decisions.”
In 2011, the 16 mines in the PRB produced 462 million short tons (MST), about 42 percent of the Nation’s total coal production that year. Subbituminous coal is typically used in electric power generation.
The key to this study was taking advantage of the wealth of recently available geologic data from the interpretation of thousands of new drill logs from coalbed methane development in the PRB. More than 8,000 new drill holes were added to the original Gillette coal field database alone. About 30,000 total data points were used in the entire PRB assessment. This geologic information interpreted from well information of the recent drilling provided an unprecedented level of data about the coal resources for the basin.
The USGS developed the geologic information that formed the basis of this assessment in cooperation with the Wyoming State Geological Survey and the Montana Bureau of Mines and Geology.
The Basin was divided into four areas for assessment: the Montana Powder River Basin, the Northern Wyoming Powder River Basin, the Gillette coal field, and the Southwestern Wyoming Powder River Basin.
Within these four areas, the USGS assessed coal resources for 47 coal beds. The three largest beds by resource are the Canyon coal bed, the Anderson coal bed, and the Smith coal bed. These three coal beds together represent about 38 percent of the total coal resources for the Powder River Basin.
To arrive at the estimate of recoverable coal and 25 BST of reserves, USGS scientists selected portions of those coal beds from the total in-place resources that were deemed both shallow and thick enough to be recoverable using current surface mining technology. Ten conceptual mine models were developed to account for the differences in coal bed geology using proven mining techniques for each the four assessment areas of the PRB. Then, estimated mining costs were calculated for all of the modeled coal resources. Finally, those resources that could be produced at or below the current sales price for PRB coal were designated reserves.
The current 25 BST of reserves does not mean that is all that remains mineable. The size of reserves change because mining costs and coal sales prices are subject to fluctuation based on market conditions – recoverable resources become reserves with favorable changes in costs, demand, and prices.
The USGS Energy Resources Program research efforts yield modern, digital assessments of the quantity, quality, location, and accessibility of the Nation’s coal resources.
To learn more about this or other geologic assessments, please visit the USGS Energy Resources Program website. Stay up to date with USGS energy science by subscribing to our newsletter or by following us on Twitter.
For the first time, anyone can find out the current conditions on thousands of rivers and streams across the country, right from their phone, using USGS' latest system WaterNow. WaterNow makes the water conditions monitored by more than 16,000 streamgages and other sites across the country available via text or email.
Like its predecessor and companion program, WaterAlert, WaterNow seeks to make USGS gage information for streamflow, groundwater levels, springs, water quality, and lake levels more readily available to the general public. These data have been available for over 10 years at USGS Water Data for the Nation, which requires a web browser to access.
"USGS is the world's largest provider of hydrologic information, and our streamgages are a vital part of that water infrastructure," said USGS Associate Director for Water Bill Werkheiser. "WaterNow brings that information straight from our streamgages to your smartphone, and keeps USGS data flowing at the cutting edge."
Knowing about current water conditions is important for a variety of purposes, from disaster planning and response to recreation. For example, water levels in streams can be checked during floods to guide evacuations or on a bright weekend morning to plan a day of paddling.
Land and resource managers can benefit from WaterNow too. Not only can water levels be obtained, but also water temperatures can be checked to determine when it is necessary to release water from a reservoir to protect downstream trout fisheries.
WaterNow expands on the service provided by the USGS WaterAlert service. WaterAlert provides a notification only when conditions exceed a threshold set by a user, whereas WaterNow provides data anytime on demand. These data are collected typically at 15 to 60-minute intervals, stored onsite, and then transmitted to USGS offices every hour.
So how does it work? It's easy! All you have to do is find the gage you are interested in using instructions found on the WaterNow page, and then send a message to WaterNow@usgs.gov with the site number of the gage you would like to get updates from.
You will receive a reply within a few minutes that includes the most recent values of water level and streamflow, if available for that site. These data are by far the mostly frequently requested; therefore, they have been pre-set as defaults. Data values are also available for other kinds of data-collection sites such as wells, springs, and lakes.
For complete instructions and guides on what types of data might be useful to you or which streamgages might be of interest to you, visit the USGS WaterNow site!
Editors: The BaSE tool and supporting documentation can be found online.
NEW CUMBERLAND, Pa. -- Water resource managers can now estimate daily baseline streamflows in a matter of minutes for any location along Pennsylvania's waterways. The Baseline Streamflow Estimator, called "BaSE," provides users with estimated daily mean streamflow, minimally altered by human activities, for locations on Pennsylvania streams that don’t have streamgages. Pennsylvania is one of the first states in the nation to have such a tool.
"BaSE provides water-resource managers with nearly 50 years of daily mean streamflow for ungaged sites in a matter of minutes that they can use for their projects. These daily values can then be used to generate a number of streamflow statistics that may be needed for decision making," said Marla Stuckey, USGS hydrologist and project lead in Pennsylvania.
Water-resource managers use daily mean streamflow to evaluate withdrawal, allocation, and wastewater permit applications and to assess the health of the Commonwealth's streams. Historically, it has been difficult, costly, and time intensive to estimate daily mean streamflow for stream locations that were not gaged, or monitored. Now, BaSE allows users to estimate daily mean streamflow values and daily hydrographs by entering a few basic basin characteristics in an easy-to-use tool. The output is a summary spreadsheet, containing information about the location of interest, including daily mean streamflow for every day from 1960 to 2008.
BaSE relies on a methodology that uses flow-duration curves, which illustrate the percentage of time, or probability, that a flow value in a stream will equal or exceed a particular value. Flow-duration curves are generated for reference streamgage locations with monitored streamflow and the curves are transferred to ungaged locations to estimate daily mean streamflow.
BaSE chooses the most appropriate reference streamgage for the ungaged location and applies newly developed regression equations to convert the transferred flow duration curve to streamflow at the ungaged location.
A USGS Scientific Investigations Report describing BaSE can be found online.
MENLO PARK, Calif. — "Predicted population increases in this century can be expected to translate into more people dying from earthquakes. There will be more individual earthquakes with very large death tolls as well as more people dying during earthquakes than ever before, according to a newly published study led by U.S. Geological Survey engineering geologist Thomas L. Holzer."
Holzer and his USGS coauthor James Savage studied earthquakes with death tolls of more than 50,000, which they define as catastrophic, and reported global death tolls from roughly 1500 A.D. to the present. Comparing those events to estimates of world population, they found that the number of catastrophic earthquakes has increased as population has grown. After statistically correlating the number of catastrophic earthquakes in each century with world population, they were able to use new (2011) 21st-century population projections by the United Nations to project that approximately 21 catastrophic earthquakes will occur in the 21st century, a tripling of the seven that occurred in the 20th century. They also predict that total deaths in the century could more than double to approximately 3.5 million people if world population grows to 10.1 billion by 2100 from 6.1 billion in 2000.
"This prediction need not be a prophesy: the National Earthquake Hazard Reduction Program (NEHRP) in the U.S. can be a model for how science can inform engineering designs that are adopted into life-saving building codes in earthquake-prone regions," said USGS Associate Director for Natural Hazards David Applegate. "I also cannot stress enough the value of educated citizens — those who understand the natural hazards of this planet and are empowered to take action to reduce their risk."
Four catastrophic earthquakes have already struck since the beginning of the 21st century, including the 2004 Sumatra-Andaman earthquake (and tsunami) and 2010 Haiti earthquake that each may have killed over 200,000 people. The study explains this increase in lethal earthquakes. It is not that we are having more earthquakes; it is that more people are living in seismically vulnerable buildings in the world's earthquake zones.
Holzer's study underscores the need to build residential and commercial structures that will not collapse and kill people during earthquake shaking.
"Without a significant increase in seismic retrofitting and seismic-resistant construction in earthquake hazard zones at a global scale, the number of catastrophic earthquakes and earthquake fatalities will continue to increase and our predictions are likely to be fulfilled," Holzer said.
The study, "Global Earthquake Fatalities and Population," is available online.
RESTON, Va. — For the first time, U.S. Geological Survey scientists have mapped long-term average evapotranspiration rates across the continental United States – a crucial tool for water managers and planners because of the huge role evapotranspiration plays in water availability.
Why are evapotranspiration rates so important to know? It's because the amount of water available for people and ecosystems is the amount of annual precipitation – that is, snow or rain – minus the amount of annual evapotranspiration. Evapotranspiration itself is the amount of water lost to the atmosphere from the ground surface. Much of this loss is the result of the "transpiration" of water by plants, which is the plant equivalent of breathing. Just as people release water vapor when they breathe, plants do too.
"Since evapotranspiration consumes more than half of the precipitation that happens every year, knowing the evapotranspiration rates in different regions of the country is a solid leap forward in enabling water managers and policy makers to know how much water is available for use in their specific region," said Bill Werkheiser, associate director for water at the USGS. "Just as importantly," he added, "this knowledge will help them better plan for the water availability challenges that will occur as our climate changes since transpiration rates vary widely depending on factors such as temperature, humidity, precipitation, soil type, and wind."
In spite of its importance, evapotranspiration has been difficult to measure accurately on a regional or continental scale. To produce these maps, USGS scientists Ward Sanford and David Selnick examined Landsat satellite imagery for climate and land-cover data from 1971 to 2000 and streamflow data for more than 800 watersheds for the same time period. This information allowed them to generate a mathematical equation that can be used to more precisely estimate long-term evapotranspiration at any location in the continental United States.
"The map of the long-term average annual evapotranspiration rates for different areas should be immensely helpful for ensuring the long-term, sustainable use of water in different regions, especially since forecasted climate change will, in many places, change the amount of precipitation and evapotranspiration that occurs," Sanford said. "This tool, for example, allows water managers to quantify surface water runoff to reservoirs or water recharge to aquifers. It will also enable natural resource planners to understand the water needed for healthy-functioning ecosystems."
One interesting finding illustrated in the maps is that in certain regions of the United States, such as the High Plains and the Central Valley of California, evapotranspiration exceeds the amount of precipitation because water is imported from other regions. The map also shows that the Pacific Northwest has many areas with low evapotranspiration to precipitation rates because of the area’s very high rainfall and low-to-moderate temperatures. In contrast, counties in the arid Southwest have evapotranspiration rates that usually exceed 80 percent of precipitation.
The research was published this week in the Journal of the American Water Resources Association. To read the article and see the maps, click here.
Interior Prepares to Conduct Landsat 8 Scientific Programs After Successful Launch of Latest Earth-Observing Satellite
VANDENBERG AFB, CA – Secretary of the Interior Ken Salazar today joined NASA Administrator Charles F. Bolden, Assistant Secretary of the Interior for Water and Science Anne Castle, United States Geological Survey (USGS) Director Dr. Marcia McNutt and other Interior and NASA officials to launch the nation's newest Earth-observing satellite into space.
Launched by NASA from Vandenberg Air Force Base in California, the satellite is expected to transmit images and data about the Earth within 100 days. Landsat data from more than 3 million current and archived images of Earth – available free of charge through the Interior Department’s USGS – have spurred extensive research and innovations, ranging from scientific investigations around the globe to the development of applications like Google Earth.
"Landsat has been delivering invaluable scientific information about our planet for more than forty years," said Salazar. "It's an honor to be a part of today's launch to ensure that this critical data will continue to help us better understand our natural resources and help people like water managers, farmers, and resource managers make informed decisions."
"Landsat is a centerpiece of NASA's Earth Science program, and today's successful launch will extend the longest continuous data record of Earth's surface as seen from space," NASA Administrator Charles Bolden said. "This data is a key tool for monitoring climate change and has led to the improvement of human and biodiversity health, energy and water management, urban planning, disaster recovery and agriculture monitoring – all resulting in incalculable benefits to the U.S. and world economy."
The Landsat program is a joint partnership between NASA and the USGS. NASA develops the remote-sensing instruments and spacecraft, launches satellites, and validates their performance. The USGS then assumes ownership and operation of the satellites, in addition to managing ground-data reception, archiving, product generation, and distribution. The result is a long-term, impartial register of natural and human-induced changes on the global landscape.
"Seeing the world from a birds-eye view has been a primal desire since the earliest days of our civilization, in order to gain a better understand of how the world operates," said Interior Assistant Secretary for Water and Science Anne Castle. "In an era of rapid world population growth, climate change, and increased competition for natural resources, we can't afford not to have the long-term, objective perspective that Landsat's eyes on the Earth provide."
From a distance of more than 400 miles above the earth surface, a single Landsat scene can record the condition of hundreds of thousands of acres of grassland, agricultural crops, or forests. Each Landsat image gives a view as broad as 12,000 square miles per scene while describing land cover in units the size of a baseball diamond.
The Landsat program also offers substantial economic benefits, including an estimated $100 million per year in management of water for irrigated agriculture in western states.
Federal, state and local agencies rely on Landsat as a data source on wildfires, consumptive water use, land cover change, crop conditions, rangeland status and wildlife habitat. Landsat images can show where vegetation is thriving and where it is stressed, where droughts are occurring, where wildland fire is a danger, and where erosion has altered coastlines or river course.
"Over the last 40 years, students, land managers, scientists, relief workers, water managers, and ordinary citizens from nearly 200 nations have come to rely on Landsat as the authoritative source of unbiased information on changes in our planet's solid surface," said USGS Director Marcia McNutt. "The launch of Landsat 8, in the nick of time as Landsat 5 is decommissioned and Landsat 7 is experiencing continued hardware failures, allows us to continue to provide this vital information to the world."
Salazar today also released a new strategy to strengthen and inspire education and careers in science, technology, engineering and math (STEM). Interior's STEM strategic plan is designed to provide a five-year framework for engaging the American public—particularly youth underrepresented in STEM fields—to become scientifically literate stewards of our natural and cultural resources while building a future workforce that fully represents the diversity of America for the 21st century.
"We need to make sure that there's a next generation of cutting edge scientists to design and run Landsat 9, 10, 11 and beyond," said Salazar. "This new plan will pave the way for our youth to choose the innovative and technical careers that are increasingly needed in federal service and in managing increasingly complex natural and cultural resource challenges."
STEM careers can be found at all of Interior's nine agencies including not only USGS—the nation's premier science agency in various disciplines—but also the National Park Service, U.S. Fish and Wildlife Service, Bureau of Land Management, Bureau of Reclamation, Bureau of Ocean Energy Management and others.
Interior employs nearly 15,000 scientists and engineers, many of whom will be retiring in the coming decade. By emphasizing fields of study in STEM, the Department is better positioned to fill in these critical gaps.
Over the next five years, Interior plans to engage more partners in science education, to better coordinate access to the Department's educational resources, to engage students and other citizens in place-based learning and service opportunities, and to strengthen career training and workforce development.
The five-year STEM plan is available online.