RESTON, Va.-- Aftershocks from the 2011 Virginia earthquake have helped scientists identify the previously unknown fault zone on which the earthquake occurred. The research marked one of the few times in the Eastern United States that a fault zone on which a magnitude-5-or-more earthquake occurred was clearly delineated by aftershocks, and is just one finding in a 23-chapter book with new information on the Virginia earthquake and eastern seismic hazards.
Research by the U.S. Geological Survey along with its partners and collaborators defined the newly recognized fault zone, which has been named the “Quail” fault zone. USGS and others worked cooperatively in an effort to capture the accurate locations of hundreds of aftershocks by deploying portable seismic instruments after the earthquake. Most of these aftershocks were in the Quail fault zone, and outlying clusters of shallow aftershocks helped researchers to identify and locate other active faults. Knowing where to look for the active faults helped to focus geologic mapping, geophysical imaging and other technologies to better understand earthquakes in the Central Virginia Seismic Zone and Eastern U.S.
The book includes contributions by Virginia Tech, the Virginia Department of Mines, Minerals, and Energy and the U.S. Nuclear Regulatory Commission among many others.
“Studies of the Virginia earthquake have improved our understanding of earthquakes and seismic hazards in Eastern North America,” said USGS geologist Wright Horton. “The Virginia earthquake served as a ‘wakeup call’ for many residents of the Eastern U.S., where the probability of major earthquakes is fairly low, but many buildings are vulnerable to damage during earthquakes.”
The new book, “The 2011 Mineral, Virginia, Earthquake, and Its Significance for Seismic Hazards in Eastern North America”, is a collection of articles that covers a broad range of subjects relating to the 2011 earthquake. Highlights from the book include:
- Earthquake shaking and its effects, such as widespread changes in groundwater levels, occurred at greater distances from the source in this and other Eastern U.S. earthquakes as opposed to those of comparable magnitude on the West Coast
- Shaking intensities and related damage were more severe along the northeast trend of the Appalachians than in northwestward directions across this trend
- Evidence that the earthquake ground motion was amplified in parts of D.C. and other areas around the Chesapeake Bay with thicker coastal plain sediments or artificial fill is stimulating further studies to determine how much seismic shaking is amplified by local geological conditions
- Analysis of data on residential property damage in the epicentral area delineates a “bulls eye” distribution of shaking intensities and also confirms that damage is influenced by the age and construction of homes
- Damage to unreinforced masonry buildings in D.C., as far as 80 miles from the epicenter, highlights the seismic risk to buildings in Eastern North American cities. Ground motions occur at farther distances from the epicenter on the East Coast than other parts of the U.S., and buildings are not as well designed to sustain these motions as in other locations
- Seismic reflection imaging—which is similar to medical sonograms—and geophysical flight surveys of the Earth’s magnetic and gravity fields were used to image geologic structures down to about 5 miles underground where the earthquake occurred
- Airborne laser swath mapping using lidar, and radiometric flight surveys—which mapped radioactive elements in rocks and soils within a few feet of the land surface—identified and accurately located preexisting linear features including faults associated with aftershock clusters for detailed surface geologic mapping and trenching studies
- New geologic mapping and trenching reveal previously unknown faults and evidence that the faults were active more than once in the past
- Recorded ground motions from the Virginia earthquake were consistent with previous USGS estimates for the region, and they are helping to improve the assessments of potential earthquake ground motions used to design buildings that will be better able to withstand strong earthquakes
Earthquakes in Eastern North America are not as frequent or as well understood as those along Earth's tectonic plate boundaries, such as on the West Coast. The magnitude 5.8 Virginia earthquake was the largest to occur in the eastern U.S. since the 1886 earthquake near Charleston, South Carolina, and it may have been felt by more people than any other earthquake in U.S. history. It was felt over much of the Eastern U.S. and Southeastern Canada, triggered the automatic safe shutdown of a nuclear power plant and caused significant damage from Central Virginia to the National Capital Region. The earthquake provided a wealth of modern scientific and engineering data to better understand earthquakes and seismic hazards in Eastern North America.
The President’s fiscal year 2016 budget request for the U.S. Geological Survey is $1.2 billion, an increase of nearly $150 million above the FY 2015 enacted level. The FY16 budget reflects the vital role the USGS plays in advancing the President’s ongoing commitment to scientific discovery and innovation to support a robust economy, sustainable economic growth, natural resource management, and science-based decision-making for critical societal needs.
The budget request includes increases that ensure the USGS is at the leading edge of earth sciences research. It includes robust funding for science to inform land and resource management decisions, advance a landscape-level understanding of ecosystems, and develop new information and strategies to support communities in responding to climate change, historic drought, water quality issues, and natural hazards. The budget also funds science to support the Nation’s energy strategy, to help identify critical mineral resources, and to address the impacts of energy and mineral development on the environment.
“The USGS has a strong 136-year legacy of providing reliable science to decision-makers,” said Suzette Kimball, Acting USGS Director. “This budget request recognizes our unique capabilities with multi-disciplinary earth science research and will allow the USGS to meet societal needs for our Nation now and in the future.”
Key increases in the FY 2016 Budget are summarized below. For more detailed information on the President’s 2016 budget, visit the USGS Budget, Planning, and Integration website.
Meeting Water Challenges in the 21st Century
The FY16 budget provides an increase of $14.5 million above the FY 2015 enacted level for science to support sustainable water management. Meeting the Nation’s water resource needs poses increasing challenges for resource managers, who must contend with changes in the frequency and magnitude of floods and droughts. As competition for water resources grows for activities such as farming, energy production, and community water supplies, so does the need for information and tools to aid decision-makers. The budget provides increased funding across several USGS mission areas to support resource managers in understanding and managing competing demands related to water availability and quality and to enable adaptive management of watersheds to support the resilience of the communities and ecosystems that depend on them. This includes a $3.2 million increase for science to understand and respond to drought, a $4 million increase for water use information and research, a $2.5 million increase to study ecological water flows, a $1.3 million increase for stream flow information, and a $1.0 million increase to advance the National Groundwater Monitoring Network.
Powering Our Future and Supporting Sustainable Energy and Mineral Development
The 2016 USGS budget provides $9.6 million in program increases across the energy, minerals and environmental health portfolio for science to support the sustainable development of unconventional oil and gas resources, renewable energy sources such as geothermal, wind, and solar, critical minerals such as rare earth elements, and to address the environmental impacts of uranium mining.
Specifically, the budget includes a program increase of $1 million for mineral resources science to continue life-cycle analysis for critical minerals such as rare earth elements and to develop new science and tools to reduce the impacts of minerals extraction, production, and recycling on the global environment and human health. A life-cycle analysis will trace the flow of critical minerals from generation and occurrence through the consequences of human activity to ultimate disposition and disposal. The Nation faces key economic decisions within each stage of the resource life cycle. Scientific understanding is an essential input to these decisions. The program change will support new workforce capability to address the main thrusts of the President’s four working groups in the Office of Science and Technology Policy that are currently focused on critical and strategic materials essential to national security, economic vitality, and environmental protection.
Responding to Natural Hazards
The budget provides an increase of more than $6.6 million above the FY 2015 enacted level for natural hazard science. This includes an increase of $4.9 million to expand the Global Seismic Network used for worldwide earthquake monitoring, tsunami warning, and nuclear treaty verification monitoring and research in partnership with the Department of Energy and the Department of Defense. It also includes a $1.7 million increase to support space weather (solar flare) geomagnetic monitoring. The increase will also support the installation and operation of rapid-deployable streamgages and expand the library of flood-inundation maps to help manage flood response activities. The proposed increase will also support landslide, wildfire, and sinkhole response capabilities as well as provide disaster scenario planning products for emergency managers. Included in the request is funding to build on investments to continue development of an earthquake early warning system, with the goal of implementing a limited public warning system for the U.S. west coast by 2018, as well as continued investments in volcano monitoring networks and science.
Building a Landscape-Level Understanding of Our Resources
The budget includes $15.6 million to expand, enhance, and initiate ecosystem science activities to increase the understanding of the Nation’s landscapes and how they work. This includes budget increases of $6.7 million in support of critical landscapes. Specifically it provides a $4.2 million increase for the Arctic, a $1 million increase to study sagebrush landscapes that provide habitat for survival of greater sage-grouse, and a $1.5 million increase that supports science for Puget Sound, Columbia River, and the upper Mississippi River. USGS research will continue to support restoration of other priority ecosystems, such as Chesapeake Bay, Everglades, Great Lakes, California Bay Delta, and the Gulf Coast. The budget request also provides an increase of $2.2 million for research on invasive plants and animals that cause significant economic losses in the U.S. and transmit diseases to wildlife and people, and $1.6 million to study the decline of insects, birds, and mammals that pollinate agricultural and other plants. Finally, the budget increases funding by $5.1 million to support coastal resilience to hazards and adaptation to long-term change from sea-level rise and coastal erosion.
Foundations for Land Management
The President’s budget request includes an increase of $37.8 million to provide data and tools to help land and resource managers make informed decisions across the landscape and provide data and information to the public for use in a wide variety of applications. The budgets of USGS and NASA provide complementary funding to sustain the Landsat data stream, which is critical to understanding global landscapes. An increase of $24.3 million in the USGS budget supports the ground system portion of the Sustained Land Imaging Program, including funding for ground systems development for a Thermal Instrument Free Flyer, Landsat 9 (a rebuild of Landsat 8), and to receive data from internal partners. The increase also will enhance the accessibility and usability of data. Specifically, the budget includes a $4 million increase for Landsat science products for climate and resource assessments.
The budget provides increases for other foundational data and tools needed to support landscape-level understanding. For example, an increase of $3.7 million will expand three-dimensional elevation data collection using ifsar (interferometric synthetic aperture radar) for Alaska and lidar (light detection and ranging) elsewhere in the U.S. in response to growing needs for high-quality, high-resolution elevation data to improve aviation safety, to understand and mitigate the effects of coastal erosion, storms, and other hazards, and to support many other critical activities. A $1.8 million increase will enhance understanding of the benefits of the Nation’s ecosystem services, and a $1.1 million increase for the Big Earth Data Initiative will make high-value data sets easier to discover, access and use. The accessibility and usability of these data are critical for land management, hazard mitigation, and building a landscape-level understanding of our resources.
Supporting Community Resilience in the Face of a Changing Climate
The USGS plays an important role in conducting research and developing information and tools to support communities in understanding, preparing for, and responding to the impacts of global change. The budget includes an increase of $32 million above the FY 2015 enacted level for science to support climate resilience and adaptation. Climate change requires the Nation to prepare for more intense drought, heatwaves, wildfire, flooding, and sea level rise. These challenges are already impacting infrastructure, food and water supplies, and physical safety in communities across the Nation. Understanding potential impacts to communities, ecosystems, water, plant and animal species, and other resources is crucial to federal, state, tribal, local, and international partners as they develop adaptive and resilient strategies in response to climate change. The budget includes a $6.8 million increase in science for adaptation and resilience planning, an increase of $2.3 million for the USGS to provide interagency coordination of regional climate science activities across the Nation, an increase of $8.7 million to support biological carbon sequestration, and an increase of $11 million for the USGS to support the community resilience toolkit, which is a web-based clearinghouse of data, tools, shared applications, and best practices for resource managers, decision-makers, and the public.
The estimated value of mineral production increased in the United States in 2014, despite the decline in price for most precious metals, the U.S. Geological Survey announced today in its Mineral Commodity Summaries 2015.
The estimated value of mineral raw materials produced at mines in the United States in 2014 was $77.6 billion, an increase of 4.6 percent from $74.2 billion in 2013. U.S. economic growth supported the domestic primary metals industry and industrial minerals industry, however, weak global economic growth and the strong U.S. dollar limited U.S. processed mineral exports, which decreased to $108 billion in 2014 from $129 billion in 2013. Meanwhile, low-priced metal imports increased during most of 2014.
The annual report from the USGS is the earliest comprehensive source of 2014 mineral production data for the world. It includes statistics on about 90 mineral commodities essential to the U.S. economy and national security, and addresses events, trends, and issues in the domestic and international minerals industries.
"Decision-makers and policy-makers in the private and public sectors rely on the Mineral Commodity Summaries and other USGS minerals information publications as unbiased sources of information to make business decisions and national policy," said Steven M. Fortier, Director of the USGS National Minerals Information Center.
Mineral commodities remain an essential part of the U.S. economy, contributing to the real gross domestic product at several levels, including mining, processing and manufacturing finished products. The United States continues to rely on foreign sources for raw and processed mineral materials. In 2014, the supply for more than one-half of U.S. apparent consumption of 43 mineral commodities came from imports, increasing from 40 commodities in 2013. The United States was 100 percent import reliant for 19 of those commodities, including indium, niobium, and tantalum, which are among a suite of materials often designated as “critical” or “strategic.”
Mine production of 13 mineral commodities was worth more than $1 billion each in the United States in 2014. These were, in decreasing order of value, crushed stone, copper, gold, cement, construction sand and gravel, iron ore (shipped), industrial sand and gravel, molybdenum concentrates, phosphate rock, lime, salt, zinc, soda ash, and clays (all types). The estimated value of U.S. industrial minerals mine production in 2014 was $46.1 billion, about 7 percent more than that of 2013.
The estimated value of U.S. metal mine production in 2014 was $31.5 billion, slightly less than that of 2013. These raw materials and domestically recycled materials were used to process mineral materials worth $697 billion. These mineral materials, including aluminum, brick, copper, fertilizers, and steel, plus net imports of processed materials (worth about $41 billion) were, in turn, consumed by industries that use minerals to create products, with a value added to the U.S. economy of an estimated $2.5 trillion in 2014.
The construction industry continued to show signs of improvement in 2014, being led by nonresidential construction, with increased production and consumption of cement, construction sand and gravel, crushed stone, and gypsum mineral commodities.
In 2014, 12 states each produced more than $2 billion worth of nonfuel mineral commodities. These states were, in descending order of value—Arizona, Nevada, Minnesota, Texas, Utah, California, Alaska, Florida, Missouri, Michigan, Wyoming and Colorado. The mineral production of these states accounted for 62 percent of the U.S. total output value.
The USGS Mineral Resources Program delivers unbiased science and information to understand mineral resource potential, production, consumption, and how minerals interact with the environment. The USGS National Minerals Information Center collects, analyzes, and disseminates current information on the supply of and the demand for minerals and materials in the United States and about 180 other countries.
The USGS report Mineral Commodity Summaries 2015 is available online. Hardcopies will be available later in the year from the Government Printing Office, Superintendent of Documents. For ordering information, please call (202) 512-1800 or (866) 512-1800 or go online.
For more information on this report and individual mineral commodities, please visit the USGS National Minerals Information Center.
The U.S. Geological Survey citizen science project, The National Map Corps, has realized remarkable response. In less than two years, the volunteer-based project has harvested more than 100,000 “points”. Hundreds of volunteer cartographers are making significant additions to the USGS ability to provide accurate mapping information to the public.
Using crowd-sourcing techniques, the USGS Volunteer Geographic Information project known as The National Map Corps (TNMCorps) encourages citizen volunteers to collect manmade structure data in an effort to provide accurate and authoritative spatial map data for the USGS National Geospatial Program’s web-based map products.
“I am 80 years old. I work three days a week for a golf course trapping moles and gophers”, said a very prominent citizen scientist volunteer who goes by the handle of “Mole Trapper”. “I spent 11 years volunteering for a fish and wildlife agency. When the big landslide at Oso, Washington happened, I went on the USGS website and discovered the map corps. I worked summers while in high school for a surveyor who was very precise and he told me an inaccurate survey is worthless. I hate inaccurate maps, so this program was just right for me. I hope my work is as accurate as it can be, but if it isn't, I plead old age.”
Structures being updated include schools, hospitals, post offices, police stations and other important public buildings. The data currently being collected by volunteers becomes part of The National Map structures dataset, which is made available to users free of charge.
"I am retired from an unrelated field, but I have loved maps and travel all my life,” explained other highly active volunteer who goes by “fconley”. “When I saw that USGS was looking for volunteers I immediately joined, first of all working with paper maps and quads. As digital mapping, satellite imagery, and GPS became more available I was enthralled. With the imagery now accessible it is almost like being able to travel sitting at my desk. At times, locating structures seems similar to solving puzzles or detective work. This whole project is not only enjoyable but it makes me feel that I am making a lasting and useful contribution. I am thankful for the opportunity to be involved in this fascinating endeavor."
Beginning as a series of pilot projects in 2011, The National Map Corps has grown state-by-state to include the entire U.S. By August of 2013, volunteers were editing in every state in the country and the US territories. To date, the number of active volunteers has grown to 930 individuals, including some extremely energetic participants who have collected in excess of 6,000 points.
To show appreciation of the volunteers’ efforts, The National Map Corps has instituted a recognition program that awards “virtual" badges to volunteers. Each edit that is submitted is worth one point towards the badge level. The badges consist of a series of antique surveying instruments and images following the evolution of land survey and moving to aerial observation of the Earth’s surface such as pigeon-mounted cameras and hot air balloons. Additionally, volunteers are publically acknowledged (with permission) via Twitter, Facebook and Google+.
Tools on TNMCorps web site explain how a volunteer can edit any area, regardless of their familiarity with the selected structures, and becoming a volunteer for TNMCorps is easy; go to The National Map Corps web site to learn more and to sign up as a volunteer. If you have access to the Internet and are willing to dedicate some time to editing map data, we hope you will consider participating.
Squadron of Biplane Spectators badge, currently the highest recognition award, is given to volunteers who submit more than 6,000 points.
Family of Floating Photogrammetrists badge is one of the new awards, which is given to volunteers who submit more than 3,000 points.Badges awarded for submitting edits, shown from first to last: Order of the Surveyor’s Chain (25-49 points), Society of the Steel Tape ( 50-99 points), Pedometer Posse (100-199 points), Surveyor’s Compass (200-499 points), Stadia Board Society (500-999 points), Alidade Alliance (1,000-1,999 points), and the Theodolite Assemblage (2000-2,999 points).
Heidi Koontz ( Phone: 303-202-4763 );
Two new U.S. Geological Survey publications that highlight historical hydraulic fracturing trends and data from 1947 to 2010 are now available.
Hydraulic fracturing is presently the primary stimulation technique for oil and gas production in unconventional resource reservoirs. Comprehensive, published, and publicly available information regarding the extent, location, and character of hydraulic fracturing in the United States is scarce.
“These national-scale data and analyses will provide a basis for making comparisons of current-day hydraulic fracturing to historical applications,” said USGS scientist and lead author Tanya Gallegos.
“We now have an improved understanding of where the practice is occurring and how hydraulic fracturing characteristics have changed over time.”
This national analysis of data on nearly 1 million hydraulically fractured wells and 1.8 million fracturing treatment records from 1947 through 2010 is used to identify hydraulic fracturing trends in drilling methods and use of proppants (sand or similar material suspended in water or other fluid to keep fissures open), treatment fluids, additives, and water in the United States. These trends are compared to peer-reviewed literature in an effort to establish a common understanding of the differences in hydraulic fracturing and provide a context for understanding the costs and benefits of increased oil and gas production. The publications also examine how newer technology has affected the amount of water needed for the process and where hydraulic fracturing has occurred at different points in time. Although hydraulic fracturing is in widespread use across the United States in most major oil and gas basins for the development of unconventional oil and gas resources, historically, Texas had the highest number of records of hydraulic fracturing treatments and associated wells documented in the datasets.
These datasets also illustrate the rapid expansion of water-intensive horizontal/directional drilling that has increased from 6 percent of new hydraulically fractured wells drilled in the United States in 2000 to 42 percent of new wells drilled in 2010. Increased horizontal drilling also coincided with the emergence of water-based “slick water” fracturing fluids. This is one example of how the most current hydraulic fracturing materials and methods are notably different from those used in previous decades and have contributed to the development of previously inaccessible unconventional oil and gas production target areas, namely in shale and tight-sand reservoirs.
In a long-term field study, U.S. Geological Survey (USGS) and Virginia Tech scientists have found that changes in geochemistry from the natural breakdown of petroleum hydrocarbons underground can promote the chemical release (mobilization) of naturally occurring arsenic into groundwater. This geochemical change can result in potentially significant arsenic groundwater contamination.
While arsenic is naturally present in most soils and sediments at various concentrations, it is not commonly a health concern until it is mobilized by a chemical reaction and dissolves into groundwater. Elevated arsenic levels in groundwater used for drinking water is a significant public health concern since arsenic, a toxin and carcinogen, is linked to numerous forms of skin, bladder, and lung cancer.
For the past 32 years, a collaborative group of government, academic, and industry-supported scientists have studied the natural attenuation (biodegradation over time) of a 1979 petroleum spill in the shallow, glacial aquifer at the National Crude Oil Spill Fate and Natural Attenuation Research Site, near Bemidji, Minnesota.
Working at this intensively surveyed site, the researchers in this USGS-led investigation focused on a specific question: whether naturally occurring arsenic found in the glacial aquifers in this area might be mobilized in the presence of hydrocarbons because of chemical interactions involving iron hydroxides which also occur naturally. To address this question, arsenic concentrations were measured for several years in groundwater and in sediment up-gradient, within, and down-gradient from the hydrocarbon plume at Bemidji.
Carefully measured samples from the field reveal that arsenic concentrations in the hydrocarbon plume can reach 230 micrograms per liter — 23 times the current drinking water standard of 10 micrograms per liter. Arsenic concentrations fall below 10 micrograms per liter both up-gradient and down-gradient from the plume.
The scientists attributed the elevated arsenic in the hydrocarbon plume to a series of interrelated geochemical and biochemical processes that involve arsenic and iron oxides (both are commonly found in sediments across the country) and the metabolization of carbon–rich petroleum by microbes in anoxic (low oxygen) conditions. The complex chemical process is explained further at this USGS website and in the published research article.
The results from this work also suggest that the arsenic released in the plume may reattach to aquifer sediments down-gradient from the plume. This reattachment could be considered good news for limiting the extent of the arsenic contamination in the groundwater. However, the chemical reattachment process may also be reversible, highlighting the need for long–term monitoring of arsenic and other chemicals that pose a water quality concern in areas associated with petroleum hydrocarbon leaks and spills.
The presence and amount of naturally occurring arsenic and iron oxides and the condition of the groundwater in the study area are fairly typical of many geologic settings across the nation, suggesting that the process of arsenic mobilization that was observed in the presence of hydrocarbons is not geographically limited.
This research was supported by the USGS Toxic Substances Hydrology Program and Hydrologic Research and Development Program, the Virginia Polytechnic Institute and State University, and the National Crude Oil Spill Fate and Natural Attenuation Research Site, a collaborative venture of the USGS, the Enbridge Energy Limited Partnership, the Minnesota Pollution Control Agency, and Beltrami County, Minnesota. By law, the USGS, a science bureau of the U.S. Department of the Interior, does not have any regulatory authority or responsibility.
Jon Campbell ( Phone: 703-648-4180 );
Improved global topographic (elevation) data are now publicly available for most of Asia (India, China, southern Siberia, Japan, Indonesia), Oceania (Australia, New Zealand), and western Pacific Islands. See diagram below for geographic coverage.
The data are being released following the President’s commitment at the United Nations to provide assistance for global efforts to combat climate change. The broad availability of more detailed elevation data across the globe through the Shuttle Radar Topography Mission (SRTM) will improve baseline information that is crucial to investigating the impacts of climate change on specific regions and communities.
“We are pleased to offer improved elevation data to scientists, educators, and students worldwide. It’s free to whomever can use it,” said Suzette Kimball, acting USGS Director, at the initial release of SRTM30 data for Africa in September. “Elevation, the third dimension of maps, is critical in understanding so many aspects of how nature works. Easy access to reliable data like this advances the mutual understanding of environmental challenges by citizens, researchers, and decision makers around the globe.”
The SRTM30 datasets resolve to 30-meters and can be used worldwide to improve environmental monitoring, advance climate change research, and promote local decision support. The previous global resolution for this data was 90-meters.
SRTM30 elevation data are increasingly being used to supplement other satellite imagery. In India, for example, SRTM30 elevation data can be used to track changes to the Gangotri Glacier, a major source of water for the Ganges River. Changes to this glacier, which has retreated 345 meters over the past 25 years, directly affect the water resources for hundreds of millions of people on the Indian subcontinent.
The National Aeronautics and Space Administration (NASA) and the National Geospatial-Intelligence Agency (NGA) worked collaboratively to produce the enhanced SRTM data, which have been extensively reviewed by relevant government agencies and deemed suitable for public release. SRTM flew aboard the Space Shuttle Endeavour in February 2000, mapping Earth's topography between 56 degrees south and 60 degrees north of the equator. During the 11-day mission, SRTM used imaging radar to map the surface of Earth numerous times from different perspectives.
The USGS, a bureau of the U.S. Department of the Interior, distributes SRTM30 data free of charge via its user-friendly Earth Explorer website. NASA also distributes SRTM data versions through the Land Processes Distributed Active Archive Center (LPDAAC) operated by USGS along with descriptions of the various versions and processing options.
Enhanced 30-meter resolution SRTM data for the remainder of the globe (at less than 60 deg. latitude) are scheduled to be released in the last of four releases in August 2015.Shaded grid over most of Asia, Japan, and Australia indicates the coverage of the third of four releases of improved topographic (elevation) data now publicly available through USGS archives. (High resolution image)
Some media are reporting that the Asian H5N1 strain of highly pathogenic avian influenza has now entered the United States. This is incorrect. The avian flu that was recently found in a green-winged teal in Washington state is a different strain and is not known to harm humans nor has it been found in domestic poultry. This Washington state strain incorporates genes from North American waterfowl-associated viruses. Unlike the Asian H5N1 strain that has been found in Asia, Europe, and Africa, this Washington state strain has only been found in wild waterfowl and has not been associated with human illness, nor has this new Washington state strain been found in domestic poultry.
Catherine Puckett, USGS ( Phone: 352-377-2469 );
BOZEMAN – Pallid sturgeon come from a genetic line that has lived on this planet for tens of millions of years; yet it has been decades since anyone has documented any of the enormous fish successfully producing young that survive to adulthood in the upper Missouri River basin.
Now, fisheries scientists with the U.S. Geological Survey, Montana State University and the U.S. Fish and Wildlife Service have shown why, detailing for the first time the biological mechanism that has caused the long decline of pallid sturgeon in the Missouri River and led to its being placed on the endangered species list 25 years ago.
In a paper published this week in the journal Fisheries, the scientists show that oxygen-depleted dead zones between dams in the upper Missouri River are directly linked with the failure of endangered pallid sturgeon hatched embryos to survive to adulthood.
“This research is a notable breakthrough in identifying the reason why pallid sturgeon in the Missouri River have been declining for so many decades,” said Suzette Kimball, acting director of the USGS. “By pinpointing the biological mechanism responsible for the species’ decline, resource managers have vital information they can use as a focus of pallid sturgeon conservation.”
“We certainly think this is a significant finding in the story of why pallid sturgeon are failing to recruit in the upper Missouri River,” said Christopher Guy, the assistant unit leader with the USGS Montana Cooperative Fishery Research Unit and the MSU professor who was the lead author on the paper. “We’re basically talking about a living dinosaur that takes 20 years to reach sexual maturity and can live as long as the average human in the U.S. After millions of years of success, the pallid sturgeon population stumbled and now we know why. From a conservation perspective, this is a major breakthrough.”
The study is the first to make a direct link among dam-induced changes in riverine sediment transport, the subsequent effects of those changes on reduced oxygen levels and the survival of an endangered species, the pallid sturgeon.
“This research shows that the transition zone between the freely flowing river and reservoirs is an ecological sink – a dead zone – for pallid sturgeon,” Guy said. “Essentially, hatched sturgeon embryos die in the oxygen-depleted sediments in the transition zones.”
Guy said fisheries biologists long suspected that the Missouri River’s massive reservoirs were preventing hatched embryonic pallid sturgeon from surviving to the juvenile stage. But early attempts to tie the problem to low levels of dissolved oxygen were unsuccessful.
“The reason for that is we hadn’t sampled deep enough,” Guy said. “It wasn’t until we sampled water down at the bottom, where those sediments are being deposited, that we found there was no dissolved oxygen. Because hatched pallid sturgeon embryos are negatively buoyant, they tend to sink into that hostile environment.”
“The lack of oxygen is a function of high microbial activity in the sediment laden area,” said co-author Eric Scholl, a Ph.D. student at Montana State University and a co-author on the study.
Hilary Treanor, an MSU research associate working with Guy, said they were able to show just how hostile these transition zones between riverine environment and reservoir could be to hatched sturgeon embryos.
In experiments at the U.S. Fish and Wildlife Fish Technology Center in Bozeman with coauthors Molly Webb, Kevin Kappenman, and Jason Ilgen, Treanor said different aged hatched embryos were treated with water of varying levels of dissolved oxygen. The lowest level they could recreate – 1.5 milligrams of oxygen per liter of water – was still higher than samples pulled from the bottom at the upper end of Fort Peck Reservoir.
At those depleted levels, the hatched sturgeon embryos suffered almost immediately.
“We saw changes in their behavior fairly quickly. They became disoriented and weren’t able to move the way they should have,” Treanor said. “Within an hour we started to see mortality. By the end of the experiment they were all dead.”
"Pallid sturgeon, native to the Missouri and Mississippi rivers, were listed as an endangered species in 1990. The species has a lifespan of as much as a century. According to the U.S. Fish and Wildlife Service, fewer than 175 wild-spawned pallid sturgeon – all adults – live in the free-flowing Missouri River above Lake Sakakawea. Since 1990, not a single wild-spawned pallid sturgeon is known to have survived to a juvenile, despite intensive searching.
In the past 5 years, researchers identified the most important reason for pallid sturgeon population declines in the Upper Missouri River: the lack of survival of naturally produced hatched sturgeon embryos.
Guy said this most recent study of sturgeon built on research conducted by USGS fisheries biologist Patrick Braaten, which demonstrated not enough available drift distance exists between the reservoirs for hatched pallid sturgeon embryos before entering the reservoirs in the upper Missouri River.
Before dams, hatched pallid sturgeon embryos would drift for hundreds of miles, eventually settling out of the river’s current in areas with low flow where they matured enough to negotiate the river’s flow.
“This team has shown how much we can do when we have a collaboration between MSU, USGS and world-renowned reproductive physiologists Molly Webb and Kevin Kappenman with the U.S. Fish and Wildlife Service,” Guy said. “In the process of doing this research, we’ve trained a dozen MSU graduate students and a number of undergraduate field and lab techs.”
Given what the new research shows about how no oxygen is available to hatched pallid sturgeon embryos, the authors of the paper propose that officials will need to consider innovative approaches to managing Missouri River reservoirs for pallid sturgeon conservation to have a chance. It also could provide some guiding principles for the construction of new dams around the world, Guy said.
ANCHORAGE, Alaska Melting glaciers are not just impacting sea level, they are also affecting the flow of organic carbon to the world’s oceans, according to new research that provides the first ever global-scale estimates for the storage and release of organic carbon from glaciers.
The research, published in the Jan. 19 issue of Nature Geoscience, is crucial to better understand the role glaciers play in the global carbon cycle, especially as climate warming continues to reduce glacier ice stores and release ice-locked organic carbon into downstream freshwater and marine ecosystems.
“This research makes it clear that glaciers represent a substantial reservoir of organic carbon,” said Eran Hood, the lead author on the paper and a scientist with the University of Alaska Southeast (Juneau). “As a result, the loss of glacier mass worldwide, along with the corresponding release of carbon, will affect high-latitude marine ecosystems, particularly those surrounding the major ice sheets that now receive fairly limited land-to-ocean fluxes of organic carbon.”
Polar ice sheets and mountain glaciers cover roughly 11 percent of the Earth’s land surface and contain about 70 percent of Earth’s fresh water. They also store and release organic carbon to downstream environments as they melt. Because this glacier-derived organic carbon is readily metabolized by microorganisms, it can affect productivity in aquatic ecosystems.
“This research demonstrates that the impacts of glacier change reach beyond sea level rise,” said U.S. Geological Survey research glaciologist and co-author of the research Shad O’Neel. “Changes in organic carbon release from glaciers have implications for aquatic ecosystems because this material is readily consumed by microbes at the bottom of the food chain.”
Due to climate change, glacier mass losses are expected to accelerate, leading to a cumulative loss of nearly 17 million tons of glacial dissolved organic carbon by 2050 — equivalent to about half of the annual flux of dissolved organic carbon from the Amazon River.
These estimates are the first of their kind, and thus have high uncertainty, the scientists wrote, noting that refining estimates of organic carbon loss from glaciers is critical for improving the understanding of the impacts of glacier change. The U.S. Department of the Interior Alaska Climate Science Center and USGS Alaska Science Center plan to continue this work in 2015 and beyond with new efforts aimed at studying the biophysical implications of glacier change.
This project highlights ongoing collaboration between academic and federal research and the transformative results that stem from such funding partnerships. Other institutions involved in the research include Ecole Polytechnique Fédérale de Lausanne and Florida State University.
The work was supported by the National Science Foundation, the USGS Alaska Science Center, and the DOI Alaska Climate Science Center. The Alaska Climate Science Center provides scientific information to help natural resource managers and policy makers respond effectively to climate change.
Newly released US Topo maps for Nebraska now feature trails provided to the USGS through a “crowdsourcing” project operated by the International Mountain Biking Association (IMBA). Several of the 1,376 new US Topo quadrangles for the state now display trails along with other improved data layers such as map symbol redesign and new road source data.
"As an avid cyclist I look forward to exploring the new US Topo maps for bike trails as I plan my trips," said Jim Langtry, National Map Liaison for Nebraska. "I look forward to the expansion of the trail network and hope this encourages the crowdsourcing effort to add and maintain trails for future updates. It would be great to see the Cowboy Trail, the nation’s longest rails-to-trail trek along the northern tier of Nebraska, included on the next update. You can hike, bike, or horseback ride a total of 195 miles on the completed trail from Norfolk to Valentine. Enjoy the small towns along the way, beautiful scenery and pristine air on the Cowboy Trail."
For Nebraska residents and visitors who want to explore the rolling “cornhusker” landscape on a bicycle seat, the new trail features on the US Topo maps will come in handy. The data is provided through a partnership with IMBA and MTB Project. During the past two years, the IMBA has been building a detailed national database of mountain bike trails with the aid and support of the MTB Project. This activity allows local IMBA chapters, IMBA members, and the public to provide trail data and descriptions through their website. MTB Project and IMBA then verify the quality of the trail data provided, ensure accuracy and confirm that the trail is legal. This unique crowdsourcing venture has increased the availability of trail data available through The National Map mobile and web apps, and the revised US Topo maps.
These new maps replace the first edition US Topo maps for Nebraska and are available for free download from The National Map, the USGS Map Locator & Downloader website , or several other USGS applications.
To compare change over time, scans of legacy USGS topo maps, some dating back to the late 1800s, can be downloaded from the USGS Historical Topographic Map Collection
For more information on US Topo maps: http://nationalmap.gov/ustopo/New version of the North Platte, Nebraska US Topo quadrangle: 2014, with orthoimage turned on. (1:24,000 scale) (high resolution image 1.2 MB) 1902 historic version of the North Platte, Nebraska US Topo quadrangle at 1;25,000 scale. (high resolution image 1.8 MB)
VANCOUVER, Wash. — The large landslide that occurred on March 22, 2014 near Oso, Washington was unusually mobile and destructive. The first published study from U.S. Geological Survey investigations of the Oso landslide (named the “SR530 Landslide” by Washington State) reveals that the potential for landslide liquefaction and high mobility are influenced by several factors, and the landslide process at Oso could have unfolded very differently (with much less destruction) if initial conditions had been only subtly different.
A major focus of the research reported this week is to understand the causes and effects of the landslide’s high mobility. High “mobility” implies high speeds and large areas of impact, which can be far from the landslide source area. Because high-mobility landslides overrun areas that are larger than normal, they present a significant challenge for landslide hazard evaluation. Understanding of the Oso event adds to the knowledge base that can be used to improve future hazard evaluations.
Computer reconstructions of the landslide source-area geometry make use of high-resolution digital topographic (lidar) data, and they indicate that the Oso landslide involved about 8 million cubic meters (about 18 million tons, or almost 3 times the mass of the Great Pyramid of Giza) of material. The material consisted of sediments deposited by ancient glaciers and in streams and lakes near the margins of those glaciers. The landslide occurred after a long period of unusually wet weather. Prolonged wet weather increases groundwater pressures, which act to destabilize slopes by reducing frictional resistance between sediment particles.
The slope that failed at Oso on March 22, 2014 had a long history of prior historical landslides at the site, but these had not exhibited exceptional mobility.
The area overrun by the March 22 landslide was about 1.2 square kilometers (one-half square mile), mostly on the nearly flat floodplain of the North Fork Stillaguamish River. Additional areas were affected by upstream flooding along the river, which was partially dammed by the landslide. Eyewitness accounts and seismic energy radiated by the landslide indicate that slope failure occurred in two stages over the course of about 1 minute. During the second stage of slope failure, the landslide greatly accelerated, crossed the North Fork Stillaguamish River, and mobilized to form a high-speed debris avalanche. The leading edge of the wet debris avalanche probably acquired additional water as it crossed the North Fork Stillaguamish River. It transformed into a water-saturated debris flow (a fully liquefied slurry of quicksand-like material) that entrained and transported virtually all objects in its path.
Field evidence and mathematical modeling indicate that the high mobility of the debris avalanche was caused by liquefaction at the base of the slide caused by pressures generated by the landslide itself. The physics of landslide liquefaction has been studied experimentally and is well understood, but the complex nature of natural geological materials complicates efforts to predict which landslides will liquefy and become highly mobile.
Results from a suite of computer simulations indicate that the landslide’s liquefaction and high mobility were very sensitive to its initial porosity and water content. Landslide mobility may have been far less if the landslide material had been slightly denser and/or drier. Computer simulations that best fit field observations and seismological interpretations indicate that the fast-moving landslide crossed the entire 1-km-wide river floodplain in about one minute, implying an average speed of about 40 miles per hour. Maximum speeds were even higher.
Only one individual landslide in U.S. history (an event in Mameyes, Puerto Rico in 1985 that killed at least 129) caused more fatalities than the 43 that occurred in the 2014 landslide near Oso.
The full paper, “Landslide mobility and hazards: implications of the 2014 Oso disaster” by R.M. Iverson et al. is published in the journal, “Earth and Planetary Science Letters” and is freely available online.Oso landslide simulation screen shot. (High resolution image) (Video)
[Access images for this release at: &amp;amp;amp;lt;a href="http://gallery.usgs.gov/tags/NR2015_01_12" _mce_href="http://gallery.usgs.gov/tags/NR2015_01_12"&amp;amp;amp;gt;http://gallery.usgs.gov/tags/NR2015_01_12&amp;amp;amp;lt;/a&amp;amp;amp;gt;]
Newly released US Topo maps for Arizona now feature mountain bike trails, segments of the Arizona National Scenic Trail and Public Land Survey System data. Several of the 1,880 new US Topo quadrangles for the state now display these selected new features along with other improved data layers.
“Having recently returned to Arizona, I am excited to re-explore our state using the new USGS Arizona Topo maps,” said Curtis Pulford, Arizona State Cartographer. “Detailed topographic maps are one of the best ways I know to visualize the terrain one is planning to examine. All who use these will appreciate the newly updated reference features, such as BLM Public Lands Survey System, roadways, schools, fire and police stations, post offices, and hospitals. Mountain bikers will appreciate the addition of International Mountain Biking Association trails. And the addition of the 817 mile, border to border, Arizona National Scenic Trail will be an outstanding resource for nature enthusiasts, hikers and equestrians.”
For Arizona residents and visitors who want to explore the landscape on a bicycle seat, the new mountain bike trails will come in handy. The mountain bike trail data is provided through a partnership with the International Mountain Biking Association (IMBA) and MTB Project. During the past two years, the IMBA has been building a detailed national database of mountain bike trails with the aid and support of the MTB Project. This activity allows local IMBA chapters, IMBA members, and the public to provide trail data and descriptions through their website. MTB Project and IMBA then verify the quality of the trail data provided, ensure accuracy and confirm that the trail is legal. This unique “crowdsourcing” project has allowed availability of mountain bike trail data though mobile and web apps, and the revised US Topo maps.
National Scenic Trail enthusiasts can now find the “Arizona Trail” on new US Topo map segments. The Arizona National Scenic Trail stretches more than 800 miles from the Mexican border to Utah to connect deserts, mountains, canyons, wilderness, history, communities and people. Rugged, wild and challenging, this trail showcases Arizona’s diverse vegetation, wildlife, scenery, and historic and prehistoric sites in a way that provides a unique and unparalleled Arizona experience.
“For more than 20 years the Arizona Trail Association’s members have been creating, maintaining, and mapping the Arizona National Scenic Trail,” said Aaron Seifert, GIS Director for the Arizona Trail Association. “Since the trail was designated as a National Scenic Trail in 2009 and completed in 2011, it is very exciting to display the entire trail on the new set of US Topo maps for many more to discover the diverse landscape of Arizona from this amazing trail.”
The USGS partnered with the U.S. Forest Service and the Arizona Trail Association to incorporate the trail data onto the Arizona US Topo maps. This NST joins the Ice Age National Scenic Trail, the Pacific Northwest National Scenic Trail the North Country National Scenic Trail, the Pacific Crest National Scenic Trail, and the Appalachian National Scenic Trail as being featured on the new US Topo quads. The USGS hopes to eventually include all National Scenic Trails in The National Map products.
Another important addition to the new Arizona US Topo maps in the inclusion of Public Land Survey System. PLSS is a way of subdividing and describing land in the US. All lands in the public domain are subject to subdivision by this rectangular system of surveys, which is regulated by the U.S. Department of the Interior.
These new maps replace the first edition US Topo maps for Arizona and are available for free download from The National Map, the USGS Map Locator & Downloader website , or several other USGS applications.
To compare change over time, scans of legacy USGS topo maps, some dating back to the late 1800s, can be downloaded from the USGS Historical Topographic Map Collection
For more information on US Topo maps: http://nationalmap.gov/ustopo/New (2014) Black Canyon City, Arizona US Topo quadrangle with orthoimage turn on. (1:24,000 scale). (high resolution image 1.3 MB) Historical USGS topographic map of the Prescott, Arizona area (1887). !:250,000 scale. (high resolution image 1.6 MB) Zoom of the Black Canyon City, Arizona, US Topo quadrangle. The Blank Canyon Trail (BCT) is denoted by a dashed line on the left side of the graphic. (high resolution image 1.2 MB)
Heidi Koontz ( Phone: 303-202-4763 );
While the number of large earthquakes fell to 12 in 2014, from 19 in 2013, several moderate temblors hit areas relatively new to seismicity, including Oklahoma and Kansas, according to the U.S. Geological Survey. Worldwide, 11 earthquakes reached magnitude 7.0-7.9 and one registered magnitude 8.2, in Iquique, Chile, on April 1. This is the lowest annual total of earthquakes magnitude 7.0 or greater since 2008, which also had 12.
Earthquakes were responsible for about 664 deaths in 2014, with 617 having perished in the magnitude 6.1 Ludian Xian, Yunnan, China, event on August 3, as reported by the United Nations Office for Coordination of Humanitarian Affairs. Deadly quakes also occurred in Chile, Nicaragua, Papua New Guinea, and the United States.
A magnitude 6.0 quake struck American Canyon, California (South Napa) in the early hours of August 24, triggering more than 41,300 responses via the USGS Did You Feel It? website. One woman died from her injuries 12 days later. This temblor also represents northern California’s strongest earthquake since the October 1989 Loma Prieta event.
The biggest earthquake in the United States, and the second largest quake of 2014, was a magnitude 7.9 event in the Aleutian Islands of Alaska on June 23. Several quakes below magnitude 5.0 rattled Oklahoma, Texas, Kansas, Arkansas and Arizona throughout the year. The USGS estimates that several million earthquakes occur throughout the world each year, although most go undetected because they have very small magnitudes or hit remote areas.
On average, the USGS National Earthquake Information Center (NEIC) publishes the locations for about 40 earthquakes per day, or about 14,500 annually. The USGS NEIC publishes worldwide earthquakes with a magnitude of 4.0 or greater or U.S. earthquakes of 2.5 or greater. On average each year since about 1900, 18 have a magnitude of 7.0 or higher.
To monitor earthquakes worldwide, the USGS NEIC receives data in real-time from about 1,700 stations in more than 90 countries. These stations include the 150-station Global Seismographic Network, which is jointly supported by the USGS and the National Science Foundation, and is operated by the USGS in partnership with the Incorporated Research Institutions for Seismology (IRIS) consortium of universities. Domestically, the USGS partners with 13 regional seismic networks operated by universities that provide detailed coverage for the areas of the country with the highest seismic risk.
In the U.S., 42 of the 50 states, plus Puerto Rico, may experience damaging ground shaking from an earthquake in 50 years, the nominal lifetime of a building. The USGS and its partners in the multi-agency National Earthquake Hazard Reduction Program are working to improve earthquake monitoring and reporting capabilities through the development of the USGS Advanced National Seismic System (ANSS). More information about ANSS can be found on the ANSS website.
Read a USGS feature story to learn more about other natural hazards in 2014.
Paul Laustsen ( Phone: 650-329-4046 );
Editors: B-roll footage of polar bear research is available for your use.
ANCHORAGE, Alaska — In a new polar bear study published today, scientists from around the Arctic have shown that recent generations of polar bears are moving towards areas with more persistent year-round sea ice.
Research scientists, led by the U.S. Geological Survey, found that the 19 recognized subpopulations of polar bears group into four genetically-similar clusters, corresponding to ecological and oceanographic factors. These four clusters are the Eastern Polar Basin, Western Polar Basin, Canadian Archipelago, and Southern Canada.
The scientists also detected directional gene flow towards the Canadian Archipelago within the last 1-3 generations. Gene flow of this type can result from populations expanding and contracting at different rates or directional movement and mating over generations. The findings of spatial structure (clusters) and directional gene flow are important because they support the hypothesis that the species is coalescing to the region of the Arctic most likely to retain sea ice into the future.
“The polar bear’s recent directional gene flow northward is something new,” said Elizabeth Peacock, USGS researcher and lead author of the study. “In our analyses that focused on more historic gene flow, we did not detect movement in this direction.” The study found that the predominant gene flow was from Southern Canada and the Eastern Polar Basin towards the Canadian Archipelago where the sea ice is more resilient to summer melt due to circulation patterns, complex geography, and cooler northern latitudes.
Projections of future sea ice extent in light of climate warming typically show greater retention of sea ice in the northern Canadian Archipelago than in other regions.
“By examining the genetic makeup of polar bears, we can estimate levels and directions of gene flow, which represents the past story of mating and movement, and population expansion and contraction,” said Peacock. “Gene flow occurs over generations, and would not be detectable by using data from satellite-collars which can only be deployed on a few polar bears for short periods of time.”
The authors also found that female polar bears showed higher fidelity to their regions of birth than did male polar bears. Data to allow comparison of the movements of male and female polar bears is difficult to obtain because male bears cannot be collared as their necks are wider than their heads.
The study also confirmed earlier work that suggests that modern polar bears stem from one or several hybridization events with brown bears. No evidence of current polar bear-brown bear hybridization was found in the more than 2,800 samples examined in the current study. Scientists concluded that the hybrid bears that have been observed in the Northern Beaufort Sea region of Canada represent a recent and currently localized phenomenon. Scientists also found that polar bear populations expanded and brown bear populations contracted in periods with more ice. In periods with less ice, the opposite was true.
The goal of the study was to see how genetic diversity and structure of the worldwide polar bear population have changed over the recent dramatic decline in their sea-ice habitat. The USGS and the Government of Nunavut led the study with scientists from 15 institutions representing all five nations with polar bears (U.S., Canada, Greenland, Norway, and Russia).
This circumpolar, multi-national effort provides a timely perspective on how a rapidly changing Arctic is influencing the gene flow and likely future distribution of a species of worldwide conservation concern.
The paper “Implications of the circumpolar genetic structure of polar bears for their conservation in a rapidly warming Arctic” was published today in the journal PLOS One.
CORVALLIS, Ore. — Scientists from the U.S. Geological Survey and Washington State University have discovered that endangered Chinook salmon can be detected accurately from DNA they release into the environment. The results are part of a special issue of the journal Biological Conservation on use of environmental DNA to inform conservation and management of aquatic species.
The special issue contains eleven papers that move the detection of aquatic species using eDNA from concept to practice and include a thorough examination of the potential benefits, limitations and biases of applying eDNA methods to research and monitoring of animals.
“The papers in this special edition demonstrate that eDNA techniques are beginning to realize their potential contribution to the field of conservation biology worldwide,” said Caren Goldberg, Assistant Professor at Washington State University and lead editor of the special issue.
DNA, or deoxyribonucleic acid, is the hereditary material that contains the biological instructions to build and maintain all life forms; eDNA is the DNA that animals release into the environment through normal biological processes from sources such as feces, mucous, skin, hair, and carcasses. Research and monitoring of rare, endangered, and invasive species can be done by analyzing eDNA in water samples.
A paper included in the special issue by USGS ecologists Matthew Laramie and David Pilliod, and Goldberg, looked at the potential for eDNA analysis to improve detection of Chinook salmon in the Upper Columbia River in Washington, USA and British Columbia, Canada. This is the first time eDNA methods have been used to monitor North American salmon populations. The successful project also picked up evidence of Chinook in areas where they have not been previously observed.
“The results from this study indicate that eDNA detection methods are an effective way to determine the distribution of Chinook across a large area and can potentially be used to document the arrival of migratory species, like Pacific salmon, or colonization of streams following habitat restoration or reintroduction efforts,” said Laramie.
Spring Chinook of the Upper Columbia River are among the most imperiled North American salmon and are currently listed as endangered under the Endangered Species Act. Laramie has been working with the Confederated Tribes of the Colville Reservation Fisheries Program in the use of eDNA to document the success of reintroduction of Spring Chinook into the Okanogan Basin of the Upper Columbia River.
The papers of the special issue focus on techniques for analyzing eDNA samples, eDNA production and degradation in the environment and the laboratory, and practical applications of eDNA techniques in detecting and managing endangered fish and amphibians.
The co-editors, Goldberg, Pilliod, and WSU researcher Katherine Strickler, open the special issue with an overview on the state of eDNA science, a field developed from the studies of micro-organisms in environmental samples and DNA collected from ancient specimens such as mummified tissues or preserved plant remains.
“Incorporating eDNA methods into survey and monitoring programs will take time, but dedicated professionals around the world are rapidly advancing these methods closer to this goal,” said Goldberg.
Strickler, Goldberg, and WSU Assistant Professor Alexander Fremier authored a paper which quantified the effects of ultraviolet radiation, temperature, and pH on eDNA degradation in aquatic systems. Using eDNA from bullfrog tadpoles, the scientists determined that DNA broke down faster in warmer temperatures and higher levels of Ultraviolet-B light.
“We need to better understand how long DNA can be detected in water under different conditions. Our work will help improve sampling strategies for eDNA monitoring of sensitive and invasive species,” said Strickler.
“These papers lead the way in advancing eDNA sample collection, processing, analysis, and interpretation,” said Pilliod, “eDNA methods have great promise for detecting aquatic species of concern and may be particularly useful when animals occur in low numbers or when there are regulatory restrictions on the use of more invasive survey techniques.”
For the first time, scientists have developed a detailed explanation of how white-nose syndrome (WNS) is killing millions of bats in North America, according to a new study by the U.S. Geological Survey and the University of Wisconsin. The scientists created a model for how the disease progresses from initial infection to death in bats during hibernation.
“This model is exciting for us, because we now have a framework for understanding how the disease functions within a bat,” said University of Wisconsin and USGS National Wildlife Health Center scientist Michelle Verant, the lead author of the study. “The mechanisms detailed in this model will be critical for properly timed and effective disease mitigation strategies.”
Scientists hypothesized that WNS, caused by the fungus Pseudogymnoascus destructans, makes bats die by increasing the amount of energy they use during winter hibernation. Bats must carefully ration their energy supply during this time to survive without eating until spring. If they use up their limited energy reserves too quickly, they can die.
The USGS tested the energy depletion hypothesis by measuring the amounts of energy used by infected and healthy bats hibernating under similar conditions. They found that bats with WNS used twice as much energy as healthy bats during hibernation and had potentially life-threatening physiologic imbalances that could inhibit normal body functions.
Scientists also found that these effects started before there was severe damage to the wings of the bats and before the disease caused increased activity levels in the hibernating bats.
“Clinical signs are not the start of the disease — they likely reflect more advanced disease stages,” Verant said. “This finding is important because much of our attention previously was directed toward what we now know to be bats in later stages of the disease, when we observe visible fungal infections and behavioral changes.”
Key findings of the study include:
- Bats infected with P. destructans had higher proportions of lean tissue to fat mass at the end of the experiment compared to the non-infected bats. This finding means that bats with WNS used twice as much fat as healthy control bats over the same hibernation period. The amount of energy they used was also higher than what is expected for normal healthy hibernating little brown bats.
- Bats with mild wing damage had elevated levels of dissolved carbon dioxide in their blood resulting in acidification and pH imbalances throughout their bodies. They also had high potassium levels, which can inhibit normal heart function.
The study, “White-nose syndrome initiates a cascade of physiologic disturbances in the hibernating bat host,” is published in BMC Physiology. Learn more about WNS, ongoing research and actions that are being taken here:
- USGS National Wildlife Health Center, WNS page
- USGS Fort Collins Science Center, WNS page
- University of Wisconsin-Madison
Arlene Compher ( Phone: 703-648-4282 );
WASHINGTON, D.C. — U.S. Secretary of the Interior Sally Jewell announced today that the Department of the Interior’s regional Climate Science Centers and the United States Geological Survey (USGS) National Climate Change and Wildlife Science Center are awarding nearly $6 million to universities and other partners for 50 new research projects to better prepare communities for impacts of climate change.
Highly Pathogenic H5 Avian Influenza Confirmed in Wild Birds in Washington State H5N2 Found in Northern Pintail Ducks & H5N8 Found in Captive Gyrfalcons
WASHINGTON, Dec. 17, 2014 — The United States Department of Agriculture's (USDA) Animal and Plant Health Inspection Service (APHIS) confirmed the presence of highly pathogenic (HPAI) H5 avian influenza in wild birds in Whatcom County, Washington. Two separate virus strains were identified: HPAI H5N2 in northern pintail ducks and HPAI H5N8 in captive gyrfalcons that were fed hunter-killed wild birds. Neither virus has been found in commercial poultry anywhere in the United States and no human cases with these viruses have been detected in the United States, Canada or internationally. There is no immediate public health concern with either of these avian influenza viruses.
Both H5N2 and H5N8 viruses have been found in other parts of the world and have not caused any human infection to date. While neither virus has been found in commercial poultry, federal authorities with the U.S. Department of Agriculture also emphasize that poultry, poultry products and wild birds are safe to eat even if they carry the disease if they are properly handled and cooked to a temperature of 165 degrees Fahrenheit.
The finding in Whatcom County was reported and identified quickly due to increased surveillance for avian influenza in light of HPAI H5N2 avian influenza outbreaks in poultry affecting commercial poultry farms in British Columbia, Canada. The northern pintail duck samples were collected by officials from the Washington Department of Fish and Wildlife following a waterfowl die-off at Wiser Lake, Washington, and were sent to the U.S. Geological Survey (USGS) National Wildlife Health Center for diagnostic evaluation and initial avian influenza testing. The U.S. Department of the Interior's USGS, which also conducts ongoing avian influenza testing of wild bird mortality events, identified the samples as presumptive positive for H5 avian influenza and sent them to USDA for confirmation. The gyrfalcon samples were collected after the falconer reported signs of illness in his birds.
Following existing avian influenza response plans, USDA is working with the U.S. Department of the Interior and the U.S. Department of Health and Human Services as well as state partners on additional surveillance and testing of both commercial and wild birds in the nearby area.
Wild birds can be carriers of HPAI viruses without the birds appearing sick. People should avoid contact with sick/dead poultry or wildlife. If contact occurs, wash your hands with soap and water and change clothing before having any contact with healthy domestic poultry and birds.
HPAI would have significant economic impacts if detected in U.S. domestic poultry. Commercial poultry producers follow strict biosecurity practices and raise their birds in very controlled environments. Federal officials emphasize that all bird owners, whether commercial producers or backyard enthusiasts, should continue practicing good biosecurity. This includes preventing contact between your birds and wild birds, and reporting sick birds or unusual bird deaths to State/Federal officials, either through your state veterinarian or through USDA's toll-free number at 1-866-536-7593. Additional information on biosecurity for backyard flocks can be found at healthybirds.aphis.usda.gov.
CDC considers the risk to people from these HPAI H5 infections in wild birds to be low because (like H5N1) these viruses do not now infect humans easily, and even if a person is infected, the viruses do not spread easily to other people.
Avian influenza (AI) is caused by influenza type A viruses which are endemic in some wild birds (such as wild ducks and swans) which can infect poultry (such as chickens, turkeys, pheasants, quail, domestic ducks, geese and guinea fowl). AI viruses are classified by a combination of two groups of proteins: hemagglutinin or "H" proteins, of which there are 17 (H1–H17), and neuraminidase or "N" proteins, of which there are 10 (N1–N10). Many different combinations of "H" and "N" proteins are possible. Each combination is considered a different subtype, and can be further broken down into different strains. AI viruses are further classified by their pathogenicity—the ability of a particular virus to produce disease in domestic chickens.
For more information on avian influenza and wild birds, please visit the USGS National Wildlife Health Center. For other information visit the USDA avian influenza page and the USDA APHIS avian influenza page.
Average chloride concentrations often exceed toxic levels in many northern United States streams due to the use of salt to deice winter pavement, and the frequency of these occurrences nearly doubled in two decades.
Chloride levels increased substantially in 84 percent of urban streams analyzed, according to a U.S. Geological Survey study that began as early as 1960 at some sites and ended as late as 2011. Levels were highest during the winter, but increased during all seasons over time at the northern sites, including near Milwaukee, Wisconsin; Chicago, Illinois; Denver, Colorado; and other metropolitan areas. The report was published today in the journal Science of the Total Environment.
"Some freshwater organisms are sensitive to chloride, and the high concentrations that we found could negatively affect a significant number of species," said Steve Corsi, USGS scientist and lead author of the study. “If urban development and road salt use continue to increase, chloride concentrations and associated toxicity are also likely to increase.”
The scientists analyzed water-quality data from 30 monitoring sites on 19 streams near cities in Wisconsin, Illinois, Colorado, Michigan, Ohio, Pennsylvania, Maryland, Texas and the District of Columbia. Key findings include:
- Twenty-nine percent of the sites exceeded the U.S. Environmental Protection Agency’s chronic water-quality criteria (230 milligrams per liter) by an average of more than 100 days per year from 2006 through 2011, which was almost double the amount of days from 1990 through 1994. This increase occurred at sites such as the Menomonee and Kinnickinnic Rivers near Milwaukee and Poplar Creek near Chicago.
- The lowest chloride concentrations were in watersheds that had little urban land use or cities without much snowfall, such as Dallas, Texas.
- In 16 of the streams, winter chloride concentrations increased over the study period.
- In 13 of the streams, chloride concentrations increased over the study period during non-deicing periods such as summer. This finding suggests that chloride infiltrates the groundwater system during the winter and is slowly released to the streams throughout the year.
- Chloride levels increased more rapidly than development of urban land near the study sites.
- The rapid chloride increases were likely caused by increased salt application rates, increased baseline conditions (the concentrations during summer low-flow periods) and greater snowfall in the Midwest during the latter part of the study.
"Deicing operations help to provide safe winter transportation conditions, which is very important,” Corsi said. “Findings from this study emphasize the need to consider deicer management options that minimize the use of road salt while still maintaining safe conditions."
Road deicing by cities, counties and state agencies accounts for a significant portion of salt applications, but salt is also used by many public and private organizations and individuals to deice parking lots, walkways and driveways. All of these sources are likely to contribute to these increasing chloride trends.
Other major sources of salt to U.S. waters include wastewater treatment, septic systems, farming operations and natural geologic deposits. However, the new study found deicing activity to be the dominant source in urban areas of the northern U.S.
The USGS conducted this study in cooperation with the Milwaukee Metropolitan Sewerage District. For more information about winter runoff and water-quality, please visit the USGS Wisconsin Water Science Center website.
[Access images for this release at: &amp;lt;a href="http://gallery.usgs.gov/tags/NR2010_09_02" _mce_href="http://gallery.usgs.gov/tags/NR2010_09_02"&amp;gt;http://gallery.usgs.gov/tags/NR2010_09_02&amp;lt;/a&amp;gt;]