On June 18, 2015 in Canberra, Australia, the U.S. Geological Survey and Geoscience Australia signed a comprehensive new partnership to maximize land remote sensing operations and data that can help to address issues of national and international significance.
"This partnership builds on a long history of collaboration between the USGS and Geoscience Australia and creates an exciting opportunity for us to pool resources across our organizations,” said Dr. Frank Kelly, USGS Space Policy Advisor and Director of the USGS Earth Resources Observation and Science Center. “We will work collaboratively to implement a shared vision for continental-scale monitoring of land surface change using time-series of Earth observations to detect change as it happens.”
Dr. Chris Pigram, Geoscience Australia’s Chief Executive Officer, also welcomed the agreement. “This new partnership elevates an already very strong relationship to a new level, and will see both organizations harness their respective skillsets to further unlock the deep understanding of our planet that the Landsat program provides.”
Dr. Kelly and Dr. Pigram both observed, “Our shared vision is to develop systems that enable us to monitor the Earth and detect change as it happens. The ability to do this will be critical to our ability to engage with major challenges like water security, agricultural productivity, and environmental sustainability.”
A key element of the partnership involves a major upgrade to Geoscience Australia’s Alice Springs satellite antenna which will see the station play a much more significant role in the international Landsat ground-station network. Following this $3 million (AUD) upgrade committed to by the Australian Government, the Alice Springs antenna will transmit command-and-control signals to the Landsat satellites and support downloading of satellite imagery for the broader South East-Asia and Pacific region. Alice Springs will be one of only three international collaborator ground stations worldwide playing such a vital role in the Landsat program.
Dr. Kelly noted, “We are very pleased to see such a commitment from Australia to the future success and sustainability of the Landsat program. We appreciate the essential role that Australia continues to play in ensuring that Landsat data for this region is collected and then made available for societal benefit.”
The partnership will also include a strong focus on applying new science and ‘big data’ techniques, such as Geoscience Australia’s Geoscience Data Cube and the USGS’s land change monitoring, assessment, and projection capability, to help users unlock the full value of the data from the Landsat program.
Dr. Suzette Kimball, acting Director of the USGS, recently noted, “We are now beginning to see that the combination of high performance computing, data storage facilities, data preparation techniques, and advanced systems can materially accelerate the value of Landsat data.”
Dr. Kimball added, “By lowering barriers to this technology, we can enable government, research and industry users in the United States and Australia, as well as the broader world, to realize the full benefits of this open-access and freely available data.”
Scientists are expecting that this year’s Gulf of Mexico hypoxic zone, also called the “dead zone,” will be approximately 5,483 square miles or about the size of Connecticut — the same as it has averaged over the last several years.
The dead zone in the Gulf of Mexico affects nationally important commercial and recreational fisheries and threatens the region's economy. Hypoxic zones hold very little oxygen, and are caused by excessive nutrient pollution, primarily from activities such as agriculture and wastewater. The low oxygen levels cannot support most marine life and habitats in near-bottom waters.
This year marks the first time the results of four models were combined. The four model predictions ranged from 4,344 to 5,985 square miles, and had a collective predictive interval of 3,205 to 7,645 square miles, which take into account variations in weather and oceanographic conditions.
The NOAA-sponsored Gulf of Mexico hypoxia forecast has improved steadily in recent years, a result of advancements of individual models and an increase in the number of models used for the forecast. Forecasts based on multiple models are called ensemble forecasts and are commonly used in hurricane and other weather forecasts.
The ensemble models were developed by NOAA-sponsored modeling teams and researchers at the University of Michigan, Louisiana State University, Louisiana Universities Marine Consortium, Virginia Institute of Marine Sciences/College of William and Mary, Texas A&M University, North Carolina State University, and the U.S.Geological Survey (USGS). The hypoxia forecast is part of a larger NOAA effort to deliver ecological forecasts that support human health and well-being, coastal economies, and coastal and marine stewardship.
“NOAA, along with our partners, continues to improve our capability to generate environmental data that can help mitigate and manage this threat to Gulf fisheries and economies,” said Kathryn D. Sullivan, Ph.D., under secretary of commerce for oceans and atmosphere and NOAA administrator. “We are adding models to increase the accuracy of our dead zone forecast."
The Gulf of Mexico hypoxia forecast is based on nutrient runoff and river stream data from the USGS. The USGS operates more than 3,000 real-time stream gauges, 50 real-time nitrate sensors, and collects water quality data at long-term stations throughout the Mississippi River basin to track how nutrient loads are changing over time.
The USGS estimates that 104,000 metric tons of nitrate and 19,300 metric tons of phosphorus flowed down the Mississippi and Atchafalaya rivers into the Gulf of Mexico in May 2015. This is about 21 percent below the long-term (1980-2014) average for nitrogen and 16 percent above the long-term average for phosphorus.
"Real-time nitrate sensors are advancing our understanding of how nitrate is transported in small streams and large rivers, including the main stem of the Mississippi River,” said William Werkheiser, USGS associate director for water. “Long-term monitoring is critical for tracking how nutrient levels are changing in response to management actions and for improving modeling tools to estimate which sources and areas are contributing the largest amounts of nutrients to the Gulf. "
The confirmed size of the 2015 Gulf hypoxic zone will be released in early August, following a monitoring survey led by the Louisiana Universities Marine Consortium from July 28 to August 4.
A new GPS survey of Mount McKinley, the highest point in North America, will update the commonly accepted elevation of McKinley’s peak, 20,320 ft. The last survey was completed in 1953.
The USGS, along with NOAA’s National Geodetic Survey (NGS), and the University of Alaska Fairbanks (UAF), are supporting a Global Positioning System (GPS) survey of the Mount McKinley apex. Surveying technology and processes have improved greatly since the last survey and the ability to establish a much more accurate height now exists. With the acquisition of new elevation (ifsar) data in Alaska as part of the 3D Elevation Program, there have been inquiries about the height of the summit. The survey party is being led by CompassData, a subcontractor for Dewberry on a task awarded under the USGS’ Geospatial Products and Services Contract (GPSC).
Using modern GPS survey equipment and techniques, along with better gravity data to improve the geoid model in Alaska, the partners will be able to report the summit elevation with a much higher level of confidence than has been possible in the past. It is anticipated the newly surveyed elevation will be published by the National Geodetic Survey in late August.
An experienced team of four climbers, one from UAF and three from CompassData, will start the precarious trek to the summit with the needed scientific instruments in tow, in the middle part of June. They plan to return on or before July 7 and begin work with the University of Alaska Fairbanks and NGS processing the data to arrive at the new summit elevation.At 20, 320 feet, Mount McKinley is North America’s highest peak. (Photo courtesy of Todd Paris, UAF). (High resolution image) Climbing Mount McKinley, North America’s highest peak, is a daunting task for even the most experienced mountaineers at Denali National Park in Alaska. (Photo courtesy of National Geographic). (High resolution image) The Mount McKinley survey team, and their equipment, are expected to face temperatures well below zero, high winds and frequent snow. Current forecast, courtesy of NOAA. (Photo courtesy of Todd Paris, UAF). (High resolution image)
Are you a developer, firm, or organization using mobile or web applications to enable your users? The USGS has publicly available geospatial services and data to help your application development and enhancement.
The USGS’ National Geospatial Technical Operations Center (NGTOC) will be hosting a 30- minute webinar on “Using The National Map services to enable your web and mobile mapping efforts” on June 16 at 9am Mountain Time.
This webinar will feature a brief overview of services, data and products that are publicly available, a quick overview on how AlpineQuest, a leading private firm, is leveraging this public data to benefit their users, and a Question & Answer session with a USGS developer to help you get the most out of the national geospatial services.
“This is an opportunity from NGTOC to bring developers and users together for some demonstrations and starting some dialogue,” said Brian Fox, the NGTOC Systems Development Branch Chief. “The webinar format allows us to improve awareness of USGS geospatial services and develop a better understanding of what users and developers need to make our data and services more available and usable.”
To access the webinar, you’ll need to activate Cisco WebEx and call into the conference number (toll free) 855-547-8255 and use the security code: 98212385. The webinar will display through WebEx, and you can access it via this address: http://bit.ly/1RHayxY
The session will be recorded and closed caption option is available during the webinar at: https://recapd.com/w-a3c704
To find out more about this and other NGOC webinar conferences, go to: http://ngtoc.usgs.gov/webinars/webinar_june2015.htmlScreen shot of a mobile mapping service integrating USGS topographic data; hiking and biking trails south of Golden, Colo. Imagery with road and contour data overlaid via AlpineQuest. (high resolution image 631 KB) Screen shot of a mobile mapping service integrating USGS topographic data; hiking and biking trails south of Golden, Colo. Trail data in KML/GPX overlaid via AlpineQuest. (high resolution image 613 KB)
North America may have once been attached to Australia, according to research just published in Lithosphere and spearheaded by U.S. Geological Survey geologist James Jones and his colleagues at Bucknell University and Colorado School of Mines.
Approximately every 300 million years, the Earth completes a supercontinent cycle wherein continents drift toward one another and collide, remain attached for millions of years, and eventually rift back apart. Geologic processes such as subduction and rifting aid in the formation and eventual break-up of supercontinents, and these same processes also help form valuable mineral resource deposits. Determining the geometry and history of ancient supercontinents is an important part of reconstructing the geologic evolution of Earth, and it can also lead to a better understanding of past and present mineral distributions.
North America is a key component in reconstructions of many former supercontinents, and there are strong geological associations between the western United States and Australia, which is one of the world’s leading mineral producers.
In this study, Jones and others synthesized mineral age data from ancient sedimentary rocks in the Trampas and Yankee Joe basins of Arizona and New Mexico. They found that the ages of many zircon crystals—mineral grains that were eroded from other rocks and embedded in the sedimentary deposits—were approximately 1.6 to 1.5 billion years old, an age range that does not match any known geologic age provinces in the entire western United States.
This surprising result actually mirrors previous studies of the Belt-Purcell basin (located in Montana, Idaho and parts of British Columbia, Canada) and a recently recognized basin in western Yukon, Canada, in which many zircon ages between 1.6 and 1.5 billion years old are common despite the absence of matching potential source rocks of this age.
However, the distinctive zircon ages in all three study locations do match the well known ages of districts in Australia and, to a slightly lesser known extent, Antarctica.
This publication marks the first time a complete detrital mineral age dataset has been compiled to compare the Belt basin deposits to strata of similar age in the southwestern United States. “Though the basins eventually evolved along very different trajectories, they have a shared history when they were first formed,” said Jones. “That history gives us clues as to what continents bordered western North America 1.5 billion years ago.”
The tectonic model presented in this paper suggests that the North American sedimentary basins were linked to sediment sources in Australia and Antarctica until the break up of the supercontinent Columbia. The dispersed components of Columbia ultimately reformed into Rodinia, perhaps the first truly global supercontinent in Earth’s history, around 1.0 billion years ago. Continued sampling and analysis of ancient sedimentary basin remnants will remain a critical tool for further testing global supercontinent reconstructions.
Landsat satellite data have been produced, archived, and distributed by the U.S. Geological Survey since 1972. Data users in many different fields depend on this basic Earth observation information to conduct broad investigations of historical land surface change that cross large regions of the globe and span many years. Accordingly, this community of users requires consistently calibrated radiometric data that are processed to the highest standards.
Recognizing the need, the USGS has begun production of higher-level (more highly processed) Landsat data products to help advance land surface change studies. One such product is Landsat surface reflectance data.
Surface reflectance data products approximate what a sensor held just above the Earth’s surface would measure, if conditions were ideal without any intervening artifacts (interference or changing conditions) that may come from the Earth’s atmosphere, different levels of illumination, and the changing geometry of the view by the sensor from hundreds of miles above the Earth. The precise removal of atmospheric artifacts increases the consistency and comparability between images of the Earth’s surface taken at different times of the year and different times of the day.
Surface reflectance and other high level data products can be requested through the USGS Earth Resources Observation and Science (EROS) Center by accessing the EROS Science Processing Architecture (ESPA) interface. Surface reflectance data are also available using the USGS EarthExplorer; select “Landsat CDR” under the tab for datasets.
Although record low precipitation has been the main driver of one of the worst droughts in California history, abnormally high temperatures have also played an important role in amplifying its adverse effects, according to a recent study by the U.S. Geological Survey and university partners.
Experiments with a hydrologic model for the period Oct. 2013-Sept. 2014 showed that if the air temperatures had been cooler, similar to the 1916-2012 average, there would have been an 86% chance that the winter snowpack would have been greater, the spring-summer runoff higher, and the spring-summer soil moisture deficits smaller.
To gauge the effect of high temperatures on drought, lead author Shraddhanand Shukla (University of California – Santa Barbara, UCSB) devised two sets of modeling experiments that compared climate data from water year 2014 (Oct. 2013-Sept. 2014) to similar intervals during 1916-2012.
In the first simulation set, Shukla substituted 2014 temperature values with the historical temperatures for each of the study’s 97 years, while keeping the 2014 precipitation values. In the second simulation set, he combined the observed 2014 temperatures with historical precipitation values for each of the preceding years, 1916-2012.
“This experimental approach allows us to model past situations and tease out the influence of temperature in preceding drought conditions,” said Chris Funk, a USGS scientist and a co-author of the investigation. “By crunching enough data over many, many simulations, the effect of temperature becomes more detectable. We can’t do the same in reality, the here and now, because then we only have a single sample.” Funk, an adjunct professor at UCSB, helps coordinate research at the university that supports USGS programs.
High heat has multiple damaging effects during drought, according to the study, increasing the vulnerability of California’s water resources and agricultural industry. Not only does high heat intensify evaporative stress on soil, it has a powerful effect in reducing snowpack, a key to reliable water supply for the state. In addition to decreased snowpack, higher temperatures can cause the snowpack to melt earlier, dramatically decreasing the amount of water available for agriculture in summer when it is most needed.
Although the study did not directly address the issue of long-term climate change, the implications of higher temperatures are clear.
“If average temperatures keep rising, we will be looking at more serious droughts, even if the historical variability of precipitation stays the same,” Shukla said. “The importance of temperature in drought prediction is likely to become only more significant in the future.”
The research was published online in Geophysical Research Letters, a journal of the American Geophysical Union.
For more information about drought in California, visit the USGS California Water Science Center online.Drought effects at Trinity Lake, a major California reservoir located about 60 miles NW of Redding, California. USGS photo, Tim Reed, Feb. 2014. Photo source: CA Water Science Center
Heidi Koontz ( Phone: 303-202-4763 );
Newly released research from the U.S. Geological Survey describes U.S. hydraulic fracturing (frac) sand deposits and their locations, and provides estimates of frac sand production, consumption, and reserves. A companion map of producing and potential frac sand and resin-coated sand source units in the conterminous U.S. is also included.
The United States is the largest producer and consumer of frac sand in the world with nearly 70 percent of 2014 domestic production coming from the Great Lakes Region, primarily Wisconsin and Minnesota. The specialized silica sand, which consists of natural sand grains with strict mineralogical and textural properties, acts as a proppant (a granular substance that props open fractures) when added to fracking fluids that are injected into unconventional oil and gas wells during hydraulic fracturing.
“These new USGS compilations will provide comprehensive information about frac sand to mining companies, the petroleum industry, and land managers,” said USGS scientist Mary Ellen Benson, principal author of “Frac Sand Sources in the United States”.
Hydraulic fracturing in the U.S. significantly increased around 2004, and frac sand production rapidly grew to meet that demand. “Estimates of Hydraulic Fracturing (Frac) Sand Production, Consumption, and Reserves in the United States” by USGS scientist Don Bleiwas, provides an overview of the frac sand industry, including production, consumption, reserves, and resources.
“Frac Sand Sources in the United States,” by USGS geologists Mary Ellen Benson and Anna Burack Wilson, describes the unique physical properties of frac sand and focuses on the geology and spatial relationships of frac sand sources in the U.S. It also tracks recent published efforts to examine the potential for less optimal frac sand sources, reviews current and future sources in Canada, discusses the emergence of alternative proppants, and provides geologic guidelines for identifying potential new sources.
The papers are contained in a special supplement, Frac Sand Insider Resource Guide, in the May 2015 issue of the magazine Rock Products. A USGS Open-File Report expanding on the geology and containing digital data is expected to be released later this year.Map of producing and potential frac sand and resin-coated source units in the conterminous United States. (High resolution image)
Heidi Koontz ( Phone: 303-202-4763 );
Recently, U.S. Geological Survey researchers and partners working in California’s Channel Islands National Park discovered mammoth remains in uplifted marine deposits that date to about 80,000 years ago, confirming a long-held but never proven hypothesis that mammoths may have been on the Channel Islands long before the last glacial period 25,000 to 12,000 years ago.
“These are the first confidently dated fossils from the California Channel Islands showing that mammoths had been on the islands a long time, not just during the last glacial period,” said lead author and USGS research geologist Dan Muhs. “It supports an older hypothesis that mammoths could have swum from the mainland to the islands any time that conditions were favorable for such a journey, when sea level was low.”
This discovery on Santa Rosa Island, detailed in the online and print journal editions of Quaternary Research, shows that mammoths likely ventured to the islands during at least one earlier glacial period, when sea level was lower than present and the swimming distance from the mainland to the islands was minimal.
The older age of mammoths also challenges the hypothesis that climate change and sea level rise at the close of the last glacial period (about 12,000 years ago) were the causes of mammoth extinction on the Channel Islands. Earlier mammoth populations also would have had to contend with climate change and sea level rise, but apparently survived.
The newly discovered fossil mammoth remains are likely Mammuthus exilis, the pygmy mammoth. The Columbian mammoth immigrated to the islands from the California mainland by swimming and the pygmy mammoth evolved on the islands from this ancestral stock. Most mammoth remains previously reported on the Channel Islands date to the last glacial period, about 25,000 to 12,000 years ago.
Mammoths are iconic animals of the Pleistocene Ice Ages, both in North America and Eurasia. Fossil mammoths and other proboscideans (elephants and their relatives) have also been found on many islands of the Mediterranean.
The risk of extinction for the endangered Florida manatee appears to be lower, according to a new U.S. Geological Survey led study.
Based on the data available in 2012, the long-term probability of the species surviving has increased compared to a 2007 analysis, as a result of higher aerial survey estimates of population size, improved methods of tracking survival rates, and better estimates of the availability of warm-water refuges.
USGS scientists, working with colleagues from several other agencies and universities, used the manatee Core Biological Model to analyze the long-term viability of the manatee population in Florida, and to evaluate the threats it faces. A similar analysis completed in 2007 was used by the U.S. Fish and Wildlife Service as part of its 5-year Review of the status of manatees.
“Our analysis using data from 2007 estimated that there was nearly a nine percent chance of Florida manatee numbers falling below 250 adults over the next 100 years on either the Atlantic or Gulf Coast,” said Michael Runge, a USGS research ecologist and lead author of the study. “The current analysis, using data available in 2012, has the estimate dropping to a fraction of one percent, but we need to be cautious in our conclusion, because the analysis did not include several mortality events that have occurred since then.
The mortality events Runge was referencing were cold winters, loss of seagrass in prime habitat, and a red tide event, all of which affected the population.
“Although the estimated status in 2012 was better than in 2007, questions still remain about the population effects of the more recent cold-related mortality events in the winters of 2009-10 and 2010-11,” Runge said. “The 2012 analysis also does not account for the extensive loss of seagrass habitat in Indian River Lagoon in 2011 and 2012 nor the severe red tide event in the Southwest region of Florida in 2013.”
The potential effects of these events will be analyzed in the next update of the Core Biological Model, which is underway in collaboration with Florida Fish and Wildlife Research Institute and Mote Marine Laboratory, and is expected to be complete within the next year.
The major threats to long-term survival of Florida manatees remain boat-related deaths and loss of warm-water winter habitat. In the Southwest region, an increasing frequency of red-tide deaths also warrants concern.
Manatees are large, gentle, herbivorous, slow-moving mammals. They are entirely aquatic, and their range is limited by temperature. Manatees cannot survive for extended periods in water colder than about 17°C (63°F), and prefer temperatures warmer than 22°C (72°F). Manatees live in shallow fresh, brackish, and marine aquatic habitats, traveling readily among them. In Florida, they travel considerable distances during the winter to access warm water refuges, such as artesian springs and the heated discharges of power generating plants. Some individuals also travel long distances during the warm season.
The publication “Status and threats analysis for the Florida Manatee (Trichechus manatus latirostris), 2012,” USGS Open-File Report 2015-1083, by M. C. Runge, C. A. Langtimm, J. Martin, and C. J. Fonnesbeck is available online.
Several of the 812 new US Topo quadrangles for Louisiana now display public trails along with improved data layers. Other significant additions include public land survey system information (PLSS), redesign of map symbols, enhanced railroad information and new road source data.
“I am very excited about the 2015 US Topo maps for Louisiana!” said R. Hampton Peele, GIS Coordinator for the Louisiana Geological Survey. “These maps will provide a great reference for our Cartographic Section as we compile our annual geologic map deliverables for the USGS.”
For Louisiana recreationalists and visitors who want to explore the diverse Gulf coast landscape on a bicycle, hiking, horseback or other means, the new trail features on the US Topo maps will come in handy. During the past two years the IMBA, in a partnership with the MTB Project, has been building a detailed national database of trails. This activity allows local IMBA chapters, IMBA members, and the public to provide trail data and descriptions through their website. The MTB Project and IMBA then verify the quality of the trail data provided, ensure accuracy and confirm the trail is legal. This unique crowdsourcing venture has increased the availability of trail data available through The National Map mobile and web apps, and the revised US Topo maps.
Additionally, a widely anticipated addition to the new Louisiana US Topo maps is the inclusion of Public Land Survey System data. PLSS is a way of subdividing and describing land in the US. All lands in the public domain (lands owned by the federal government) are subject to subdivision by this rectangular system of surveys, which is regulated by the U.S. Department of the Interior.
“The US Topo maps provide an excellent instructional tool in our GIS Certification Program,” said Brent Yantis, Director of the University of Louisiana Lafayette Regional Application Center. “They orient students to their environment and provide a fundamental foundation in the development of geospatial concepts. We look forward to this new release.”
These new maps replace the first edition US Topo maps for the Pelican State and are available for free download from The National Map, the USGS Map Locator & Downloader website , or several other USGS applications.
To compare change over time, scans of legacy USGS topo maps, some dating back to the late 1800s, can be downloaded from the USGS Historical Topographic Map Collection.
For more information on US Topo maps: http://nationalmap.gov/ustopo/Updated 2015 version of Saint Landry quadrangle with orthoimage turned on. (1:24,000 scale) (high resolution image 1.3 MB) Updated 2015 version of the Saint Landry quadrangle with the orthoimage turned off to better see the contour intervals. (1:24,000 scale) (high resolution image 1.1 MB) Scan of the 1935 USGS quadrangle of the Turkey Creek area (which covers the Saint Landry map) from the USGS Historic Topographic Map Collection. (1:62, 500 scale) (high resolution image 1.6 MB)
The U.S. Geological Survey National Geospatial Program is developing the 3D Elevation Program (3DEP) to respond to growing needs for high-quality topographic data and for a wide range of other three-dimensional (3D) representations of the Nation's natural and constructed features.
To expand awareness of 3DEP status and plans, as well as provide an open forum for 3DEP stakeholders to communicate and coordinate potential Broad Agency Announcement (BAA) proposals, the USGS is offering numerous state and regional coordination workshops. The meetings will be held throughout the US between early May and June 30th. Locations, dates, times and registration information can be found at: http://1.usa.gov/1IMab1H. The workshops will include in-person and/or virtual participation options.
The primary goal of 3DEP is to systematically collect 3D elevation data in the form of light detection and ranging (lidar) data over the conterminous United States, Hawaii, and the U.S. territories, with data acquired over an 8-year period. Interferometric synthetic aperture radar (ifsar) data will be acquired for Alaska, where cloud cover and remote locations preclude the use of lidar in much of the State. The 3DEP initiative is based on the results of the National Enhanced Elevation Assessment that documented more than 600 business uses across 34 Federal agencies, all 50 States, selected local government and Tribal offices, and private and nonprofit organizations. A fully funded and implemented 3DEP would provide more than $690 million annually in new benefits to government entities, the private sector, and citizens.
3DEP is a "Call for Action" because no one entity can accomplish it independently. 3DEP presents a unique opportunity for collaboration between all levels of government, to leverage the services and expertise of private sector mapping firms that acquire the data, and to create jobs now and in the future. When partners work together, they can achieve efficiencies and lower costs so that 3DEP can become a reality. When 3D elevation data are available to everyone, new innovations will occur in forest resource management, alternative energy, agriculture, and other industries for years to come.
The annual Broad Agency Announcement (BAA) is a competitive solicitation issued to facilitate the collection of lidar and derived elevation data for 3DEP. Federal agencies, state and local governments, tribes, academic institutions and the private sector are eligible to submit proposals. The 3DEP public meetings will introduce this opportunity to the broadest stakeholder community possible and provide a forum for interested parties to discuss elevation data collection needs of mutual interest that could be addressed by a coordinated investment.Map depicts the proposed body of work for 3DEP in Fiscal Year 2015. The BAA awards will add more than 95,000 square miles of 3DEP quality lidar data to the national database. (high resolution image 98 MB)
Water contamination by hormone-disrupting pollutants is a concern for water quality around the world. Existing research has determined that elevated concentrations of Bisphenol-A (BPA), a chemical used in consumer products such as plastic food storage and beverage containers, have been deposited directly into rivers and streams by municipal or industrial wastewater. Now, researchers from the University of Missouri and the U.S. Geological Survey have assessed Missouri water quality near industrial sites permitted to release BPA into the air. As a result, scientists now believe that atmospheric releases may create a concern for contamination of local surface water leading to human and wildlife exposure.
“There is growing concern that hormone disruptors such as BPA not only threaten wildlife, but also humans,” said Chris Kassotis, a doctoral candidate in the Division of Biological Sciences in the College of Arts and Science at MU. “Recent studies have documented widespread atmospheric releases of BPA from industrial sources across the United States. The results from our study provide evidence that these atmospheric discharges can dramatically elevate BPA in nearby environments.”
Water sampling sites were selected based on their proximity to the Superfund National Priorities List (NPL) or locations with reported atmospheric discharges of BPA as identified by the Environmental Protection Agency. Current or historical municipal wastewater treatment sites, which have been shown in the past to contribute hormonally active chemicals to surface water from urban or industrial sources, were also tested. Finally, relatively clean sites were chosen to serve as the control group.
The water then was analyzed for concentrations of BPA, Ethinyl estradiol (EE2), an estrogen commonly used in oral contraceptive pills, and several wastewater compounds. Scientists also measured the total estrogen and receptor activities of the water. This approach is used to measure all chemicals present in the water that are able to bind to and activate (or inhibit) the estrogen or androgen receptors in wildlife and humans. Levels of chemicals were highest in samples with known wastewater treatment plant discharges.
“In addition, we were surprised to find that BPA concentrations were up to 10 times higher in the water near known atmospheric release sites,” said Don Tillitt, adjunct professor of biological sciences at MU, and biochemistry and physiology branch chief with the USGS Columbia Environmental Research Center. “This finding suggests that atmospheric BPA releases may contaminate local surface water, leading to greater exposure of humans or wildlife.”
Concentrations of BPA measured in surface water near these sites were well above levels shown to cause adverse health effects in aquatic species, Kassotis said.
The study, “Characterization of Missouri surface waters near point sources of pollution reveals potential novel atmospheric route of exposure for bisphenol A and wastewater hormonal activity pattern,” was published in the journal, Science of the Total Environment, with funding from MU, the USGS Contaminants Biology Program (Environmental Health Mission Area), and STAR Fellowship Assistance Agreement awarded by the U.S. EPA.
WELLSBORO, Pa. — A piece of the restoration puzzle to save populations of endangered freshwater mussels may have been found, according to a recent U.S. Geological Survey led study. Local population losses in a river may not result in irreversible loss of mussel species; other mussels from within the same river could be used as sources to restore declining populations.
Though they serve a critical role in rivers and streams, freshwater mussels are threatened by habitat degradation such as dams, alteration to river channels, pollution and invasive species. Mussels filter the water and provide habitat and food for algae, macroinvertebrates, and even fish, which are necessary components of aquatic food webs.
“Few people realize the important role that mussels play in the ecosystem," said USGS research biologist Heather Galbraith, lead author of the study. "Streams and rivers with healthy mussel populations tend to have relatively good water quality which is good for the fish and insects that also inhabit those systems."
Mussels in general are poorly understood and difficult to study. Because of this lack of knowledge, population genetics has become a useful tool for understanding their ecology and guiding their restoration.
More than 200 of the nearly 300 North American freshwater mussel species are imperiled, with rapidly dwindling populations. Researchers are providing information to resource managers, who are working to reverse this trend. USGS led research suggests that re-introducing mussels within the same river could reverse population declines without affecting the current genetic makeup of the population.
The research shows that patterns in the genetic makeup of a population occurs within individual rivers for freshwater mussels; and that in the study area, mussels from the same river could be used for restoration.
“That genetic structuring is occurring within individual rivers is good news, because it may be a means of protecting rare, threatened and endangered species from impending extinction,” said Galbraith. “Knowing the genetic structure of a freshwater mussel population is necessary for restoring declining populations to prevent factors such as inbreeding, high mutation rates and low survivorship.”
Knowing that mussels in the same river are similar genetically opens up opportunities for augmenting declining populations or re-introducing mussels into locations where they were historically found. The genetics also highlight the importance of not mixing populations among rivers without additional studies to verify the genetic compatibility of mussels within those rivers.
The international team of researchers from Canada and the United States working to understand mussel genetics found similar genetic patterns among common and endangered mussel species. This is important information for mussel biologists because studying endangered species can be difficult, and researchers may be able to study the genetic structure of common mussels and generalize the patterns to endangered mussels.
Although understanding the genetic structure of mussel populations is important for restoration, genetic tools do have limitations. Researchers found that despite drastic reductions in freshwater mussel populations, there was little evidence of this population decline at the genetic level. This may be due to the extremely long lifespan of mussels, some of which can live to be more than 100 years old.
“Genetics, it turns out, is not a good indicator of population decline; by the time we observe a genetic change, it may be too late for the population,” said Galbraith.
By way of comparison, in fruit flies, which have short lifespans, genetic changes show up quickly within a few generations. Mussels, on the other hand, are long lived animals; therefore it may take decades to see changes in their genetic structure within a population.
The study examined six species of freshwater mussels in four Great Lakes Tributaries in southwestern Ontario. The species are distributed across the eastern half of North America and range in status from presumed extinct to secure. The six mussels were the snuffbox, Epioblasma triquetra; kidneyshell, Ptychobranchus fasciolaris; mapleleaf, Quadrula quadrula; wavy-rayed lampmussel, Lampsilis fasciola; Flutedshell Lasmigona costata; and the threeridge mussel Amblema plicata.
The study, “Comparative analysis of riverscape genetic structure in rare, threatened and common freshwater mussels” is available online in the journal Conservation Genetics.
For more information on freshwater mussels please visit Stranger than Fiction: The Secret Lives of Freshwater Mussels.
USGS has released a preliminary methodology to assess the population level impacts of onshore wind energy development on birds and bats. This wind energy impacts assessment methodology is the first of its kind, evaluating national to regional scale impacts of those bats and birds that breed in and migrate through the United States. The methodology focuses primarily on the effects of collisions between wildlife and turbines.
Primary uses of this new methodology, which is complementary to and incorporates detailed studies and demographic models USGS conducts on key species, include:
- Quantitative measuring of the potential impacts to species’ populations through demographic modeling and the use of potential biologic removal methods.
- Ranking species in terms of their direct and indirect relative risk to wind energy development.
- Recommending species for more intensive demographic modeling or study.
- Highlighting species for which the effects of wind energy development on their populations are projected to be small.
This new draft methodology is based on a robust quantitative and probabilistic framework used by the USGS in energy resource assessments. The assessment methodology also incorporates publicly available information on fatality incidents, population estimates, species range maps, turbine location data and biological characteristics.
The methodology includes a qualitative risk ranking component, as well as a generalized population modelling component. The USGS also repurposed a well-established marine mammal conservation method known as Potential Biological Removal. This methodology identifies the maximum number of animals—not including natural deaths—that may be removed from a marine mammal population while allowing it to reach or maintain its optimum sustainable population. The USGS uses the Potential Biological Removal tool to compare the observed fatalities from collisions with wind turbines to the estimated number of fatalities that can occur before a population would decline.
This methodology also builds on previous USGS research on wind energy, for example, the USGS WindFarm map, released early 2014, that shows the location of all land-based wind turbines in the United States.
Applying expertise in biology, ecology, mapping and resource assessment, the USGS has contributed to the Department of the Interior’s Powering Our Future Initiative with this methodology to quantify the impact of wind energy development on birds and bats.
Throughout the course of this project, USGS scientists have engaged in discussions with a variety of partners and stakeholders, such as the U.S. Fish and Wildlife Service, National Oceanic and Atmospheric Administration, the Bureau of Land Management, Department of Energy, and Department of Defense, as well as industry, non-governmental organizations, and universities. The USGS will now solicit technical comments on this methodology from an expert panel external to the USGS and will consider these comments in developing the final methodology.
Additional ongoing USGS research is focused on understanding potential impacts to wildlife species on a national, regional, and localized scale. Examples of these efforts include developing wildlife and mortality survey protocols, estimating causes and magnitude of fatalities, assessing population level effects, describing bird migration and movement patterns, understanding wildlife interactions with turbines, and developing technologies to reduce fatalities from interactions with turbines.
Marisa Lubeck ( Phone: 303-526-6694 );
New research can help water resource managers quantify critical groundwater resources and assess the sustainability of long-term water use in Minnesota.
U.S. Geological Survey scientists recently estimated annual rates of potential recharge, or the natural replenishment of groundwater, over 15 years across Minnesota. According the study, the statewide mean annual potential recharge rate from 1996‒2010 was 4.9 inches per year (in/yr). Recharge rates increased from west to east across the state and April generally had the highest potential recharge.
Improved estimates of recharge are necessary because approximately 75 percent of drinking water and 90 percent of agricultural irrigation water in Minnesota are supplied from groundwater.
“Resource managers in Minnesota can use this study to help inform water use or water conservation guidelines throughout the state,” said USGS scientist and lead author of the report, Erik Smith.
To maintain a stable supply of groundwater, recharge rates must be high enough to compensate for water that is lost to streams, lakes and other surface-water bodies, or removed for uses such as agriculture. The scientists used data about daily precipitation, minimum and maximum daily temperatures, land cover and soil to model Minnesota’s recharge rates.
During the study period, mean annual potential recharge estimates across Minnesota ranged from less than 0.1 to 17.8 in/yr. Other findings include:
- The highest annual mean recharge estimate across the state was in 2010 at 7 inches, and the lowest mean recharge estimate was 1.3 inches in 2003.
- Some of the lowest potential recharge rates were in the Red River of the North Basin in northwestern Minnesota, generally between 1 and 1.5 in/yr.
- The highest potential recharge rates were in northeastern Minnesota and the Anoka Sand Plain in central Minnesota.
- Eighty-eight percent of the mean annual potential recharge rates were between 2 and 8 in/yr.
- April had the greatest monthly mean at 30 percent of the yearly recharge.
The USGS partnered with the Minnesota Pollution Control Agency on the new study.
For more information on groundwater in Minnesota, please visit the USGS Minnesota Water Science Center website.
The latest coal resource assessment of the Powder River Basin showcases the newly revised USGS’ assessment methodology, which, for the first time, includes an estimate of the reserve base for the entire basin.
The coal reserve base includes those resources that are currently economic (reserves), but also may encompass those parts of a resource that have a reasonable potential for becoming economically available within planning horizons. The complete, final assessment results are available in two USGS publications released today: Professional Paper 1809 and Data Series 912.
The Powder River Basin contains one of the largest resources of low-sulfur, low-ash, subbituminous coal in the world and is the single most important coal basin in the United States.
The most important distinction between this Powder River Basin coal assessment and other, prior assessments, was the inclusion of mining and economic analyses to develop an estimate of the portion of the total resource that is potentially recoverable, not just the original (in-place) resources. Prior resource assessments relied on net coal thickness maps for only selected beds, which provided only in-place resource estimates.
The key to performing the economic analyses was gathering and interpreting a sufficient amount of recent geological data from the extensive coal bed methane development over the past 20 years in the Powder River Basin. This wealth of new data was essential to enable modeling and mapping of all of the significant individual coal beds over the entire Powder River Basin for the first time.
The revised USGS assessment methodology resulted in an estimated original resource of about 1.16 trillion short tons in the Powder River Basin, of which 162 billion short tons are considered recoverable resources (coal reserve base) at a stripping ratio of 10:1 or less. An estimated 25 billion short tons of that coal reserve base met the definition of reserves. A 10:1 stripping ratio is approximately estimated by dividing the total thickness of rock mined to the total thickness of coal recovered.
The coal reserve base includes those resources that are currently economic (reserves), but also may encompass those parts of a resource that have a reasonable potential for becoming economically available. This reserve estimate does not mean that the total amount of coal left in the Powder River Basin could be produced by surface mining technologies. The costs of mining and coal sales prices are not static as both tend to increase over time if supported by demand. If future market prices continue to exceed mining costs, portions of the coal reserve base would be elevated to reserve status (and the converse).
The estimate of the current reserves along with the total coal reserve base provide more meaningful resource information for use by energy planners from local to national perspectives rather than just total in-place resource quantities..
Although no underground mining in the Powder River Basin is expected to occur in the foreseeable future, a substantial, deeper coal resource in beds 10–20 feet thick is estimated at 304 billion short tons in the region.
The USGS Energy Resources Program research efforts yield comprehensive, digital assessments of the quantity, quality, location, and accessibility of the Nation’s coal resources.
To learn more about this or other geologic assessments, please visit the USGS Energy Resources Program website. Stay up to date with USGS energy science by subscribing to our newsletter or by following us on Twitter.
Mining companies, land managers, and regulators now have a wealth of tools to aid in reducing potential mining impacts even before the mine gets started. USGS and various research partners released a special edition of papers specifically targeted at providing modern environmental effect research for modern mining techniques.
Minerals play an important role in the global economy, and, as rising standards of living have increased demand for those minerals, the number and size of mines have increased, leading to larger potential impacts from mining.
“Approaches to protecting the environment from mining impacts have undergone a revolution over the past several decades,” said USGS mineral and environmental expert Bob Seal. “The sustainability of that revolution relies on an evolving scientific understanding of how mines and their waste products interact with the environment.”
Many research conclusions are contained in the special issue, and some of the primary findings are listed here:
- USGS evaluated several tools for predicting pre-mining baseline conditions at a mine, even if no baseline was established. This will make it easier to remediate the mine after it closes.
- USGS also took tools used to screen mine waste for contaminants and tested them for predicting potential sources for contaminants before the mine even got started.
Mitigating while Mining
- Because slag is the byproduct of mineral processing, its physical and chemical properties depend a lot on what the original mined mineral material was.
- Slag from copper, zinc, or nickel may be less attractive for reuse, since it has a higher potential to negatively impact the environment than slag that came from iron or steel production.
- Gold mining runs a lower risk of contaminating the environment with cyanide if mines give enough time for it to safely evaporate and be broken down by sunlight.
- Mine drainage is incredibly complicated. It doesn’t come from a single source, but rather complex interactions between water, air, and micro-organisms like bacteria.
- Mine drainage is not just acid mine drainage—it can be basic, neutral, or even high in salts. All of these drainage types have their own impacts.
- Mine drainage concentrations in streams can actually change based on the time of day.
- USGS tested many of the existing techniques for figuring out what toxic contaminants wind up in stream sediments so managers know the right one for the right job.
- USGS also evaluated a new technique for predicting how toxic certain metals will be in aquatic environments.
The research papers are contained in a special issue of the journal Applied Geochemistry. This research was conducted by scientists from USGS and several collaborating organizations, including the Geological Survey of Canada, InTerraLogic, Montana Bureau of Mines and Geology, Montana Tech, SUNY Oneonta, the University of Maryland, the University of Montana, and the University of Waterloo.
USGS minerals research can help to identify problems before they become problems, or at the very least, help address the impacts that do exist. Learn more about USGS minerals research here, or follow us on www.twitter.com/usgsminerals.
A pine siskin stands on the branch of a northern conifer tree. Photo, USFWS National Digital Library. (High resolution image)
Weaving concepts of ecology and climatology, recent interdisciplinary research by USGS and several university partners reveals how large-scale climate variability appears to connect boom-and-bust cycles in the seed production of the boreal (northern conifer) forests of Canada to massive, irregular movements of boreal birds.
These boreal bird “irruptions” — extended migrations of immense numbers of birds to areas far outside their normal range — have been recorded for decades by birders, but the ultimate causes of the irruptions have never been fully explained.
“This study is a textbook example of interdisciplinary research, establishing an exciting new link between climate and bird migrations” said USGS acting Director Suzette Kimball. “A vital strength of our organization is our ability to pursue scientific issues across the boundaries of traditional academic disciplines.”
The investigation was based on statistical analysis of two million observations of the pine siskin (a finch, Spinus pinus) recorded since 1989 by Project FeederWatch, a citizen science program managed by the Cornell Lab of Ornithology. By methodically counting the birds they see at their feeders from November through early April, FeederWatchers help scientists track continent-wide movements of winter bird populations.
One of several nomadic birds that breed during summer in Canadian boreal forests, pine siskins feed on seed crops of conifers and other tree species. When seed is abundant locally, pine siskins also spend the autumn and winter there. In other years, they may irrupt, migrating unpredictably hundreds or even thousands of kilometers to the south and east in search of seed and favorable habitat. “Superflights” is the term applied to winters (e.g.1997-1998, 2012-2013) when boreal species have blanketed bird feeders across the U.S.
The irruptions of pine siskins and other boreal species follow a lagging pattern of intermittent, but broadly synchronous, accelerated seed production (“masting”) by trees in the boreal forest. Widespread masting in pines, spruces, and firs is driven primarily by favorable climate during the two or three consecutive years required to initiate and mature seed crops. Leading up to masting events, the green developing cones and the promise of abundant seed stimulate higher reproductive rates in birds.
However, seed production is expensive for trees and tends to be much reduced in the years following masting. Consequently, meager seed crops in the years following masting drive boreal birds to search elsewhere for food and overwintering habitat.
The key finding of the new research is that the two principal pine siskin irruption modes – North to South and West to East – correlate closely with spatial patterns of climate variability across North America that are well understood by climatologists. Not surprisingly, severely cold winters tend to drive birds south during the irruption year.
More subtly, the researchers found that favorable and unfavorable climatic conditions of regularly juxtaposed regions called “climate dipoles” two years prior to the irruption also appear to push and pull bird migrations across the continent.
USGS co-author Julio Betancourt commented, “Our study underscores the value of continent-wide biological monitoring. In this case, avid birders across the U.S. and Canada have contributed sustained observations of birds at the same broad geographic scale in which weather and climate have also been observed and understood.”
The research study, authored by Court Strong (University of Utah), Ben Zuckerberg (University of Wisconsin-Madison), Julio Betancourt (USGS-Reston), and Walt Koenig (Cornell University), was published May 11 online in the Proceedings of the National Academy of Sciences.
Storage tanks for produced water from natural gas drilling in the Marcellus Shale gas play of western Pennsylvania. USGS photo, Doug Duncan. (High resolution image)
In a study of 13 hydraulically fractured shale gas wells in north-central Pennsylvania, USGS researchers found that the microbiology and organic chemistry of the produced waters varied widely from well to well.
The variations in these aspects of the wells followed no discernible spatial or geological pattern but may be linked to the time a well was in production. Further, the study highlighted the presence of some organic compounds (e.g. benzene) in produced waters that could present potential risks to human health, if the waters are not properly managed.
Produced water is the term specialists use to describe the water brought to the land surface during oil, gas, and coalbed methane production. This water is a mixture of naturally occurring water and fluid injected into the formation deep underground to enhance production. A USGS Fact Sheet on produced water provides more background information and terminology definitions.
Although the USGS investigators found that the inorganic (noncarbon-based) chemistry of produced waters from the shale gas wells tested in the Marcellus region was fairly consistent from well to well and meshed with comparable results of previous studies (see USGS Energy Produced Waters Project), the large differences in the organic geochemistry (carbon-based, including petroleum products) and microbiology (e.g. bacteria) of the produced waters were striking findings of the study.
“Some wells appeared to be hotspots for microbial activity,” observed Denise Akob, a USGS microbiologist and lead author of the study, “but this was not predicted by well location, depth, or salinity. The presence of microbes seemed to be associated with concentrations of specific organic compounds — for example, benzene or acetate — and the length of time that the well was in production.”
The connection between the presence of organic compounds and the detection of microbes was not, in itself, surprising. Many organic compounds used as hydraulic fracturing fluid additives are biodegradable and thus could have supported microbial activity at depth during shale gas production.
The notable differences in volatile organic compounds (VOCs) from the produced waters of the tested wells could play a role in the management of produced waters, particularly since VOCs, such as benzene, may be a health concern around the well or holding pond. In wells without VOCs, on the other hand, disposal strategies could concentrate on issues related to the handling of other hazardous compounds.
Microbial activity detected in these samples could turn out to be an advantage by contributing to the degradation of organic compounds present in the produced waters. Potentially, microbes could also serve to help mitigate the effects of organic contaminants during the disposal or accidental release of produced waters. Additional research is needed to fully assess how microbial activity can best be utilized to biodegrade organic compounds found in produced waters.
The research article can be found in the most recent edition of Applied Geochemistry, Special Issue on Shale Gas Geochemistry.