A new study based on Earth-observing satellite data comprehensively describes changes in the world's forests from the beginning of this century. Published in Science today, this unparalleled survey of global forests tracked forest loss and gain at the spatial granularity of an area covered by a baseball diamond (30-meter resolution).
Led by Matthew C. Hansen of the University of Maryland and assisted by USGS co-author Thomas R. Loveland, a team of scientists analyzed data from the Landsat 7 satellite to map changes in forests from 2000 to 2012 around the world at local to global scales. The uniform data from more than 650,000 scenes taken by Landsat 7 ensured a consistent global perspective across time, national boundaries, and regional ecosystems.
"Tracking changes in the world's forests is critical because forests have direct impacts on local and national economies, on climate and local weather, and on wildlife and clean water," said Anne Castle, Assistant Secretary of the Interior for Water and Science. "This fresh view of recent changes in the world’s forests is thorough, objective, visually compelling, and vitally important."
Overall, the study found that from 2000 to 2012 global forests experienced a loss of 888,000 square miles (2.3 million square kilometers), roughly the land area of the U.S. states east of the Mississippi River. During the study period, global forests also gained an area of 309,000 square miles (800,000 square kilometers), approximately the combined land area of Texas and Louisiana.
The global survey found that Russia experienced the most forest loss overall (in absolute numbers) over the study period. Brazil was the nation with the second highest level of forest loss, but other countries, including Malaysia, Cambodia, Cote d’Ivoire, Tanzania, Argentina and Paraguay, experienced a greater proportional loss of forest cover. Indonesia exhibited the largest increase in forest loss; its losses on an annual basis during 2011-12 were twice what they were during 2000-03.
Brazil is a global exception in terms of forest change during this timeframe, with a dramatic policy-driven reduction in the rate of deforestation in the Amazon Basin. Brazil's use of free Landsat data in documenting trends in deforestation was crucial to its policy formulation and implementation. To date, only Brazil produces and shares spatially explicit information on annual forest extent and change.
In the United States, the most intensive forest change was noted in the southeastern states where pine plantations allow for cyclic tree harvesting for timber, followed by immediate planting of tree replacements. In this area, over 30 percent of the forest cover was either lost or regrown during the study period.
Deforestation as well as deliberate forest regrowth are human factors that accounted for most of the forest change. Natural forces — for instance, wildfire, windstorms, insect infestations, and regrowth of abandoned agricultural areas — also caused forest changes, which were also methodically mapped.
"Ever since the USGS made Landsat data free to anyone in 2008, Landsat imagery has served as a reliable common record, a shared vocabulary of trusted data about Earth conditions," Castle continued. "It's been said that the free data policy is like giving every person on the globe a free library card to the world's best library on Earth observations."
"With the free data policy, we have seen a remarkable revolution in the use of Landsat for documenting the changes in the Earth's land cover," said Tom Loveland, chief scientist at the USGS Earth Resources Observation and Science (EROS) Center and a co-author of the study. "This multi-organization project was only feasible with the existence of free Landsat data. The invaluable Landsat archive supplies high-quality, long-term, consistent global data at a scale appropriate for tracking forest gains and losses."
The research team included scientists from the University of Maryland, the U.S. Geological Survey, Google, the State University of New York, Woods Hole Research Center, and South Dakota State University. The collaborative study is published in the Nov. 15 issue of the journal Science.
Key to the project was collaboration with the Google Earth Engine team, who leveraged sophisticated cloud computing technology to enable University of Maryland researchers to compute the vast amount of data in a matter of days. Additional project funding was provided to the University of Maryland by the Gordon and Betty Moore Foundation.
Since 1972, the Landsat program has played a critical role in supplying continuous, objective data that can be used to monitor, understand, and manage the resources needed to sustain human life, such as food, water, and forests. The Landsat 8 satellite launched in February 2013 is designed to extend the four-decade Landsat record of Earth observation. The Landsat program is jointly managed by NASA and USGS.
- "High-resolution global maps of 21st-century forest cover change," Science, 15 Nov 2013.
- Forest cover maps in Google Earth Engine
- Global Forest Change animation videos
- USGS Landsat
- NASA Landsat
Learn even more by participating
November 18 at 10:00 a.m. PT, 1:00 ET
The USGS, in cooperation with other Federal agencies, has posted new Ohio US Topo quadrangles (748 maps) which include partial Public Land Survey System (PLSS). Ohio is the first state east of the Mississippi River to have PLSS data added to US Topo maps, joining Wyoming and Colorado in the west.
"It is great to have these 748 updated US Topo maps for our state available online at no charge," said Charley Hickman, the Geospatial Liaison for Ohio. "We appreciate the continuing improvements in this product, including the availability of PLSS township, range, and section information."
The PLSS is a way of subdividing and describing land in the United States. All lands in the public domain are subject to subdivision by this rectangular system of surveys, which is regulated by the U.S. Department of the Interior. Other selected states will begin getting PLSS map data during the next respective revision cycle.
The new design for US Topo maps improves readability of maps for online and printed use, while retaining the look and feel of the traditional USGS topo map. Map symbols are easy to read when the digital aerial photograph layer imagery is turned on.
Other re-design enhancements and new features:
- New shaded relief layer for enhanced view of the terrain
- Military installation boundaries, post offices and cemeteries
- New road classification
- A slight screening (transparency) has been applied to some features to enhance visibility of multiple competing layers
- New PDF legend attachment
- Metadata formatted to support multiple browsers
US Topo maps are created from geographic datasets in The National Map, and deliver visible content such as high-resolution aerial photography, which was not available on older paper-based topographic maps. The new US Topo maps provide modern technical advantages that support wider and faster public distribution and on-screen geographic analysis tools for users.
The new digital topographic maps are PDF documents with geospatial extensions (GeoPDF®) image software format and may be viewed using Adobe Reader, available as a no-cost download.
These new quads replace the first edition US Topo maps for Ohio. The replaced maps will be added to the USGS Historical Topographic Map Collection and are also available for free download from The National Map and the USGS Map Locator & Downloader website.
US Topo maps are updated every three years. The initial round of the 48 conterminous state coverage was completed in September of 2012. Hawaii and Puerto Rico maps have recently been added. More than 400 new US Topo maps for Alaska have been added to the USGS Map Locator & Downloader, but will take several years to complete.
For more information, go to: http://nationalmap.gov/ustopo/
USGS scientists have determined that high-salinity groundwater found more than 1,000 meters (0.6 mi.) deep under the Chesapeake Bay is actually remnant water from the Early Cretaceous North Atlantic Sea and is probably 100-145 million years old. This is the oldest sizeable body of seawater to be identified worldwide.
Twice as salty as modern seawater, the ancient seawater was preserved like a prehistoric fly in amber, partly by the aid of the impact of a massive comet or meteorite that struck the area about 35 million years ago, creating Chesapeake Bay.
"Previous evidence for temperature and salinity levels of geologic-era oceans around the globe have been estimated indirectly from various types of evidence in deep sediment cores," said Ward Sanford, a USGS research hydrologist and lead author of the investigation. "In contrast, our study identifies ancient seawater that remains in place in its geologic setting, enabling us to provide a direct estimate of its age and salinity."
The largest crater discovered in the United States, the Chesapeake Bay impact crater is one of only a few oceanic impact craters that have been documented worldwide.
About 35 million years ago a huge rock or chunk of ice traveling through space blasted a 56-mile-wide hole in the shallow ocean floor near what is now the mouth of the Chesapeake Bay. The force of the impact ejected enormous amounts of debris into the atmosphere and spawned a train of gigantic tsunamis that probably reached as far as the Blue Ridge Mountains, more than 110 miles away.
The impact of the comet or meteorite would have deformed and broken up the existing arrangement of aquifers (water-bearing rocks) and confining units (layers of rock that restrict the flow of groundwater). Virginia's "inland saltwater wedge" is a well-known phenomenon that is thought to be related to the impact crater. The outer rim of the crater appears to coincide with the boundary separating salty and fresh groundwater.
"We knew from previous observations that there is deep groundwater in quite a few areas in the Atlantic Coastal Plain around the Chesapeake Bay that have salinities higher than seawater,” said Jerad Bales, acting USGS Associate Director for Water. "Various theories related to the crater impact have been developed to explain the origin of this high salinity. But, up to this point, no one thought that this was North Atlantic Ocean water that had essentially been in place for about 100 million years."
"This study gives us confidence that we are working directly with seawater that dates far back in Earth’s history,” Bales continued. “The study also has heightened our understanding of the geologic context of the Chesapeake Bay region as it relates to improving our understanding of hydrology in the region."
USGS Professional Paper 1612. "The Effects of the Chesapeake Bay Impact Crater on the Geological Framework and Correlation of Hydrogeologic Units of the Lower York-James Peninsula, Virginia."
New research by the U.S. Geological Survey conducted on the Delmarva Peninsula, which forms the Eastern Shore of the Chesapeake Bay, indicates it may take several decades for many water-quality management practices aimed at reducing nitrogen input to the Bay to achieve their full benefit due to the influence of groundwater.
The USGS findings provide critical information on how long it may take to see the water quality in the Bay improve as more stringent practices are implemented to reduce nutrients and sediment to tidal waters. Having established a calculation for the total nitrogen, phosphorus, and sediment pollutants that are allowable for the Chesapeake watershed, known as the total maximum daily load (TMDL), the U.S. Environmental Protection Agency (EPA) is working with Maryland, Pennsylvania, Virginia and the four other Bay watershed jurisdictions to ensure that all water-quality practices needed to reduce the flow of nutrients and sediment to the Bay are in place by 2025.
"This new understanding of how groundwater affects water-quality restoration in the Chesapeake Bay will help sharpen our focus as many agencies, organizations, and individuals work together to improve conditions for fish and wildlife," said Lori Caramanian, Department of the Interior Deputy Assistant Secretary for Water and Science. "In turn, improved environmental conditions will serve to further people’s enjoyment and promote the economic benefits of the Nation’s largest estuary."
The responses of watershed systems and ecosystems to environmental management actions at any location can vary from rapid changes (such as the swift beneficial effect of a wastewater treatment plant upgrade) to longer improvement intervals of several decades. In the Chesapeake Bay, "lag response time" refers to the time between implementing management actions and the resulting improvements in water quality. Lag times will vary for nitrogen, phosphorus, and sediment.
This USGS study focused on nitrogen. Some of the nitrogen will run off directly into a stream, but a large portion on the Delmarva (more than two thirds) is affected by the slow travel times of nutrients moving from their land source through underground aquifers to a receiving stream or estuary.
Sources of nitrogen include fertilizer and manure applications to agricultural land, wastewater and industrial discharges, runoff from urban areas, domestic septic drain fields, and air emissions. Excess nitrogen contributes to algal blooms that cause low dissolved oxygen in the Bay and related fish kills each summer and impact recreational activities.
For this study USGS scientist Ward Sanford developed a complex model for water, geology, and chemical interactions that he applied to seven separate watersheds on the Delmarva Peninsula. Based on the concept of nitrogen mass-balance regression, the model was able to reproduce the time history of nitrate concentrations in area streams and wells, including a recent slowdown in the rate of concentration increase in streams. The model was then also used to forecast future nitrogen delivery from the Delmarva Peninsula to the Bay under different nitrogen management scenarios.
The new study shows that ages of groundwater and associated nitrogen from the Delmarva Peninsula into the Chesapeake Bay range from less than a year to centuries, with median ages ranging from 20 to 40 years. These groundwater age distributions are markedly older than previously estimated for areas west and north of the Bay, which has a median age of 10 years. The older ages occur because the porous, sandy aquifers on the Delmarva yield longer groundwater return times than the fractured-rock areas of the Bay watershed.
The USGS research found that in some areas of the Delmarva the groundwater currently discharging to streams is gradually transitioning to waters containing higher amounts of nitrate due to fertilizer used during the 1970s through the 1990s. Similarly, the total amount of nitrogen in the groundwater is continuing to rise as a result of the slow groundwater response times.
Without additional management practices being implemented, the study forecasts about a 12% increase in nitrogen loads from the Delmarva to the Bay by 2050. The study provides several scenarios for reducing nitrogen to the water table and the amount of time needed to see the reductions in groundwater discharging to streams. For example, the model predicts that a 25% reduction in the nitrogen load to the water table will be required to have a 13% reduction in load to the bay.
However, the results also indicate that nutrient management practices implemented over the past decade or so have begun to work and confirm that the amount of the nitrogen loading to streams in the future will depend on the rigor of water-quality practices implemented to reduce nutrients at present.
This study highlights the complexities of environmental restoration of the Bay. The findings help refine the expectations of resource managers and citizens alike of how long it may take to see substantial water-quality improvements in the Bay, and they may provide additional insight into the effectiveness of different types of land management practices given the time lag created by local groundwater response times.
The study was done as part of increased federal efforts under the President’s 2009 Chesapeake Executive Order, which directs Federal agencies, including the EPA and the Department of the Interior, to "begin a new era" in protection and restoration of the Chesapeake Bay. With a watershed that spreads across six states and Washington, DC, the Chesapeake Bay is the largest estuary in the United States and one of the largest and most biologically productive estuaries in the world.
"Quantifying Groundwater's Role in Delaying Improvements to Chesapeake Bay Water Quality," Environmental Science & Technology
SACRAMENTO, Calif. — Although dousing the flames was foremost in people's minds during the recent Rim Fire in Stanislaus National Forest and Yosemite National Park, U.S. Geological Survey scientific work continues well after the fire is out. USGS scientists are continuing their critical research characterizing the hidden dangers faced after large wildfires.
While the fire was still smoldering in September, the multi-agency BAER (Burned Area Emergency Response) team developed a burn-severity map and shared it with USGS scientists. USGS assessed the potential for debris flows that tend to occur when the winter rains soak the steep slopes following fires by adding critical information on soil characteristics, the ruggedness of the terrain, and the typical amount of rainfall in that area in order to model the likelihood and possible volumes of debris flows. The just published Rim Fire debris-flow hazards assessment map, will help land and emergency managers focus mitigation treatments on where the greatest damage might be done by post-fire debris flows.
As the Rim Fire growth slowed and the BAER team began their onsite evaluation of post-fire threats to life and property, the USGS California Water Science Center and the San Francisco Public Utilities Commission, operator of the Hetch Hetchy Regional Water System, assessed project facilities and USGS streamgages within the fire perimeter. Many of these streamgages were scorched and in need of repair, and the facilities are now subject to increased potential of debris flows and flash flooding.
"Our ongoing science research and the relationships and collaborations we have built with wildfire responders and land managers help mitigate and reduce the hazards of wildfire in the future," said USGS Pacific Regional Director Mark Sogge.
The most time-critical task for USGS scientists now is to work with the California Department of Water Resources, Modesto Irrigation District, and Turlock Irrigation District to establish a streamgage and extensive water-quality monitoring at a location downstream from the fire extent and upstream from Don Pedro Reservoir in the Sierra Nevada foothills. Documenting the quantity and quality of water entering the reservoir, and modeling streamflow changes in response to the fire, gives reservoir managers the tools to understand the cumulative effects of the Rim Fire on future water supplies.
Throughout the fire, USGS satellite imagery, such as Landsat’s before-and-after images of the Rim Fire, provided firefighters with situational awareness, and helped them mount their campaign against the intense and fast flames. Maps and satellite imagery were a daily element of Integrated Fire Command briefings.
USGS continues to work with the Calif. Dept. of Water Resources, the SFPUC, and the Don Pedro Reservoir managers to include the Rim Fire burn area data in, and thereby increase the utility of, hydrologic models currently being developed. Additional remote sensing information will be incorporated into the updated hydrologic model to help the managers understand the runoff and streamflow as it changes through the years after the fire.
USGS is also monitoring soil moisture and incorporating soil properties, burn severity, and expected rates of vegetation recovery into a computer model to understand the hydrologic processes, responses, and sensitivities of the burned areas to a range of potential climatic conditions in the next five years.
Although the challenging job of managing such fires rests with other agencies, USGS provides the underlying science for sound land management decisions, before, during, and after wildfires. The USGS role studying natural hazards such as floods, landslides, earthquakes and volcanoes is well known, but fewer people are aware of the USGS scientific work in major wildfire events, which are one of the most regular and sometimes most devastating natural hazards in the West.
Well before the Rim Fire, USGS fire ecologists routinely provided scientific information to help state and federal agencies manage wildfire as a critical and natural ecosystem process and as a potential hazard to human life, property and infrastructure in the Sierra Nevada as well as Southern California. A new video tells this story. USGS ecologists also monitor the effects of ash on wildlife.
These are only a few of the many ways USGS scientists and experts help to manage the tremendous potential for wildfire-related calamity in the West.Likelihood of significant debris-flow hazard within and downstream of the Rim Fire burn area. Red areas are most likely to generate debris flows. (High resolution image 36 MB PDF.)
Although recent declines in nitrate in the Illinois River are promising, increasing nitrate levels at other sites throughout the basin are a continuing cause for concern.
Nitrate levels in the Illinois River decreased by 21 percent between 2000 and 2010, marking the first time substantial, multi-year decreases in nitrate have been observed in the Mississippi River Basin since 1980, according to a new USGS study.
Unfortunately, similar signs of progress were not widespread. "Nitrate levels continue to increase in the Missouri and Mississippi Rivers, including the Mississippi’s outlet to the Gulf of Mexico," said Lori Sprague, USGS research hydrologist.
"These results show that solving the problem of the dead zone will not be easy or quick. We will need to work together with our federal and state partners to develop strategies to address nitrate concentrations in both groundwater and surface water," said Lori Caramanian, Department of the Interior Deputy Assistant Secretary for Water and Science.
"Expanded research and monitoring is absolutely essential to tracking progress in reducing nutrient pollution in the Mississippi River Basin," said Nancy Stoner, acting Assistant Administrator for Water at the U.S. Environmental Protection Agency and co-chair of the Hypoxia Task Force. "The federal agencies and states that are part of the Hypoxia Task Force greatly appreciate this work by USGS and how it advances the science in the Mississippi River Basin."
Excessive nitrate and other nutrients from the Mississippi River strongly influence the extent and severity of the hypoxic zone that forms in the northern Gulf of Mexico every summer. This hypoxic zone, known as the "dead zone," is characterized by extremely low oxygen levels in bottom or near-bottom waters, degraded water quality, and impaired marine life. The 2013 Gulf hypoxic zone encompassed 5,840 square miles, an area the size of Connecticut.
The reasons for increases or declines in annual nitrate levels are unknown. Reliable information on trends in contributing factors, such as fertilizer use, livestock waste, agricultural management practices, and wastewater treatment improvements, is needed to better understand what is causing increases or decreases in nitrate.
Nitrate trends from 1980 to 2010 were determined at eight long-term USGS monitoring sites in the Mississippi River Basin, including four major tributaries (Iowa, Illinois, Ohio and Missouri rivers) and four locations along the Mississippi River using methodology that adjusts for year-to-year variability in streamflow conditions.
- Nitrate concentrations steadily decreased by 21 percent in the Illinois River from 2000 to 2010. Decreases were also noted in the Iowa River during this time, but the declines were not as large (10 percent).
- Consistent increases in nitrate concentrations occurred between 2000 and 2010 in the upper Mississippi River (29 percent) and the Missouri River (43 percent).
- Nitrate concentrations in the Ohio River are the lowest among the eight Mississippi River Basin sites and have remained relatively stable over the last 30 years.
- Nitrate concentrations increased at the Mississippi River outlet by 12 percent between 2000 and 2010.
Nitrate increased at low streamflows throughout the basin, except for the Ohio and Illinois Rivers. Groundwater is likely the dominant source of nitrate during low flows. It may take decades for nitrate to move from the land where it was applied, migrate through groundwater, and eventually flow into these rivers. Because of this lag time, it can take several years or decades before the full water-quality effects of either increased groundwater contamination, or of improved management practices, are evident in the rivers.
The USGS report and additional information on nitrate trends in concentration and flux for each of the eight sites are available online. Additional information on USGS long-term monitoring sites in the Mississippi River Basin is also available online (PDF).
Research on nitrate trends is conducted as part of the USGS National Water-Quality Assessment (NAWQA) program. This program provides an understanding of water-quality conditions, whether conditions are getting better or worse over time, and how natural features and human activities combine to affect water quality. Additional information on the NAWQA program can be found online.
USGS Long-term Nitrate Trends Monitoring Sites
The illustrating graphic depicts the Mississippi River drainage in the central U.S. and indicates the geographic location of the eight permanent nitrate monitoring stations:
- Mississippi River at Clinton, IA
- Iowa River at Wapello, IA
- Illinois River at Valley City, IL
- Mississippi River below Grafton, IL
- Missouri River at Hermann, MO
- Mississippi River at Thebes, IL
- Ohio River near Grand Chain, IL
- Mississippi River above Old River Outflow Channel, LA
Americans place high value on butterfly royalty. A recent study suggests they are willing to support monarch butterfly conservation at high levels, up to about 6 ½ billion dollars if extrapolated to all U.S. households.
If even a small percentage of the population acted upon this reported willingness, the cumulative effort would likely translate into a large, untapped potential for conservation of the iconic butterfly.
Monarch butterfly populations have been declining across Mexico, California and other areas of the United States since 1999. A 2012 survey at the wintering grounds of monarchs in Mexico showed the lowest colony size ever recorded.
"The multigenerational migration of the monarch butterfly is considered one of the world’s most spectacular natural events," said Jay Diffendorfer, a USGS scientist and the study’s lead author. "However, managing migratory species is difficult because they can cross international borders and depend on many geographic areas for survival."
Much of the decline in monarch numbers has been blamed on the loss of milkweed, the native plants on which monarch caterpillars feed.
"While many factors may be affecting monarch numbers, breeding, migrating, and overwintering habitat loss are probably the main culprits," said Karen Oberhauser, a monarch biologist at the University of Minnesota and a co-author of the study. "In the U.S., the growing use of genetically-modified, herbicide-tolerant crops, such as corn and soybeans, has resulted in severe milkweed declines and thus loss of breeding habitat."
The authors suggest that the universal popularity of monarchs could encourage a market for monarch-friendly plants.
"This is the first nation-wide, published, economic valuation survey of the general public for an insect. The study indicates that economic values of monarch butterflies are potentially large enough to mobilize people for conservation planting and funding habitat conservation," said John Loomis, the lead economist on the study from Colorado State University.
"The life cycle of monarchs creates opportunities for untapped market-based conservation approaches," Diffendorfer continued. "Ordinary households, conservation organizations, and natural resource agencies can all plant milkweed and flowering plants to offset ongoing losses in the species’ breeding habitat."
According to the annual survey of the National Gardening Association, households that identify as "do-it-yourself lawn and gardeners" spent $29.1 billion in related retail sales in 2012.
"By reallocating some of those purchases to monarch-friendly plants, people would be able to contribute to the conservation of the species as well as maintain a flower garden," said Oberhauser. "Helping restore the monarch’s natural habitat, and potentially the species’ abundance, is something that people can do at home by planting milkweed and other nectar plants."
Unfortunately, many plants purchased by gardeners have been treated with systemic insecticides that can kill both pollinators that consume the nectar, and caterpillars, like monarchs, that eat the leaves.
"This study shows that not only might consumers pay more for monarch-friendly milkweeds grown without systemic insecticides in the potting soil, but also that consumers might be more interested overall in buying nectar-producing plants or milkweeds if they knew a small percentage of sales will be donated to habitat conservation," said Diffendorfer.
The study, released today in Conservation Letters, was authored by researchers with the USGS, Colorado State University, the University of Minnesota, and others, who participated in a USGS John Wesley Powell Center for Analysis and Synthesis working group.
About Monarch Butterflies
Monarchs are very popular in both society and throughout education. The monarch butterfly is currently the official insect or butterfly of seven different U.S. states, and is celebrated in festivals held across North America. Monarchs have been the focus of many school’s science curricula as well as the subjects of multiple citizen-science projects.
- National Valuation of Monarch Butterflies Indicates an Untapped Potential for Incentive-based Conservation
- John Wesley Powell Center for Analysis and Synthesis
- Ecosystem Services
- Geosciences and Environmental Change Science Center
- Colorado State University, Agricultural Resource Economics
- University of Maryland, Department of Biology
- University of Arizona, School of Natural Resources and the Environment
- University of Arizona, Udall Center for Studies in Public Policy
- Scripps Institution of Oceanography
- National Gardening Association
- European Forest Institute
- Upper Midwest Environmental Sciences Center
- Wiley Online Library: Conversation Letters
- Monarch Butterfly Fund
- Monarch Joint Venture
Satellite Data Yield New Understanding Of How Galápagos Volcanoes Are Formed And May Erupt In The Future
HAWAI`I ISLAND, Hawaii — The chance transit of a satellite over the April 2009 eruption of Fernandina volcano — the most active in South America's famed Galápagos archipelago — has revealed for the first time the mechanism behind the characteristic pattern of eruptive fissures on the island chain's volcanoes, according to a new study by University of Miami (UM) and U.S. Geological Survey (USGS) scientists. Their model not only sheds light on how Galápagos volcanoes grow, which has been a subject of debate since Darwin’s time, but may also help in forecasting the locations of future eruptions, adding to the vast scientific knowledge acquired by study of this iconic island chain.
In the study, Marco Bagnardi, a doctoral candidate at the UM Rosenstiel School of Marine & Atmospheric Science (RSMAS) and visiting scientist at USGS' Hawaiian Volcano Observatory, analyzed surface deformations on Fernandina from European Space Agency (ESA) ENVISAT satellite images acquired just two hours before the 2009 eruption. He sought to explain why Hawaiian and Galápagos volcanoes, while similar in some respects, show different eruptive patterns. Eruptions from Hawai‘i's volcanoes tend to occur along narrow rift zones that extend radially from the summit, while the shields of western Galápagos volcanoes have eruptive fissures that circle the summit near the calderas but are oriented radially on the flanks below.
Bagnardi found that magma began moving away from Fernandina's reservoir as a sill, or horizontal feature, before rotating vertically and erupting in a fissure perpendicular to the summit on the volcano's southwest flank. A sill also fed the 2005 eruption at Fernandina, but that magma moved upward to form a fissure circling the volcano's summit closer to the top. These data suggest that sills feed all eruptive activity in the Galápagos but that they rotate in different ways as they propagate toward the host volcano's surface.
"Our findings have literally turned our perspective by 90 degrees," Bagnardi said. "For decades, we thought that eruptions in the Galápagos were starting with the intrusion of vertical blade-like bodies. We now know this is not the case."
"Without knowing why the fissures are arranged the way they are, we had no way to infer the geologic history of the volcanoes and how they evolved over time," said USGS geophysicist Mike Poland, a co-author of the study. "This research allows scientists to better model how the volcanoes grow and behave, especially with respect to past and future activity," Poland said. "You can't move forward with solving the puzzle because you are missing a key part of the story. This work fills in a major missing piece and will allow better interpretations of a multitude of parameters about Galapagos volcanoes."
UM RSMAS is part of a long-term satellite-based monitoring program of the Galápagos volcanoes using Interferometric Synthetic Aperture Radar (InSAR), a technique that uses repeat satellite observations to measure millimeter-scale ground displacements. Bagnardi credited the "extremely fortunate" timing of data collected by the ENVISAT satellite, which passed over the island just when magma was already on its way toward the surface. Fernandina erupts every four to five years, on average, and the satellite passed over the archipelago only once every several days, he said.
The team theorizes that Hawaiian and Galápagos volcanoes behave differently because neighboring volcanoes of the Galápagos grew concurrently and did not affect one another as they formed. Hawaiian volcanoes, however, grow sequentially, meaning that older volcanoes control the structural development, including eruptive patterns, of younger edifices.
Based on the relations between deformation and eruptions at Fernandina in 1995 and 2005, the authors argue that the next eruption of Fernandina will be from a fissure circling the volcano's summit on the southwest side of the caldera in the area uplifted by the 2009 sill intrusion.
"Unfortunately, we are still not able to predict the timing of future eruptions in the Galápagos," Bagnardi said. "However, providing a forecast for the location and type of eruption is a step in the right direction."
Satellite imagery used in the study was also provided by the Japanese Space Exploration Agency (JAXA). This research was supported by the National Aeronautics and Space Administration (NASA) and the National Science Foundation (NSF).
The paper in Earth and Planetary Science Letters, "A New Model for the Growth of Basaltic Shields Based on Deformation of Fernandina Volcano, Galápagos Islands," by Marco Bagnardi, Falk Amelung and Michael P. Poland, is available online.
Since January 2009, more than 200 magnitude 3.0 or greater earthquakes have rattled Central Oklahoma, marking a significant rise in the frequency of these seismic events.
The U.S. Geological Survey and Oklahoma Geological Survey are conducting collaborative research quantifying the changes in earthquake rate in the Oklahoma City region, assessing the implications of this swarm for large-earthquake hazard, and evaluating possible links between these earthquakes and wastewater disposal related to oil and gas production activities in the region.
Studies show one to three magnitude 3.0 earthquakes or larger occurred yearly from 1975 to 2008, while the average grew to around 40 earthquakes per year from 2009 to mid-2013.
"We've statistically analyzed the recent earthquake rate changes and found that they do not seem to be due to typical, random fluctuations in natural seismicity rates," said Bill Leith, USGS seismologist. "These results suggest that significant changes in both the background rate of events and earthquake triggering properties needed to have occurred in order to explain the increases in seismicity. This is in contrast to what is typically observed when modeling natural earthquake swarms."
The analysis suggests that a contributing factor to the increase in earthquakes triggers may be from activities such as wastewater disposal--a phenomenon known as injection-induced seismicity. The OGS has examined the behavior of the seismicity through the state assessing the optimal fault orientations and stresses within the region of increased seismicity, particularly the unique behavior of the Jones swarm just east of Oklahoma City. The USGS and OGS are now focusing on determining whether evidence exists for such triggering, which is widely viewed as being demonstrated in recent years in Arkansas, Ohio and Colorado.
This "swarm" includes the largest earthquake ever recorded in Oklahoma, a magnitude 5.6 that occurred near Prague Nov. 5, 2011. It damaged a number of homes as well as the historic Benedictine Hall at St. Gregory's University, in Shawnee, Okla. Almost 60 years earlier in1952, a comparable magnitude 5.5, struck El Reno and Oklahoma City. More recently, earthquakes of magnitude 4.4 and 4.2 hit east of Oklahoma City on April 16, 2013, causing objects to fall off shelves.
Following the earthquakes that occurred near Prague in 2011, the agencies issued a joint statement, focusing on the Prague event and ongoing seismic monitoring in the region. Since then, the USGS and OGS have continued monitoring and reporting earthquakes, and have also made progress evaluating the significance of the swarm.
Important to people living in the Oklahoma City region is that earthquake hazard has increased as a result of the swarm. USGS calculates that ground motion probabilities, which relate to potential damage and are the basis for the seismic provisions of building codes, have increased in Oklahoma City as a result of this swarm. While it’s been known for decades that Oklahoma is "earthquake country," the increased hazard has important implications for residents and businesses in the area.
To more accurately determine the locations and magnitudes of earthquakes in Oklahoma, the OGS operates a 15-station seismic network. Data from this system, and from portable seismic stations installed in the Oklahoma City region, are sent in real-time to the USGS National Earthquake Information Center, which provides 24x7 reporting on earthquakes worldwide.
COOK, Wash. — The eggs of endangered Kootenai River white sturgeon are less likely to hatch on some of the surfaces that have been made more common by human, or anthropogenic, changes on the river, a new U.S. Geological Survey report has found.
The white sturgeon (Acipenser transmontanus), once common in much of North America, is a very large, slow-to-mature fish that has evolved little from its late Cretaceous ancestors 175 million years ago. It has great cultural significance for the Kootenai Tribe of Idaho and many other Northwest Tribes. White sturgeon was harvested in many places for caviar, and dams and other development have altered its habitat in ways whose implications are still being studied. White sturgeon in Idaho and Montana’s Kootenai River basin were listed as endangered in 1994, and poor recruitment (the number of a species’ young to survive to maturity) in other West Coast populations is a concern.
"Sturgeons are imperiled across the globe. Our scientists are committed to working with partners, including tribes, to address sturgeon issues across the region," said Jill Rolland, director of the USGS Western Fisheries Research Center.
In the report, prepared in cooperation with the Kootenai Tribe of Idaho, USGS research fishery biologist Mike Parsley and biological science technician Eric Kofoot examined hatch success in the laboratory on various surfaces, such as clean rocks, algae-covered rocks and sand, that sturgeon eggs settle and adhere to in the wild while they develop into larvae. The scientists found sand to be a poor surface, because the developing sturgeon embryos failed to attach to it. River rocks covered in algae yielded poor results, in part because they were more hospitable to fungus that threatens sturgeon embryos, while waterlogged wood and clean rocks performed well.
The report notes that sand substrates, or surfaces, now dominate the highly altered Kootenai River in areas currently used by spawning sturgeon, and that dam operation for flood management and hydropower during the spawning season have largely eliminated spring scouring flows that typically would clean rocks of algae and other growth. Finally, the report raises several possibilities, based on the findings, for maximizing white sturgeon recruitment, including substrate-type recommendations for spawning-habitat restoration and the incorporation of scouring flows to clean spawning substrate prior to the spawning season.
"This is another piece in the puzzle of understanding why some white sturgeon populations in highly altered river systems succeed and others don’t," Parsley said.
The publication, "Hatch Success of White Sturgeon Embryos Incubated on Various Substrates," USGS Report Series 2013-5180, by Michael J. Parsley and Eric Kofoot, is available online.
New research provides insight on why the New Madrid Seismic Zone is unique and may continue to pose a higher earthquake risk than adjacent areas in the central United States.
Using innovative and sophisticated technology, scientists now have high-resolution imagery of the New Madrid Seismic Zone, allowing them to map the area in more detail than ever before. The maps allow for greater understanding of the weak rocks in this zone that are found at further depths in the Earth's mantle compared to surrounding areas. Scientists also determined that earthquakes and their impacts are likely to be narrowly concentrated in this zone.
U.S. Geological Survey scientists led this research and recently published their findings in the journal, Earth and Planetary Science Letters.
A swarm of some of the largest historical earthquakes in the nation occurred in the New Madrid Seismic Zone, in particular three earthquakes greater than magnitude 7 occurred from 1811 to 1812. There have been several smaller, yet still significant, earthquakes in the area since then. This zone extends about 165 miles from Marked Tree, Ark., to Paducah, Ky. and the southern end of the zone is about 35 miles northwest of Memphis, Tenn.
"With the new high-resolution imagery, we can see in greater detail that the New Madrid Seismic Zone is mechanically weaker than surrounding areas and therefore concentrates movement and stress in a narrow area," said USGS scientist Fred Pollitz, who is the lead author of this research. "The structure beneath this zone is unique when compared to adjacent areas in the central and eastern United States. A more in-depth understanding of such zones of weakness ultimately helps inform decisions such as the adoption of appropriate building codes to protect vulnerable communities, while also providing insight that could be applied to other regions across the world."
Prior to this research, the New Madrid Seismic Zone has been mapped by the USGS as an area of high seismic hazard, but those assessments were a consequence of a short (about 4,500 years) earthquake record for the area.
This research specifically investigated the Reelfoot Rift area, which is a 500-million-year-old geologic feature that contains the New Madrid Seismic Zone in its northernmost part. Scientists imaged rocks deep beneath Earth’s surface to see their characteristics and understand their mechanical behavior, especially their ability to withstand the huge stresses constantly placed on them.
A surprising finding was that weak rocks underlie the fault lines in the crust of the Reelfoot Rift and extend more than 100 miles down into the mantle. In contrast, weak rocks in other ancient rift zones in the central and eastern United States bottom out at much shallower depths. These weak mantle rocks have low seismic velocity, meaning that they are more susceptible to concentration of tectonic stress and more mobile.
USGS scientists used data from USArray, which is a large network of seismometers that is a component of the EarthScope program of the National Science Foundation. These seismometers provide images of the crust and mantle down to 120 miles (200 kilometers) beneath the surface using the methods employed by these scientists.
"Our results are unexpected and significant because they suggest that large earthquakes remain concentrated within the New Madrid Seismic Zone," said USGS scientist Walter Mooney, the co-author of the report. "There are still many unknowns about this zone, and future research will aim to understand why the seismic zone is active now, why its earthquake history may be episodic over millions of years, and how often it produces large quakes."
In the future, USGS scientists plan to map the seismic structure of the entire nation using USArray. This effort started in California in 2004, is focusing on the east coast next, and will then move to Alaska. All of the USArray and other Earthscope efforts will also help inform the USGS National Seismic Hazard Maps.
Podcast: Mercury and Global Change
Rising global temperatures and changing human actions will significantly affect the behavior and distribution of mercury worldwide, according to a recent article by the U.S. Geological Survey and Harvard University.
Mercury, especially in the form of methylmercury, is an extremely toxic chemical to all life forms. It occurs both naturally and as the result of human activities. A majority of mercury releases to the environment presently are atmosphere emissions from human activities, and reemissions of previously deposited mercury from soils and the oceans. The largest sources of man-made mercury emissions are small-scale gold mining and burning coal for electrical generation.
"Studies like this help us better understand the overall effects of multiple impacts on the environment," said USGS Acting Director Suzette Kimball. "We are just beginning to understand many of the consequences of global climate change and how interrelated many environmental issues truly are."
Several seemingly unconnected aspects of climate change are expected to affect mercury at the global scale, according to the article. In the atmosphere, higher temperatures and weaker air circulation patterns from climate change will likely have significant impacts on the atmospheric lifetime and patterns of mercury deposition.
In most climate change scenarios, storms will be less frequent but more intense, resulting in larger amounts of mercury being released from the soil through erosion that may end up in rivers, lakes and oceans, the study said. When mercury reaches these surface waters, it can be processed by naturally occurring bacteria into the neuro-toxic methylmercury.
In addition, the article explained that climate change will likely lead to more frequent and intense forest fires, which release mercury from relatively stable and safe storage in the soil and allow it to be transported downwind and then redeposited and possibly converted into methylmercury.
"The intersection of the complex behavior of mercury in the environment with the myriad of aspects of global change provided a significant challenge to describe in this paper," said USGS scientist David Krabbenhoft, the article’s lead author. "Although the science behind mercury research has exponentially increased in the past couple decades, providing reliable information to resource managers and decision makers on such complex topics remains a significant research challenge."
Changes in human behavior will also have substantial impacts on global mercury, according to the article. Current human emissions of mercury total 2,000 metric tons per year. Under the best-case scenario of curbing human emissions and mitigating climate change, that number could fall to 800 metric tons per year by 2050. If no actions are taken and a business-as-usual approach is followed, the number will likely increase to 3,400 metric tons per year by 2050.
Human activity has already had a significant impact on the release of mercury emissions, the article explained. For example, since the Industrial Revolution and widespread development of mercury emitting processes like coal combustion for electric power generation, soil records show a 3 to 5 fold increase in atmospheric deposition since the 1880s, and 7 to10 fold since antiquity. During the 20th century, coal-fired power plants dominated the human emissions of mercury.
However, with the current high cost of gold and relatively inexpensive liquid mercury, small-scale gold mining has taken over as the primary source of human emissions of mercury. Mercury is used to separate gold from rock deposits, and is often done in a manner that results in the miners and the local environment being exposed to toxic levels of mercury, according to the report.
Positive steps at controlling mercury emissions have been taken, though, Krabbenhoft noted. In 2011, the United States enacted the first-ever emissions regulations on coal-fired electricity-generating power plants. However, the United States only accounts for six to ten percent of global emissions.
To tackle global emissions, the United Nations Environmental Program brought together 140 countries to craft the Minamata Convention on Mercury, which is a binding resolution that includes emissions standards for mercury. It is scheduled to be signed in October, 2013.
USGS provides information on mercury sources; mercury cycling in the atmosphere, land surface, lakes, streams and oceans; and bioaccumulation and toxicity of mercury. This information helps land and resource managers understand and reduce mercury hazards to people and wildlife. Learn more about this article online.
The USGS has developed a tiered chemical prioritization approach that can be used by local, state, and federal agencies to develop ambient water and sediment quality monitoring strategies. Over 2,450 chemicals were prioritized based on their likelihood of adverse effects on human health or aquatic life, combined with information on likelihood of environmental occurrence.
Ambient monitoring includes studies of the quality of environmental resources (such as water or sediment) that are conducted under typical conditions without a predisposition that contamination is present.
"The information and approaches described in this study can be used by water resource managers to develop improved strategies that focus limited monitoring dollars on chemicals that, if present at high enough concentrations, can adversely affect human or ecosystem health," said Lisa Olsen, USGS hydrologist who led the study.
Dr. Helen Goeden, a senior toxicologist with the Minnesota Department of Health (MDH), stated that staff within the MDH's Contaminants of Emerging Concern (CEC) program has used the information described in the USGS study to assist in identifying and ranking contaminants for assessment within the CEC program.
The range of chemicals reviewed includes organic compounds of human origin and chemicals from natural sources, such as radionuclides and trace elements from geologic materials. Of 2,451 chemicals prioritized, 1,081 were determined to be of the highest priority or "Tier 1" for ambient monitoring, including 602 for water and 686 for sediment (some were evaluated for both matrixes). Others were assigned to Tier 2, those having intermediate priority for monitoring on the basis of a lower likelihood of occurrence or lower likelihood of effects on human health or aquatic life, or to Tier 3, those have low or no priority for monitoring.
Groups of chemicals prioritized for this effort included:
- Volatile organic compounds in water – 85
- Pesticides and degradates in water – 615
- Pesticides and degradates in sediment – 605
- Pharmaceuticals and hormones in water or sediment – 405
- Trace elements and other inorganics in water – 38
- Trace elements in sediment – 10
- Cyanotoxins in surface water – 15
- Lipophilic organic compounds in sediment – 699
- Disinfection by-products in water – 93
- High-production-volume chemicals in water – 318
- Wastewater-indicator and industrial compounds in water – 470
- Radionuclides in water – 14
This study was done to help the National Water-Quality Assessment (NAWQA) Program prepare for its third decade of monitoring the Nation's surface and groundwater resources. For example, NAWQA monitoring of surface and groundwater resources is using analytical schedules that include 350 of the 602 Tier 1 chemicals recommended for ambient water-quality monitoring.
The study is based on information from NAWQA and other USGS programs, information on chemicals of human-health and aquatic-life concern compiled by other agencies and organizations, U.S. Environmental Protection Agency databases, and peer-reviewed literature, totaling over 800 references. Results of the prioritization effort and related analytical methods used by the NAWQA Program are compiled in USGS Scientific Investigations Report 2012-5218.
Surveying Ice and Fire: The First Map of All of Iceland's Glaciers and Subglacier Volcanic Calderas Released
For the first time, all of Iceland’s glaciers are shown on a single map, produced by the Icelandic Meteorological Office (IMO), in collaboration with the U. S. Geological Survey (USGS) and Iceland Geosurvey. The map is the first to incorporate historical data and coverage from aerial photographs and remote sensing satellites, such as Landsat and SPOT, to show the change in the areal extent of glaciers during the past century.
Iceland has about 300 glaciers throughout the country, and altogether, 269 glaciers, outlet glaciers and internal ice caps are named. The glaciers that lack names are small and largely newly revealed, exposed by melting of snow pack due to warmer summer temperatures. The number of identified glaciers has nearly doubled at the beginning of the 21st century.
"Iceland's glaciers have also been revealed to be quite dynamic during the past century," said Oddur Sigurðsson, the senior author of the new map and a glaciologist with the IMO. "At the maximum of the Little Ice Age (about 1890 in Iceland), it's glaciers reached their greatest areal extent before receding to their present-day positions, interrupted with a few cooler periods during this century-long recession. Iceland's glaciers continue their retreat and lose volume; its ice caps are losing an average of 1 m of ice each year from their surfaces."
Subglacier volcanic calderas and their locations with respect to the glaciers are an important feature of the new map. Many of Iceland‘s glaciers lie over active volcanoes, including Eyjafjallajökull, the now well-known volcano an eruption from which in 2010 disrupted air travel between North America and Europe and within Europe.
Knowing which volcanic calderas lie beneath glaciers and their history of volcanic activity is important for disaster preparation and mitigation. When a volcano erupts beneath a glacier, it often results in the unleashing of very large volume floods known by scientists as a jökulhlaup ("Glacier-outburst flood").
This volcano-glaciological hazard is well known to Icelanders. The largest jökluhlaups occur when the Katla volcano under the Mýrdalsjökull ice cap (just to the east of Eyjafjallajökull; see graphic, an excerpt from the map) erupts, resulting in a flood that exceeds the normal flow of the Amazon River, Earth's largest river in terms of volume of water!
Richie Williams, emeritus senior research geologist with the USGS and collaborator on the map notes that, "The more than 40 years of scientific research in Iceland by USGS scientists, in collaboration with numerous Icelandic scientists and institutions, has produced many important scientific publications in volcanology, geothermal activity, volcanic geomorphology, glaciology, and geologic hazards."
Surge-type glaciers also make their debut on this cartographically unique map. Surge-type glaciers are those that, for reasons not completely understood scientifically, suddenly move forward, advancing several hundred meters or even several kilometers in a few months.
Brúarjökull, a surge-type glacier on the northern margin of Vatnajökull, Iceland's largest ice cap, surged forward 8 km in 1963/1964. Eyjabakkajökull, a surge-type glacier just to the east of Brúarjökull, surged about 2 km in 1972/1973, a change that was captured on the first Landsat images acquired of a surging glacier.
The map of Iceland's glaciers is the result of many decades of research and data collection from all across Iceland, the area of which is about the same as the Commonwealth of Virginia. Maps compiled by the Danish Geodetic Survey in the first third of the 20th century, aerial mapping missions for the U.S. Army Map Service at the end of World War II, satellite images from Landsat and SPOT during a period of four decades, all contributed to the map´s compilation.
The USGS and many Icelandic scientific institutions, including the Icelandic Meteorological Office, have a more than 40-year history of cooperative research, including a long-standing Memorandum of Understanding for research on a wide variety of subjects, including glaciers, volcanoes, tectonics, and geothermal energy. Iceland is the world leader in geothermal exploration and technology, a major source of "green" energy.
The map is entitled "Map of the Glaciers of Iceland" ("Jöklakort af Íslandi" in Icelandic); the map Legend is in both Icelandic and English and is available from the Icelandic Meteorological Office or Iðnú ehf. More information on glacier research in the USGS can be found online.
Celebrate the second annual Geologic Map Day! On October 18, as a part of the Earth Science Week 2013 activities, join leading geoscience organizations in promoting awareness of the importance of geologic mapping to society.
Geologic maps are vital to education, science, business, and public policy concerns. Geologic Map Day will focus the attention of students, teachers, and the general public on the study, uses, and significance of these tools, by engaging audiences through educational activities, print materials, online resources, and public outreach opportunities.Geologic map of the Holy Cross quadrangle, Colorado
In conjunction with Geologic Map Day, the USGS is promoting several new events. The first "Best Student Geologic Map Competition" will be held at the annual Geological Society of America meeting. Students in university (either undergraduates or graduate students) from around the world who have completed a geologic map are eligible to compete and we will be posting short videos of university students who have mapping in the field this summer. We hope these videos will inspire younger students in high school to consider the Earth Sciences as their future major.
Be sure to check out the Geologic Map Day poster included in this year's Earth Science Week Toolkit. The poster and other materials in the kit show how geologic maps can be used to understand natural hazards as well as providing step-by-step instructions for a related classroom activity. Additional resources for learning about geologic maps can be found on the Geologic Map Day web page.
Geologic Map Day partners include the American Geosciences Institute (AGI), the Association of American State Geologists, the U.S. Geological Survey, the National Park Service, the Geological Society of America, and Esri.
Focusing on the theme of "Mapping Our World," Earth Science Week 2013 will be celebrated October 13-19. To learn more, please visit www.earthsciweek.org/. To order your Toolkits, please visit www.earthsciweek.org/materials/. You may also call AGI Publications to place your order at 703-379-2480.
For more information, go to: http://www.earthsciweek.org/geologicmap/
NASA and the U.S. Geological Survey kick off a quest for an innovative and affordable space-based system to extend the Landsat data record for decades to come with a public forum and call for ideas Wednesday, Sept. 18.
The Sustainable Land Imaging Architecture Study Industry and Partner Day will take place from 1-4:30 p.m. EDT in the NASA Headquarters Webb Auditorium, 300 E St. SW, Washington. Following this public forum, NASA will release a request for information to formally seek new ideas on the design of such a system.
In April the Obama Administration directed NASA to conduct the study as part of its initiative to create for the first time a long-term, sustainable spaceborne system to provide Landsat-quality global observations for at least the next 20 years. The Sustainable Land Imaging Program, announced in the President's proposed fiscal year 2014 budget, directs NASA to lead the overall system architecture study with participation from USGS.
Representatives of the White House Office of Science and Technology Policy, NASA and the USGS will present details of the study process and planning timeline during the public forum.
"We are looking for system design solutions that spur innovation and increase efficiencies, making use of aerospace expertise from across the government and commercial aerospace sector," said David Jarrett, study lead in the Earth Science Division in the Science Mission Directorate at NASA Headquarters. "We will evaluate a range of solutions, including large and small dedicated spacecraft, formation flying, hosted payloads, and international and private sector collaborations."
"Landsat data are used by a broad range of specialists to assess some of the world’s most critical issues — the food, water, forests, and other natural resources needed for a growing world population.” said Matt Larsen, USGS Associate Director for Climate and Land Use Change. "We are happy to participate in the NASA study to help develop and refine the long-term future of this program, while at the same time recognizing that it is vital that we maintain our Landsat observational capabilities over the short-term to ensure that no data gap occurs."
The objective of the Sustainable Land Imaging study is to design space-based systems that can provide continuous Landsat-quality data for at least 20 years and be sustained in a tight federal budget environment. The system is planned to continue the 41-year-old Landsat data record, which was assembled with a series of single satellites implemented one at a time.
The most recent addition to the long-running series, Landsat 8, launched in February 2013, is performing well. However, Landsat 7, launched in 1999 and now operating with limited redundancy and a waning fuel supply, could fail or run out of fuel in the next few years. Both satellites were developed and launched by NASA. The spacecraft are now operated by the USGS, which is responsible for generating, archiving, and distributing a range of standard products based on the space-borne measurements.
The Landsat program provides continuous global, moderate-resolution measurements of land and coastal regions, providing mankind's longest record of our planet from space. Landsat data provide a consistent and reliable foundation for research on land use change, forest health and carbon inventories, and changes to our environment, climate, and natural resources.
The free and open availability of Landsat data enables the measurements to be used routinely by decision makers both inside and outside the government, for a wide range of natural resource issues including water resource management, wildfire response, agricultural productivity, rangeland management, and the effects of climate change.
Media interested in attending the public forum must contact Steve Cole no later than 11 a.m. EDT, Wednesday, Sept. 18.
Today the U.S. Geological Survey led a congressional briefing featuring state and regional water stakeholders who spoke about vital uses of comprehensive water information that would be met by the National Water Census called for by the SECURE Water Act of 2009.
Growing populations, increased energy development, and the uncertain effects of a changing climate magnify the need for an improved understanding of water use and water availability. However, no comprehensive and current national assessment of water resources exists.
A report released in April, Progress Toward Establishing a National Assessment of Water Availability and Use, fulfilled a requirement under the 2009 law for the Secretary of the Interior to report to Congress on progress made in implementing the national water availability and use assessment program, also referred to as a National Water Census.
"It’s true in other fields and no less so for water: you can’t manage what you don’t measure," said Anne Castle, Department of the Interior Assistant Secretary for Water and Science. "The Water Census will quantify water supply and demand consistently across the entire country, fill in gaps in existing data, and make that information available to anyone who needs it—and that represents a huge step forward on the path toward water sustainability."
The National Academy of Sciences applauded the concept of a Water Census in 2009, suggesting that it would be "an ongoing, effective tool, on a par with the social and economic censuses, that supports national decision making."
"As competition for water grows — for irrigation of crops, for the production of energy, for use by cities and communities, and for the environment — the need for information and tools to aid water-resource managers also grows," said Tony Willardson, Executive Director, Western States Water Council.
"The more accurately we can assess the quantity and quality of our water resources, the better we can know whether our strategies for conserving and improving those resources are actually having the desired beneficial effect," observed Bob Tudor, Deputy Director, Delaware River Basin Commission. Willardson and Tudor were speakers at the briefing.
A Water Census is a complex undertaking, which points to why national water availability and use have not been comprehensively assessed in more than 35 years. Since then, competition for water resources has increased greatly and, in addition to human use, considerably more importance is now attached to the availability of water for environmental and ecosystem needs.
The USGS envisions the Water Census to be a key ongoing activity that, like the population census mandated by the Constitution, supports national decision-making in many different ways.
The resources currently available for this census are finite, however. USGS foresees that estimates of flow at ungaged locations and estimates of evapotranspiration will be among the earliest products of the National Water Census. Providing complete water-use information and adequately assessing the Nation’s groundwater resources with respect to water availability will require additional time.
Although the existing data are limited and much work remains to be done, funding over the past two years has allowed important progress. The USGS will continue to work with partner agencies and organizations to maximize the utility of the information for a broad range of uses.
The Water Census is part of an overarching Department of the Interior (DOI) initiative known as WaterSMART (Sustain and Manage America’s Resources for Tomorrow). Through WaterSMART, the Department is working to achieve a sustainable water strategy to help meet the Nation’s water needs. The Water Census will help inform that strategy.
The USGS is developing plans for the Water Census in coordination with other Federal and non-Federal agencies, universities, and other organizations. Collaboration across agency boundaries ensures that information produced by the USGS can be aggregated with data on other types of physical, social, economic, and environmental factors that affect water availability. Both the USGS and the U.S. Bureau of Reclamation (USBR) have substantive responsibilities under the Department of the Interior WaterSMART initiative.
One of the geographic focus areas of the Water Census is the drainage basin of the Colorado River, which covers parts of seven states, delivers water to more than 30 million people, irrigates nearly 4 million acres of cropland in the U.S. and Mexico, and supplies hydropower plants that annually generate more than 10 billion kilowatt-hours. Increasing population, decreasing streamflows, and the uncertain effects of a changing climate amplify the need for an improved understanding of water use and water availability in this crucial watershed.
In keeping with rapid demand, the USGS has posted new US Topo quadrangles covering Colorado (1,794 maps) and Minnesota (1,689). These new quads replace the first edition US Topo maps for those states. The replaced maps will be added to the USGS Historical Topographic Map Collection and are also available for free download from The National Map and the USGS Map Locator & Downloader website.
The new design for US Topo maps improves readability of maps for online and printed use, while retaining the look and feel of the traditional USGS topo map. Also, map symbols are now easier to read over the digital aerial photograph layer whether the imagery is turned on or off.
Other re-design enhancements and new features:
- New shaded relief layer for enhanced view of the terrain
- Military installation boundaries, post offices and cemeteries
- New road classification
- A slight screening (transparency) has been applied to some features to enhance visibility of multiple competing layers
- New PDF legend attachment
- Metadata formatted to support multiple browsers
In addition, the new Colorado US Topo quads include recreational trails in National Forests, provided by the U.S. Forest Service. Although this first test of trails was successful, the Forest Service does not yet have comparable data in other states, and schedules for adding trails in all National Forests have not been set.
"We are excited to about these two updates that are part of our continual effort to improve US Topo maps for our users," said Vicki Lukas, USGS Chief of Partner and User Engagement. "First, the new design makes US Topo maps even easier to use, and the new Colorado maps include Forest Service trails as a new feature."
US Topo maps are updated every three years. The initial round of the 48 conterminous state coverage was completed last September. Hawaii and Puerto Rico maps are being completed this year. New US Topo maps for Alaska have started, but will take several years to complete.
US Topo maps are created from geographic datasets in The National Map, and deliver visible content such as high-resolution aerial photography, which was not available on older paper-based topographic maps. The new US Topo maps provide modern technical advantages that support wider and faster public distribution and on-screen geographic analysis tools for users.
For more information, go to: http://nationalmap.gov/ustopo/Figure showing proposed US Topo production schedule. States that were updated in 2012 are in yellow; states that have, or will be updated in 2013 are colored red; and states are scheduled to be updated in 2014 are in blue. (High resolution image)
ST. PETERSBURG, Fla. — Acidification of the Arctic Ocean is occurring faster than projected according to new findings published in the journal PLOS ONE. The increase in rate is being blamed on rapidly melting sea ice, a process that may have important consequences for health of the Arctic ecosystem.
Ocean acidification is the process by which pH levels of seawater decrease due to greater amounts of carbon dioxide being absorbed by the oceans from the atmosphere. Currently oceans absorb about one-fourth of the greenhouse gas. Lower pH levels make water more acidic and lab studies have shown that more acidic water decrease calcification rates in many calcifying organisms, reducing their ability to build shells or skeletons. These changes, in species ranging from corals to shrimp, have the potential to impact species up and down the food web.
The team of federal and university researchers found that the decline of sea ice in the Arctic summer has important consequences for the surface layer of the Arctic Ocean. As sea ice cover recedes to record lows, as it did late in the summer of 2012, the seawater beneath is exposed to carbon dioxide, which is the main driver of ocean acidification.
In addition, the freshwater melted from sea ice dilutes the seawater, lowering pH levels and reducing the concentrations of calcium and carbonate, which are the constituents, or building blocks, of the mineral aragonite. Aragonite and other carbonate minerals make up the hard part of many marine micro-organisms’ skeletons and shells. The lowering of calcium and carbonate concentrations may impact the growth of organisms that many species rely on for food.
The new research shows that acidification in surface waters of the Arctic Ocean is rapidly expanding into areas that were previously isolated from contact with the atmosphere due to the former widespread ice cover.
"A remarkable 20 percent of the Canadian Basin has become more corrosive to carbonate minerals in an unprecedented short period of time. Nowhere on Earth have we documented such large scale, rapid ocean acidification" according to lead researcher and ocean acidification project chief, U.S. Geological Survey oceanographer Lisa Robbins.
Globally, Earth's ocean surface is becoming acidified due to absorption of man-made carbon dioxide. Ocean acidification models show that with increasing atmospheric carbon dioxide, the Arctic Ocean will have crucially low concentrations of dissolved carbonate minerals, such as aragonite, in the next decade.
"In the Arctic, where multi-year sea ice has been receding, we see that the dilution of seawater with melted sea ice adds fuel to the fire of ocean acidification" according to co-author, and co-project chief, Jonathan Wynn, a geologist from the University of the South Florida. "Not only is the ice cover removed leaving the surface water exposed to man-made carbon dioxide, the surface layer of frigid waters is now fresher, and this means less calcium and carbonate ions are available for organisms."
Researchers were able to investigate seawater chemistry at high spatial resolution during three years of research cruises in the Arctic, alongside joint U.S.-Canada research efforts aimed at mapping the seafloor as part of the U.S. Extended Continental Shelf program. In addition to the NOAA supported ECS ship time, the ocean acidification researchers were funded by the USGS, National Science Foundation, and National Oceanic and Atmospheric Administration.
Compared to other oceans, the Arctic Ocean has been rather lightly sampled. "It's a beautiful but challenging place to work," said Robert Byrne, a USF marine chemist. Using new automated instruments, the scientists were able to make 34,000 water-chemistry measurements from the U.S. Coast Guard icebreaker. "This unusually large data set, in combination with earlier studies, not only documents remarkable changes in Arctic seawater chemistry but also provides a much-needed baseline against which future measurements can be compared." Byrne credits scientists and engineers at the USF college of Marine Science with developing much of the new technology.
While scientists can't predict when a great earthquake producing a pan-Pacific tsunami will occur, thanks to new tools being developed by federal and state officials, scientists can now offer more accurate insight into the likely impacts when tsunamis occur. This knowledge can lead officials and the public to reduce the risk of the future tsunamis that will impact California.
What are the potential economic impacts? Which marinas could be destroyed? Who needs to be prepared for evacuations? A newly published report looks at these factors and more.
A hypothetical yet plausible scenario was developed where a tsunami is created by an earthquake offshore from the Alaskan peninsula and extends to the California coast. This is detailed in a new report titled, the U.S. Geological Survey's Science Application for Risk Reduction (SAFRR) Tsunami Scenario.
Some of the issues highlighted in the scenario include public safety and economic loss. In this scenario approximately 750,000 people would need to be evacuated, with 90,000 of those being tourists and visitors. Additionally, one-third of the boats in California's marinas could be damaged or completely sunk, resulting in $700 million in losses. It was concluded that neither of California's nuclear power plants would likely be damaged by this particular event.
Looking back to 2011, not only was Japan devastated by the magnitude 9.1 Tohoku earthquake, but the resulting tsunami also swept through California and caused $50-100 million of damage. This shows that tsunamis near and far can lead to billions of dollars in losses in California.
The SAFRR Tsunami Scenario is a collaborative effort between the USGS, the California Geological Survey, the California Governor's Office of Emergency Services, the National Oceanic and Atmospheric Administration, other agencies, and academic and other institutions.
"The good news is that three-quarters of California's coastline is cliffs, and thus immune to the harsher and more devastating impacts tsunamis could pose," said Lucy Jones, who is the USGS Science Advisor for Risk Reduction and leads the SAFRR Project. "The bad news is that the one-quarter at risk is some of the most economically valuable property in California."
"In order to effectively protect communities from tsunamis, we must first know what to plan for," continued Jones. "By starting with science, there is a clearer understanding on how tsunamis function and their potential impacts. The scenario will serve as a long-lasting resource to raise awareness and provide scientifically-sound and unbiased information to decision makers in California and abroad."
In this scenario, scientists specifically outline the likely inundation areas, current velocities in key ports and harbors, physical damage and repair costs; economic consequences; environmental impacts; social vulnerability; emergency management; and policy implications for California.
This scenario will also be the focus of discussion at a workshop series starting September 4 that is convened in partnership with the California Tsunami Hazard Mitigation Program. USGS scientists and partners will explain the scenario and results to stakeholders in the coastal communities of California. The workshops aims to establish a community of experts while fostering the use of science in decision-making.
The workshops will be hosted by the Cabrillo Marine Aquarium (September 4); Santa Barbara County Office of Emergency Management (September 5); San Diego County Office of Emergency Management (September 6); Santa Cruz County Office of Emergency Management (September 9); and the Port of San Francisco (September 10).
The SAFRR Project is the same USGS research group that produced the ShakeOut Scenario in 2008, examining the consequences of a probable major earthquake on the southern San Andreas Fault, and the ARkStorm Scenario in 2011, examining the risks associated with extreme rain events associated with atmospheric rivers.The March 11, 2011 Tohoku-oki tsunami caused significant damage to ships and docks within Crescent City Harbor in California. A number of ships were sunk within the harbor. Because of extensive sedimentation and potential contaminated debris within the harbor, recovery efforts took over a year. Photo Credit: Rick Wilson, California Geological Survey. (Larger image) Maximum current speeds for the Port of Los Angeles (POLA) and the Port of Long Beach (POLB), according to the SAFRR Tsunami Scenario. In the POLA, currents are strongest at Angels Gate, the Cabrillo Marina, the Boat Yard, and the old Navy Yard. Once the tsunami event is underway, navigation through the Gate would be very dangerous. In the Cabrillo Marina and Boat Yard, currents are likely strong enough to break apart floating docks, damage piles, and pull small vessels from their mooring lines. The strongest currents are found in the old Navy Yard; however there are no exposed floating assets in this immediate area. At the POLB, again strong currents are found at the Gate. Also in the POLB, strong and jet-like currents are predicted at the entrance to the main cargo container area (Pier J). Currents here may be strong enough to damage, and possible break, mooring lines. Photo Credit: SAFRR Tsunami Scenario. (Larger image) Areas that would be inundated (in red) from the SAFRR Tsunami Scenario at Oakland Airport and Alameda in California. Large portions of Bay of Farm Island and Oakland Airport would be flooded. Photo Credit: SAFRR Tsunami Scenario. (Larger image) Areas that would be inundated (in red) from the SAFRR Tsunami Scenario in Newport Beach in Orange County, California. This includes complete and partial flooding of islands and near overtopping of Balboa Peninsula. Photo Credit: SAFRR Tsunami Scenario. (Larger image) A map of the California coast in the City of Long Beach showing areas predicted to be inundated (in red) by the SAFRR Tsunami scenario. These include the Long Beach Convention Center and many retail businesses. Photo Credit: SAFRR Tsunami Scenario. (Larger image)