The USGS has developed a tiered chemical prioritization approach that can be used by local, state, and federal agencies to develop ambient water and sediment quality monitoring strategies. Over 2,450 chemicals were prioritized based on their likelihood of adverse effects on human health or aquatic life, combined with information on likelihood of environmental occurrence.
Ambient monitoring includes studies of the quality of environmental resources (such as water or sediment) that are conducted under typical conditions without a predisposition that contamination is present.
"The information and approaches described in this study can be used by water resource managers to develop improved strategies that focus limited monitoring dollars on chemicals that, if present at high enough concentrations, can adversely affect human or ecosystem health," said Lisa Olsen, USGS hydrologist who led the study.
Dr. Helen Goeden, a senior toxicologist with the Minnesota Department of Health (MDH), stated that staff within the MDH's Contaminants of Emerging Concern (CEC) program has used the information described in the USGS study to assist in identifying and ranking contaminants for assessment within the CEC program.
The range of chemicals reviewed includes organic compounds of human origin and chemicals from natural sources, such as radionuclides and trace elements from geologic materials. Of 2,451 chemicals prioritized, 1,081 were determined to be of the highest priority or "Tier 1" for ambient monitoring, including 602 for water and 686 for sediment (some were evaluated for both matrixes). Others were assigned to Tier 2, those having intermediate priority for monitoring on the basis of a lower likelihood of occurrence or lower likelihood of effects on human health or aquatic life, or to Tier 3, those have low or no priority for monitoring.
Groups of chemicals prioritized for this effort included:
- Volatile organic compounds in water – 85
- Pesticides and degradates in water – 615
- Pesticides and degradates in sediment – 605
- Pharmaceuticals and hormones in water or sediment – 405
- Trace elements and other inorganics in water – 38
- Trace elements in sediment – 10
- Cyanotoxins in surface water – 15
- Lipophilic organic compounds in sediment – 699
- Disinfection by-products in water – 93
- High-production-volume chemicals in water – 318
- Wastewater-indicator and industrial compounds in water – 470
- Radionuclides in water – 14
This study was done to help the National Water-Quality Assessment (NAWQA) Program prepare for its third decade of monitoring the Nation's surface and groundwater resources. For example, NAWQA monitoring of surface and groundwater resources is using analytical schedules that include 350 of the 602 Tier 1 chemicals recommended for ambient water-quality monitoring.
The study is based on information from NAWQA and other USGS programs, information on chemicals of human-health and aquatic-life concern compiled by other agencies and organizations, U.S. Environmental Protection Agency databases, and peer-reviewed literature, totaling over 800 references. Results of the prioritization effort and related analytical methods used by the NAWQA Program are compiled in USGS Scientific Investigations Report 2012-5218.
Surveying Ice and Fire: The First Map of All of Iceland's Glaciers and Subglacier Volcanic Calderas Released
For the first time, all of Iceland’s glaciers are shown on a single map, produced by the Icelandic Meteorological Office (IMO), in collaboration with the U. S. Geological Survey (USGS) and Iceland Geosurvey. The map is the first to incorporate historical data and coverage from aerial photographs and remote sensing satellites, such as Landsat and SPOT, to show the change in the areal extent of glaciers during the past century.
Iceland has about 300 glaciers throughout the country, and altogether, 269 glaciers, outlet glaciers and internal ice caps are named. The glaciers that lack names are small and largely newly revealed, exposed by melting of snow pack due to warmer summer temperatures. The number of identified glaciers has nearly doubled at the beginning of the 21st century.
"Iceland's glaciers have also been revealed to be quite dynamic during the past century," said Oddur Sigurðsson, the senior author of the new map and a glaciologist with the IMO. "At the maximum of the Little Ice Age (about 1890 in Iceland), it's glaciers reached their greatest areal extent before receding to their present-day positions, interrupted with a few cooler periods during this century-long recession. Iceland's glaciers continue their retreat and lose volume; its ice caps are losing an average of 1 m of ice each year from their surfaces."
Subglacier volcanic calderas and their locations with respect to the glaciers are an important feature of the new map. Many of Iceland‘s glaciers lie over active volcanoes, including Eyjafjallajökull, the now well-known volcano an eruption from which in 2010 disrupted air travel between North America and Europe and within Europe.
Knowing which volcanic calderas lie beneath glaciers and their history of volcanic activity is important for disaster preparation and mitigation. When a volcano erupts beneath a glacier, it often results in the unleashing of very large volume floods known by scientists as a jökulhlaup ("Glacier-outburst flood").
This volcano-glaciological hazard is well known to Icelanders. The largest jökluhlaups occur when the Katla volcano under the Mýrdalsjökull ice cap (just to the east of Eyjafjallajökull; see graphic, an excerpt from the map) erupts, resulting in a flood that exceeds the normal flow of the Amazon River, Earth's largest river in terms of volume of water!
Richie Williams, emeritus senior research geologist with the USGS and collaborator on the map notes that, "The more than 40 years of scientific research in Iceland by USGS scientists, in collaboration with numerous Icelandic scientists and institutions, has produced many important scientific publications in volcanology, geothermal activity, volcanic geomorphology, glaciology, and geologic hazards."
Surge-type glaciers also make their debut on this cartographically unique map. Surge-type glaciers are those that, for reasons not completely understood scientifically, suddenly move forward, advancing several hundred meters or even several kilometers in a few months.
Brúarjökull, a surge-type glacier on the northern margin of Vatnajökull, Iceland's largest ice cap, surged forward 8 km in 1963/1964. Eyjabakkajökull, a surge-type glacier just to the east of Brúarjökull, surged about 2 km in 1972/1973, a change that was captured on the first Landsat images acquired of a surging glacier.
The map of Iceland's glaciers is the result of many decades of research and data collection from all across Iceland, the area of which is about the same as the Commonwealth of Virginia. Maps compiled by the Danish Geodetic Survey in the first third of the 20th century, aerial mapping missions for the U.S. Army Map Service at the end of World War II, satellite images from Landsat and SPOT during a period of four decades, all contributed to the map´s compilation.
The USGS and many Icelandic scientific institutions, including the Icelandic Meteorological Office, have a more than 40-year history of cooperative research, including a long-standing Memorandum of Understanding for research on a wide variety of subjects, including glaciers, volcanoes, tectonics, and geothermal energy. Iceland is the world leader in geothermal exploration and technology, a major source of "green" energy.
The map is entitled "Map of the Glaciers of Iceland" ("Jöklakort af Íslandi" in Icelandic); the map Legend is in both Icelandic and English and is available from the Icelandic Meteorological Office or Iðnú ehf. More information on glacier research in the USGS can be found online.
Celebrate the second annual Geologic Map Day! On October 18, as a part of the Earth Science Week 2013 activities, join leading geoscience organizations in promoting awareness of the importance of geologic mapping to society.
Geologic maps are vital to education, science, business, and public policy concerns. Geologic Map Day will focus the attention of students, teachers, and the general public on the study, uses, and significance of these tools, by engaging audiences through educational activities, print materials, online resources, and public outreach opportunities.Geologic map of the Holy Cross quadrangle, Colorado
In conjunction with Geologic Map Day, the USGS is promoting several new events. The first "Best Student Geologic Map Competition" will be held at the annual Geological Society of America meeting. Students in university (either undergraduates or graduate students) from around the world who have completed a geologic map are eligible to compete and we will be posting short videos of university students who have mapping in the field this summer. We hope these videos will inspire younger students in high school to consider the Earth Sciences as their future major.
Be sure to check out the Geologic Map Day poster included in this year's Earth Science Week Toolkit. The poster and other materials in the kit show how geologic maps can be used to understand natural hazards as well as providing step-by-step instructions for a related classroom activity. Additional resources for learning about geologic maps can be found on the Geologic Map Day web page.
Geologic Map Day partners include the American Geosciences Institute (AGI), the Association of American State Geologists, the U.S. Geological Survey, the National Park Service, the Geological Society of America, and Esri.
Focusing on the theme of "Mapping Our World," Earth Science Week 2013 will be celebrated October 13-19. To learn more, please visit www.earthsciweek.org/. To order your Toolkits, please visit www.earthsciweek.org/materials/. You may also call AGI Publications to place your order at 703-379-2480.
For more information, go to: http://www.earthsciweek.org/geologicmap/
NASA and the U.S. Geological Survey kick off a quest for an innovative and affordable space-based system to extend the Landsat data record for decades to come with a public forum and call for ideas Wednesday, Sept. 18.
The Sustainable Land Imaging Architecture Study Industry and Partner Day will take place from 1-4:30 p.m. EDT in the NASA Headquarters Webb Auditorium, 300 E St. SW, Washington. Following this public forum, NASA will release a request for information to formally seek new ideas on the design of such a system.
In April the Obama Administration directed NASA to conduct the study as part of its initiative to create for the first time a long-term, sustainable spaceborne system to provide Landsat-quality global observations for at least the next 20 years. The Sustainable Land Imaging Program, announced in the President's proposed fiscal year 2014 budget, directs NASA to lead the overall system architecture study with participation from USGS.
Representatives of the White House Office of Science and Technology Policy, NASA and the USGS will present details of the study process and planning timeline during the public forum.
"We are looking for system design solutions that spur innovation and increase efficiencies, making use of aerospace expertise from across the government and commercial aerospace sector," said David Jarrett, study lead in the Earth Science Division in the Science Mission Directorate at NASA Headquarters. "We will evaluate a range of solutions, including large and small dedicated spacecraft, formation flying, hosted payloads, and international and private sector collaborations."
"Landsat data are used by a broad range of specialists to assess some of the world’s most critical issues — the food, water, forests, and other natural resources needed for a growing world population.” said Matt Larsen, USGS Associate Director for Climate and Land Use Change. "We are happy to participate in the NASA study to help develop and refine the long-term future of this program, while at the same time recognizing that it is vital that we maintain our Landsat observational capabilities over the short-term to ensure that no data gap occurs."
The objective of the Sustainable Land Imaging study is to design space-based systems that can provide continuous Landsat-quality data for at least 20 years and be sustained in a tight federal budget environment. The system is planned to continue the 41-year-old Landsat data record, which was assembled with a series of single satellites implemented one at a time.
The most recent addition to the long-running series, Landsat 8, launched in February 2013, is performing well. However, Landsat 7, launched in 1999 and now operating with limited redundancy and a waning fuel supply, could fail or run out of fuel in the next few years. Both satellites were developed and launched by NASA. The spacecraft are now operated by the USGS, which is responsible for generating, archiving, and distributing a range of standard products based on the space-borne measurements.
The Landsat program provides continuous global, moderate-resolution measurements of land and coastal regions, providing mankind's longest record of our planet from space. Landsat data provide a consistent and reliable foundation for research on land use change, forest health and carbon inventories, and changes to our environment, climate, and natural resources.
The free and open availability of Landsat data enables the measurements to be used routinely by decision makers both inside and outside the government, for a wide range of natural resource issues including water resource management, wildfire response, agricultural productivity, rangeland management, and the effects of climate change.
Media interested in attending the public forum must contact Steve Cole no later than 11 a.m. EDT, Wednesday, Sept. 18.
Today the U.S. Geological Survey led a congressional briefing featuring state and regional water stakeholders who spoke about vital uses of comprehensive water information that would be met by the National Water Census called for by the SECURE Water Act of 2009.
Growing populations, increased energy development, and the uncertain effects of a changing climate magnify the need for an improved understanding of water use and water availability. However, no comprehensive and current national assessment of water resources exists.
A report released in April, Progress Toward Establishing a National Assessment of Water Availability and Use, fulfilled a requirement under the 2009 law for the Secretary of the Interior to report to Congress on progress made in implementing the national water availability and use assessment program, also referred to as a National Water Census.
"It’s true in other fields and no less so for water: you can’t manage what you don’t measure," said Anne Castle, Department of the Interior Assistant Secretary for Water and Science. "The Water Census will quantify water supply and demand consistently across the entire country, fill in gaps in existing data, and make that information available to anyone who needs it—and that represents a huge step forward on the path toward water sustainability."
The National Academy of Sciences applauded the concept of a Water Census in 2009, suggesting that it would be "an ongoing, effective tool, on a par with the social and economic censuses, that supports national decision making."
"As competition for water grows — for irrigation of crops, for the production of energy, for use by cities and communities, and for the environment — the need for information and tools to aid water-resource managers also grows," said Tony Willardson, Executive Director, Western States Water Council.
"The more accurately we can assess the quantity and quality of our water resources, the better we can know whether our strategies for conserving and improving those resources are actually having the desired beneficial effect," observed Bob Tudor, Deputy Director, Delaware River Basin Commission. Willardson and Tudor were speakers at the briefing.
A Water Census is a complex undertaking, which points to why national water availability and use have not been comprehensively assessed in more than 35 years. Since then, competition for water resources has increased greatly and, in addition to human use, considerably more importance is now attached to the availability of water for environmental and ecosystem needs.
The USGS envisions the Water Census to be a key ongoing activity that, like the population census mandated by the Constitution, supports national decision-making in many different ways.
The resources currently available for this census are finite, however. USGS foresees that estimates of flow at ungaged locations and estimates of evapotranspiration will be among the earliest products of the National Water Census. Providing complete water-use information and adequately assessing the Nation’s groundwater resources with respect to water availability will require additional time.
Although the existing data are limited and much work remains to be done, funding over the past two years has allowed important progress. The USGS will continue to work with partner agencies and organizations to maximize the utility of the information for a broad range of uses.
The Water Census is part of an overarching Department of the Interior (DOI) initiative known as WaterSMART (Sustain and Manage America’s Resources for Tomorrow). Through WaterSMART, the Department is working to achieve a sustainable water strategy to help meet the Nation’s water needs. The Water Census will help inform that strategy.
The USGS is developing plans for the Water Census in coordination with other Federal and non-Federal agencies, universities, and other organizations. Collaboration across agency boundaries ensures that information produced by the USGS can be aggregated with data on other types of physical, social, economic, and environmental factors that affect water availability. Both the USGS and the U.S. Bureau of Reclamation (USBR) have substantive responsibilities under the Department of the Interior WaterSMART initiative.
One of the geographic focus areas of the Water Census is the drainage basin of the Colorado River, which covers parts of seven states, delivers water to more than 30 million people, irrigates nearly 4 million acres of cropland in the U.S. and Mexico, and supplies hydropower plants that annually generate more than 10 billion kilowatt-hours. Increasing population, decreasing streamflows, and the uncertain effects of a changing climate amplify the need for an improved understanding of water use and water availability in this crucial watershed.
In keeping with rapid demand, the USGS has posted new US Topo quadrangles covering Colorado (1,794 maps) and Minnesota (1,689). These new quads replace the first edition US Topo maps for those states. The replaced maps will be added to the USGS Historical Topographic Map Collection and are also available for free download from The National Map and the USGS Map Locator & Downloader website.
The new design for US Topo maps improves readability of maps for online and printed use, while retaining the look and feel of the traditional USGS topo map. Also, map symbols are now easier to read over the digital aerial photograph layer whether the imagery is turned on or off.
Other re-design enhancements and new features:
- New shaded relief layer for enhanced view of the terrain
- Military installation boundaries, post offices and cemeteries
- New road classification
- A slight screening (transparency) has been applied to some features to enhance visibility of multiple competing layers
- New PDF legend attachment
- Metadata formatted to support multiple browsers
In addition, the new Colorado US Topo quads include recreational trails in National Forests, provided by the U.S. Forest Service. Although this first test of trails was successful, the Forest Service does not yet have comparable data in other states, and schedules for adding trails in all National Forests have not been set.
"We are excited to about these two updates that are part of our continual effort to improve US Topo maps for our users," said Vicki Lukas, USGS Chief of Partner and User Engagement. "First, the new design makes US Topo maps even easier to use, and the new Colorado maps include Forest Service trails as a new feature."
US Topo maps are updated every three years. The initial round of the 48 conterminous state coverage was completed last September. Hawaii and Puerto Rico maps are being completed this year. New US Topo maps for Alaska have started, but will take several years to complete.
US Topo maps are created from geographic datasets in The National Map, and deliver visible content such as high-resolution aerial photography, which was not available on older paper-based topographic maps. The new US Topo maps provide modern technical advantages that support wider and faster public distribution and on-screen geographic analysis tools for users.
For more information, go to: http://nationalmap.gov/ustopo/Figure showing proposed US Topo production schedule. States that were updated in 2012 are in yellow; states that have, or will be updated in 2013 are colored red; and states are scheduled to be updated in 2014 are in blue. (High resolution image)
ST. PETERSBURG, Fla. — Acidification of the Arctic Ocean is occurring faster than projected according to new findings published in the journal PLOS ONE. The increase in rate is being blamed on rapidly melting sea ice, a process that may have important consequences for health of the Arctic ecosystem.
Ocean acidification is the process by which pH levels of seawater decrease due to greater amounts of carbon dioxide being absorbed by the oceans from the atmosphere. Currently oceans absorb about one-fourth of the greenhouse gas. Lower pH levels make water more acidic and lab studies have shown that more acidic water decrease calcification rates in many calcifying organisms, reducing their ability to build shells or skeletons. These changes, in species ranging from corals to shrimp, have the potential to impact species up and down the food web.
The team of federal and university researchers found that the decline of sea ice in the Arctic summer has important consequences for the surface layer of the Arctic Ocean. As sea ice cover recedes to record lows, as it did late in the summer of 2012, the seawater beneath is exposed to carbon dioxide, which is the main driver of ocean acidification.
In addition, the freshwater melted from sea ice dilutes the seawater, lowering pH levels and reducing the concentrations of calcium and carbonate, which are the constituents, or building blocks, of the mineral aragonite. Aragonite and other carbonate minerals make up the hard part of many marine micro-organisms’ skeletons and shells. The lowering of calcium and carbonate concentrations may impact the growth of organisms that many species rely on for food.
The new research shows that acidification in surface waters of the Arctic Ocean is rapidly expanding into areas that were previously isolated from contact with the atmosphere due to the former widespread ice cover.
"A remarkable 20 percent of the Canadian Basin has become more corrosive to carbonate minerals in an unprecedented short period of time. Nowhere on Earth have we documented such large scale, rapid ocean acidification" according to lead researcher and ocean acidification project chief, U.S. Geological Survey oceanographer Lisa Robbins.
Globally, Earth's ocean surface is becoming acidified due to absorption of man-made carbon dioxide. Ocean acidification models show that with increasing atmospheric carbon dioxide, the Arctic Ocean will have crucially low concentrations of dissolved carbonate minerals, such as aragonite, in the next decade.
"In the Arctic, where multi-year sea ice has been receding, we see that the dilution of seawater with melted sea ice adds fuel to the fire of ocean acidification" according to co-author, and co-project chief, Jonathan Wynn, a geologist from the University of the South Florida. "Not only is the ice cover removed leaving the surface water exposed to man-made carbon dioxide, the surface layer of frigid waters is now fresher, and this means less calcium and carbonate ions are available for organisms."
Researchers were able to investigate seawater chemistry at high spatial resolution during three years of research cruises in the Arctic, alongside joint U.S.-Canada research efforts aimed at mapping the seafloor as part of the U.S. Extended Continental Shelf program. In addition to the NOAA supported ECS ship time, the ocean acidification researchers were funded by the USGS, National Science Foundation, and National Oceanic and Atmospheric Administration.
Compared to other oceans, the Arctic Ocean has been rather lightly sampled. "It's a beautiful but challenging place to work," said Robert Byrne, a USF marine chemist. Using new automated instruments, the scientists were able to make 34,000 water-chemistry measurements from the U.S. Coast Guard icebreaker. "This unusually large data set, in combination with earlier studies, not only documents remarkable changes in Arctic seawater chemistry but also provides a much-needed baseline against which future measurements can be compared." Byrne credits scientists and engineers at the USF college of Marine Science with developing much of the new technology.
While scientists can't predict when a great earthquake producing a pan-Pacific tsunami will occur, thanks to new tools being developed by federal and state officials, scientists can now offer more accurate insight into the likely impacts when tsunamis occur. This knowledge can lead officials and the public to reduce the risk of the future tsunamis that will impact California.
What are the potential economic impacts? Which marinas could be destroyed? Who needs to be prepared for evacuations? A newly published report looks at these factors and more.
A hypothetical yet plausible scenario was developed where a tsunami is created by an earthquake offshore from the Alaskan peninsula and extends to the California coast. This is detailed in a new report titled, the U.S. Geological Survey's Science Application for Risk Reduction (SAFRR) Tsunami Scenario.
Some of the issues highlighted in the scenario include public safety and economic loss. In this scenario approximately 750,000 people would need to be evacuated, with 90,000 of those being tourists and visitors. Additionally, one-third of the boats in California's marinas could be damaged or completely sunk, resulting in $700 million in losses. It was concluded that neither of California's nuclear power plants would likely be damaged by this particular event.
Looking back to 2011, not only was Japan devastated by the magnitude 9.1 Tohoku earthquake, but the resulting tsunami also swept through California and caused $50-100 million of damage. This shows that tsunamis near and far can lead to billions of dollars in losses in California.
The SAFRR Tsunami Scenario is a collaborative effort between the USGS, the California Geological Survey, the California Governor's Office of Emergency Services, the National Oceanic and Atmospheric Administration, other agencies, and academic and other institutions.
"The good news is that three-quarters of California's coastline is cliffs, and thus immune to the harsher and more devastating impacts tsunamis could pose," said Lucy Jones, who is the USGS Science Advisor for Risk Reduction and leads the SAFRR Project. "The bad news is that the one-quarter at risk is some of the most economically valuable property in California."
"In order to effectively protect communities from tsunamis, we must first know what to plan for," continued Jones. "By starting with science, there is a clearer understanding on how tsunamis function and their potential impacts. The scenario will serve as a long-lasting resource to raise awareness and provide scientifically-sound and unbiased information to decision makers in California and abroad."
In this scenario, scientists specifically outline the likely inundation areas, current velocities in key ports and harbors, physical damage and repair costs; economic consequences; environmental impacts; social vulnerability; emergency management; and policy implications for California.
This scenario will also be the focus of discussion at a workshop series starting September 4 that is convened in partnership with the California Tsunami Hazard Mitigation Program. USGS scientists and partners will explain the scenario and results to stakeholders in the coastal communities of California. The workshops aims to establish a community of experts while fostering the use of science in decision-making.
The workshops will be hosted by the Cabrillo Marine Aquarium (September 4); Santa Barbara County Office of Emergency Management (September 5); San Diego County Office of Emergency Management (September 6); Santa Cruz County Office of Emergency Management (September 9); and the Port of San Francisco (September 10).
The SAFRR Project is the same USGS research group that produced the ShakeOut Scenario in 2008, examining the consequences of a probable major earthquake on the southern San Andreas Fault, and the ARkStorm Scenario in 2011, examining the risks associated with extreme rain events associated with atmospheric rivers.The March 11, 2011 Tohoku-oki tsunami caused significant damage to ships and docks within Crescent City Harbor in California. A number of ships were sunk within the harbor. Because of extensive sedimentation and potential contaminated debris within the harbor, recovery efforts took over a year. Photo Credit: Rick Wilson, California Geological Survey. (Larger image) Maximum current speeds for the Port of Los Angeles (POLA) and the Port of Long Beach (POLB), according to the SAFRR Tsunami Scenario. In the POLA, currents are strongest at Angels Gate, the Cabrillo Marina, the Boat Yard, and the old Navy Yard. Once the tsunami event is underway, navigation through the Gate would be very dangerous. In the Cabrillo Marina and Boat Yard, currents are likely strong enough to break apart floating docks, damage piles, and pull small vessels from their mooring lines. The strongest currents are found in the old Navy Yard; however there are no exposed floating assets in this immediate area. At the POLB, again strong currents are found at the Gate. Also in the POLB, strong and jet-like currents are predicted at the entrance to the main cargo container area (Pier J). Currents here may be strong enough to damage, and possible break, mooring lines. Photo Credit: SAFRR Tsunami Scenario. (Larger image) Areas that would be inundated (in red) from the SAFRR Tsunami Scenario at Oakland Airport and Alameda in California. Large portions of Bay of Farm Island and Oakland Airport would be flooded. Photo Credit: SAFRR Tsunami Scenario. (Larger image) Areas that would be inundated (in red) from the SAFRR Tsunami Scenario in Newport Beach in Orange County, California. This includes complete and partial flooding of islands and near overtopping of Balboa Peninsula. Photo Credit: SAFRR Tsunami Scenario. (Larger image) A map of the California coast in the City of Long Beach showing areas predicted to be inundated (in red) by the SAFRR Tsunami scenario. These include the Long Beach Convention Center and many retail businesses. Photo Credit: SAFRR Tsunami Scenario. (Larger image)
Two recent images from the Landsat 8 satellite compare land conditions in the vicinity of Yosemite National Park before and during the Rim Fire. The images, from August 15 before the fire began and from August 31, can be contrasted and downloaded from the USGS Earth Resources Observation and Science (EROS) Center.
As of September 3, the Rim Fire, currently the fourth-largest wildfire in California history, has burned over 235,841 acres (about 16 times the land area of Manhattan Island) and is 70-percent contained. The Rim Fire started August 17 on lands to the west of Yosemite National Park, but spread quickly into western regions of the park.
Landsat imagery provides critical vegetation and fuels information that is used to model fire behavior and make tactical decisions. After a fire, scientist and land managers use Landsat imagery to determine the severity of the fire's effect and to monitor the recovery of the land.
Both images are false-colored to allow identification of critical vegetation and fuels information. In the images fire appears bright red, vegetation is green, smoke is blue, clouds are white, and bare ground is tan-colored.
The USGS supports both the Department of the Interior and U.S. Forest Service wildfire response. Throughout the fire season, USGS regularly uploads images for wildfires from several satellites to the Hazard Data Distribution System. These remotely sensed data are used by wildfire responders to map potential risks to communities and determine immediate post-fire effects. So far in 2013, 2,156 images have been distributed for wildfires.
The USGS helps staff the Geospatial Multi-Agency Coordination Group (GeoMAC). The GeoMAC viewer is a mapping application that allows fire managers and the public to access online maps of current fire locations and fire perimeters. Currently, for 2013, GeoMAC is maintaining up-to-date perimeter information for 620 wildfires across the United States.
Landsat is a joint effort of both USGS and NASA. Landsat images are unique in that they provide complete global coverage, they are available for free, and they span more than 41 years of continuous earth observation.
- USGS Southern California Wildfire Risk Scenario Project
- NASA 'Fire Towers' in Space
- USGS Landsat
- NASA Landsat
PASADENA, Calif. — Imagine the Delaware River abruptly rising toward Philadelphia in a tsunami-like wave of water. Scientists now propose that this might not be a hypothetical scenario. A newly published paper concludes that a modest (one-foot) tsunami-like event on the East Coast was generated in the past by a large offshore earthquake. This result may have potential ramifications for emergency management professionals, government officials, businesses and the general public.
Early in the morning of Jan. 8, 1817, earthquake shaking was felt along the Atlantic seaboard as far north as Baltimore, Md., and at least as far south as Charleston, S.C. Later that morning, a keen observer documented an abrupt rise in the tide on the Delaware River near Philadelphia, commenting on the earthquake felt earlier to the south, and remarking that the tidal swell was most likely "the reverberation or concussion of the earth operating on the watery element."
Scientists have previously interpreted this earthquake to have a magnitude around 6 and a location somewhere in the Carolinas or slightly offshore. In a new study, USGS research geophysicist Susan Hough and colleagues reconsider the accounts of shaking and, for the first time, consider in detail the Delaware River account. They show that the combined observations point to a larger magnitude and a location farther offshore than previously believed. In particular, they show that a magnitude-7.4 earthquake located 400-500 miles off South Carolina or Georgia could have generated a tsunami wave large enough to account for the tidal swell on the Delaware. Using new computer-assisted research techniques, they uncover first-hand accounts from newspapers and ships' logs that give a wider perspective on the 1817 event. Notably, the predicted timing of such a tsunami wave from this location matches the documented timing in the eyewitness account.
The USGS monitors earthquakes offshore, and in recent years has undertaken research to better understand shaking and tsunami hazard from offshore earthquakes and landslides. Scientific understanding of faults and geological processes in this part of the Atlantic is limited. Still, it has long been understood that large, infrequent offshore earthquakes may pose a tsunami hazard to the Atlantic coast. In 1978, a magnitude-6 earthquake occurred roughly 240 miles southwest of Bermuda, even farther offshore than the inferred location of the 1817 earthquake. In 1929, the magnitude-7.2 Grand Banks, Newfoundland, earthquake triggered a submarine landslide that generated a large tsunami. Waves 10-13 feet high struck the Newfoundland coast, killing 29 people and leaving 10,000 temporarily homeless.
The inferred 1817 tsunami was significantly smaller than the Newfoundland disaster. However, the new interpretation by Hough and colleagues highlights the potential earthquake and tsunami hazard along the Atlantic seaboard from the still poorly understood offshore earthquake faults. The new study highlights that there is still work to be done to characterize this hazard in the southeastern United States.
The study, "Reverberations on the Watery Element: A Significant, Tsunamigenic Historical Earthquake Offshore the Carolina Coast," by Susan E. Hough, Jeffrey Munsey, and Steven N. Ward, is published in the September/October issue of Seismological Research Letters.
More than 400 new topographic maps are now available for the state of Alaska. The new maps are part of the U.S. Geological Survey Alaska Mapping Initiative, to update foundational data for the state and to replace the existing maps that are about 50 years old.
"These new digital maps of Alaska are elevating our visual record of the surface of the state to 21st century levels," said Anne Castle, Assistant Secretary of the Interior for Water and Science. "The associated advances in human safety, navigation, and natural resource management cannot be overestimated. The productive partnership between the State government and the USGS is facilitating acquisition of the necessary data to complete digital mapping of Alaska, which is a critical chapter in the history of our geographical knowledge of the North American continent."
The first 400-plus new US Topo maps for Alaska are now accessible and are the beginning of a multi-year project, ultimately leading to more than 11,000 new maps for the entire state. The goal of the AMI is the production of a complete series of digital topographical maps at a scale of 1:25,000 to replace the 1:63,360-scale maps produced about 50 years ago. The maps will be published in digital PDF format (GeoPDF©) and are available for free download and manipulation on a computer.
These new maps include several layers, with an option for the user to turn them on or off. Major updated features include:
- Satellite image layers which allows a recent view of the earth's surface.
- Contours and shaded relief layers showing the lay of the land derived from newly acquired 5-meter radar elevation data.
- Surface water features from the USGS National Hydrography Dataset, which are updated by local stewards and USGS.
- Glaciers updated using Randolph Glacier Inventory data.
- Boundaries integrated from multiple sources, including Census and major Federal landholders.
- The Public Land Survey System layer from the Bureau of Land Management.
- Roads from a commercial vendor under a USGS contract.
- Railroads and the Trans-Alaska oil pipeline data from local sources.
- Important buildings including police stations, schools, and hospitals.
- Airports, heliports and seaplane landing strips compiled by USGS from multiple sources.
- Feature names from the USGS-maintained Geographic Names Information System.
To ensure that the maps meet current accuracy specifications and standards, the maps will be made using newly acquired elevation and imagery data from multiple state, federal and commercial sources. The map-making process will be largely automated using software specially adapted by the USGS to create approximately 11,275 digital map quadrangles, covering the entire area of the state.
Mapping in Alaska did not keep pace with records for the rest of the nation as a result of difficult terrain, remote locations, and vast distances. Modern mapping information does not exist over the majority of land in the state. Prior to this effort, topographical maps for much of Alaska were about 50 years out of date and not produced to current standards, which rely largely on high resolution digital imagery and elevation data. As a consequence, essential public services have suffered, among them transportation planning and safety, urban and regional planning, economic development, natural resource management, conservation and scientific research.
This new generation of digital topographic maps will continue the rich and valuable USGS cartographic history, and serve the Nation by providing reliable scientific information to describe and understand the Earth; minimize loss of life and property from natural disasters; manage water, biological, energy, and mineral resources; and enhance and protect quality of life.
For more information and download, go to: http://nationalmap.gov/alaska/Historically, Alaska has been a proving ground for many developments in modern surveying and mapping. Field surveying and topographical mapping of the Alaskan interior by the USGS began in the 1890s following the discovery of gold in the Yukon. Travel was often by dog sled and pack train, canoe and walrus-skin kayak as shown in this undated photo. (Larger image) Part of a new US Topo quadrangle map of Fairbanks, Alaska – southwest borough with enhanced elevation contours, surface water, names, transportation, and structures data. (Larger image, 25 MB)
Hydraulic fracturing fluids are believed to be the cause of the widespread death or distress of aquatic species in Kentucky's Acorn Fork, after spilling from nearby natural gas well sites. These findings are the result of a joint study by the U.S. Geological Survey and the U.S. Fish and Wildlife Service.
The Acorn Fork, a small Appalachian creek, is habitat for the federally threatened Blackside dace, a small colorful minnow. The Acorn Fork is designated by Kentucky as an Outstanding State Resource Waters.
"Our study is a precautionary tale of how entire populations could be put at risk even with small-scale fluid spills," said USGS scientist Diana Papoulias, the study's lead author. "This is especially the case if the species is threatened or is only found in limited areas, like the Blackside dace is in the Cumberland."
The Blackside dace typically lives in small, semi-isolated groups, so harmful events run the risk of completely eliminating a local population. The species is primarily threatened with loss of habitat.
After the spill of hydraulic fracturing fluid, state and federal scientists observed a significant die-off of aquatic life in Acorn Fork including the Blackside dace as well as several more common species like the Creek chub and Green sunfish. They had been alerted by a local resident who witnessed the fish die-off. The U.S. Fish and Wildlife Service and the Commonwealth of Kentucky are currently working towards restoration of the natural resources that were injured by the release.
To determine the cause of the fish die-off, the researchers collected water and fish samples immediately following the chemical release in 2007.
The samples analyses and results clearly showed that the hydraulic fracturing fluids degraded water quality in Acorn Fork, to the point that the fish developed gill lesions, and suffered liver and spleen damage as well.
"This is an example of how the smallest creatures can act as a canary in a coal mine," said Tony Velasco, Ecologist for the Fish and Wildlife office in Kentucky, who coauthored the study, and initiated a multi-agency response when it occurred in 2007. "These species use the same water as we do, so it is just as important to keep our waters clean for people and for wildlife."
The gill lesions were consistent with exposure to acidic water and toxic concentrations of heavy metals. These results matched water quality samples from Acorn Fork that were taken after the spill.
After the fracturing fluids entered Acorn Fork Creek, the water’s pH dropped from 7.5 to 5.6, and stream conductivity increased from 200 to 35,000 microsiemens per centimeter. A low pH number indicates that the creek had become more acidic, and the stream conductivity indicated that there were higher levels of dissolved elements including iron and aluminum.
Blackside dace are a species of ray-finned fish found only in the Cumberland River basin of Kentucky and Tennessee and the Powell River basin of Virginia. It has been listed as a federally-threatened species by the Service since 1987.
Hydraulic fracturing is the most common method for natural gas well-development in Kentucky.
The report is entitled "Histopathological Analysis of Fish from Acorn Fork Creek, Kentucky Exposed to Hydraulic Fracturing Fluid Releases," and is published in the scientific journal Southeastern Naturalist, in a special edition devoted to the Blackside dace.
To learn more about this study and other contaminants research, please visit the USGS Environmental Health web page, the USGS Columbia Environmental Research Center, or the U.S. Fish and Wildlife Service Environmental Contaminants web page.
Declining bighorn sheep populations may be vulnerable to some of the fatal diseases, including chronic wasting disease (CWD), that are found in their western U.S. habitats, according to a new U.S. Geological Survey study.
USGS National Wildlife Health Center (NWHC) research showed that bighorn sheep are likely susceptible to the deadly neurological diseases scrapie and CWD, which are occurring in or near natural bighorn sheep environments. These fatal diseases are caused by mysterious proteins called prions, and are known to infect domestic sheep (scrapie) and non-domestic deer, elk, and moose (CWD). The USGS study is published in the journal BMC Veterinary Research, and is available online.
"Bighorn sheep are economically and culturally important to the western U.S.," said Dr. Christopher Johnson, USGS scientist and senior author of the report. "Understanding future risks to the health of bighorn sheep is key to proper management of the species."
USGS laboratory tests found evidence that bighorn sheep could be vulnerable to CWD from either white-tailed deer or elk, and to a domestic sheep prion disease known as scrapie. However, none of a small number of bighorn sheep sampled in the study showed evidence of infection.
"Our results do not mean that bighorns get, or will eventually get, prion diseases," Johnson said. "However, wildlife species like bighorn sheep are increasingly exposed to areas where CWD occurs as the disease expands to new geographical areas and increases in prevalence."
The laboratory test results could be useful to wildlife managers because bighorn sheep habitats overlap with farms and ranches with scrapie-infected sheep and regions where CWD is common in deer, elk, and moose.
Bighorn sheep populations in western North America have declined from habitat loss and, more recently, epidemics of fatal pneumonia thought to be transmitted to them from domestic sheep. Prion diseases are another possible threat to this valuable species.
For more information on prion diseases such as CWD, please visit the USGS NWHC website.
Hurricane Sandy Eroded Half of Fire Island's Beaches and Dunes: New Report Quantifies Coastal Change
ST. PETERSBURG, Fla. – Beaches and dunes on Fire Island, New York, lost more than half of their pre-storm volume during Hurricane Sandy, leaving the area more vulnerable to future storms.
While the damage and destruction on Fire Island was immediately evident after the storm, a new U.S. Geological Survey study released today is the first to quantify the actual changes to the coast caused by the storm.
"The beaches and dunes of the island were severely eroded during Sandy," said Cheryl Hapke, a USGS research geologist and lead author of the study. "The island was breached in three locations, and there was widespread damage and destruction of coastal infrastructure, including private residences. The report shows that the beaches and dunes lost 54.4 percent of their pre-storm volume, and the dunes experienced overwash along 46.6 percent of the island, dramatically changing the island’s shape."
Field surveys conducted immediately after Sandy documented low, flat beaches and extensive dune erosion. Assessment of overwash deposits -- the material that was carried to the interior of the island -- indicates that most of the sand lost from the beaches and dunes during Hurricane Sandy was moved offshore, carried by waves and storm surge. Of the volume of sand that was lost from the beaches and dunes, 14 percent was deposited inland.
"The impact from Sandy was unprecedented in recent times," said Hapke. "It is important that efforts to rebuild on the island be guided by the science, which shows that Sandy profoundly altered the shape and position of the barrier island, shifting it landward and redistributing large amounts of sand. Storms like Sandy are part of the natural evolution of barrier islands, which ultimately result in islands that are more resilient to sea level rise."
The extreme erosion of the beach and loss of dunes made the island more vulnerable to subsequent winter storms. In the course of the following winter months, the shoreline position shifted as much as 57.5 meters (189 feet) inland. Although several areas begin to experience some recovery in the early spring, at the end of the survey period only a small fraction, 18 percent, of the pre-Sandy beach volume had returned.
"Barrier islands provide natural protection against storms, shielding coastlines from rising waves and tides," said Hapke. "The loss of so much sand increases the vulnerability of this area of coastline to future storms."
Fire Island is the longest of the barrier islands that lie along the south shore of Long Island, New York. The majority of the island is part of Fire Island National Seashore and not only provides the first line of defense against storms, but is a unique and important recreational and ecosystem resource. USGS research on Fire Island focuses understanding the evolution of the form and structure of the barrier system on a variety of time scales, including storm driven change in the region.
Scientists have detected magmatic water — water that originates from deep within the Moon's interior — on the surface of the Moon. These findings, published in the August 25 issue of Nature Geoscience, represent the first such remote detection of this type of lunar water, and were arrived at using data from NASA's Moon Mineralogy Mapper (M3).
The discovery represents an exciting contribution to the rapidly changing understanding of lunar water, said Rachel Klima, a planetary geologist at the Johns Hopkins University Applied Physics Laboratory (APL) in Laurel, Md., and lead author of the paper, "Remote detection of magmatic water in Bullialdus Crater on the Moon."
"For many years, researchers believed that the rocks from the Moon were 'bone dry' and that any water detected in the Apollo samples had to be contamination from Earth," said Klima, a member of the NASA Lunar Science Institute's (NLSI) Scientific and Exploration Potential of the Lunar Poles team. "About five years ago, new laboratory techniques used to investigate lunar samples revealed that the interior of the Moon is not as dry as we previously thought. Around the same time, data from orbital spacecraft detected water on the lunar surface, which is thought to be a thin layer formed from solar wind hitting the lunar surface."
"This surficial water unfortunately did not give us any information about the magmatic water that exists deeper within the lunar crust and mantle, but we were able to identify the rock types in and around Bullialdus crater," said co-author Justin Hagerty, of the U.S. Geological Survey. "Such studies can help us understand how the surficial water originated and where it might exist in the lunar mantle."
In 2009, the M3, aboard the Indian Space Research Organisation's Chandrayaan-1 spacecraft, fully imaged the lunar impact crater Bullialdus. "It's within 25 degrees latitude of the equator and so not in a favorable location for the solar wind to produce significant surface water," Klima explained. "The rocks in the central peak of the crater are of a type called norite that usually crystallizes when magma ascends but gets trapped underground instead of erupting at the surface as lava. Bullialdus crater is not the only location where this rock type is found, but the exposure of these rocks combined with a generally low regional water abundance enabled us to quantify the amount of internal water in these rocks."
After examining the M3 data, Klima and her colleagues found that the crater has significantly more hydroxyl — a molecule consisting of one oxygen atom and one hydrogen atom — compared to its surroundings. "The hydroxyl absorption features were consistent with hydroxyl bound to magmatic minerals that were excavated from depth by the impact that formed Bullialdus crater," Klima writes.
The internal magmatic water provides information about the Moon's volcanic processes and internal composition, Klima said. "Understanding this internal composition helps us address questions about how the Moon formed, and how magmatic processes changed as it cooled. There have been some measurements of internal water in lunar samples, but until now this form of native lunar water has not been detected from orbit."
The detection of internal water from orbit means that scientists can begin to test some of the findings from sample studies in a broader context, including in regions that are far from where the Apollo sites are clustered on the near side of the Moon. "Now we need to look elsewhere on the Moon and try to test our findings about the relationship between the incompatible trace elements (e.g., thorium and uranium) and the hydroxyl signature," Klima said. "In some cases this will involve accounting for the surface water that is likely produced by interactions with the solar wind, so it will require integration of data from many orbital missions."
"This impressive research confirms earlier lab analyses of Apollo samples, and will help broaden our understanding of how this water originated and where it might exist in the lunar mantle," said NLSI Director Yvonne Pendleton.
Along with Klima and Hagerty, Joshua Cahill and David Lawrence, both of APL, co-authored the paper. The research was supported by the NASA Lunar Advanced Science and Engineering Program, NLSI and the NASA Planetary Mission Data Analysis Program.
The Applied Physics Laboratory, a not-for-profit division of The Johns Hopkins University, meets critical national challenges through the innovative application of science and technology. For more information, visit the JHUAPL website.