This oblique aerial photograph from 2006 shows the Barter Island long-range radar station landfill threatened by coastal erosion. The landfill was subsequently relocated further inland, however, the coastal bluffs continue to retreat. (High resolution image)
ANCHORAGE, Alaska — In a new study published today, scientists from the U.S. Geological Survey found that the remote northern Alaska coast has some of the highest shoreline erosion rates in the world. Analyzing over half a century of shoreline change data, scientists found the pattern is extremely variable with most of the coast retreating at rates of more than 1 meter a year.
“Coastal erosion along the Arctic coast of Alaska is threatening Native Alaskan villages, sensitive ecosystems, energy and defense related infrastructure, and large tracts of Native Alaskan, State, and Federally managed land,” said Suzette Kimball, acting director of the USGS.
Scientists studied more than 1600 kilometers of the Alaskan coast between the U.S. Canadian border and Icy Cape and found the average rate of shoreline change, taking into account beaches that are both eroding and expanding, was -1.4 meters per year. Of those beaches eroding, the most extreme case exceeded 18.6 meters per year.
“This report provides invaluable objective data to help native communities, scientists and land managers understand natural changes and human impacts on the Alaskan coast,” said Ann Gibbs, USGS Geologist and lead author of the new report.
Coastlines change in response to a variety of factors, including changes in the amount of available sediment, storm impacts, sea-level rise and human activities. How much a coast erodes or expands in any given location is due to some combination of these factors, which vary from place to place.
"There is increasing need for this kind of comprehensive assessment in all coastal environments to guide managed response to sea-level rise and storm impacts," said Dr. Bruce Richmond of the USGS. "It is very difficult to predict what may happen in the future without a solid understanding of what has happened in the past. Comprehensive regional studies such as this are an important tool to better understand coastal change. ”
Compared to other coastal areas of the U.S., where four or more historical shoreline data sets are available, generally back to the mid-1800s, shoreline data for the coast of Alaska are limited. The researchers used two historical data sources, from the 1940s and 2000s, such as maps and aerial photographs, as well as modern data like lidar, or “light detection and ranging,” to measure shoreline change at more than 26,567 locations.
There is no widely accepted standard for analyzing shoreline change. The impetus behind the National Assessment project was to develop a standardized method of measuring changes in shoreline position that is consistent on all coasts of the country. The goal was to facilitate the process of periodically and systematically updating the results in a consistent manner.
The report, titled “National Assessment of Shoreline Change: Historical Shoreline Change Along the North Coast of Alaska, U.S.-Canadian Border to Icy,” is the 8th Long-Term Coastal Change report produced as part of the USGS’s National Assessment of Coastal Change Hazards project. A comprehensive database of digital vector shorelines and rates of shoreline change for Alaska, from the U.S.-Canadian border to Icy Cape, is presented along with this report. Data for all 8 long-term coastal change reports are also available on the USGS Coastal Change Hazards Portal.
ANCHORAGE, Alaska — Greenhouse gas emissions remain the primary threat to the preservation of polar bear populations worldwide. This conclusion holds true under both a reduced greenhouse gas emission scenario that stabilizes climate warming and another scenario where emissions and warming continue at the current pace, according to updated U.S. Geological Survey research models.
Under both scenarios, the outcome for the worldwide polar bear population will very likely worsen over time through the end of the century.
The modeling effort examined the prognosis for polar bear populations in the four ecoregions (see map) comprising their range using current sea ice projections from the Intergovernmental Panel on Climate Change for two greenhouse gas emission scenarios. Both scenarios examined how greenhouse gas emissions may affect polar bears: one looked at stabilization in climate warming by century’s end because of reduced GHG emissions, and the other looked at unabated (unchanged) rates of GHG emissions, leading to increased warming by century’s end.
“Addressing sea ice loss will require global policy solutions to reduce greenhouse gas emissions and likely be years in the making,” said Mike Runge, a USGS research ecologist. “Because carbon emissions accumulate over time, there will be a lag, likely on the order of several decades, between mitigation of emissions and meaningful stabilization of sea ice loss.”
Under the unabated emission scenario, polar bear populations in two of four ecoregions were projected to reach a greatly decreased state about 25 years sooner than under the stabilized scenario. Under the stabilized scenario, GHG emissions peak around 2040, decline through 2080, then decline through the end of the century. In this scenario, USGS projected that all ecoregion populations will greatly decrease except for the Archipelago Ecoregion, located in the high-latitude Canadian Arctic, where sea ice generally persists longer in the summer. These updated modeling outcomes reinforce earlier suggestions of the Archipelago’s potential as an important refuge for ice-dependent species, including the polar bear.
The models, updated from 2010, evaluated specific threats to polar bears such as sea ice loss, prey availability, hunting, and increased human activities, and incorporated new findings on regional variation in polar bear response to sea ice loss.
“Substantial sea ice loss and expected declines in the availability of marine prey that polar bears eat are the most important specific reasons for the increasingly worse outlook for polar bear populations,” said Todd Atwood, research biologist with the USGS, and lead author of the study. “We found that other environmental stressors such as trans-Arctic shipping, oil and gas exploration, disease and contaminants, sustainable harvest and defense of life takes, had only negligible effects on polar bear populations—compared to the much larger effects of sea ice loss and associated declines in their ability to access prey.”
Additionally, USGS researchers noted that if the summer ice-free period lengthens beyond 4 months – as forecasted to occur during the last half of this century in the unabated scenario – the negative effects on polar bears will be more pronounced. Polar bears rely on ice as the platform for hunting their primary prey – ice seals – and when sea ice completely melts in summer, the bears must retreat to land where their access to seals is limited. Other research this year has shown that terrestrial foods available to polar bears during these land-bound months are unlikely to help polar bear populations adapt to sea ice loss.
USGS scientists’ research found that managing threats other than greenhouse gas emissions could slow the progression of polar bear populations to an increasingly worse status. The most optimistic prognosis for polar bears would require immediate and aggressive reductions of greenhouse gas emissions that would limit global warming to less than 2°C above preindustrial levels.
The U.S. Fish and Wildlife Service listed the polar bear as threatened under the Endangered Species Act in 2008 due to the threat posed by sea ice loss. The polar bear was the first species to be listed because of climate change. A plan to address recovery of the polar bear will be released into the Federal Register by the USFWS for public review on July 2, 2015.
The updated forecast for polar bears was developed by USGS as part of its Changing Arctic Ecosystems Initiative, together with collaborators from the U.S. Forest Service and Polar Bears International. The polar bear forecasting report is available online.Polar Bear Ecoregions: In the Seasonal Ice Ecoregion (see map), sea ice melts completely in summer and all polar bears must be on land. In the Divergent Ice Ecoregion, sea ice pulls away from the coast in summer, and polar bears must be on land or move with the ice as it recedes north. In the Convergent Ice and Archipelago Ecoregions, sea ice is generally retained during the summer. (High resolution image)
The amount of water required to hydraulically fracture oil and gas wells varies widely across the country, according to the first national-scale analysis and map of hydraulic fracturing water usage detailed in a new USGS study accepted for publication in Water Resources Research, a journal of the American Geophysical Union. The research found that water volumes for hydraulic fracturing averaged within watersheds across the United States range from as little as 2,600 gallons to as much as 9.7 million gallons per well.This map shows the average water use in hydraulic fracturing per oil and gas well in watersheds across the United States. (High resolution image)
In addition, from 2000 to 2014, median annual water volume estimates for hydraulic fracturing in horizontal wells had increased from about 177,000 gallons per oil and gas well to more than 4 million gallons per oil well and 5.1 million gallons per gas well. Meanwhile, median water use in vertical and directional wells remained below 671,000 gallons per well. For comparison, an Olympic-sized swimming pool holds about 660,000 gallons.
“One of the most important things we found was that the amount of water used per well varies quite a bit, even within a single oil and gas basin,” said USGS scientist Tanya Gallegos, the study’s lead author. “This is important for land and resource managers, because a better understanding of the volumes of water injected for hydraulic fracturing could be a key to understanding the potential for some environmental impacts.”This map shows the percentage of oil and gas wells that use horizontal drilling in watersheds across the United States. (High resolution image)
Horizontal wells are those that are first drilled vertically or directionally (at an angle from straight down) to reach the unconventional oil or gas reservoir and then laterally along the oil or gas-bearing rock layers. This is done to increase the contact area with the reservoir rock and stimulate greater oil or gas production than could be achieved through vertical wells alone.
However, horizontal wells also generally require more water than vertical or directional wells. In fact, in 52 out of the 57 watersheds with the highest average water use for hydraulic fracturing, over 90 percent of the wells were horizontally drilled.
Although there has been an increase in the number of horizontal wells drilled since 2008, about 42 percent of new hydraulically fractured oil and gas wells completed in 2014 were still either vertical or directional. The ubiquity of the lower-water-use vertical and directional wells explains, in part, why the amount of water used per well is so variable across the United States.
The watersheds where the most water was used to hydraulically fracture wells on average coincided with parts of the following shale formations:
- Eagle Ford (within watersheds located mainly in Texas)
- Haynesville-Bossier (within watersheds located mainly in Texas & Louisiana)
- Barnett (within watersheds located mainly in Texas)
- Fayetteville (within watersheds located in Arkansas)
- Woodford (within watersheds located mainly in Oklahoma)
- Tuscaloosa (within watersheds located in Louisiana & Mississippi)
- Marcellus & Utica (within watersheds located in parts of Ohio, Pennsylvania, West Virginia and within watersheds extending into southern New York)
Shale gas reservoirs are often hydraulically fractured using slick water, a fluid type that requires a lot of water. In contrast, tight oil formations like the Bakken (in parts of Montana and North Dakota) often use gel-based hydraulic fracturing treatment fluids, which generally contain lower amounts of water.
This research was carried out as part of a larger effort by the USGS to understand the resource requirements and potential environmental impacts of unconventional oil and gas development. Prior publications include historical trends in the use of hydraulic fracturing from 1947-2010, as well as the chemistry of produced waters from hydraulically fractured wells.
The report is entitled “Hydraulic fracturing water use variability in the United States and potential environmental implications,” and has been accepted for publication in Water Resources Research. More information about this study and other USGS energy research can be found at the USGS Energy Resources Program. Stay up to date on USGS energy science by signing up for our quarterly Newsletter or following us on Twitter!
Jon Campbell ( Phone: 703-648-4180 );
The U.S. Geological Survey salutes the European Space Agency (ESA) on the successful June 23 launch of its Sentinel-2A satellite, the second satellite to be launched in Europe’s Copernicus environment monitoring program.
"We are very pleased to have such a talented new player join the team in watching Earth from space,” said Suzette Kimball, acting USGS Director. “The aptly named Sentinel mission will help sharpen our focus on changes in Earth systems and contribute further insight to a great many global challenges at international to local scales, including food security, forest and wildlife conservation, and disaster response."
Sentinel-2 imagery is expected to supply valuable parallels and counterparts to Landsat imagery provided by the United States. Before Sentinel-2A launched, USGS and ESA staff worked together at length to ensure that Sentinel-2 data would be as compatible as possible with Landsat data.
First launched by NASA in 1972, the Landsat series of satellites has produced the longest, continuous record of Earth’s land surface as seen from space. Landsat images have been used by scientists and resource managers to monitor water quality, glacier recession, coral reef health, land use change, deforestation rates, and population growth.
Landsat is a joint effort of USGS and NASA. NASA develops remote-sensing instruments and spacecraft, launches the satellites, and validates their performance. USGS develops the associated ground systems, then takes ownership and operates the satellites (since 2000), as well as managing data reception, archiving, and distribution. Landsat data were made available to all users free of charge under a policy change by the U.S. Department of the Interior and USGS in late 2008.
"We are also pleased that a free and open data policy has been adopted for users of Sentinel data,"Kimball added. “Free, open access to Landsat and Sentinel-2 data together will create remarkable economic and scientific benefits for people around the globe."
Designed as a two-satellite constellation – Sentinel-2A and -2B – the Sentinel-2 mission carries an innovative wide swath high-resolution multispectral imager with 13 spectral bands. However, it will not fully duplicate the Landsat data stream, which includes thermal measurements. Sentinel-1A, a satellite with radar-based instruments, was launched April 3, 2014.
Once it is fully operational following several months of on-orbit testing, Sentinel-2A alone could provide 10-day repeat coverage of Earth’s land areas. With Sentinel-2A data added to the 8-day coverage from Landsat 7/8 combined, users can look forward to better-than-weekly coverage at moderate resolution. Repeat coverage capabilities will further increase with the planned launch of a second Sentinel-2 satellite (Sentinel-2B) next year.
NASA has published an online comparison of Sentinel-2A and Landsat bandwidths.
Wading bird numbers in the Florida Everglades are driven by water patterns that play out over multiple years according to a new study by the U.S. Geological Survey and Florida Atlantic University. Previously, existing water conditions were seen as the primary driving factor affecting numbers of birds, but this research shows that the preceding years’ water conditions and availability are equally important.
“We’ve known for some time that changes in water levels trigger a significant response by wading birds in the Everglades,” said James Beerens, the study’s lead author and an ecologist at USGS. “But what we discovered in this study is the importance of history. What happened last year can tell you what to expect this year.”
From 2000 to 2009, scientists examined foraging distribution and abundance data for wading bird populations, including Great Egrets, White Ibises, and threatened Wood Storks. To do the research, they conducted reconnaissance flights across the Greater Everglades system, an area that includes Big Cypress National Preserve and Everglades National Park. They found climate and water management conditions going as far back as three years influenced current bird population numbers and distribution.
“We know wading birds depend on small fish and invertebrates for food,” said Dale Gawlik, director of FAU’s Environmental Science Program and study coauthor. “What is interesting is the ‘lag effect’; wet conditions that build up invertebrate and fish numbers may not immediately result in increased bird numbers until after several more wet years.”
This new information has allowed scientists to improve existing wading bird distribution models providing a more accurate tool to estimate wading bird numbers under climate change scenarios and hydrological restoration scenarios proposed for the Everglades.
In the Everglades, food items such as small fish and crayfish are concentrated from across the landscape into pools as water levels recede throughout the dry season. It does not always work that way anymore due to a lack of water and loss of habitat in Everglades marshes. This new research shows that under the right dry season conditions following a water pulse in previous years, wading bird food is even further concentrated in near-perfect water depths, setting off a boom in the numbers of young wading birds that add to the population.
Beerens and computer scientists from the USGS have also developed publically available software as an extension to this work that predicts wading bird numbers in the Everglades based on real-time, current conditions, in addition to historical settings. This new model allows managers to simulate the effect of various management strategies that can have an impact on future bird numbers. The number and distribution of wading birds serve as an important indicator of ecosystem health in the Everglades. Beerens further explained that “increased seasonal water availability in drier areas of the Everglades stimulates the entire ecosystem, as reflected in the wading birds.”
Altered water patterns resulting from land-use and water management changes have reduced wading bird numbers throughout the Everglades by about 90 percent since the turn of the 20th Century. This research shows that current management and use of water is equally important.
“Our findings also suggest that we can continue to improve the Everglades and its wading bird community by restoring water availability to areas that are over drained,” said Beerens. “There is increasing understanding that water availability and proper management make this entire ecological and economic engine work.”
Florida generates more than $3 billion in annual revenue from resident and nonresident wildlife watchers according to estimates from the U.S. Fish and Wildlife Service. Of the 1.9 million people who view wildlife in Florida while ‘away-from-home’ each year, more than 1.3 million watch wading birds and other water-dependent birds.
The study, “Linking Dynamic Habitat Selection with Wading Bird Foraging Distributions across Resource Gradients,” was published in the journal PLOS ONE and can be found online.
Scientists are expecting that this year’s Chesapeake Bay hypoxic low-oxygen zone, also called the “dead zone,” will be approximately 1.37 cubic miles – about the volume of 2.3 million Olympic-size swimming pools. While still large, this is 10 percent lower than the long-term average as measured since 1950.
The anoxic portion of the zone, which contains no oxygen at all, is predicted to be 0.27 cubic miles in early summer, growing to 0.28 cubic miles by late summer. Low river flow and low nutrient loading from the Susquehanna River this spring account for the smaller predicted size.
This is the ninth year for the Bay outlook which, because of the shallow nature of large areas of the estuary, focuses on water volume or cubic miles, instead of square mileage as used in the Gulf of Mexico dead zone forecast announced last week. The history of hypoxia in the Chesapeake Bay since 1985 can be found at EcoCheck, a website from the University of Maryland Center for Environmental Science.
The Bay’s hypoxic and anoxic zones are caused by excessive nutrient pollution, primarily from human activities such as agriculture and wastewater. The nutrients stimulate large algal blooms that deplete oxygen from the water as they decay. The low oxygen levels are insufficient to support most marine life and habitats in near-bottom waters and threaten the Bay’s production of crabs, oysters and other important fisheries.
The Chesapeake Bay Program coordinates a multi-year effort to restore the water and habitat quality to enhance its productivity. The forecast and oxygen measurements taken during summer monitoring cruises are used to test and improve our understanding of how nutrients, hydrology, and other factors affect the size of the hypoxic zone. They are key to developing effective hypoxia reduction strategies.
The predicted “dead zone” size is based on models that forecast three features of the zone to give a comprehensive view of expected conditions: midsummer volume of the low-oxygen hypoxic zone, early-summer oxygen-free anoxic zone, and late-summer oxygen-free anoxic zone. The models were developed by NOAA-sponsored researchers at the University of Maryland Center for Environmental Science and the University of Michigan. They rely on nutrient loading estimates from the U. S. Geological Survey.
"These ecological forecasts are good examples of the critical environmental intelligence products and tools that NOAA is providing to stakeholders and interagency management bodies such as the Chesapeake Bay Program," said Kathryn D. Sullivan, Ph.D., under secretary of commerce for oceans and atmosphere and NOAA administrator. “With this information, we can work collectively on ways to reduce pollution and protect our marine environments for future generations.”
The hypoxia forecast is based on the relationship between nutrient loading and oxygen. Aspects of weather, including wind speed, wind direction, precipitation and temperature also impact the size of dead zones. For example, in 2014, sustained winds from Hurricane Arthur mixed Chesapeake Bay waters, delivering oxygen to the bottom and dramatically reducing the size of the hypoxic zone to 0.58 cubic miles.
"Tracking how nutrient levels are changing in streams, rivers, and groundwater and how the estuary is responding to these changes is critical information for evaluating overall progress in improving the health of the Bay,” said William Werkheiser, USGS associate director for water. "Local, state and regional partners rely on this tracking data to inform their adaptive management strategies in Bay watersheds."
The USGS provides the nutrient runoff and river stream data that are used in the forecast models. USGS estimates that 58 million pounds of nitrogen were transported to the Chesapeake Bay from January to May 2015, which is 29 percent below average conditions. The Chesapeake data are funded through a cooperative agreement between USGS and the Maryland Department of Natural Resources. USGS operates more than 400 real-time stream gages and collects water quality data at numerous long-term stations throughout the Chesapeake Bay basin to track how nutrient loads are changing over time.
"Forecasting how a major coastal ecosystem, the Chesapeake Bay, responds to decreasing nutrient pollution is a challenge due to year-to-year variations and natural lags," said Dr. Donald Boesch, president of the University of Maryland Center for Environmental Science, "But we are heading in the right direction."
Later this year researchers will measure oxygen levels in the Chesapeake Bay. The final measurement in the Chesapeake will come in October following surveys by the Chesapeake Bay Program's partners from the Maryland Department of Natural Resources (DNR) and the Virginia Department of Environmental Quality. Bimonthly monitoring cruise updates on Maryland Bay oxygen levels can be found on DNR’s Eyes on the Bay website.
SPOKANE, Wash. — Significant amounts of undiscovered copper may be present in northeast Asia according to a new U.S. Geological Survey report. USGS scientists evaluated the potential for copper in undiscovered porphyry copper deposits in Russia and northeastern China as part of a global mineral resource assessment. The estimate of undiscovered copper is about 260 million metric tons, which is nearly 30 times the amount of copper identified in the two known porphyry deposits in northeast Asia.
Porphyry copper deposits are the main source of copper globally. Russia is an important source of copper, consistently ranking as sixth, seventh, or eighth in world production since 2000, and ranked seventh in 2014. The study area includes only two known porphyry copper deposits: 1), the world class Peschanka deposit in the Kolyma area of interior northeastern Russia that contains more than 7 million metric tons of identified copper resources, and 2), the Lora deposit in the Magadan area along the Pacific margin of Russia with about 1 million metric tons of identified copper.
Five mineral resource assessment regions with geology known to be conducive to hosting porphyry-type deposits (known as permissive tracts) are delineated in the new report. The largest tract evaluated, the Pacific Margin, extends across the entire Pacific Ocean margin of Russia (inboard of the Kamchatka Peninsula), and in addition to the known Lora deposit, contains 53 significant porphyry copper prospects, including the recently discovered Malmyzh prospect in the western Sikhote-Alin region of southeastern Russia, and at least 50 other smaller copper prospects. The geologically youngest tract, the Kamchatka-Kuril, extends from the mainland area of the Kamchatka Peninsula through the Kuril island chain, and encompasses 10 significant porphyry copper prospects, in addition to at least 17 other copper occurrences. The Pacific Margin tract is similar in tectonic setting, dimensions, geologic ages, and rock types to the rocks in the North American Cordillera that host numerous world-class porphyry copper deposits.
The Kolyma tract, located in the interior regions of northeast Russia, contains the known Peschanka deposit, and hosts five significant porphyry copper prospects and at least 19 other copper occurrences. The Chukotka tract, extending along the Arctic Ocean margin of northeasternmost Russia, is extremely remote, not well explored, and best known for hosting deposit types other than porphyry copper, such as mercury and tin-tungsten deposits. The geologically oldest region, the Kedon tract, a small region located in the interior of northeast Russia, is deeply eroded and metamorphosed and hosts few porphyry copper prospects compared with most of the geologically younger regions evaluated.
The full report, USGS Scientific Investigations Report 2010-5090-W, “Porphyry Copper Assessment of Northeast Asia—Far East Russia and Northeasternmost China,” is available online and includes a summary of the data used in the assessment, a brief overview of the geologic framework of the area, descriptions of the mineral resource assessment tracts and known deposits, maps, and tables. A GIS database that accompanies this report includes the tract boundaries and known porphyry copper deposits, significant prospects, and other prospects. Assessments of adjacent areas are included in separate reports, which are also available online.
This report is part of a cooperative international effort to assess the world’s undiscovered mineral resources. In response to the growing demand for information on the global mineral-resource base, the USGS conducts national and global assessments of renewable and nonrenewable resources to support decision making. Mineral resource assessments provide a synthesis of available information about where mineral deposits are known and suspected to occur in the Earth’s crust, what commodities may be present, and how much undiscovered resource could be present.
On June 18, 2015 in Canberra, Australia, the U.S. Geological Survey and Geoscience Australia signed a comprehensive new partnership to maximize land remote sensing operations and data that can help to address issues of national and international significance.
"This partnership builds on a long history of collaboration between the USGS and Geoscience Australia and creates an exciting opportunity for us to pool resources across our organizations,” said Dr. Frank Kelly, USGS Space Policy Advisor and Director of the USGS Earth Resources Observation and Science Center. “We will work collaboratively to implement a shared vision for continental-scale monitoring of land surface change using time-series of Earth observations to detect change as it happens.”
Dr. Chris Pigram, Geoscience Australia’s Chief Executive Officer, also welcomed the agreement. “This new partnership elevates an already very strong relationship to a new level, and will see both organizations harness their respective skillsets to further unlock the deep understanding of our planet that the Landsat program provides.”
Dr. Kelly and Dr. Pigram both observed, “Our shared vision is to develop systems that enable us to monitor the Earth and detect change as it happens. The ability to do this will be critical to our ability to engage with major challenges like water security, agricultural productivity, and environmental sustainability.”
A key element of the partnership involves a major upgrade to Geoscience Australia’s Alice Springs satellite antenna which will see the station play a much more significant role in the international Landsat ground-station network. Following this $3 million (AUD) upgrade committed to by the Australian Government, the Alice Springs antenna will transmit command-and-control signals to the Landsat satellites and support downloading of satellite imagery for the broader South East-Asia and Pacific region. Alice Springs will be one of only three international collaborator ground stations worldwide playing such a vital role in the Landsat program.
Dr. Kelly noted, “We are very pleased to see such a commitment from Australia to the future success and sustainability of the Landsat program. We appreciate the essential role that Australia continues to play in ensuring that Landsat data for this region is collected and then made available for societal benefit.”
The partnership will also include a strong focus on applying new science and ‘big data’ techniques, such as Geoscience Australia’s Geoscience Data Cube and the USGS’s land change monitoring, assessment, and projection capability, to help users unlock the full value of the data from the Landsat program.
Dr. Suzette Kimball, acting Director of the USGS, recently noted, “We are now beginning to see that the combination of high performance computing, data storage facilities, data preparation techniques, and advanced systems can materially accelerate the value of Landsat data.”
Dr. Kimball added, “By lowering barriers to this technology, we can enable government, research and industry users in the United States and Australia, as well as the broader world, to realize the full benefits of this open-access and freely available data.”
Scientists are expecting that this year’s Gulf of Mexico hypoxic zone, also called the “dead zone,” will be approximately 5,483 square miles or about the size of Connecticut — the same as it has averaged over the last several years.
The dead zone in the Gulf of Mexico affects nationally important commercial and recreational fisheries and threatens the region's economy. Hypoxic zones hold very little oxygen, and are caused by excessive nutrient pollution, primarily from activities such as agriculture and wastewater. The low oxygen levels cannot support most marine life and habitats in near-bottom waters.
This year marks the first time the results of four models were combined. The four model predictions ranged from 4,344 to 5,985 square miles, and had a collective predictive interval of 3,205 to 7,645 square miles, which take into account variations in weather and oceanographic conditions.
The NOAA-sponsored Gulf of Mexico hypoxia forecast has improved steadily in recent years, a result of advancements of individual models and an increase in the number of models used for the forecast. Forecasts based on multiple models are called ensemble forecasts and are commonly used in hurricane and other weather forecasts.
The ensemble models were developed by NOAA-sponsored modeling teams and researchers at the University of Michigan, Louisiana State University, Louisiana Universities Marine Consortium, Virginia Institute of Marine Sciences/College of William and Mary, Texas A&M University, North Carolina State University, and the U.S.Geological Survey (USGS). The hypoxia forecast is part of a larger NOAA effort to deliver ecological forecasts that support human health and well-being, coastal economies, and coastal and marine stewardship.
“NOAA, along with our partners, continues to improve our capability to generate environmental data that can help mitigate and manage this threat to Gulf fisheries and economies,” said Kathryn D. Sullivan, Ph.D., under secretary of commerce for oceans and atmosphere and NOAA administrator. “We are adding models to increase the accuracy of our dead zone forecast."
The Gulf of Mexico hypoxia forecast is based on nutrient runoff and river stream data from the USGS. The USGS operates more than 3,000 real-time stream gauges, 50 real-time nitrate sensors, and collects water quality data at long-term stations throughout the Mississippi River basin to track how nutrient loads are changing over time.
The USGS estimates that 104,000 metric tons of nitrate and 19,300 metric tons of phosphorus flowed down the Mississippi and Atchafalaya rivers into the Gulf of Mexico in May 2015. This is about 21 percent below the long-term (1980-2014) average for nitrogen and 16 percent above the long-term average for phosphorus.
"Real-time nitrate sensors are advancing our understanding of how nitrate is transported in small streams and large rivers, including the main stem of the Mississippi River,” said William Werkheiser, USGS associate director for water. “Long-term monitoring is critical for tracking how nutrient levels are changing in response to management actions and for improving modeling tools to estimate which sources and areas are contributing the largest amounts of nutrients to the Gulf. "
The confirmed size of the 2015 Gulf hypoxic zone will be released in early August, following a monitoring survey led by the Louisiana Universities Marine Consortium from July 28 to August 4.
A new GPS survey of Mount McKinley, the highest point in North America, will update the commonly accepted elevation of McKinley’s peak, 20,320 ft. The last survey was completed in 1953.
The USGS, along with NOAA’s National Geodetic Survey (NGS), and the University of Alaska Fairbanks (UAF), are supporting a Global Positioning System (GPS) survey of the Mount McKinley apex. Surveying technology and processes have improved greatly since the last survey and the ability to establish a much more accurate height now exists. With the acquisition of new elevation (ifsar) data in Alaska as part of the 3D Elevation Program, there have been inquiries about the height of the summit. The survey party is being led by CompassData, a subcontractor for Dewberry on a task awarded under the USGS’ Geospatial Products and Services Contract (GPSC).
Using modern GPS survey equipment and techniques, along with better gravity data to improve the geoid model in Alaska, the partners will be able to report the summit elevation with a much higher level of confidence than has been possible in the past. It is anticipated the newly surveyed elevation will be published by the National Geodetic Survey in late August.
An experienced team of four climbers, one from UAF and three from CompassData, will start the precarious trek to the summit with the needed scientific instruments in tow, in the middle part of June. They plan to return on or before July 7 and begin work with the University of Alaska Fairbanks and NGS processing the data to arrive at the new summit elevation.At 20, 320 feet, Mount McKinley is North America’s highest peak. (Photo courtesy of Todd Paris, UAF). (High resolution image) Climbing Mount McKinley, North America’s highest peak, is a daunting task for even the most experienced mountaineers at Denali National Park in Alaska. (Photo courtesy of National Geographic). (High resolution image) The Mount McKinley survey team, and their equipment, are expected to face temperatures well below zero, high winds and frequent snow. Current forecast, courtesy of NOAA. (Photo courtesy of Todd Paris, UAF). (High resolution image)
Are you a developer, firm, or organization using mobile or web applications to enable your users? The USGS has publicly available geospatial services and data to help your application development and enhancement.
The USGS’ National Geospatial Technical Operations Center (NGTOC) will be hosting a 30- minute webinar on “Using The National Map services to enable your web and mobile mapping efforts” on June 16 at 9am Mountain Time.
This webinar will feature a brief overview of services, data and products that are publicly available, a quick overview on how AlpineQuest, a leading private firm, is leveraging this public data to benefit their users, and a Question & Answer session with a USGS developer to help you get the most out of the national geospatial services.
“This is an opportunity from NGTOC to bring developers and users together for some demonstrations and starting some dialogue,” said Brian Fox, the NGTOC Systems Development Branch Chief. “The webinar format allows us to improve awareness of USGS geospatial services and develop a better understanding of what users and developers need to make our data and services more available and usable.”
To access the webinar, you’ll need to activate Cisco WebEx and call into the conference number (toll free) 855-547-8255 and use the security code: 98212385. The webinar will display through WebEx, and you can access it via this address: http://bit.ly/1RHayxY
The session will be recorded and closed caption option is available during the webinar at: https://recapd.com/w-a3c704
To find out more about this and other NGOC webinar conferences, go to: http://ngtoc.usgs.gov/webinars/webinar_june2015.htmlScreen shot of a mobile mapping service integrating USGS topographic data; hiking and biking trails south of Golden, Colo. Imagery with road and contour data overlaid via AlpineQuest. (high resolution image 631 KB) Screen shot of a mobile mapping service integrating USGS topographic data; hiking and biking trails south of Golden, Colo. Trail data in KML/GPX overlaid via AlpineQuest. (high resolution image 613 KB)
North America may have once been attached to Australia, according to research just published in Lithosphere and spearheaded by U.S. Geological Survey geologist James Jones and his colleagues at Bucknell University and Colorado School of Mines.
Approximately every 300 million years, the Earth completes a supercontinent cycle wherein continents drift toward one another and collide, remain attached for millions of years, and eventually rift back apart. Geologic processes such as subduction and rifting aid in the formation and eventual break-up of supercontinents, and these same processes also help form valuable mineral resource deposits. Determining the geometry and history of ancient supercontinents is an important part of reconstructing the geologic evolution of Earth, and it can also lead to a better understanding of past and present mineral distributions.
North America is a key component in reconstructions of many former supercontinents, and there are strong geological associations between the western United States and Australia, which is one of the world’s leading mineral producers.
In this study, Jones and others synthesized mineral age data from ancient sedimentary rocks in the Trampas and Yankee Joe basins of Arizona and New Mexico. They found that the ages of many zircon crystals—mineral grains that were eroded from other rocks and embedded in the sedimentary deposits—were approximately 1.6 to 1.5 billion years old, an age range that does not match any known geologic age provinces in the entire western United States.
This surprising result actually mirrors previous studies of the Belt-Purcell basin (located in Montana, Idaho and parts of British Columbia, Canada) and a recently recognized basin in western Yukon, Canada, in which many zircon ages between 1.6 and 1.5 billion years old are common despite the absence of matching potential source rocks of this age.
However, the distinctive zircon ages in all three study locations do match the well known ages of districts in Australia and, to a slightly lesser known extent, Antarctica.
This publication marks the first time a complete detrital mineral age dataset has been compiled to compare the Belt basin deposits to strata of similar age in the southwestern United States. “Though the basins eventually evolved along very different trajectories, they have a shared history when they were first formed,” said Jones. “That history gives us clues as to what continents bordered western North America 1.5 billion years ago.”
The tectonic model presented in this paper suggests that the North American sedimentary basins were linked to sediment sources in Australia and Antarctica until the break up of the supercontinent Columbia. The dispersed components of Columbia ultimately reformed into Rodinia, perhaps the first truly global supercontinent in Earth’s history, around 1.0 billion years ago. Continued sampling and analysis of ancient sedimentary basin remnants will remain a critical tool for further testing global supercontinent reconstructions.