The image shows one of many possible badge designs. The final design will be selected in the coming months.(High resolution image)
Using crowdsourcing techniques, the USGS project known as The National Map Corps (TNMCorps) encourages volunteer “citizen scientists” to collect manmade structure data such as police stations, schools, hospitals and cemeteries, in an effort to provide more precise and authoritative spatial data for the USGS web-based mapping portal known as The National Map.
In celebration of these common passions and in honor of GIS Day and International Map Year, TNMCorps is encouraging volunteers to edit 2,016 features between GIS Day 2015 and GIS Day 2016. Each of those submitted edits are worth a point. Volunteers who contribute 2,016 edits and thus earning 2,016 points between November 18, 2015 and November 16, 2016 will be awarded with a special edition collectable embroidered patch.
“We’re excited about this ambitious challenge to our current and new National Map Corps members,” said Julia Fields, Deputy Director of the USGS National Geospatial Program, “and we are looking forward to seeing the patches on backpacks and jackets!”
Volunteer map editors are a fundamental component of TNMCorps and are critical to the success of the project. The project started in 2012, and since that time, an increasing number of volunteers have verified, edited, deleted, and created more than 160,000 structures points.
Volunteering for TNMCorps is a great way for folks to get involved in building maps for their communities and the nation. Volunteers not only increase their geographic knowledge through the process, they make a significant contribution to the nation’s wealth of publicly available geographic information. TNMCorps volunteers are some of the many individuals who share a passion for geography, cartography and collaborative mapping initiatives.
"Having a patch to display my contribution to The National Map would be the perfect incentive for me to reach 2,016 submissions,” said Mattson Fields, a volunteer patch designer. “What a great way to break the ice and introduce The National Map Corps to friends and acquaintances."
All you need is access to the internet and willingness to learn. If you are interested in becoming a Volunteer Map Editor and/or participating in this initiative, please visit The National Map Corps for more information.
Follow progress and updates at The National Map Twitter #TNMCorps, @gisday, @mapyear
The fungus Ophidiomyces ophiodiicola is the definitive cause of the skin infections in snakes known as snake fungal disease, or SFD, according to U.S. Geological Survey research published today in the journal mBio.
Wild snakes are valuable because they consume pests that damage agricultural crops, prey on rodents that can carry disease and serve as food for many predatory animals. However, some snake populations in the midwestern and eastern United States have declined since 2006 as a result of SFD, which produces thickened skin, ulcers and blisters. New USGS research provides the first direct evidence that O. ophiodiicola causes SFD, documents how the disease progresses and reveals how snakes respond to the infection.
“The loss of certain snake species in eastern North America could have widespread negative impacts on ecosystems,” said Jeffrey Lorch, a USGS National Wildlife Health Center scientist and the lead author of the study. “Pinpointing the SFD-causing fungus can help conserve snake populations threatened by this disease.”
The scientists infected eight healthy captive-bred corn snakes with O. ophiodiicola in the laboratory. Within days after exposure to the fungus, all snakes developed swelling followed by lesions identical to those observed in wild snakes with SFD. These lesions contained the same fungus to which the animals were exposed. Snakes that were not infected in the laboratory did not develop lesions and did not harbor O. ophiodiicola.
Most snakes responded to the fungus by repeatedly molting 15 to 20 days after exposure, but the disease caused potentially lethal behaviors that could increase their risk for predation or starvation in the wild. For example, infected snakes rested in exposed areas of their cages and some snakes were reluctant to eat. The uninfected snakes acted normally.
“These behaviors are uncharacteristic of healthy snakes and demonstrate how SFD can put snakes at risk in the wild,” Lorch said. “Climate change could promote growth of O. ophidiodiicola and hinder recovery from SFD because snake immunity is highly dependent on environmental conditions.”
O. ophiodiicola has consistently been found on snakes with SFD, but this new study is the first to prove that the fungus is the actual cause of the disease. The USGS has confirmed SFD in at least seven species of snakes in nine states: Illinois, Florida, Massachusetts, Minnesota, New Jersey, New York, Ohio, Tennessee and Wisconsin.
For more information on SFD, please visit the USGS National Wildlife Health Center website.
A photo from the study area acquired in August 2015 showing thermokarst development manifest as a network of troughs forming over degrading ice wedges (left). Comparison between the two airborne LiDAR data showing permafrost terrain subsidence in the aftermath of a large and severe Arctic tundra fire. (High resolution image)
ANCHORAGE, Alaska – Large and severe tundra fires cause top down permafrost thaw, playing a major role in altering Arctic landscapes according to a new study led by the U.S. Geological Survey.
The study documented widespread thermokarst formation, characterized by subsidence of the land surface as a result of melted ground-ice, in the years following a tundra fire event. Thaw of ice-rich permafrost is known to impact terrestrial and aquatic ecosystems by altering vegetation communities and hydrology as well as releasing carbon that was previously stored in the frozen ground below.
"Thermokarst development may sound like an esoteric topic, but when ground ice melts it affects everything at the surface, formation or drainage of lakes, how much water runs off the landscape and what kind of plants can grow," said Philip Martin, Science Coordinator for the Arctic Landscape Conservation Cooperative. "This is an essential part of the puzzle for resource managers and Arctic residents to piece together how the land is changing."
Researchers, led by the USGS, used repeat airborne LiDAR data acquisitions, a remote sensing tool that allows for the creation of highly detailed topographic models of the landscape, to quantify thermokarst development in the aftermath of the 2007 Anaktuvuk River tundra fire. By comparing data obtained two and again seven years post-fire, researchers determined that thermokarst affected more than 34 percent of the studied burned tundra area compared to less than one percent in similar unburned tundra.
Arctic tundra fires are known to have an immediate and severe impact on the landscape through combustion of vegetation and soil organic layers. However, widespread thermokarst development in the aftermath of an Arctic tundra fire had not been previously measured in detail.
"With LiDAR data acquisitions, we are able to document landscape changes in a measurable way like never before," said lead author Benjamin Jones, a Research Geographer with the USGS. "It is likely that the impact of fires and other disturbances on permafrost-influenced terrain in the Arctic has been underestimated since highly precise elevation data, such as from LiDAR datasets, are not widely available in these regions."
The paper "Recent Arctic tundra fire initiates widespread thermokarst development" was published in the journal Scientific Reports, the online open access journal from the publishers of Nature.
The work was supported by the Land Change Science and Land Remote Sensing programs at the USGS, the USGS Alaska Science Center, the Arctic Landscape Conservation Cooperative, the University of Alaska Fairbanks, and the Alfred Wegener Institute Helmholtz Centre for Polar and Marine Research.
Heidi Koontz ( Phone: 303-202-4763 );
According to new U.S. Geological Survey research published today in the Proceedings of the National Academy of Sciences USA, springs and marshes in the desert outside Las Vegas expanded and contracted dramatically in response to past episodes of abrupt climate change, even disappearing altogether for centuries at a time when conditions became too warm. This new record, gleaned from dirt and rocks exposed in the desert just outside the city limits, provides an unprecedented look into how climate change can affect fragile desert ecosystems in the American Southwest.
Kathleen Springer, a geologist with the USGS and former Senior Curator at the San Bernardino County Museum, was the principal investigator and lead scientist for this study showing that desert wetlands are extremely sensitive to climate change.
“This is a story of water,” said Springer. “Water was plentiful in the desert at times in the past, but when climate warmed, springs and wetlands dried up, and the plants and animals living in the harsh desert environment were out of luck.”
During the Pleistocene, between approximately 100,000 and 10,000 years ago, wetlands dotted the landscape in the area just north of Las Vegas, attracting a plethora of ice age animals, including mammoths, sloths, sabre-toothed cats, dire wolves, and extinct species of bison, horse, and camel, and later, the first human inhabitants to the area.
Today, existing desert wetlands are home to a number of threatened and endangered species that rely on the ecosystem for water in an otherwise arid landscape. Their fate may lie in the hands of a rapidly changing climate.
“What we're seeing in the geologic record frames what we are observing today,” said Springer. "The drought that California is currently experiencing is extreme, but droughts are an inherent part of the climate system and have occurred repeatedly in the past."
The study was initiated by the Bureau of Land Management, which called for an integrative approach to studies that emphasize the geological age and context of fossils, as well as a comprehensive analysis of how local hydrologic systems responded to climate change in the past.
“Scientists collect fossils all the time,” said Scott Foss, a senior paleontologist with the BLM. "What is remarkable about this work is the vision that Kathleen had of making sure her team understood the intricacies of the deposits in incredible detail, which allowed them to determine how climate affected the local landscape. It was an immense undertaking, and one that will serve as a benchmark for generations to come for those interested in understanding the effects of climate change on desert ecosystems.”
Studies examining the effects of climate change on springs and desert wetlands will continue through the USGS’s Climate and Land Use Change Research and Development Program, and will build upon the investigations conducted in the Las Vegas Valley, a large portion of which is now protected as Tule Springs Fossil Beds National Monument.
When the monument was established in December 2014, the BLM turned their stewardship over to the National Park Service, who will determine how to interpret the unique land and its former inhabitants for the public.
“The future of this newly designated national monument and what it can tell us about the effects of climate change is all about the past,” said Springer. “And the past is the key to the present.”
A cow in a pasture taken in the Central Oklahoma/Texas Plains Ecoregion (TX) in August July 2006. (High resolution image) An abandoned mine shaft taken in the Mojave Basin and Range Ecoregion (NV) in August 2002. (High resolution image)
The U.S. Geological Survey announced today that it has made part of a huge national repository of geographically referenced USGS field photographs publicly available. USGS geographers developed a simple, easy-to-use mapping portal called the Land Cover Trends Field Photo Map.
The entire collection contains over 33,000 geo-referenced field photos with associated keywords describing the land-use and land-cover change processes taking place. Initially, nearly 13,000 photos from across the continental US will be available to the public, yet the online collection will grow as more processed photos become available.
“This is a treasure trove of royalty and copyright-free photography collected using consistent procedures,” said Chris Soulard, project leader and USGS research geographer. “We envision that these photos will captivate general audiences and fulfill a myriad of scientific needs.”
Sharing these unique field photos provides an excellent resource for the scientific community with potential to develop future research, such as future repeat photography projects or applications where photos may validate remote sensing classifications. Serving USGS data interactively to the public is integral to the USGS mission and provides opportunities for future scientific collaboration by communicating USGS land change research to the broader public and scientific community.
“The benefit of these photos being hosted by the USGS is equal access to all without copyright concerns and quality control,” said Jason Sherba, USGS geographer and project web-developer.
The photography was collected as part the USGS National Land Cover Trends Project, a research effort that spanned over ten years and represented one of USGS’ largest cross-center research efforts. The project employed Landsat imagery between 1973-2000 to derive rates, causes, and consequences of contemporary land use/land cover change. Photos were collected between 1999 and 2007 to serve as an aid in Landsat-derived land-use/land-cover change analyses and assessments.Screenshot of the Land Cover Trends Field Photo Map website. (High resolution image)
An aerial view looking southeast of the treated-wastewater infiltration beds at Joint Base Cape Cod. The wastewater disposal beds (source of the nitrogen contamination) appear in the foreground. In the background is a freshwater pond that is receiving discharge of some of the groundwater contaminants. Toxic waste disposal at the site ended in 1995. (High resolution image)
USGS scientists have conducted the first-ever field measurements of anammox activity in groundwater, demonstrating that nitrogen removal from groundwater can occur through the action of naturally occurring bacteria. This research was conducted in collaboration with partners from the Virginia Institute of Marine Science and the University of Connecticut.
Anammox, shorthand for anaerobic ammonium oxidation, is a process carried out by naturally occurring bacteria that can simultaneously remove ammonium and reduce nitrogen oxides (such as nitrate and nitrite), combining the two to produce harmless nitrogen gas.
Over the past 100 years, humans have drastically altered the global nitrogen budget by fixing nitrogen gas from air to produce fertilizer in the form of ammonium and nitrate. Nitrate and ammonium are now prevalent fixed nitrogen contaminants that may be found in surface water and groundwater worldwide. Until fixed nitrogen is converted back to nitrogen gas, it remains as a potential water contaminant. Anammox and denitrification are the only two processes that can remove excess fixed nitrogen by chemically changing it back to nitrogen gas.
“Virtually all terrestrial and aquatic environments now contain extra fixed nitrogen from human activities, including groundwater, the planet’s primary freshwater resource,” said Richard Smith, a USGS research hydrologist and lead author of the investigation.
Discovered just 20 years ago in wastewater treatment systems, anammox has been studied since then in laboratory settings using enrichment cultures. Relatively recently, anammox was found to be ecologically important in marine and other surface water environments.
“Because anammox is a process that can supply its own organic carbon by fixing carbon dioxide,” Smith continued, “naturally occurring anammox bacteria are ideally suited for life in groundwater, where they could potentially be important for fixed nitrogen removal. While practical applications are still in the distant future, this process could be particularly important where groundwater is discharging to surface waters and coastal environments.”
Working at a carefully monitored USGS groundwater study site at Cape Cod, Massachusetts, the research team found that anammox was active in the subsurface in a variety of geochemical conditions, even where groundwater ammonium concentrations were low. The rates of activity were relatively low, but anammox could potentially affect inorganic nitrogen concentrations in situations where groundwater residence times are sufficiently long.
The detailed findings of the investigation were recently published in the journal Environmental Science and Technology.
The paper documents the competition between anammox and denitrification for nitrogen oxides and explores the effect of altered organic carbon concentrations on that competition. The results of this study indicate that anammox does occur in groundwater, that it can be an important mechanism for fixed nitrogen removal, and that it should be included when interpreting subsurface geochemistry and constructing groundwater nitrogen budgets.
Co-authors on the study include USGS scientist J.K. Böhlke; Bongkeun Song from the Virginia Institute of Marine Science; and Craig Tobias from the University of Connecticut, Department of Marine Science. The National Science Foundation provided additional research support.
Research article, Environmental Science and Technology
USGS Toxic Substances Hydrology Program
USGS Cape Cod Toxic Substances Hydrology Research Site
USGS National Research Program
According to the first-ever study of pesticide residues on field-caught bees, native bees are exposed to neonicotinoid insecticides and other pesticides. This report was conducted by the U.S. Geological Survey and published in the journal Science of the Total Environment.
This research focused on native bees, because there is limited information on their exposure to pesticides. In fact, little is known about how toxic these pesticides are to native bee species at the levels detected in the environment. This study did not look at pesticide exposure to honey bees.
“We found that the presence and proximity of nearby agricultural fields was an important factor resulting in the exposure of native bees to pesticides,” said USGS scientist Michelle Hladik, the report’s lead author. “Pesticides were detected in the bees caught in grasslands with no known direct pesticide applications.”
Although conservation efforts have been shown by other investigators to benefit pollinators, this study raises questions about the potential for unintended pesticide exposures where various land uses overlap or are in proximity to one another.
The research consisted of collecting native bees from cultivated agricultural fields and grasslands in northeastern Colorado, then processing the composite bee samples to test for 122 different pesticides, as well as 14 chemicals formed by the breakdown of pesticides. Scientists tested for the presence of pesticides both in and on the bees.
The most common pesticide detected was the neonicotinoid insecticide thiamethoxam, which was found in 46 percent of the composite bee samples. Thiamethoxam is used as a seed coating on a variety of different crops. Pesticides were not found in all bee samples, with 15 of the 54 total samples testing negative for the 122 chemicals examined.
Although this study did not investigate the effects of pesticide exposures to native bees, previous toxicological studies have shown that the chemicals do not have to kill the bees to have an adverse effect at the levels of exposure documented here. For example, neonicotinoids can cause a reduction in population densities and reproductive success, and impair the bees’ ability to forage. Follow-up research is now being designed to further investigate adverse effects at these exposure levels.
There are about 4,000 native species of bees in the United States. They pollinate native plants like cherries, blueberries and cranberries, and were here long before European honeybees were brought to the country by settlers. In addition, many native bees are quite efficient crop pollinators, a role that may become more crucially important if honey bees continue to decline.
This paper is a preliminary, field-based reconnaissance study that provides critical information necessary to design more focused research on exposure, uptake and accumulation of pesticides relative to land-use, agricultural practices and pollinator conservation efforts on the landscape. Another USGS study published in August discovered neonicotinoids in in a little more than half of both urban and agricultural streams sampled across the United States and Puerto Rico.
“This foundational study is needed to prioritize and design new environmental exposure experiments on the potential for adverse impacts to terrestrial organisms,” said Mike Focazio, program coordinator for the USGS Toxic Substances Hydrology Program. “This and other USGS research is helping support the overall goals of the White House Strategy to Promote the Health of Honey Bees and Other Pollinators by helping us understand whether these pesticides, particularly at low levels, pose a risk for pollinators.”
More information can be found on this paper here. USGS research on the occurrence, transport and fate of pesticides can be found with the USGS Toxic Substance Hydrology Program webpage or the USGS Pesticide Fate Research project in California. Stay up to date with USGS Environmental Health science by signing up for our GeoHealth Newsletter.
Soil acidification from acid rain that is harmful to plant and aquatic life has now begun to reverse in forests of the northeastern United States and eastern Canada, according to an American-Canadian collaboration of five institutions led by the U.S. Geological Survey.
The new research shows that these changes are strongly linked to acid rain decreases, although some results differ from expected responses.
"Reduced acid rain levels resulting from American and Canadian air-pollution control measures have begun to reverse soil acidification across this broad region," said Gregory Lawrence, a USGS soil and water chemist and lead author. "Prior to this study, published research on soils indicated that soil acidification was worsening in most areas despite several decades of declining acid rain. However, those studies relied on data that only extended up to 2004, whereas the data in this study extended up to 2014. "
As acid rain acidifies soils, it depletes soil calcium reserves, which are important in preventing the formation of aluminum that is toxic to plants and aquatic life. Calcium is also a nutrient essential for healthy ecosystems. Results of this study show that soils are no longer being depleted of calcium and that toxic aluminum levels have substantially decreased.
The uppermost soil layers have shown a strong recovery response, but deeper layers are actually increasing in aluminum, which suggests further acidification. However, this may be part of the recovery process as aluminum moves downward in the soil to be stored in a non-toxic form.
"The start of widespread soil recovery is a key step to remedy the long legacy of acid rain impacts on terrestrial and aquatic ecosystems," according to Lawrence.
The results were obtained by resampling soils that had been originally sampled eight to 24 years earlier. The collaboration among the USGS, U.S. Department of Agriculture, U.S. Forest Service, University of Maine, Canadian Forest Service and the Quebec Ministry of Forests, Wildlife and Parks, was developed through the Northeast Soil Monitoring Cooperative, a group of scientists focused on how soils are responding to our rapidly changing environment.
The study is available online. Lawrence, G. B., P. W. Hazlett, I. J. Fernandez, R. Ouimet, S. W. Bailey, W. C. Shortle, K. T. Smith, and M. R. Antidormi. 2015. Declining Acidic Deposition Begins Reversal of Forest-Soil Acidification in the Northeastern U.S. and Eastern Canada. Environmental Science & Technology.
On average, streams in the Niobrara-Mowry Play of eastern Wyoming, Fayetteville Play of Arkansas, and Barnett Play of Texas ranked most vulnerable to unconventional oil and gas development, but for different reasons, according to recent U.S. Geological Survey coauthored research.
Streams in the Fayetteville and Barnett were vulnerable mostly from existing man-made stressors, whereas streams in the Niobara-Mowry were vulnerable largely due to a stream's natural sensitivity to alterations. However, the study also shows that streams in all regions have the potential to be impacted by such development.
A team of academic, USGS, and private-sector researchers computed potential stream vulnerability to unconventional oil and gas development in six shale plays, including the Bakken, Barnett, Fayetteville, Hilliard-Baxter-Mancos, Marcellus and Utica, and Niobrara-Mowry. The newly developed vulnerability index shows that streams with the highest sensitivity and exposure to stressors may be most vulnerable to unconventional oil and gas development.
"Stream ecosystems show variation in potential vulnerability to unconventional oil and gas development across the contiguous United States," said Kelly Maloney, USGS research ecologist and coauthor of the study. "The index we developed incorporated a stream ecosystem's natural sensitivity to alterations and its exposure to man-made stressors, such as well pads, urbanization and agriculture."
What made areas potentially vulnerable varied across plays due to climatic, geologic and human caused differences. Low annual precipitation in the drier regions of the western US (Niobrara-Mowry, Hilliard-Baxter-Mancos, and Bakken) affected stream vulnerability to unconventional oil and gas development. In contrast, the steeper slopes in the watersheds of Appalachia made streams in the Marcellus-Utica play naturally sensitive. The Barnett and Marcellus regions had areas with greater urbanization than other plays.
"The indices developed in this paper can be used to identify streams where aquatic life are particularly vulnerable, and then help prioritize stream protection and monitoring efforts," said Maloney. "These findings can also be used to guide local development activities to help reduce potential environmental effects."
Research partners in this study included the University of Central Arkansas, Waterborne Environmental Inc., University of Arkansas and Wilkes University.
The paper "Stream vulnerability to widespread and emergent stressors: a focus on unconventional oil and gas" is available in PLOS ONE, which is an open-access, peer-reviewed scientific journal and can be downloaded free of charge online.
A female grizzly with a cub. Adult females are considered the most important segment of the Greater Yellowstone grizzly population and consequently are a major focus of USGS research and monitoring. (High resolution image) USGS biologists collecting biological information from a grizzly bear they have captured. Biologists collect hair samples for genetic analysis, weigh the bear, and gather numerous measurements of the body, such as the head, paws, claws, teeth, etc. Overall condition of the bear is assessed as well, including a body fat measurement. (High resolution image) Grizzly bear hair on barbed wire at a hair corral. Bears climb over or under the corral and the hair collected provides information for genetic analysis. (High resolution image)
BOZEMAN, Mont. – Genetic data show the grizzly bear population in the Greater Yellowstone Ecosystem has grown since the 1980s with no loss in genetic diversity, according to a report by the Interagency Grizzly Bear Study Team.
Results indicate that the effective population size of Yellowstone grizzly bears, or the number of individuals that contribute offspring to the next generation, has increased 4-fold over a 25-year period. This provides evidence that Yellowstone grizzly bears are approaching the effective size necessary for long-term genetic viability.
"The increase in effective size of the Yellowstone grizzly bear population over the past several decades, with no significant change in genetic diversity, supports evidence of population growth based on traditional surveys," said Pauline Kamath, USGS ecologist and lead author of the study. "This is a key genetic indicator of a population’s ability to respond to future environmental change."
Researchers used several newly available techniques to assess trends in effective population size from a sample of 729 grizzly bears in the Greater Yellowstone Ecosystem, a region slightly smaller than South Carolina. Based on one of several methods, they found estimates of effective population size increased from approximately 100 bears in the 1980s to 450 in the 2000s. These numbers are smaller than estimates of total population size because not all animals in the population breed. Although an isolated population, grizzly bear genetic diversity remained stable and inbreeding was relatively low, 0.2 percent, over the time period.
The application of these new methods to monitor trends in effective population size of wildlife has been limited because it is difficult to measure and requires long-term data on individuals in the population. The isolated and well-studied population of Yellowstone grizzly bears provided a rare opportunity to examine the usefulness of this technique for monitoring a threatened species because of the breadth of genetic and demographic, gender and age, data that have been collected over decades. Grizzly bear populations in the lower 48 states were listed as threatened in 1975 under the Endangered Species Act.
"For long-lived species such as grizzly bears, a concerted effort is required to collect long-term genetic and associated demographic data. Four decades of intensive research on Yellowstone grizzly bears presented a unique opportunity to evaluate and compare genetic estimators for monitoring of wildlife populations." said Frank van Manen, USGS wildlife biologist and Team Leader of the Interagency Grizzly Bear Study Team.
The study demonstrates how genetic monitoring can complement traditional demographic-based monitoring, providing valuable tools for wildlife managers for current and future studies. It also underscores the effectiveness of long-term studies that provide detailed data to support a variety of analyses, providing researchers and managers a better picture of the status of populations of interest.
The article "Multiple estimates of effective population size for monitoring a long-lived vertebrate: an application to Yellowstone grizzly bears" is published in Molecular Ecology.
The study is a collaborative effort between the USGS, Wildlife Genetics International, the University of Montana, and the federal, state and tribal partners of the Interagency Grizzly Bear Study Team.
More information about Yellowstone grizzly bear studies can found on the USGS Northern Rocky Mountain Science Center website.
The invasive northern snakehead fish found in the mid-Atlantic area is now cause for more concern, potentially bringing diseases into the region that may spread to native fish and wildlife, according to a team of U.S. Geological Survey scientists.
The team found that a group of adult northern snakehead collected from Virginia waters of the Potomac River south of Washington D.C. were infected with a species of Mycobacterium, a type of bacteria known to cause chronic disease among a wide range of animals.
"Mycobacterial infections are not unusual among fish, but they are nonetheless noteworthy because they can have an impact at the population level and potentially even affect other fish and wildlife," said lead author Christine Densmore, a veterinarian with the USGS.
There are many known species of Mycobacteria that have been identified in fish, including fish from the Potomac River and Chesapeake Bay area. Several years ago, mycobacterial infections were associated with severe disease typified by ulcerative skin lesions and wasting among wild striped bass from the Chesapeake Bay and its tributaries. Some species of Mycobacterium are also known to cause diseases among other animals, including mammals. For instance, Mycobacterium tuberculosis is the cause of human tuberculosis and Mycobacterium paratuberculosis, causes Johne’s disease of cattle.
The effect of this particular species of Mycobacterium on humans is not known.
Mycobacterial disease in fish is often called piscine mycobacteriosis, and it is associated with many different species of mycobacteria. In this instance, no external signs of disease were noted on the infected snakehead fish, and the disease was discovered microscopically as lesions associated with the bacteria that were visible within internal organs.
"Another interesting feature of this particular mycobacterial organism is that we have not been able to identify it in the available gene sequence data base, so this may be a unique, undescribed species of Mycobacterium. However, more research is needed to further characterize the bacteria and its potential effects on the northern snakehead population and other native species," said Densmore.
The researchers plan to continue to work closely with other federal and state agencies to investigate the pathogens and diseases carried by the northern snakehead fish in mid-Atlantic waters such as the Potomac River. This study of Mycobacterial infection in Northern snakehead from the Potomac River catchment, conducted in collaboration with fisheries biologists from the Virginia Department of Game and Inland Fisheries, is available online through the Journal of Fish Diseases.
The Maumee River flows through northwest Ohio into Lake Erie. Nitrate concentrations in the Maumee River increased rapidly between 1945 and 1980 as nitrogen inputs from fertilizer and livestock increased. Since 1980, changes in nitrate levels have been much smaller as nitrogen inputs leveled off.(High resolution image)
During 1945 to 1980, nitrate levels in large U.S. rivers increased up to fivefold in intensively managed agricultural areas of the Midwest, according to a new U.S. Geological Survey study. In recent decades, nitrate changes have been smaller and levels have remained high in most of the rivers studied.
The greatest increases in river nitrate levels coincided with increased nitrogen inputs from livestock and agricultural fertilizer, which grew rapidly from 1945 to 1980. In some urbanized areas along the East and West coasts during the same period, river nitrate levels doubled. Since 1980, nitrate changes have been smaller as the increase in fertilizer use has slowed in the Midwest and large amounts of farmland have been converted to forest or urban land along the East coast.
"Long-term monitoring of 22 large U.S. rivers provides a rare glimpse into how water quality conditions have changed over the last 65 years," said Edward Stets, lead author of the study. "Although the greatest increases in nitrate concentrations occurred prior to 1980, levels have since remained high in most rivers. Unfortunately, there is no widespread evidence of improving conditions."
High nitrate levels can lead to the formation of zones of low oxygen in coastal waters, which harms fisheries, recreational use, and ecological habitat, causing major economic impacts. High nitrate levels also pose a threat to drinking-water supplies, sometimes resulting in high water treatment costs, and can harm aquatic life.
The USGS study, reported in the Journal of the American Water Resources Association, includes rivers flowing into the Great Lakes and coastal waters such as Long Island Sound, Delaware River estuary, Chesapeake Bay, San Francisco Bay, and the Gulf of Mexico.
Long-term monitoring of water quality is essential to track how changes in land use, climate, and water-quality management actions are impacting both local streams and rivers and valuable commercial and recreational fisheries in estuaries across the Nation. The USGS National Water-Quality Program is working on more detailed analysis of water quality trends within the past 10 to 50 years in small and large rivers across the Nation.
Information on USGS long-term water-quality monitoring can be accessed online.
Alligators and the Everglades go hand-in-hand, and as water conditions change in the greater Everglades ecosystem, gators are one of the key species that could be affected.
A recent study conducted by researchers from the U.S. Geological Survey, U.S. Fish and Wildlife Service and the University of Florida found the number of American alligators observed in the Arthur R. Marshall Loxahatchee National Wildlife Refuge dropped following dry years, and then appeared to recover in later non-dry years. The decrease in alligators appeared proportional to the intensity of the dry event. The refuge is located west of Boynton Beach, within the greater Everglades ecosystem.
"Alligator behavior and habitat use is linked to hydrology, and when that hydrology changes, alligator behavior and habitat can change," said USGS research ecologist Hardin Waddle, lead author of the study. "They don’t need it wet all the time, but if dry events increase in frequency and intensity, this could be problematic for alligator numbers in the greater Everglades ecosystem."
Ten years of spotlight night counts in marsh and canals were analyzed to better understand the effect of annual minimum water depth on annual population growth rate. Years were considered dry if they experienced a drop in water level to 6 inches above the marsh surface. At this water level, alligators have difficulty moving around.
Dry conditions can cause male alligators to use up more energy to locate mates, disrupt the ability of females to nest and could result in death due to over-competition for resources and even cannibalism in crowded areas, explained Waddle.
The Everglades is currently one of the world’s largest wetland restoration efforts. The ecosystem, encompassing nearly 4 million acres from near Orlando to the Florida Bay, is threatened by a number of disturbances including changes in hydrology and land use. Much of the remaining areas, including Water Conservation Areas, are now influenced by water management for water supply and flood control. Restoring the natural hydrology to improve ecosystem function is one focus of the Comprehensive Everglades Restoration Plan.
Due to their sensitivity to hydrology, the CERP is using alligator populations to help determine ecosystem response and success of the restoration project. The study results have important implications for helping managers with restoration and water management planning.
"Long-term data sets like the one used in this study offer invaluable insight into what is happening in an ecosystem and provide us the knowledge to build flexibility into Everglades restoration and management," said U.S. Fish and Wildlife Service regional scientist and study co-author Laura Brandt. "Understanding both annual patterns and long-term trends helps us with water management decisions within the refuge and other natural areas throughout South Florida."
The study can be found online.
Badges awarded for submitting edits, stating at the top of the circle and proceeding clockwise, with the points needed for each level: Order of the Surveyor’s Chain (25-49), Society of the Steel Tape (50-99), Pedometer Posse (100-199), Circle of the Surveyor’s Compass (200-499), Stadia Board Society (500-999), Alidade Alliance (1000-1999), Theodolite Assemblage (2000 - 2999), Family of Floating Photogrammetrists (3000-3999) . Flock of Winged Witnesses (4000-4999), Ring of Reconnaissance Rocketeers (5000 – 5999) and Squadron of Biplane Spectators (6000+) Credit: TNMCorps. (High resolution image)
Volunteers are being recognized and earning custom badges for making significant contributions to the U.S. Geological Survey's ability to provide accurate and timely information to the public. Using crowdsourcing techniques, the USGS project known as The National Map Corps (TNMCorps) encourages volunteer “citizen scientists” to collect manmade structure data such as police stations, schools, hospitals and cemeteries, in an effort to provide more precise and authoritative spatial data for the USGS web-based mapping portal known as The National Map and updated US Topo map products.
"Each badge displays classic surveying tools that played a vital role in gathering geographic information in the history of the USGS," said Ricardo Oliveira, TNMCorps student contractor. "After all, our volunteers are modern surveyors working towards a better understanding of the nation’s geographic assets."
Volunteer map editors are a fundamental component of TNMCorps and are critical to the success of the project. The project started in 2012, and since that time, the increasing number of volunteers have verified, edited, deleted, and created more than 160,000 structures points. In appreciation for the efforts of these "free" mappers, those who reach certain milestones are celebrated in the form of virtual badges.
The newly designed badges showcase the same classic surveying tools and aerial data collection methods, but have been colorfully updated and highlight a variety of amazing landscapes across the United States. These badges are sure to be proudly displayed by any TNMCorps volunteer.
A second set of badges based on aerial data collection was introduced about a year ago as some extra-energetic volunteers quickly surpassed the first set of badge levels. Currently, there are 11 possible badges that can be earned beginning with the Order of the Surveyor’s Chain (25 points) and ending with the Squadron of Biplane Spectators (6000 + points). As volunteer map editors attain each level, a congratulatory email is sent to the awardee with a description of the badge and encouragement to achieve the next level. With permission, volunteer accomplishments are highlighted on TNMCorps Recognition page, and The National Map Twitter (#TNMCorps).
It is easy to become a USGS TNMCorps volunteer map editor. All you need is access to the internet and a willingness to learn. Visit The National Map Corps for more information.
General view of a 35-meter-high riverbank exposure of the ice-rich syngenetic permafrost (yedoma) containing large ice wedges along the Itkillik River in northern Alaska. Copyright-free photo courtesy Mikhail Kanevskiy; University of Alaska Fairbanks, Institute of Northern Engineering; 8/13/2011. (High resolution image)
Researchers from the U.S. Geological Survey and key academic partners have quantified how rapidly ancient permafrost decomposes upon thawing and how much carbon dioxide is produced in the process.
Huge stores of organic carbon in permafrost soils — frozen for hundreds to tens of thousands of years across high northern latitudes worldwide — are currently isolated from the modern day carbon cycle. However, if thawed by changing climate conditions, wildfire, or other disturbances, this massive carbon reservoir could decompose and be emitted as the greenhouse gases carbon dioxide and methane, or be carried as dissolved organic carbon to streams and rivers.
"Many scientists worldwide are now investigating the complicated potential end results of thawing permafrost," said Rob Striegl, USGS scientist and study co-author. "There are critical questions to consider, such as: How much of the stored permafrost carbon might thaw in a future climate? Where will it go? And, what are the consequences for our climate and our aquatic ecosystems?"
At a newly excavated tunnel operated by the U.S. Army Corps of Engineers near Fairbanks, Alaska, a research team from USGS, the University of Colorado Boulder, and Florida State University set out to determine how rapidly the dissolved organic carbon from ancient (about 35,000 years old) “yedoma” soils decomposes upon soil thaw and how much carbon dioxide is produced.
Yedoma is a distinct type of permafrost soil found across Alaska and Siberia that accounts for a significant portion of the permafrost soil carbon pool. These soils were deposited as wind-blown silts in the late Pleistocene age and froze soon after they were formed.
"It had previously been assumed that permafrost soil carbon this old was already degraded and not susceptible to rapid decomposition upon thaw," said Kim Wickland, the USGS scientist who led the team.
The researchers found that more than half of the dissolved organic carbon in yedoma permafrost was decomposed within one week after thawing. About 50% of that carbon was converted to carbon dioxide, while the rest likely became microbial biomass.Map of the northern circumpolar permafrost zone, highlighting the extent of the yedoma permafrost region (indicated in yellow and red). Map image and copyright permission courtesy of Macmillan Publishers Ltd, from NATURE, Schuur et al., 2015, Climate change and the permafrost carbon feedback, doi:10.1038/nature14338, copyright 2015. (High resolution image)
"What this study adds is that we show what makes permafrost so biodegradable," said Travis Drake, the lead author of the research. "Immediately upon thaw, microbes start using the carbon and then it is sent back into the atmosphere." Drake was both a USGS employee and a master’s degree student at the University of Colorado during the investigation.
The researchers attribute this rapid decomposition to high concentrations of low molecular weight organic acids in the dissolved organic carbon, which are known to be easily degradable and are not usually present at high concentrations in other soils.
These rates are among the fastest permafrost decomposition rates that have been documented. It is the first study to link rapid microbial consumption of ancient permafrost soil-derived dissolved organic carbon to the production of carbon dioxide.
An important implication of the study for aquatic ecosystems is that dissolved organic carbon released by thawing yedoma permafrost will be quickly converted to carbon dioxide and emitted to the atmosphere from soils or small streams before it can be transported to major rivers or coastal regions.
This research was recently published in the Proceedings of the National Academy of Sciences. The National Science Foundation’s Division of Polar Programs provided essential support for the investigation.
Working throughout the Mississippi River basin, USGS scientists and collaborators from the University of Texas at Austin have established the river’s own potential to decrease its load of nitrate and identified how certain basic river management practices could increase that potential.
"Increasing nitrogen concentrations, mostly due to the runoff of agricultural fertilizers, in the world's major rivers have led to over-fertilization of waters downstream, diminishing their commercial and recreational values,” said William Werkheiser, USGS associate director for water. “Understanding the natural potential of rivers themselves to remove nitrogen from the water, and boosting that potential, is a promising avenue to help mitigate the problem."
Beneath all streams and rivers is a shallow layer of sediment that is permeated by water exchange across the sediment surface. This thin region in the sediment beneath and to the side of the stream is referred to by scientists as the "hyporheic" zone, from Greek words meaning "under the flow."
"We’ve found in previous studies,” said Jesus Gomez-Velez, lead author of the study, “that the flow of stream water through this thin zone of sediment enhances chemical reactions by microbes that perform denitrification, a reaction that removes nitrogen from the aquatic system by converting it to nitrogen gas.” A USGS post-doctoral scientist at the time of the study, Gomez-Velez is now an assistant professor at the New Mexico Institute of Mining and Technology.
The research team determined that, throughout the Mississippi River network, vertical hyporheic exchange (with sediments directly beneath streams and rivers) has denitrification potential that far exceeds lateral hyporheic exchange with bank sediments.
"Rivers with more vertical exchange are more efficient at denitrification, as long as the contact time with sediment is matched with a reaction time of several hours," observed co-author Jud Harvey, the USGS team leader for the study.
The study findings suggest that managing rivers to help avoid the sealing of streambeds with fine sediments, which decreases hyporheic flow, would help exploit the valuable natural capability of rivers to improve their own water quality. Other river management and restoration practices that protect permeable river bedforms could also boost efficiency, such as reducing fine sediment runoff to rivers.
However, typical river channel restoration strategies that realign channels to increase meandering would not be as effective, because a comparatively small amount of water and river nitrate are processed through river banks compared with river beds. Although not yet tested in the model, allowing natural flooding over river banks onto floodplains may also be an effective means of processing large amounts of river water to remove nitrogen before it reaches sensitive coastal waters.
Conducted by USGS and partners from the New Mexico Institute of Mining and Technology and the University of Texas at Austin, the research investigation was recently published in the journal Nature Geoscience.The river corridor includes surface and subsurface sediments beneath and outside the wetted channel. Greater interaction between river water and sediment enhances important chemical reactions, such as denitrification, that improve downstream water quality. (high resolution image) Stream and river water make many excursions through hyporheic flow paths. The metrics in the diagram key denote the number of excursions that water makes through hyporheic flow paths per kilometer of river distance. Vertical exchange though streambed hyporheic flow paths is much more efficient compared with exchange through lateral (stream bank) hyporheic flow paths. Also, hyporheic exchange is less efficient in the Upper Mississippi River sub-basin compared with the Missouri or Ohio sub-basins. The primary reasons for different hyporheic flow efficiencies are differences in river basin slope and sediment textures that permit greater hyporheic flow in some areas compared to others. (high resolution image)
During the historic October 2015 floods in South Carolina, 17 U.S. Geological Survey streamgages recorded the highest peak streamflow and/or river height (or stage) since those streamgages were installed. An additional 15 USGS streamgages recorded peaks in the top 5 for their periods of record.
One of these streamgages, located on the Black River at Kingstree, South Carolina, recorded its largest peak in the 87 years it has existed. The streamgage showed that the Black River reached a peak streamflow of 83,700 cubic feet per second and a stage of 22.65 feet. The previous maximum on the Black River occurred on June 14, 1973. Additional annual peak stage data collected by the National Weather Service at the gauge prior to USGS operation indicates this is likely the highest flood since 1893.
"This was absolutely an historic flood for South Carolina," said John Shelton, the USGS hydrologist who oversaw the agency’s field response and gauging operations in South Carolina. "Throughout the event we continued to monitor our network of about 170 real-time streamgages, and we sent dozens of teams out in the field to verify what we were seeing. Fortunately, we have quite a few long-standing streamgages in South Carolina, so we can put these floods into historical context."
One of the longest-running streamgages in South Carolina is the one on the Congaree River in Columbia, with annual records back to 1892 and even flood information for 1852. That means that there are 123 years of record to place the October 2015 floods into perspective.
The USGS streamgage on the Congaree River at Columbia peaked at 185,000 cubic feet per second at a peak stage of 31.8 feet on October 4, 2015. When compared to the historical flood record, this peak ranks eighth out of 123 years of record with the peak of record being 364,000 cubic feet per second at a peak stage of 39.8 feet on August 27, 1908.
However, the October 2015 flood on the Congaree River is the highest since April 8, 1936, when the river peaked at 231,000 cubic feet per second at a peak stage of 33.3 ft.
For comparison, an Olympic-sized swimming pool contains 88,000 cubic feet, so the October 4, 2015, peak on the Congaree River at Columbia would fill a little over two Olympic-sized swimming pools every second.
Throughout the entire flood, the USGS deployed nearly 100 people who collected almost 250 distinct streamflow measurements in South Carolina, North Carolina, and Georgia; deployed and recovered storm-tide sensors and Rapid-Deployment Gauges; and flagged and determined the elevation of close to 600 high-water marks in support of response and recovery missions for FEMA. The effort, led by the USGS South Atlantic Water Science Center, which has offices in South Carolina, North Carolina, and Georgia, was supported by teams from other USGS offices in Alabama, Florida, Mississippi and Pennsylvania.
A total of 8 streamgages were destroyed or damaged during the floods in South Carolina, with five replaced with Rapid-Deployment Gauges within hours of the gauge outage.
In South Carolina, the teams made about 140 streamflow measurements at about 86 real-time streamgages to verify or update existing information on streamflow at that site. This information, along with a comparison of historic peak flows or stages and a chronology of major flood events in South Carolina since 1893, is available in a new USGS report entitled "Preliminary Peak Stage and Streamflow Data at Selected USGS Streamgaging Stations for the South Carolina Flood of October 2015."
A new approach by U.S. Geological Survey scientists to modeling water temperatures resulted in more realistic predictions of how climate change will affect fish habitat by taking into account effects of cold groundwater sources.
The study, recently published in the journal Ecological Applications, showed that groundwater is highly influential but also highly variable among streams and will lead to a patchy distribution of suitable fish habitat under climate change. This new modeling approach used brook trout, but can be applied to other species that require coldwater streams for survival.
"One thing that has been missing from other models is the recognition that groundwater moderates the temperature of headwater streams," said Nathaniel Hitt, a fish biologist and study coauthor. "Our paper helps to bring the effects of groundwater into climate change forecasts for fish habitat."
Climate change models predict that summer air temperatures will increase between 2.7 and 9 degrees Fahrenheit in the eastern United States over the next 50 to 100 years. Such increases in air temperatures will increase water temperatures of streams and rivers and pose a significant threat to fish like brook trout that have low resistance to warming water temperatures.
Brook trout are an important cultural and recreational species with specific restoration outcomes identified in the new Chesapeake Bay Agreement.
However, how these global and regional predictions regarding a changing climate translate to water temperatures in specific streams or stream reaches, a process called “downscaling”, remains an important and challenging question for scientists and natural resource managers.
Previous efforts to downscale global and regional estimates of air temperature change down to water temperatures in individual streams and river networks have relied on the assumption that the exchange of heat between the water and the surrounding air is the primary driver of water temperature within an individual section of a stream. However, the exchange of heat between cold groundwater and warmer surface water can also be very important, especially in headwater streams where the volume of water is relatively small.
"Our models help improve the spatial resolution of climate change forecasts in headwater streams," said Craig Snyder, a USGS research ecologist and lead author of the study. "This work will assist conservation and restoration efforts by connecting climate change models to places that matter for stream fishes."
The study is available online: Snyder, C.D., N.P. Hitt, and J.A. Young. 2015. Accounting for groundwater in stream fish thermal habitat responses to climate change. Ecological Applications 25:1397-1419.
Donyelle Davis ( Phone: 626-202-2393 );
Cimmarron River bed operations in Cushing Oil Field, Oklahoma, Looking Southwest. Creek County, Oklahoma. December 2, 1915. Panorama in two parts. / USGS archive Photo. (High resolution image)
The rate of earthquakes has increased sharply since 2009 in the central and eastern United States, with growing evidence confirming that these earthquakes are primarily caused by human activity, namely the injection of wastewater in deep disposal wells. A new study by the U.S Geological Survey released today presents evidence that, in addition to these recent earthquakes, most of the larger earthquakes in Oklahoma in the past century may likely have been induced by industrial activities.
This study explores the especially high rates of activity in Oklahoma, the background rate of natural earthquakes in the state and how much the earthquake rate has varied through the 20th century.
"In Oklahoma, seismicity rates since 2009 far surpass previously observed rates at any time during the 20th century," said Susan Hough, USGS seismologist and lead author of the study. "Several lines of evidence further suggest that most of the significant earthquakes in Oklahoma during the 20th century may also have been induced by oil production activities. Deep injection of waste water, now recognized to potentially induce earthquakes, in fact began in the state in the 1930s."
The study uses archival reports at the Library of Congress and drill permit records showing the location of wells from the Oklahoma Corporation Commission to track how wastewater injection evolved over time, with an increase around 1950 due to a rise in secondary oil recovery in response to increasing depletion of fields.
"Waste water injection has a strong correlation to the increase in earthquakes," said Morgan Page, USGS seismologist and co-author of the study. "The results further demonstrate that, while the rates seen in recent years are unprecedented, induced earthquakes are likely nothing new in Oklahoma."
Oil production in Oklahoma has been going on for over 100 years. Some activities related to oil production, particularly disposal of wastewater in deep injection wells, are known to potentially cause earthquakes. Prior to the 2011 magnitude 5.7 Prague, Oklahoma earthquake, the largest historical earthquake in the area was the 1952 magnitude 5.7 El Reno earthquake, which the study concludes was likely induced by activities related to oil production near Edmond, Oklahoma.
The complete research paper, "A Century of Induced Earthquakes in Oklahoma?" by Susan E. Hough and Morgan Page was released online in the journal Bulletin of the Seismological Society of America.
Meadow vole rests in its habitat. (High resolution image)
ANCHORAGE, Alaska — A new scientific study predicts that some of Alaska’s mammal species will respond to future climate warming by concentrating in northern areas such as the Arctic National Wildlife Refuge and the National Petroleum Reserve of Alaska. If true, for many species, this would be a significant northward shift into tundra habitats where they are currently absent.
“Small mammal species such as shrews and voles, and larger species like wolverine and marmots have been in Alaska for many thousands of years and have responded to past climate cycles just as they are responding to the current warming trend,” said Dr. Andrew Hope, lead author of the study and former researcher with the U.S. Geological Survey.
“Since these mammals experienced past climate cycles, we were able to interpret signatures of population level responses to those climate events recorded in their DNA, and then also use that information to predict likely shifts in animal distributions throughout Alaska into the future,” said Hope.
Researchers that participated in the study with Hope examined at total of 28 mammal species including those from both northern tundra and relatively more southern boreal forest habitats. The scientists determined the northern movement by looking at current geographic distribution of the animals, coupled with their historical range, and then interpreted from genetic signatures of response to past climate changes to predict where they will likely be found in 2050 and 2080.
Hope worked on the project with researchers from the USGS, the U.S. Fish and Wildlife Service’s Arctic National Wildlife Refuge, the City College of New York, the University of Nevada-Reno, and the University of New Mexico. The study leveraged genetic data collected over the past several decades in multiple laboratories.
“This approach allowed us to examine the consistency among predicted changes to mammal distributions and determine if there are differences in management implications across regions," said David Payer, chair of the Arctic Landscape Conservation Cooperative and a co-author of the study.
This study highlights the value of analyzing many species associated with discrete ecological communities that each share different evolutionary histories.
Hope conducted the research as a post-doctoral fellow at the USGS.
The study was published in the Ecological Society of America’s journal Ecosphere.
Visit the USGS Alaska Science Center for more information.Map shows land management status, ecoregions, and predictions for changes in small mammal biodiversity through time based on all 28 species. Lands highlighted in the map include the Arctic Network of National Parks administered by the National Park Service (NPS; square), Arctic National Wildlife Refuge (administered by the USFWS; circle) and National Petroleum Reserve – Alaska (NPR-A; administered by the BLM; triangle). Biodiversity predictions were based on Last Interglacial (LIG), Last Glacial Maximum (LGM), current (Now), 2050s, and 2070s climate projections. The color gradient reflects areas of low (blue) to high (red) species richness. . (High resolution image)