WARREN, Mich. – Sunday is the final day to register for FEMA disaster assistance for Michigan residents affected by the August floods.
As the registration and application deadline nears more than 125,000 residents in Macomb, Oakland and Wayne counties have registered for assistance and more than $240 million in federal disaster assistance has been approved.
FEMA has approved nearly $139 million in grants, while the U.S. Small Business Administration (SBA) has approved more than $101 million in low-interest loans.Language English
ANCHORAGE, Alaska — A polar bear capture and release-based research program had no adverse long-term effects on feeding behavior, body condition, and reproduction, according to a new study by the U.S. Geological Survey.
The study used over 40 years of capture-based data collected by USGS from polar bears in the Alaska portion of the southern Beaufort Sea. Scientists looked for short and long-term effects of capture and release and deployment of various types of satellite transmitters.
"We dug deeply into one of the most comprehensive capture-based data sets for polar bears in the world looking for any signs that our research activities might be negatively affecting polar bears," said Karyn Rode, lead author of the study and scientist with the USGS Polar Bear Research Program.
The study found that, following capture, transmitter-tagged bears returned to near-normal rates of movement and activity within 2-3 days, and that the presence of tags had no effect on a bear's subsequent physical condition, reproductive success, or ability to successfully raise cubs.
"Importantly, we found no indication that neck collars, the primary means for obtaining critical information on polar bear movement patterns and habitat use, adversely affected polar bear health or reproduction," said Rode.
The study also found that repeated capture of 3 or more times was not related to effects on health and reproduction.
"We care about the animals we study and want to be certain that our research efforts are not contributing to any negative effects," said Rode. "I expected we might find some sign that certain aspects of our studies, such as repeated capture, would negatively affect bears, and I was pleased that we could not find any negative implications."
Efforts to conserve polar bears will require a greater understanding of how populations are responding to the loss of sea ice habitat. Capture-based methods are required to assess individual bear health and to deploy transmitters that provide information on bear movement patterns and habitat use. These methods have been used for decades in many parts of the polar bear’s range. New less invasive techniques have been developed to identify individuals via hair and biopsy samples, but these techniques do not provide
complete information on bear health, movements or habitat use. Capture is likely to continue to be an important technique for monitoring polar bears. This study is reassurance that capture, handling, and tagging can be used as research and monitoring techniques with no long-term effects on polar bear populations.
The paper "Effects of capturing and collaring on polar bears: findings from long-term research on the southern Beaufort Sea population" was published today in the journal Wildlife Research.
Visit the USGS Polar Bear Research website for more information.
HONOLULU – Three months after President Barack Obama approved supplemental federal aid to help local government agencies and eligible non-profit organizations recover from Tropical Storm Iselle, state and federal disaster recovery employees have:
Conducted a Joint Preliminary Damage Assessment;
Held four Applicant Briefings on Hawaii Island, Maui, and Oahu;
Received requests for FEMA public assistance from16 applicants who were impacted during Tropical Storm Iselle, which affected the Hawaiian Islands Aug. 7-9, 2014;
The 2011 east coast earthquake felt by people from Georgia to Canada likely originated from a fault “junction” just outside of Mineral, Virginia, according to new U.S. Geological Survey research published in the Geological Society of America’s Special Papers.
Following the August 23, 2011 event, USGS scientists conducted low-altitude geophysical (gravity and magnetic) flight surveys in 2012 over the epicenter, located about eight miles from the quake’s namesake. Maps of the earth’s magnetic field and gravitational pull show subtle variations that reflect the physical properties of deeply buried rocks. More research may reveal whether geologic crossroads such as this are conducive to future earthquakes in the eastern United States.Caption: In map view, magnetic data were filtered (colors) to highlight geologic features near the earthquake depth. One contrast (blue dotted line) is aligned with aftershocks (black dots). The other crosses at an angle. They suggest that the earthquake (yellow star) occurred near a “crossroads,” or a complex intersection of different types of rock.
“These surveys unveiled not only one fault, which is roughly aligned with a fault defined by the earthquake’s aftershocks, but a second fault or contact between different rock types that comes in at an angle to the first one,” said USGS scientist and lead investigator, Anji Shah. “This visual suggests that the earthquake occurred near a ‘crossroads,’ or junction, between the fault that caused the earthquake and another fault or geologic contact.”
Deep imaging tools were specifically chosen because the earthquake occurred about five miles beneath the earth. Looking at faults in this way can help scientists better understand earthquake hazards in the eastern United States.
The USGS and partner scientists are also interested in why seismic events occur in certain parts of the central and eastern United States, like the Central Virginia seismic zone, since there are no plate boundaries there, unlike the San Andreas Fault in California, or the Aleutian Trench in Alaska.
USGS scientists still have remaining questions: Could this happen elsewhere? How common are such crossroads? Shah and other scientists are also trying to understand whether and why a junction like this might be an origin point for earthquakes.
“Part of it might be the complex stress state that arises in such an area. Imagine you have a plastic water bottle in your hand, and it has a cut (fault) in it the long way. When you squeeze the bottle, it pops (ruptures) where the cut is. The long cut is comparable to an ancient fault – it’s an area of weakness where motion (faulting and earthquakes) is more likely to happen. Multiple intersecting cuts in that bottle produce zones of weakness where fault slip is more likely to happen, especially where two cuts intersect,” said Shah.
The situation near the fault on which the magnitude 5.8 Mineral earthquake occurred is more complex than that. For example, the fault may separate different types of rocks with varying densities and strengths, as suggested by the gravity data. This contributes to a complex stress field that could also be more conducive to slip.
Additional science data about the 2011 Mineral, Virginia, earthquake may be found online.
NASA in partnership with the U.S. Geological Survey (USGS) is offering more than $35,000 in prizes to citizen scientists for ideas that make use of climate data to address vulnerabilities faced by the United States in coping with climate change.
The Climate Resilience Data Challenge, conducted through the NASA Tournament Lab, a partnership with Harvard University hosted on Appirio/Topcoder, kicks off Monday, Dec. 15 and runs through March 2015.
The challenge supports the efforts of the White House Climate Data Initiative, a broad effort to leverage the federal government’s extensive, freely available climate-relevant data resources to spur innovation and private-sector entrepreneurship in order to advance awareness of and preparedness for the impacts of climate change. The challenge was announced by the White House Office of Science and Technology Policy Dec. 9.
According to the recent National Climate Assessment produced by more than 300 experts across government and academia, the United States faces a number of current and future challenges as the result of climate change. Vulnerabilities include coastal flooding and weather-related hazards that threaten lives and property, increased disruptions to agriculture, prolonged drought that adversely affects food security and water availability, and ocean acidification capable of damaging ecosystems and biodiversity. The challenge seeks to unlock the potential of climate data to address these and other climate risks.
“Federal agencies, such as NASA and the USGS, traditionally focus on developing world-class science data to support scientific research, but the rapid growth in the innovation community presents new opportunities to encourage wider usage and application of science data to benefit society,” said Kevin Murphy, NASA program executive for Earth Science Data Systems in Washington. “We need tools that utilize federal data to help our local communities improve climate resilience, protect our ecosystems, and prepare for the effects of climate change.”
“Government science follows the strictest professional protocols because scientific objectivity is what the American people expect from us,” said Virginia Burkett, acting USGS associate director for Climate Change and Land Use. “That systematic approach is fundamental to our mission. With this challenge, however, we are intentionally looking outside the box for transformational ways to apply the data that we have already carefully assembled for the benefit of communities across the nation.”
The challenge begins with an ideation stage for data-driven application pitches, followed by storyboarding and, finally, prototyping of concepts with the greatest potential.
The ideation stage challenges competitors to imagine new applications of climate data to address climate vulnerabilities. This stage is divided into three competitive classes based on data sources: NASA data, federal data from agencies such as the USGS, and any open data. The storyboarding stage allows competitors to conceptualize and design the best ideas, followed by the prototyping stage, which carry the best ideas into implementation.
The Climate Resilience Data Challenge is managed by NASA's Center of Excellence for Collaborative Innovation at NASA Headquarters, Washington. The center was established in coordination with the Office of Science and Technology Policy to advance open innovation efforts for climate-related science and extend that expertise to other federal agencies.
For additional information and to register (beginning Dec. 15), visit the Climate Resilience Data Challenge website.
DENTON, Texas — A year-and-a-half after tornadoes and severe storms ripped through central Oklahoma, recovery efforts are still under way. Grants totaling nearly $7 million have recently been awarded to the Oklahoma Department of Emergency Management from the Federal Emergency Management Agency. The Public Assistance grants will fund the repair and replacement of numerous educational structures damaged and destroyed by the tornadoes.Language English
Warren, Mich. –Disaster survivors in Southeast Michigan have until Sunday, Dec. 14 to register with the Federal Emergency Management Agency (FEMA). As the registration and application deadline nears more than $230 million in disaster assistance has been approved for survivors.
Survivors from the August flooding who have delayed registration for any reason should apply for potential assistance that could include:Language English
CHARLOTTESVILLE, Va. -- The majority of streams in the Chesapeake Bay region are warming, and that increase appears to be driven largely by rising air temperatures. These findings are based on new U.S. Geological Survey research published in the journal Climatic Change.
Researchers found an overall warming trend in air temperature of 0.023 C (0.041 F) per year, and in water temperature of 0.028 C (0.050 F) per year over 51 years. This means that air temperature has risen 1.1 C (1.98 F), and water temperature has risen 1.4 C (2.52 F) between 1960 and 2010 in the Chesapeake Bay region.
"Although this may not seem like much, even small increases in water temperatures can have an effect on water quality, affecting the animals that rely on the bay’s streams, as well as the estuary itself," said Karen Rice, USGS Research Hydrologist and lead author of the study.
One effect of warming waters is an increase in eutrophication, or an overabundance of nutrients The issue has plagued the bay for decades and likely will increase as temperatures of waters contributing to the bay continue to rise. Other effects of warming waters include shifts in plant and animal distributions in the basin’s freshwater rivers and streams. Upstream waters may no longer be suitable for some cool-water fish species, and invasive species may move into the warming waters as those streams become more hospitable.
Chesapeake Bay is the largest estuary in the United States, with a watershed covering 166,391 square kilometers (over 64,243 square miles) that includes parts of New York, Pennsylvania, Delaware, Maryland, Virginia, West Virginia and the District of Columbia. The watershed includes more than 100,000 streams, creeks and rivers that thread through it, and it supports more than 3,700 species of plants and animals. The states and DC are working with the federal government to improve conditions in the bay and its watershed and address the threats from climate change. Results from this USGS study will help inform adaptation strategies.
The study included examination of 51 years of data from 85 air-temperature sites and 129 stream-water temperature sites throughout the bay watershed. Though the findings indicated that overall both air and water temperatures have increased throughout the region, there was variability in the magnitude and direction of temperature changes, particularly for water.
"Our results suggest that water temperature is largely influenced by increasing air temperature, and features on the landscape act to enhance or dampen the level of that influence” said John Jastram, USGS Hydrologist and study coauthor.
At many of the sites analyzed, increasing trends were detected in both streamflow and water temperature, demonstrating that increasing streamflow dampens, but does not stop or reverse warming. Water temperature at most of the sites examined increased from 1960-2010. There was wide variability in physical characteristics of the stream-water sites, including:
- Watershed area
- Channel shape
- Thermal capacity (a measure of the resistance of a body of water to temperature change)
- The presence or absence of vegetation along the waterways
- Local climate conditions
- Land cover.
Warming temperatures in the Chesapeake Bay region’s streams will have implications for future shifts in water quality, eutrophication and water column layers in the bay. As air temperatures rise, so will water temperature in Chesapeake Bay, though mixing with ocean water may buffer it somewhat, cooling the warmer water entering from the watershed. "Rising air and stream-water temperatures in Chesapeake Bay region, USA," by K.C. Rice and J.D. Jastram in Climatic Change is available online.
More information about USGS science to help restore Chesapeake Bay can be found at online.
[Access images for this release at: &lt;a href="http://gallery.usgs.gov/tags/NR2014_12_08" _mce_href="http://gallery.usgs.gov/tags/NR2014_12_08"&gt;http://gallery.usgs.gov/tags/NR2014_12_08&lt;/a&gt;]
Ethan Alpern ( Phone: 703-648-4406 );
A newly released interactive California Drought visualization website aims to provide the public with atlas-like, state-wide coverage of the drought and a timeline of its impacts on water resources.Drought coverage of California. (High resolution image)
The U.S. Geological Survey developed the interactive website as part of the federal government's Open Water Data Initiative. The drought visualization page features high-tech graphics that illustrate the effect of drought on regional reservoir storage from 2011-2014.
For the visualization, drought data are integrated through space and time with maps and plots of reservoir storage. Reservoir levels can be seen to respond to seasonal drivers in each year. However, available water decreases overall as the drought persists. The connection between snowpack and reservoir levels is also displayed interactively. Current streamflow collected at USGS gaging stations is graphed relative to historic averages. Additionally, California’s water use profile is summarized.
California has been experiencing one of its most severe drought in over a century, and 2013 was the driest calendar year in the state's 119-year recorded history. In January, California Governor, Jerry Brown, declared a State of Emergency to help officials manage the drought.
"USGS is determined to provide managers and residents with timely and meaningful data to help decision making and planning for the state's water resources," said Nate Booth, chief of USGS Water Information. "The drought affects streamflow across the state, which leads to reduced reservoir replenishment as well as groundwater depletion."
White House open data policies continue to provide opportunities for innovation at the nexus between water resource management and information technology. The Open Water Data Initiative promotes these goals with an initial objective of presenting valuable water data in a more user friendly, easily accessible format.
"Ultimately, the initiative will allow us to better communicate the nation's water resources status, trends and challenges based on the most recent monitoring information," said Mark Sogge, USGS Pacific regional director. "By integrating a range of federal and state data to communicate the extreme circumstances of the water shortage in California and the southwest, USGS is providing for public use a rich and interactive collection of drought related information."Reservoir storage levels in California. (High resolution image)
"The state and federal data presented are publicly available, as is the open-source software that supports the application," said Emily Read, a USGS developer of the website. "The application allows the public to explore the drought not only as we’ve presented it, but because the software is open-source, anyone can easily open up the data and expand the story."