DGS Newsroom

Federal News

Summary: GAINESVILLE, Fla.—The U.S. Geological Survey Southeast Ecological Science Center has acquired a state-of-the-art genetic analysis machine that will help advance environmental DNA research efforts. The use of environmental DNA, or eDNA, could assist resource managers nationwide conserve imperiled species and improve control efforts of invasive species.

GAINESVILLE, Fla.—The U.S. Geological Survey Southeast Ecological Science Center has acquired a state-of-the-art genetic analysis machine that will help advance environmental DNA research efforts. The use of environmental DNA, or eDNA, could assist resource managers nationwide conserve imperiled species and improve control efforts of invasive species.

The new technology, a droplet digital PCR machine, is the first of its kind to be acquired by a USGS facility. The machine can detect a single molecule of DNA from an environmental sample and enhances output compared to traditional methods. From water samples, it is possible to detect rare species or those that are difficult to observe due to secretive behavior, camouflaged coloration, or a resemblance to other species. Species identification via the detection of eDNA can make the physical capture or sighting of the target species unnecessary.

“This new platform allows us to process samples efficiently and with greater precision. With just a few copies of genetic material from the aquatic environment, we can detect the presence of an animal that may not otherwise be seen,” commented USGS research geneticist Margaret Hunter, who leads the SESC Conservation Genetics Laboratory.

Environmental DNA comes from organisms shedding biological material into the aquatic environment via feces, mucus, saliva, or skin cells. This material can be used to determine the presence of species, establish range limits, and estimate occupancy and detection probabilities to inform management actions. The environmental DNA exists for approximately 20 days before it degrades, allowing researchers to detect animals, such as pythons, manatees, or Grass Carp, as they move throughout the environment. As compared to traditional laboratory techniques, ddPCR reduces time and laboratory costs, and uses more rigorous statistical analyses to determine the number of DNA copies in a sample. While both techniques can detect and count molecules of DNA, ddPCR has been shown to enhance accuracy and precision.

To detect individual species, genetic researchers first design a species-specific genetic marker. Then filtered surface water samples are split into 20,000 PCR droplets, each containing the marker and, if present, a copy of the target species’ DNA. The droplets illuminate fluorescently if DNA of the targeted species is detected, with the number of illuminated droplets corresponding to the number of DNA molecules in the sample. Assessing the 20,000 droplets for positive detection of the species takes approximately two minutes.

“This technology can provide resource managers invaluable assistance in detecting and defining the habitat of imperiled and invasive species,” Hunter said. “For example, using eDNA and ddPCR can help to better delineate the spread of Burmese pythons in south Florida. Or, the habitat used by imperiled or rare species, such as elusive West Indian manatees, could be defined for research and conservation efforts.”

SACRAMENTO, Calif. – Individuals and business owners in Napa and Solano counties who had damages or losses as a result of the South Napa Earthquake have two weeks left to register for disaster assistance with the Federal Emergency Management Agency (FEMA).

Officials with FEMA and the California Governor’s Office of Emergency Services (Cal OES) urge anyone who still needs help to register before the deadline – Dec. 29, 2014.

Summary: Average chloride concentrations often exceed toxic levels in many northern United States streams due to the use of salt to deice winter pavement, and the frequency of these occurrences nearly doubled in two decades.

Average chloride concentrations often exceed toxic levels in many northern United States streams due to the use of salt to deice winter pavement, and the frequency of these occurrences nearly doubled in two decades.

Chloride levels increased substantially in 84 percent of urban streams analyzed, according to a U.S. Geological Survey study that began as early as 1960 at some sites and ended as late as 2011. Levels were highest during the winter, but increased during all seasons over time at the northern sites, including near Milwaukee, Wisconsin; Chicago, Illinois; Denver, Colorado; and other metropolitan areas. The report was published today in the journal Science of the Total Environment.

"Some freshwater organisms are sensitive to chloride, and the high concentrations that we found could negatively affect a significant number of species," said Steve Corsi, USGS scientist and lead author of the study. “If urban development and road salt use continue to increase, chloride concentrations and associated toxicity are also likely to increase.”

Twenty-nine percent of the sites exceeded the U.S. Environmental Protection Agency’s chronic water-quality criteria (230 milligrams per liter) by an average of more than 100 days per year from 2006 through 2011, which was almost double the amount of days from 1990 through 1994. This increase occurred at sites such as the Menomonee and Kinnickinnic Rivers near Milwaukee and Poplar Creek near Chicago.

The lowest chloride concentrations were in watersheds that had little urban land use or cities without much snowfall, such as Dallas, Texas.

In 16 of the streams, winter chloride concentrations increased over the study period.

In 13 of the streams, chloride concentrations increased over the study period during non-deicing periods such as summer. This finding suggests that chloride infiltrates the groundwater system during the winter and is slowly released to the streams throughout the year.

Chloride levels increased more rapidly than development of urban land near the study sites.

The rapid chloride increases were likely caused by increased salt application rates, increased baseline conditions (the concentrations during summer low-flow periods) and greater snowfall in the Midwest during the latter part of the study.

"Deicing operations help to provide safe winter transportation conditions, which is very important,” Corsi said. “Findings from this study emphasize the need to consider deicer management options that minimize the use of road salt while still maintaining safe conditions."

Road deicing by cities, counties and state agencies accounts for a significant portion of salt applications, but salt is also used by many public and private organizations and individuals to deice parking lots, walkways and driveways. All of these sources are likely to contribute to these increasing chloride trends.

Other major sources of salt to U.S. waters include wastewater treatment, septic systems, farming operations and natural geologic deposits. However, the new study found deicing activity to be the dominant source in urban areas of the northern U.S.

WARREN, Mich. – Sunday is the final day to register for FEMA disaster assistance for Michigan residents affected by the August floods.

As the registration and application deadline nears more than 125,000 residents in Macomb, Oakland and Wayne counties have registered for assistance and more than $240 million in federal disaster assistance has been approved.

FEMA has approved nearly $139 million in grants, while the U.S. Small Business Administration (SBA) has approved more than $101 million in low-interest loans.

Summary: ANCHORAGE, Alaska — A polar bear capture and release-based research program had no adverse long-term effects on feeding behavior, body condition, and reproduction, according to a new study by the U.S. Geological Survey.

ANCHORAGE, Alaska — A polar bear capture and release-based research program had no adverse long-term effects on feeding behavior, body condition, and reproduction, according to a new study by the U.S. Geological Survey.

The study used over 40 years of capture-based data collected by USGS from polar bears in the Alaska portion of the southern Beaufort Sea. Scientists looked for short and long-term effects of capture and release and deployment of various types of satellite transmitters.

"We dug deeply into one of the most comprehensive capture-based data sets for polar bears in the world looking for any signs that our research activities might be negatively affecting polar bears," said Karyn Rode, lead author of the study and scientist with the USGS Polar Bear Research Program.

The study found that, following capture, transmitter-tagged bears returned to near-normal rates of movement and activity within 2-3 days, and that the presence of tags had no effect on a bear's subsequent physical condition, reproductive success, or ability to successfully raise cubs.

"Importantly, we found no indication that neck collars, the primary means for obtaining critical information on polar bear movement patterns and habitat use, adversely affected polar bear health or reproduction," said Rode.

The study also found that repeated capture of 3 or more times was not related to effects on health and reproduction.

"We care about the animals we study and want to be certain that our research efforts are not contributing to any negative effects," said Rode. "I expected we might find some sign that certain aspects of our studies, such as repeated capture, would negatively affect bears, and I was pleased that we could not find any negative implications."

Efforts to conserve polar bears will require a greater understanding of how populations are responding to the loss of sea ice habitat. Capture-based methods are required to assess individual bear health and to deploy transmitters that provide information on bear movement patterns and habitat use. These methods have been used for decades in many parts of the polar bear’s range. New less invasive techniques have been developed to identify individuals via hair and biopsy samples, but these techniques do not provide

complete information on bear health, movements or habitat use. Capture is likely to continue to be an important technique for monitoring polar bears. This study is reassurance that capture, handling, and tagging can be used as research and monitoring techniques with no long-term effects on polar bear populations.

HONOLULU – Three months after President Barack Obama approved supplemental federal aid to help local government agencies and eligible non-profit organizations recover from Tropical Storm Iselle, state and federal disaster recovery employees have:

Conducted a Joint Preliminary Damage Assessment;

Held four Applicant Briefings on Hawaii Island, Maui, and Oahu;

Received requests for FEMA public assistance from16 applicants who were impacted during Tropical Storm Iselle, which affected the Hawaiian Islands Aug. 7-9, 2014;

Summary: The 2011 east coast earthquake felt by people from Georgia to Canada likely originated from a fault “junction” just outside of Mineral, Virginia, according to new U.S. Geological Survey research published in the Geological Society of America’s Special Papers.

The 2011 east coast earthquake felt by people from Georgia to Canada likely originated from a fault “junction” just outside of Mineral, Virginia, according to new U.S. Geological Survey research published in the Geological Society of America’s Special Papers.

Following the August 23, 2011 event, USGS scientists conducted low-altitude geophysical (gravity and magnetic) flight surveys in 2012 over the epicenter, located about eight miles from the quake’s namesake. Maps of the earth’s magnetic field and gravitational pull show subtle variations that reflect the physical properties of deeply buried rocks. More research may reveal whether geologic crossroads such as this are conducive to future earthquakes in the eastern United States.

Caption: In map view, magnetic data were filtered (colors) to highlight geologic features near the earthquake depth. One contrast (blue dotted line) is aligned with aftershocks (black dots). The other crosses at an angle. They suggest that the earthquake (yellow star) occurred near a “crossroads,” or a complex intersection of different types of rock.

“These surveys unveiled not only one fault, which is roughly aligned with a fault defined by the earthquake’s aftershocks, but a second fault or contact between different rock types that comes in at an angle to the first one,” said USGS scientist and lead investigator, Anji Shah. “This visual suggests that the earthquake occurred near a ‘crossroads,’ or junction, between the fault that caused the earthquake and another fault or geologic contact.”

Deep imaging tools were specifically chosen because the earthquake occurred about five miles beneath the earth. Looking at faults in this way can help scientists better understand earthquake hazards in the eastern United States.

The USGS and partner scientists are also interested in why seismic events occur in certain parts of the central and eastern United States, like the Central Virginia seismic zone, since there are no plate boundaries there, unlike the San Andreas Fault in California, or the Aleutian Trench in Alaska.

USGS scientists still have remaining questions: Could this happen elsewhere? How common are such crossroads? Shah and other scientists are also trying to understand whether and why a junction like this might be an origin point for earthquakes.

“Part of it might be the complex stress state that arises in such an area. Imagine you have a plastic water bottle in your hand, and it has a cut (fault) in it the long way. When you squeeze the bottle, it pops (ruptures) where the cut is. The long cut is comparable to an ancient fault – it’s an area of weakness where motion (faulting and earthquakes) is more likely to happen. Multiple intersecting cuts in that bottle produce zones of weakness where fault slip is more likely to happen, especially where two cuts intersect,” said Shah.

The situation near the fault on which the magnitude 5.8 Mineral earthquake occurred is more complex than that. For example, the fault may separate different types of rocks with varying densities and strengths, as suggested by the gravity data. This contributes to a complex stress field that could also be more conducive to slip.

Summary:
NASA in partnership with the U.S. Geological Survey (USGS) is offering more than $35,000 in prizes to citizen scientists for ideas that make use of climate data to address vulnerabilities faced by the United States in coping with climate change.

NASA in partnership with the U.S. Geological Survey (USGS) is offering more than $35,000 in prizes to citizen scientists for ideas that make use of climate data to address vulnerabilities faced by the United States in coping with climate change.

The Climate Resilience Data Challenge, conducted through the NASA Tournament Lab, a partnership with Harvard University hosted on Appirio/Topcoder, kicks off Monday, Dec. 15 and runs through March 2015.

The challenge supports the efforts of the White House Climate Data Initiative, a broad effort to leverage the federal government’s extensive, freely available climate-relevant data resources to spur innovation and private-sector entrepreneurship in order to advance awareness of and preparedness for the impacts of climate change. The challenge was announced by the White House Office of Science and Technology Policy Dec. 9.

According to the recent National Climate Assessment produced by more than 300 experts across government and academia, the United States faces a number of current and future challenges as the result of climate change. Vulnerabilities include coastal flooding and weather-related hazards that threaten lives and property, increased disruptions to agriculture, prolonged drought that adversely affects food security and water availability, and ocean acidification capable of damaging ecosystems and biodiversity. The challenge seeks to unlock the potential of climate data to address these and other climate risks.

“Federal agencies, such as NASA and the USGS, traditionally focus on developing world-class science data to support scientific research, but the rapid growth in the innovation community presents new opportunities to encourage wider usage and application of science data to benefit society,” said Kevin Murphy, NASA program executive for Earth Science Data Systems in Washington. “We need tools that utilize federal data to help our local communities improve climate resilience, protect our ecosystems, and prepare for the effects of climate change.”

“Government science follows the strictest professional protocols because scientific objectivity is what the American people expect from us,” said Virginia Burkett, acting USGS associate director for Climate Change and Land Use. “That systematic approach is fundamental to our mission. With this challenge, however, we are intentionally looking outside the box for transformational ways to apply the data that we have already carefully assembled for the benefit of communities across the nation.”

The challenge begins with an ideation stage for data-driven application pitches, followed by storyboarding and, finally, prototyping of concepts with the greatest potential.

The ideation stage challenges competitors to imagine new applications of climate data to address climate vulnerabilities. This stage is divided into three competitive classes based on data sources: NASA data, federal data from agencies such as the USGS, and any open data. The storyboarding stage allows competitors to conceptualize and design the best ideas, followed by the prototyping stage, which carry the best ideas into implementation.

The Climate Resilience Data Challenge is managed by NASA's Center of Excellence for Collaborative Innovation at NASA Headquarters, Washington. The center was established in coordination with the Office of Science and Technology Policy to advance open innovation efforts for climate-related science and extend that expertise to other federal agencies.

DENTON, Texas — A year-and-a-half after tornadoes and severe storms ripped through central Oklahoma, recovery efforts are still under way. Grants totaling nearly $7 million have recently been awarded to the Oklahoma Department of Emergency Management from the Federal Emergency Management Agency. The Public Assistance grants will fund the repair and replacement of numerous educational structures damaged and destroyed by the tornadoes.

Warren, Mich. –Disaster survivors in Southeast Michigan have until Sunday, Dec. 14 to register with the Federal Emergency Management Agency (FEMA). As the registration and application deadline nears more than $230 million in disaster assistance has been approved for survivors.

Survivors from the August flooding who have delayed registration for any reason should apply for potential assistance that could include:

Summary: CHARLOTTESVILLE, Va. -- The majority of streams in the Chesapeake Bay region are warming, and that increase appears to be driven largely by rising air temperatures. These findings are based on new U.S. Geological Survey research published in the journal Climatic Change.

CHARLOTTESVILLE, Va. -- The majority of streams in the Chesapeake Bay region are warming, and that increase appears to be driven largely by rising air temperatures. These findings are based on new U.S. Geological Survey research published in the journal Climatic Change.

Researchers found an overall warming trend in air temperature of 0.023 C (0.041 F) per year, and in water temperature of 0.028 C (0.050 F) per year over 51 years. This means that air temperature has risen 1.1 C (1.98 F), and water temperature has risen 1.4 C (2.52 F) between 1960 and 2010 in the Chesapeake Bay region.

"Although this may not seem like much, even small increases in water temperatures can have an effect on water quality, affecting the animals that rely on the bay’s streams, as well as the estuary itself," said Karen Rice, USGS Research Hydrologist and lead author of the study.

One effect of warming waters is an increase in eutrophication, or an overabundance of nutrients The issue has plagued the bay for decades and likely will increase as temperatures of waters contributing to the bay continue to rise. Other effects of warming waters include shifts in plant and animal distributions in the basin’s freshwater rivers and streams. Upstream waters may no longer be suitable for some cool-water fish species, and invasive species may move into the warming waters as those streams become more hospitable.

Chesapeake Bay is the largest estuary in the United States, with a watershed covering 166,391 square kilometers (over 64,243 square miles) that includes parts of New York, Pennsylvania, Delaware, Maryland, Virginia, West Virginia and the District of Columbia. The watershed includes more than 100,000 streams, creeks and rivers that thread through it, and it supports more than 3,700 species of plants and animals. The states and DC are working with the federal government to improve conditions in the bay and its watershed and address the threats from climate change. Results from this USGS study will help inform adaptation strategies.

The study included examination of 51 years of data from 85 air-temperature sites and 129 stream-water temperature sites throughout the bay watershed. Though the findings indicated that overall both air and water temperatures have increased throughout the region, there was variability in the magnitude and direction of temperature changes, particularly for water.

"Our results suggest that water temperature is largely influenced by increasing air temperature, and features on the landscape act to enhance or dampen the level of that influence” said John Jastram, USGS Hydrologist and study coauthor.

At many of the sites analyzed, increasing trends were detected in both streamflow and water temperature, demonstrating that increasing streamflow dampens, but does not stop or reverse warming. Water temperature at most of the sites examined increased from 1960-2010. There was wide variability in physical characteristics of the stream-water sites, including:

Watershed area

Channel shape

Thermal capacity (a measure of the resistance of a body of water to temperature change)

The presence or absence of vegetation along the waterways

Local climate conditions

Land cover.

Warming temperatures in the Chesapeake Bay region’s streams will have implications for future shifts in water quality, eutrophication and water column layers in the bay. As air temperatures rise, so will water temperature in Chesapeake Bay, though mixing with ocean water may buffer it somewhat, cooling the warmer water entering from the watershed. "Rising air and stream-water temperatures in Chesapeake Bay region, USA," by K.C. Rice and J.D. Jastram in Climatic Change is available online.

More information about USGS science to help restore Chesapeake Bay can be found at online.

Summary: A newly released interactive California Drought visualization website aims to provide the public with atlas-like, state-wide coverage of the drought and a timeline of its impacts on water resources.
USGS Releases Drought Visualization Website

The U.S. Geological Survey developed the interactive website as part of the federal government's Open Water Data Initiative. The drought visualization page features high-tech graphics that illustrate the effect of drought on regional reservoir storage from 2011-2014.

For the visualization, drought data are integrated through space and time with maps and plots of reservoir storage. Reservoir levels can be seen to respond to seasonal drivers in each year. However, available water decreases overall as the drought persists. The connection between snowpack and reservoir levels is also displayed interactively. Current streamflow collected at USGS gaging stations is graphed relative to historic averages. Additionally, California’s water use profile is summarized.

California has been experiencing one of its most severe drought in over a century, and 2013 was the driest calendar year in the state's 119-year recorded history. In January, California Governor, Jerry Brown, declared a State of Emergency to help officials manage the drought.

"USGS is determined to provide managers and residents with timely and meaningful data to help decision making and planning for the state's water resources," said Nate Booth, chief of USGS Water Information. "The drought affects streamflow across the state, which leads to reduced reservoir replenishment as well as groundwater depletion."

White House open data policies continue to provide opportunities for innovation at the nexus between water resource management and information technology. The Open Water Data Initiative promotes these goals with an initial objective of presenting valuable water data in a more user friendly, easily accessible format.

"Ultimately, the initiative will allow us to better communicate the nation's water resources status, trends and challenges based on the most recent monitoring information," said Mark Sogge, USGS Pacific regional director. "By integrating a range of federal and state data to communicate the extreme circumstances of the water shortage in California and the southwest, USGS is providing for public use a rich and interactive collection of drought related information."

"The state and federal data presented are publicly available, as is the open-source software that supports the application," said Emily Read, a USGS developer of the website. "The application allows the public to explore the drought not only as we’ve presented it, but because the software is open-source, anyone can easily open up the data and expand the story."

DENTON, Texas - Nearly $1.4 million was awarded recently to the Pueblo of Acoma in New Mexico by the Federal Emergency Management Agency for repair of the Anzac Irrigation Channel System.

Nearly five miles of the concrete-lined channel received severe damage as a result of torrential rains and severe flooding in Cibola County in July and August 2010. Structural flaws and damages were revealed following the removal of accumulated silt from the channel.

WARREN, Mich. – The holiday season is here, and the state of Michigan and FEMA wish you all the best during this special time of the year. The heavy demands of the holiday season can be a busy time for all. Managing family obligations and handling seasonal preparations along with regular day-to-day activities can make disaster awareness and preparedness less of a priority.

Summary: The U.S. Geological Survey announced today that improved global topographic (elevation) data are now publicly available for North and South America, Pacific Islands, and northern Europe
Enhanced elevation data for North and South America, Pacific Islands, northern Europe

The U.S. Geological Survey announced today that improved global topographic (elevation) data are now publicly available for North and South America, Pacific Islands, and northern Europe. Similar data for most of Africa were previously released by USGS in September.

The data have been released following the President’s commitment at the United Nations to provide assistance for global efforts to combat climate change. The broad availability of more detailed elevation data across the globe through the Shuttle Radar Topography Mission (SRTM) will improve baseline information that is crucial to investigating the impacts of climate change on specific regions and communities.

“We are pleased to offer improved elevation data to scientists, educators, and students worldwide. It’s free to whomever can use it,” said Suzette Kimball, acting USGS Director, at the initial release of SRTM data for Africa in September. “Elevation, the third dimension of maps, is critical in understanding so many aspects of how nature works. Easy access to reliable data like this advances the mutual understanding of environmental challenges by citizens, researchers, and decision makers around the globe.”

The SRTM30 datasets resolve to 30-meters and can be used worldwide to improve environmental monitoring, advance climate change research, and promote local decision support.

The National Aeronautics and Space Administration (NASA) and the National Geospatial-Intelligence Agency (NGA) worked collaboratively to produce the data, which have been extensively reviewed by relevant government agencies and deemed suitable for public release. The previous global accuracy standard for this data was 90-meters.

The USGS, a bureau of the U.S. Department of the Interior, distributes the data free of charge via its user-friendly Earth Explorer website.

Enhanced 30-meter resolution data for the rest of the world will be released in coming months.

Improved topographic (elevation) data are now publicly available for North and South America, Pacific Islands, and northern Europe as shown in the diagram. Similar data for most of Africa were previously released by USGS in September. (High resolution image)

Summary: As part of the continued US Topo maps revision and improvement cycle, the USGS will be including mountain bike trails to upcoming quadrangles on a state-aligned basis
The USGS will show mountain bike trails on newly revised US Topo maps.

As part of the continued US Topo maps revision and improvement cycle, the USGS will be including mountain bike trails to upcoming quadrangles on a state-aligned basis. The 2014 edition of US Topo maps covering Arizona will be the first maps to feature the trail data, followed by Nebraska, Missouri, Nevada, California, Louisiana, New Hampshire, Mississippi, Vermont, Wyoming, Connecticut, Massachusetts, Illinois, Rhode Island, South Dakota, Florida, Alaska (partial), and the Pacific Territories in 2015.

The mountain bike trail data is provided through a partnership with the International Mountain Biking Association (IMBA) and the MTB Project. During the past two years, the IMBA has been building a detailed national database of mountain bike trails with the aid and support of the MTB Project participants. This activity allows local IMBA chapters, IMBA members, and the public to provide trail data and descriptions through their website. MTB Project and IMBA then verify the quality of the trail data provided, ensure accuracy and confirm that the trail is legal. This unique “crowdsourcing” project has allowed availability of mountain bike trail data though mobile and web apps, and soon, revised US Topo maps.

“IMBA is stoked to have MTB Project data included on US Topo maps as well as other USGS mapping products,” added Leslie Kehmeier, IMBA’s Mapping Specialist. “It’s a really big deal for us and reflects the success of the partnership we've developed with the MTB Project team to develop a valuable and credible resource for mountain bike trails across the country.”

The partnership between the USGS and the MTB Project is considered a big move towards getting high quality trail data on The National Map and US Topo quadrangles. The collaboration also highlights private and public sectors working together to provide trails data and maps to the public.

“This is a significant step for USGS,” said Brian Fox of the USGS NGTOC. “National datasets of trails do not yet exist, and in many areas even local datasets do not exist. Finding, verifying, and consolidating data is expensive. Partnering with non-government organizations that collect trails data through crowdsourcing is a great solution. The USGS-IMBA agreement is the first example of such a partnership for US Topo map feature content and we're looking forward to expanding the number of trails available as the MTB Project contributions grow.

Disclaimer: Any use of trade, firm or product names does not imply endorsement by the U.S. Government. No warranty, expressed or implied, is made by the USGS or the U.S. Government as to the accuracy and functioning of the commercial software programs cited in this Technical Announcement, and the U.S. Government shall not be held liable for improper or incorrect use of the USGS National Map Topographic Data employing these software programs.

The Bumble Bee, Arizona US Topo map, showing the Black Canyon Trail in Arizona (dotted line, near center of the map, left of Crown King Road). USGS US Topo maps featuring IMBA trail data will be a valuable asset to recreational users, land managers, and scientists. (high resolution image)

Summary: The US Geological Survey National Geospatial Program is pleased to announce a new version of the USGS Lidar Base Specification that defines deliverables for nationally consistent lidar data acquisitions

The US Geological Survey National Geospatial Program is pleased to announce a new version of the USGS Lidar Base Specification that defines deliverables for nationally consistent lidar data acquisitions. The USGS Lidar Base Specification provides a common base specification for all lidar data acquired for the 3D Elevation Program (3DEP) component of The National Map. The primary goal of 3DEP is to systematically collect nationwide 3D elevation data in an 8-year period.

“Because we are acquiring data nationally for 3DEP with many partners, we need to have a way to ensure all of our requirements are being met, while minimizing the potential for problems with interoperability between various disparate data collections,” said Jason Stoker, Elevation Product and Services Lead for the USGS National Geospatial Program. “The USGS Lidar Base Specification helps everyone understand exactly what data we need and exactly how we need it, so we can be as efficient as possible. This new version incorporates many of the lessons we have learned since putting together version 1.0, and sets the stage for future quality 3DEP data collections.”

Originally released as a draft in 2010 and formally published in 2012, the USGS–NGP Lidar Base Specification Version 1.0 was quickly embraced as the foundation for numerous state, county, and foreign country lidar specifications. Lidar is a fast-evolving technology, and much has changed in the industry since the final draft of the Lidar Base Specification Version 1.0 was written.

Lidar data have improved in accuracy and spatial resolution, geospatial accuracy standards have been revised by the American Society for Photogrammetry and Remote Sensing (ASPRS), industry standard file formats have been expanded, additional applications for lidar have become accepted, and the need for interoperable data across collections has been realized. This revision to the Lidar Base Specification, known as Version 1.2, addresses those changes and provides continued guidance towards a nationally consistent lidar dataset.

Summary: The US Geological Survey National Geospatial Program is pleased to announce a new version of the USGS Lidar Base Specification that defines deliverables for nationally consistent lidar data acquisitions

The US Geological Survey National Geospatial Program is pleased to announce a new version of the USGS Lidar Base Specification that defines deliverables for nationally consistent lidar data acquisitions. The USGS Lidar Base Specification provides a common base specification for all lidar data acquired for the 3D Elevation Program (3DEP) component of The National Map. The primary goal of 3DEP is to systematically collect nationwide 3D elevation data in an 8-year period.

“Because we are acquiring data nationally for 3DEP with many partners, we need to have a way to ensure all of our requirements are being met, while minimizing the potential for problems with interoperability between various disparate data collections,” said Jason Stoker, Elevation Product and Services Lead for the USGS National Geospatial Program. “The USGS Lidar Base Specification helps everyone understand exactly what data we need and exactly how we need it, so we can be as efficient as possible. This new version incorporates many of the lessons we have learned since putting together version 1.0, and sets the stage for future quality 3DEP data collections.”

Originally released as a draft in 2010 and formally published in 2012, the USGS–NGP Lidar Base Specification Version 1.0 was quickly embraced as the foundation for numerous state, county, and foreign country lidar specifications. Lidar is a fast-evolving technology, and much has changed in the industry since the final draft of the Lidar Base Specification Version 1.0 was written.

Lidar data have improved in accuracy and spatial resolution, geospatial accuracy standards have been revised by the American Society for Photogrammetry and Remote Sensing (ASPRS), industry standard file formats have been expanded, additional applications for lidar have become accepted, and the need for interoperable data across collections has been realized. This revision to the Lidar Base Specification, known as Version 1.2, addresses those changes and provides continued guidance towards a nationally consistent lidar dataset.

WARREN, MICH. – The Michigan State Police, Emergency Management and Homeland Security Division and Federal Emergency Management Agency (FEMA) report more than 122,000 southeast Michigan residents affected by the August floods have registered for assistance and nearly $216 million in federal disaster assistance has been approved. Survivors are strongly encouraged to register for FEMA assistance by the Dec. 14 deadline.