Science and TechnologyReports News from the GAOhttp://www.gao.gov/
Tue, 03 Mar 2015 15:31:28 -0500GAO/images/gao_logo_rss.gifGAO logohttp://www.gao.gov/
Feed provided by GAO. Click to visit.Federal Research: DOE Is Addressing Invention Disclosure and Other Challenges but Needs a Plan to Guide Data Management Improvements, January 30, 2015http://www.gao.gov/products/GAO-15-212
What GAO Found
The U.S. Department of Energy (DOE) provided at least a total of $11 billion ($12 billion in fiscal year 2014 dollars) in research and development funding to contractors for fiscal years 2009 through 2013. Contractors reported about 5,800 inventions and 700 patents developed with DOE funding during this time period. To ensure disclosure of these agency funded inventions, DOE relies primarily on contractor self-reporting and financial assistance award closeout procedures. Contractors are generally required to adhere to specific time frames for invention disclosure. Following contractor invention disclosure, DOE patent counsel monitor the invention through the end of a financial assistance award to ensure contractor compliance with time frame requirements for electing to retain ownership and applying for patent protection of the invention.
DOE faces challenges in (1) ensuring that contractors disclose agency funded inventions and (2) managing information related to these disclosures and is taking steps to address them.
Limited ability to ensure invention disclosure after funding ends: DOE does not have a documented process to ensure contractors disclose inventions after financial assistance awards end. To address this, DOE recently began two pilot efforts to determine the extent of undisclosed inventions. One is an audit of a sample of previously completed financial assistance awards and the other involves cross-referencing U.S. Patent and Trademark Office data against DOE information on inventions it funded. DOE is still implementing these efforts but reported identifying more than 100 potential undisclosed inventions. DOE will assess the results of the pilots to determine whether to continue them, according to DOE patent counsel.
Data management limitations: DOE faces a challenge in managing information related to agency funded inventions because it relies on two different data systems that are outdated, unable to communicate with each other, and do not allow for electronic reporting. Under federal internal control standards, information should be recorded and communicated to management and others within the entity who need it and in a form and within a time frame that enables them to carry out their responsibilities. DOE is in the process of updating its data systems and is planning the development of an electronic reporting function but has not established an implementation plan with milestones against which it can track its progress toward completing these efforts. By developing such a plan, DOE would have greater assurance that it is making timely progress toward these efforts.
In addition, DOE faces challenges in its ability to monitor and influence the utilization and domestic manufacture of inventions it funded to protect its interests in them. DOE has proposed regulatory changes to address these challenges that would (1) require contractors to report on the utilization and domestic manufacture of agency funded inventions, (2) allow DOE to assess manufacturing plans as criteria for funding decisions, and (3) require contractors to obtain DOE authorization for changes in their control—including ownership—under certain circumstances. According to patent counsel, DOE expects to finalize these regulatory changes in fiscal year 2015.
Why GAO Did This Study
DOE provides funding to contractors for research and development of new technologies. To incentivize participation in federal research projects and promote the use of federally funded inventions, the 1980 Bayh-Dole Act and other laws and regulations allow contractors receiving federal research and development funds to retain ownership of inventions they create so long as they adhere to certain requirements, including disclosing inventions developed with agency funding. DOE's ability to protect its interests in these inventions—including their utilization and domestic manufacture—depends on its knowledge of their existence.
GAO was asked to review DOE efforts to protect its interests in agency funded inventions. This report examines: (1) DOE funding for contractor research for fiscal years 2009 through 2013 and how DOE ensures that contractors disclose agency funded inventions, (2) the challenges DOE faces in ensuring invention disclosure and actions it is taking to address them, and (3) the challenges DOE faces in protecting its interests in these inventions and the actions it is taking to address them. GAO reviewed laws, regulations, and other documents and interviewed DOE patent counsel responsible for intellectual property issues, representatives of organizations that facilitate the development of federally funded technology, and others.
What GAO Recommends
GAO recommends that DOE develop an implementation plan with milestones for improving its data management systems. DOE agreed with this recommendation.
For more information, contact John Neumann at (202) 512-3841 or neumannj@gao.gov.Mon, 02 Mar 2015 12:00:00 -0500Letter ReportSmall Business Innovation Research: Change in Program Eligibility Has Had Little Impact, November 20, 2014http://www.gao.gov/products/GAO-15-68
What GAO Found
Two of the 11 agencies participating in the Small Business Administration's (SBA) Small Business Innovation Research (SBIR) program—the Department of Health and Human Services (HHS) and the Department of Energy (DOE)—opted to open part of their SBIR programs to small businesses that are majority-owned by multiple venture capital or similar firms (majority-owned portfolio companies), allowing such companies to apply for and receive SBIR awards. Specifically, HHS's National Institutes of Health (NIH) and the Department of Energy's Advanced Research Projects Agency-Energy (ARPA-E) opted to allow such companies to participate. For fiscal years 2013 and 2014, NIH and ARPA-E collectively received 20 applications from majority-owned portfolio companies and made 12 SBIR awards to them, totaling about $7.9 million. SBIR applications received and awards made to these companies comprise less than 1 percent of NIH and ARPA-E's SBIR applications and awards. NIH and ARPA-E officials said the change to allow majority-owned portfolio companies to apply for SBIR awards helps ensure that their SBIR programs receive the best research proposals.
For various reasons, the remaining nine agencies participating in SBIR have not submitted a written determination to allow them to make SBIR awards to majority-owned portfolio companies. According to officials from these agencies, they did not conduct any formal analysis but considered various factors, such as whether the change would significantly increase the number of applications, what administrative resources would be required to implement the change, and whether they had the evidence needed to prepare a written determination. All but one of the agencies told GAO that they may reevaluate their decision in the future, but did not have any specific plans for doing so. Officials from several agencies said that they wanted to see how the change in eligibility affected NIH and ARPA-E before implementing the change at their agencies.
GAO also found that some agencies viewed the written determination as a potentially stringent requirement. For their written determinations, NIH and ARPA-E did not conduct any independent research on majority-owned portfolio companies (nor were they specifically required to do so), but NIH cited related research. In contrast, six agencies viewed the written determination as potentially requiring independent analysis. Five agencies told GAO that they did not have the evidence or research needed to support a written determination, and another agency said it might consider opting in if it were easier to do so. According to SBA, the written determination is a notification letter that SBA reviews but does not approve or deny. SBA officials said they meet routinely with SBIR program managers, and this issue has not been raised. SBA updated its SBIR Policy Directive to include the written determination requirement but essentially used the same language as the reauthorization act without providing any specific guidance. In SBA's rule implementing the reauthorization act, SBA stated the rule's potential benefit is to provide more businesses with access to the SBIR program, which could increase competition and the quality of proposals and spur innovation. SBA is not responsible for encouraging or discouraging agencies to expand eligibility to include such companies, but SBA also has not discussed the issue with them. SBA could be missing an opportunity to help agencies better understand the evidence required for the written determination, which could inform the agencies' decisions whether to expand their program.
Why GAO Did This Study
The SBIR program provides grants and contracts to small businesses to develop and commercialize innovative technologies. The 2011 SBIR reauthorization included a provision that gave agencies the option to allow majority-owned portfolio companies to participate in SBIR. SBA issued a rule to implement the statutory provision, which became effective in January 2013. The reauthorization act requires agencies to submit a written determination to SBA and Congress, explaining how such awards will, among other things, significantly contribute to the agency's mission, before making SBIR awards to majority-owned portfolio companies.
The reauthorization mandated GAO to review the impact of this provision every 3 years. This is the first report under the mandate, and it examines (1) the impact of allowing majority-owned portfolio companies to participate in agency SBIR programs and (2) the extent to which agencies have elected to expand their SBIR programs to include majority-owned portfolio companies. GAO reviewed agency rules, policies, and other documents; analyzed SBIR data; and interviewed program officials from SBA and the 11 participating agencies, industry associations, and majority-owned portfolio companies.
What GAO Recommends
GAO recommends that SBA discuss the written determination requirement with participating agencies and, if needed, provide additional guidance. SBA generally agrees with the recommendation and plans to discuss the written determination requirement at a future program managers meeting.
For more information, contact Cindy Brown Barnes at (202) 512-8678 or brownbarnesc@gao.gov.Thu, 20 Nov 2014 12:00:00 -0500Letter ReportTechnology Transfer: Federal Laboratory Consortium Should Increase Communication with Potential Customers to Improve Initiatives, October 03, 2014http://www.gao.gov/products/GAO-15-127
What GAO Found
The Federal Laboratory Consortium for Technology Transfer (FLC) has taken steps to communicate with potential customers, including small businesses and entrepreneurs, but has not obtained feedback from them to assess their needs when designing and implementing technology transfer clearinghouse initiatives. This resulted in missed opportunities to better meet potential customer needs. For example, in 2012, when developing a web-based search tool to help potential customers identify relevant federal technology transfer opportunities across federal laboratories (labs), FLC discussed how to implement the tool with its federal member labs and agencies. However, FLC did not assess the information needs of potential customers to ensure the tool would provide relevant information in a format that customers consider useful, as called for by leading practices and federal internal control standards on communicating with and obtaining information from stakeholders. FLC officials said they conducted testing to ensure the new website functioned as intended before launching it, but did not involve potential customers in these tests. Moreover, after developing the tool, FLC did not communicate with potential customers to collect feedback from them consistent with leading practices regarding the extent to which the tool met their needs or how it might be improved before implementing it. Potential customers of FLC's initiatives expressed concerns about the extent to which FLC's recent web-based search tool would meet their needs, specifically noting that the tool:
provides limited information to facilitate personal interaction between federal researchers and customers, despite the importance of spontaneous idea sharing to facilitate technology transfer;
provides limited information on the full range of technology transfer opportunities, focusing instead on federally patented technologies;
affords customers limited ability to compare technologies across labs; and
provides limited information on the market relevance of a given technology.
FLC faces challenges in communicating with potential customers without also engaging its agency and lab members, given the relatively small size of FLC's annual budget and available staff. By working collaboratively with agency and lab members to collect feedback, FLC can enlist their help in enhancing the information provided through its initiatives.
FLC collects data on the use of its clearinghouse initiatives but has not developed and used performance goals and measures consistent with federal agency leading practices. For example, FLC collects data on the general use of its clearinghouse initiatives, such as the number of technology transfer inquiries it receives, the number of unique views of its web pages, and the average time spent on a web page. However, FLC has not developed performance goals or measures related to the key strategic goals to which its clearinghouse initiatives contribute. Without performance measures, FLC is unable to determine whether its initiatives are having their desired effect or how their performance might be improved. FLC also cannot fully demonstrate in its annual report to Congress its progress toward the achievement of its relevant strategic goals, limiting the information that the administration and Congress receive on the effectiveness of FLC's initiatives.
Why GAO Did This Study
The federal government spends about one-third of its annual $145 billion research and development budget at hundreds of federal agency labs. Technology generated by this research may have application beyond agencies' immediate goals if commercialized by the private sector. For example, federal research has contributed to innovative products, including antibiotics and the Internet. FLC—a nationwide consortium of federal labs—helps labs transfer technology to the private sector. In recent years, FLC created new initiatives to provide a clearinghouse—a central point for collecting and disseminating information—for technology transfer opportunities.
GAO was asked to review FLC's efforts to provide information on technology transfer opportunities. This report assesses (1) the extent to which FLC has communicated with potential customers when designing and implementing its clearinghouse initiatives, and (2) how FLC measured the results of those initiatives. GAO reviewed relevant laws and FLC guidance, and interviewed a nonprobability sample of officials from four federal agencies with the highest research budgets, and a spectrum of eight customer groups, among others.
What GAO Recommends
GAO recommends, among other things, that FLC work collaboratively with agency and lab members to increase communication with potential customers to obtain feedback and improve its clearinghouse initiatives, and develop performance measures. FLC generally agreed with the report's findings and recommendations.
For more information, contact John Neumann at (202) 512-3841 or neumannj@gao.gov.Fri, 03 Oct 2014 13:00:00 -0400Letter ReportFederally Funded Research Centers: Agency Reviews of Employee Compensation and Center Performance, August 11, 2014http://www.gao.gov/products/GAO-14-593
What GAO Found
The 30 federally funded research and development centers (FFRDC) sponsored by the Department of Energy (DOE), Department of Defense (DOD), and National Science Foundation (NSF) received nearly $84 billion in total funding for fiscal years 2008 through 2012. Of these 30 centers, the 16 sponsored by DOE received about 79 percent of this funding according to GAO's analysis of sponsoring agencies' responses to a GAO survey on FFRDC funding and compensation. During this time, DOE obligated about 34 percent of its budget to the FFRDCs it sponsored, and DOD and NSF devoted less than 1 percent and 4 percent of their budgets, respectively. FFRDCs sponsored by these agencies received approximately $15 billion of their total funding from sources other than the sponsoring agency—specifically other federal agencies, nonfederal entities such as state or local governments, and private entities.
Many FFRDCs sponsored by DOE, DOD, and NSF spent over half of their total funding on employee compensation, and the three agencies had processes in place to review such compensation. For example, the agencies reviewed senior executive compensation to ensure that they do not reimburse FFRDC contractors in excess of the cap set in statute. All three agencies also have processes in place to document the total reimbursed compensation for senior executives against the cap, although DOE changed its policy during the course of GAO's work. In May 2014, DOE updated its policy on executive compensation to require documentation of compensation subject to the cap—a requirement that was not in place prior to this date. DOE officials noted that this change was due, in part, to the recent action by Congress in December 2013 to reduce the cap from $952,308 to $487,000.
DOE, DOD, and NSF assess performance of FFRDCs through three types of reviews: (1) comprehensive reviews—which the Federal Acquisition Regulation (FAR) requires at least every 5 years, (2) annual performance reviews, and (3) other review activities such as day-to-day oversight. DOE, DOD, and NSF conducted timely comprehensive reviews of the use and need for the FFRDCs they sponsored in most cases and, in all cases, the agencies recommended the continuance of the FFRDCs they sponsor. The FAR describes five elements the review should include, and DOE, DOD, and NSF generally included these elements, with varying levels of detail in keeping with the flexibilities the FAR provides. These agencies also have procedures to annually review and document the performance of the FFRDCs they sponsor, and many of these reviews use surveys of federal officials who interact with the centers. In addition, officials from DOE, DOD, and NSF told GAO that they engage in other day-to-day oversight activities to help them assess FFRDC performance, such as observing work and meeting with contractor employees.
Why GAO Did This Study
Federal agencies sponsor 40 FFRDCs for research and development tasks that are integral to their missions. DOE, DOD, and NSF sponsor the largest number of FFRDCs—16, 10, and 4 centers, respectively. Federal agencies sponsor FFRDCs by contracting with nonprofit, university-affiliated, or private industry operators. Federal statute and regulations provide for reimbursements for compensation for FFRDC contractor employees and require that sponsoring agencies evaluate the use and need for their FFRDCs.
GAO was asked to review the management and oversight of FFRDCs. This report (1) describes funding for FFRDCs sponsored by DOE, DOD, and NSF for fiscal years 2008 through 2012; (2) examines compensation for FFRDC employees and these agencies' processes for review of compensation; and (3) determines how these agencies assess FFRDC performance. GAO surveyed the agency sponsors for the 30 FFRDCs, analyzed agency policies and reviews of these FFRDCs, and interviewed agency officials and contractor representatives.
What GAO Recommends
GAO is not making recommendations in this report. DOE, DOD, and NSF reviewed a draft of this report and did not provide formal comments. Technical comments provided by DOE, DOD, and NSF were incorporated, as appropriate.
For more information, contact John Neumann at (202) 512-3841 or jneumann@gao.gov.Wed, 10 Sep 2014 13:00:00 -0400Letter ReportDepartment of Homeland Security: Actions Needed to Strengthen Management of Research and Development, September 09, 2014http://www.gao.gov/products/GAO-14-865T
What GAO Found
In September 2012, GAO reported that the Department of Homeland Security (DHS) did not know the total amount its components had invested in research and development (R&amp;D) and did not have policies and guidance for defining R&amp;D and overseeing R&amp;D resources across the department. According to DHS, its Science and Technology Directorate (S&amp;T), Domestic Nuclear Detection Office (DNDO), and Coast Guard were the only components that conducted R&amp;D, and GAO found that these were the only components that reported budget authority, obligations, or outlays for R&amp;D activities to the Office of Management and Budget. However, GAO identified an additional $255 million in R&amp;D obligations made by other DHS components. At the time of GAO's review, DHS did not have a department-wide policy defining R&amp;D or guidance directing components how to report all R&amp;D activities. GAO recommended that DHS develop policies and guidance to assist components in better understanding how to report R&amp;D activities and better position DHS to determine R&amp;D investments. DHS concurred with the recommendation and, as of September 2014, had updated its guidance to include a definition of R&amp;D but efforts to develop a process for coordinating R&amp;D with other offices remain ongoing and have not yet been completed. GAO will continue to monitor DHS's efforts to develop its approach for overseeing R&amp;D at the department.
GAO also reported in September 2012 that S&amp;T had taken some steps to coordinate R&amp;D efforts across DHS, but the department's R&amp;D efforts were fragmented and overlapping, a fact that increased the risk of unnecessary duplication. GAO recommended that DHS develop a policy defining roles and responsibilities for coordinating R&amp;D and establish a mechanism to track all R&amp;D projects to help DHS mitigate existing fragmentation and overlap and reduce the risk of unnecessary duplication. DHS concurred with the recommendation. As of September 2014, S&amp;T has not fully implemented new policy guidance but, according to S&amp;T, is conducting portfolio reviews across the department, as directed by the fiscal year 2013 appropriations act, aimed at coordinating R&amp;D activities. GAO will continue to monitor DHS's efforts to develop a policy to better coordinate and track R&amp;D activities at the department.
In September 2013, GAO reported that DHS border and maritime R&amp;D components reported producing 97 R&amp;D deliverables from fiscal years 2010 through 2012 at an estimated cost of $177 million. GAO found that the type of border and maritime R&amp;D deliverables produced by S&amp;T, the Coast Guard, and DNDO varied, and R&amp;D customers GAO met with had mixed views on the impact of the deliverables. These deliverables included knowledge products and reports, technology prototypes, and software. For example, S&amp;T developed prototype radar and video systems for use by Border Patrol. However, GAO reported that S&amp;T had not established time frames and milestones for collecting and evaluating feedback on the extent to which deliverables met customers' needs. GAO recommended that S&amp;T establish time frames and milestones for collecting and evaluating such feedback from its customers to better determine the usefulness and impact of its R&amp;D projects and make better-informed decisions regarding future work. As of September 2014, DHS had taken steps to address this recommendation, including making plans to gather customer feedback on a more consistent basis. GAO will continue to monitor DHS's efforts in this area.
Why GAO Did This Study
Conducting R&amp;D on technologies for detecting, preventing, and mitigating terrorist threats is vital to enhancing the security of the nation. Since its creation, DHS has spent billions of dollars researching and developing technologies used to support its missions. Within DHS, S&amp;T conducts and is responsible for coordinating R&amp;D across the department. Other components also conduct R&amp;D to support their respective missions.
This statement discusses (1) how much DHS invests in R&amp;D and the extent to which DHS has policies and guidance for defining and overseeing its R&amp;D efforts across the department, (2) the extent to which R&amp;D is coordinated across DHS, and (3) the results of DHS border and maritime security R&amp;D efforts and the extent to which DHS has obtained and evaluated feedback on these efforts. This statement is based on GAO's previously issued work from September 2012 to July 2014, and selected updates conducted in September 2014 on the status of GAO's prior recommendations. To conduct the updates, GAO reviewed agency documentation.
What GAO Recommends
In its prior reports, GAO recommended, among other things, that DHS develop policies and guidance for defining, overseeing, coordinating, and tracking R&amp;D activities across the department, and that S&amp;T establish time frames and milestones for collecting and evaluating feedback from its customers. DHS concurred with GAO's recommendations and has actions underway to address them.
For more information, contact Dave Maurer at (202) 512-9627 or maurerd@gao.gov.Tue, 09 Sep 2014 13:00:00 -0400TestimonyDepartment of Homeland Security: Continued Actions Needed to Strengthen Oversight and Coordination of Research and Development, July 31, 2014http://www.gao.gov/products/GAO-14-813T
What GAO Found
In September 2012, GAO reported that the Department of Homeland Security (DHS) did not know the total amount its components invested in research and development (R&amp;D) and did not have policies and guidance for defining R&amp;D and overseeing R&amp;D resources across the department. According to DHS, its Science &amp; Technology Directorate (S&amp;T), Domestic Nuclear Detection Office (DNDO), and Coast Guard were the only components that conducted R&amp;D, and GAO found that these were the only components that reported budget authority, obligations, or outlays for R&amp;D activities to the Office of Management and Budget. However, GAO identified an additional $255 million in R&amp;D obligations made by other DHS components. At the time of GAO's review, DHS reported it was difficult to identify all R&amp;D investments across the department because DHS did not have a department wide policy defining R&amp;D or guidance directing components how to report all R&amp;D activities. GAO recommended that DHS develop policies to assist components in better understanding how to report R&amp;D activities and better position DHS to determine R&amp;D investments. DHS concurred with the recommendation and, as of July 2014, had updated its guidance to include a definition of R&amp;D but had not yet determined the most effective path to guide R&amp;D across the department. GAO will continue to monitor DHS's efforts to develop its approach for overseeing R&amp;D at the department.
GAO also reported in September 2012 that S&amp;T had taken some steps to coordinate R&amp;D efforts across DHS, but the department's R&amp;D efforts were fragmented and overlapping, which increased the risk of unnecessary duplication. GAO recommended that DHS develop a policy defining roles and responsibilities for coordinating R&amp;D and establish a mechanism to track all R&amp;D projects to help DHS mitigate existing fragmentation and overlap and reduce the risk of unnecessary duplication. DHS concurred with the recommendation. As of July 2014, S&amp;T has not developed new policy guidance but is conducting portfolio reviews across the department, as directed by the fiscal year 2013 appropriations act, aimed at coordinating R&amp;D activities. GAO will continue to monitor DHS's efforts to develop a policy to better coordinate and track R&amp;D activities at the department.
In September 2013, GAO reported that DHS border and maritime R&amp;D components reported producing 97 R&amp;D deliverables from fiscal year 2010 through 2012 at an estimated cost of $177 million. GAO found that the type of border and maritime R&amp;D deliverables produced by S&amp;T, the Coast Guard, and DNDO varied, and R&amp;D customers GAO met with had mixed views on the impact of the deliverables. These deliverables included knowledge products and reports, technology prototypes, and software. For example, S&amp;T developed prototype radar and video systems for use by Border Patrol. However, GAO reported that S&amp;T had not established timeframes for collecting and evaluating feedback on the extent to which deliverables met customers' needs. GAO recommended that S&amp;T collect such feedback from its customers to better determine the usefulness and impact of its R&amp;D projects and deliverables and make better-informed decisions regarding future work. As of July 2014, DHS had taken steps to address this recommendation, including making plans to gather customer feedback. GAO will continue to monitor DHS's efforts in this area.
Why GAO Did This Study
Conducting R&amp;D on technologies for detecting, preventing, and mitigating terrorist threats is vital to enhancing the security of the nation. Since its creation, DHS has spent billions of dollars researching and developing technologies used to support its missions including securing the border, and detecting nuclear material among others. Within DHS, S&amp;T conducts and is responsible for coordinating R&amp;D across the department. Other components also conduct R&amp;D to support their respective missions.
This statement discusses (1) how much DHS invests in R&amp;D and the extent to which DHS has policies and guidance for defining and overseeing its R&amp;D efforts across the department, (2) the extent to which R&amp;D is coordinated across DHS, and (3) the results of DHS border and maritime security R&amp;D efforts and the extent to which DHS has obtained feedback on these efforts. This statement is based on GAO's previously issued work from September 2012 to September 2013, and selected updates conducted in July 2014 on the status of GAO's prior recommendations. To conduct the updates, GAO reviewed agency documentation.
What GAO Recommends
In its prior reports, GAO recommended, among other things, that DHS develop policies and guidance for defining, overseeing, coordinating, and tracking R&amp;D activities across the department; and that S&amp;T collect and evaluate feedback from its customers. DHS concurred with GAO's recommendations and has actions underway to address them.
For more information, contact Dave Maurer at (202) 512-9627 or maurerd@gao.gov.Thu, 31 Jul 2014 13:00:00 -0400TestimonyCombating Nuclear Smuggling: Past Work and Preliminary Observations on Research and Development at the Domestic Nuclear Detection Office, July 29, 2014http://www.gao.gov/products/GAO-14-783T
What GAO Found
GAO has reported on the Department of Homeland Security's (DHS) Domestic Nuclear Detection Office's (DNDO) since 2006. GAO has identified challenges and made recommendations in the following areas:
DNDO's efforts to develop the Global Nuclear Detection Architecture (GNDA): In 2008, GAO recommended that DHS develop a strategic plan to guide the development of the GNDA, a framework for 74 independent programs, projects, or activities to detect and interdict nuclear smuggling. In 2010, DHS issued a plan and GAO reviewed this plan and found that it generally addressed GAO's recommendations.
DNDO's efforts to replace radiation detection equipment: GAO has found challenges in DNDO's efforts to develop and deploy radiation portal monitors, which scan for nuclear or radiological materials at ports of entry. GAO has made several recommendations throughout the history of these efforts, and DNDO has taken actions that have generally been responsive.
DHS's efforts to coordinate research and development (R&amp;D) across the agency. In 2012 and 2013, GAO made recommendations to help DHS oversee its R&amp;D investments and efforts, and in particular its border and maritime R&amp;D efforts. GAO's recommendations focused on strengthening coordination and defining R&amp;D across the agency. DHS concurred with GAO's recommendations and described actions it plans to take in response.
Preliminary observations from GAO's ongoing review are that DNDO has taken steps to manage R&amp;D and assess project outcomes, but that it may not be able to demonstrate how agency investments align with critical mission needs. DNDO officials told GAO that they discuss how research projects may contribute to critical mission needs but that they do not document these discussions. Once research projects are complete, DNDO officials told GAO they evaluate the success of individual research projects, but DNDO does not have a systematic approach to ensure its overall R&amp;D investments address gaps in the GNDA. As a result, DNDO may not be able to demonstrate to key stakeholders—including oversight organizations and potential users of new technologies—that its R&amp;D investments are aligned with critical mission needs.
GAO's ongoing work indicates that DNDO officials have taken some steps to coordinate R&amp;D efforts internally, with other federal agencies, and with end users, but preliminary analysis shows that not all of DNDO's end users are satisfied with DNDO's communication. DNDO directorates work closely to identify critical mission needs, and DNDO collaborates with other federal research agencies to leverage expertise. However, DNDO's end users varied in their satisfaction with DNDO's efforts to coordinate with them. Officials from two end user agencies told GAO that coordination was working well; however, officials from the largest end user agency stated that they were generally dissatisfied with DNDO's coordination because DNDO's research directorate does not provide them information directly and, in some cases, found that project requirements would not meet the agency's operational needs. This is consistent with GAO's 2010 finding that inadequate communication caused DNDO to pursue scanning technology that would not meet the operational requirements of the end user if it were deployed.
Why GAO Did This Study
Preventing terrorists from using nuclear or radiological material to carry out an attack in the United States is a top national priority. Within DHS, DNDO's mission is to (1) improve capabilities to deter, detect, respond to, and attribute attacks, in coordination with domestic and international partners, and (2) conduct R&amp;D on radiation and nuclear detection devices. GAO has reported on progress and challenges in DNDO's efforts since 2006 and is currently reviewing DNDO's planning and prioritization of its R&amp;D investments.
This testimony discusses GAO's past work on DNDO's efforts to develop the GNDA and deploy radiation detection equipment and DHS's efforts to coordinate R&amp;D across the agency, as well as preliminary observations from GAO's ongoing review of DNDO's research directorate's efforts to (1) manage its R&amp;D investments to align with critical mission needs and (2) coordinate its R&amp;D efforts internally, with other federal research agencies, and with end users of the technology it develops.
To conduct its ongoing review, GAO analyzed DHS documents and data related to how DNDO plans and prioritizes its R&amp;D program, and interviewed officials on coordinating R&amp;D.
GAO is not making any new recommendations in this statement. As GAO continues to complete its ongoing work, it will consider the need for any new recommendations as appropriate. DHS provided technical comments, which were incorporated as appropriate.
For more information, contact David C. Trimble at (202) 512-3841 or trimbled@gao.gov.Tue, 29 Jul 2014 13:00:00 -0400TestimonySmall Business Innovation Research: DOD's Program Has Developed Some Technologies that Support Military Users, but Lacks Comprehensive Data on Transition Outcomes, July 23, 2014http://www.gao.gov/products/GAO-14-748T
What GAO Found
Transitioning technologies from defense research and technology development programs, such as through the Small Business Innovation Research (SBIR) program, to military users has been a long-standing challenge for the Department of Defense (DOD). Over the past decade, Congress and DOD have taken several steps to address transition challenges in DOD's SBIR program. For example, the military departments can offer additional SBIR funding to certain awardees to supplement or extend technology development projects in order to move them closer to transition. Additionally, each of the military departments has a network of transition facilitators who work directly with small businesses, military research laboratories, and the acquisition community to foster transition opportunities. Further, in fiscal year 2012, Congress provided federal agencies the opportunity to use more of SBIR funding (up to 3 percent) for program administrative purposes, including activities that facilitate transition. However, at times, promising technologies are not taken advantage of because their potential has not been adequately demonstrated, they do not meet military requirements, or users are unable to fund the final stages of development and testing.
GAO found that DOD's SBIR program has developed some technologies that successfully transitioned into acquisition programs or fielded systems, but the extent of transition is unknown because comprehensive and reliable transition data are not collected. The military departments collect information on selected transition “success stories” on a somewhat ad hoc basis from SBIR program officials, acquisition program officials, prime contractors, or directly from small businesses. In addition to these less formal transition tracking efforts, the military departments use, to varying degrees, two data systems—Company Commercialization Reports and the Federal Procurement Data System-Next Generation—to identify transition results program-wide. While these systems provide high-level commercialization information that the departments use to track progress in achieving overall program goals, the systems have significant gaps in coverage and data reliability concerns that limit their transition tracking capabilities. In addition, the systems are not designed to capture detailed information on acquisition programs, fielded systems, or on projects that did not transition.
The National Defense Authorization Act (NDAA) for fiscal year 2012 directed DOD to begin reporting the number and percentage of SBIR projects that transition into acquisition programs or to fielded systems, among other things. DOD acknowledged that it may need to modify its existing data systems or develop new tools to compile more complete and accurate technology transition data. At the end of 2013, DOD was still assessing how to comply with the new transition reporting requirements, and had not established a specific plan, as GAO had recommended, for how and when it would be able to meet the requirements. In a recent update, DOD officials confirmed that alternatives are still being evaluated and no plan for improving the tracking and reporting of technology transition has been completed. Without better information on technology transition outcomes, questions will remain as to whether the DOD SBIR program is providing the right technologies at the right time to users, using effective approaches to select, develop, and transition technologies, and providing tangible benefits.
Why GAO Did This Study
DOD relies on its research and development community to identify, pursue, and develop new technologies that improve and enhance military operations and ensure technological superiority over adversaries. The SBIR program is a key mechanism for DOD to use small businesses to meet its research and development needs; stimulate technological innovation; foster and encourage participation by minority and disadvantaged persons in technological innovation; and increase private sector commercialization of innovations derived from federal research and development funding. DOD is the largest SBIR participant in the federal government, with over $1 billion spent annually on the program.
This testimony is based primarily on a report GAO issued in December 2013 and addresses: (1) practices the military departments use to facilitate the transition of SBIR technologies, (2) the extent to which these technologies are successfully transitioning to military users, such as weapon system programs or warfighters in the field, and (3) DOD's efforts to meet fiscal year 2012 NDAA transition reporting requirements. This statement draws from the 2013 report and other work GAO has conducted on technology transition activities in DOD's science and technology programs.
For more information, contact Marie A Mak at (202) 512-4841 or makm@gao.gov.Wed, 23 Jul 2014 13:00:00 -0400TestimonyInformation Management: The National Technical Information Service's Dissemination of Technical Reports Needs Attention, July 23, 2014http://www.gao.gov/products/GAO-14-781T
What GAO Found
The Department of Commerce's National Technical Information Service (NTIS) offers a variety of products and information-related services. Its products include a repository of scientific, technical, engineering, and business research reports, which it makes available individually as well as through subscriptions to its reports library. However, from fiscal year 2001 through 2011, costs for NTIS's products exceeded revenue for 10 of the 11 fiscal years, and the agency was financially sustained during this period by services it offered to other federal agencies, such as distribution and order fulfillment and various web-based services. (See figure.)
Net Earned Revenues and Net Costs for National Technical Information Service's Products and Services, Fiscal Years 2001–2011
In addition, about 62 percent of the reports added to NTIS's repository between 1990 and 2011 were older—with publications dates in the year 2000 or earlier, while about 38 percent were published from 2001 to 2011. However, demand was greater for more recent reports—those published in 2001 or later.
Further, GAO estimated that 74 percent of the reports added to NTIS's collection from fiscal year 1990 through 2011 were available elsewhere, and 95 percent of these were available for free. This calls into question the viability and appropriateness of NTIS's fee-based model for disseminating the reports it collects.
Why GAO Did This Study
NTIS was established by statute in 1950 to collect scientific and technical research reports, maintain a bibliographic record and repository of these reports, and disseminate them to the public. In addition, it provides various information-based services to other federal agencies. NTIS charges fees for its products and services and is required by law to be financially self-sustaining to the greatest extent possible.
GAO was asked to provide a statement summarizing its November 2012 report in which it examined (1) NTIS's operations; (2) the age of and demand for reports added to its repository; and (3) the extent to which these reports are readily available from other public sources. In preparing this statement, GAO relied primarily on its previously published work as well as related updates on actions needed to reduce fragmentation, overlap, and duplication in the federal government.
What GAO Recommends
In its 2012 report, GAO suggested that Congress reassess the appropriateness and viability of the fee-based model under which NTIS operates to determine whether this model should be continued. While the Department of Commerce stated that it did not plan to propose any changes to NTIS's fee-based model, legislation recently introduced in Congress may provide a vehicle for reassessing this model.
For more information, contact Valerie C. Melvin at (202) 512-6304 or melvinv@gao.gov.Wed, 23 Jul 2014 13:00:00 -0400TestimonyHigh-Containment Laboratories: Recent Incidents of Biosafety Lapses, July 16, 2014http://www.gao.gov/products/GAO-14-785T
What GAO Found
No federal entity is responsible for strategic planning and oversight of high-containment laboratories. Since the 1990s, the number of high-containment laboratories has risen; however, the expansion of high-containment laboratories was not based on a government-wide coordinated strategy. Instead, the expansion was based on the perceptions of individual agencies about the capacity required for their individual missions and the high-containment laboratory activities needed to meet those missions, as well as the availability of congressionally approved funding. Consequent to this mode of expansion, there was no research agenda linking all these agencies, even at the federal level, that would allow for a national needs assessment, strategic plan, or coordinated oversight. As GAO last reported in 2013, after more than 12 years, GAO has not been able to find any detailed projections based on a government-wide strategic evaluation of research requirements based on public health or national security needs. Without this information, there is little assurance of having facilities with the right capacity to meet the nation's needs.
GAO's past work has found a continued lack of national standards for designing, constructing, commissioning, and operating high-containment laboratories. As noted in a 2009 report, the absence of national standards means that the laboratories may vary from place to place because of differences in local building requirements or standards for safe operations. Some guidance exists about designing, constructing, and operating high-containment laboratories. Specifically, the Biosafety in Microbiological and Biomedical Laboratories guidance recommends various design, construction, and operations standards, but GAO's work has found it is not universally followed. The guidance also does not recommend an assessment of whether the suggested design, construction, and operational standards are achieved. As GAO has reported, national standards are valuable not only in relation to new laboratory construction but also in ensuring compliance for periodic upgrades.
No one agency is responsible for determining the aggregate or cumulative risks associated with the continued expansion of high-containment laboratories; according to experts and federal officials GAO interviewed for prior work, the oversight of these laboratories is fragmented and largely self-policing.
On July 11, 2014, the Centers for Disease Control and Prevention (CDC) released a report on the potential exposure to anthrax that described a number of actions that CDC plans to take within its responsibilities to avoid another incident like the one in June. The incident in June was caused when a laboratory scientist inadvertently failed to sterilize plates containing samples of anthrax, derived with a new method, and transferred them to a facility with lower biosecurity protocols. This incident and the inherent risks of biosecurity highlight the need for a national strategy to evaluate the requirements for high-containment laboratories, set and maintain national standards for such laboratories' construction and operation, and maintain a national strategy for the oversight of laboratories that conduct important work on highly infectious pathogens.
Why GAO Did This Study
Recent biosecurity incidents—such as the June 5, 2014, potential exposure of staff in Atlanta laboratories at the Centers for Disease Control and Prevention (CDC) to live spores of a strain of anthrax—highlight the importance of maintaining biosafety and biosecurity protocols at high-containment laboratories. This statement summarizes the results of GAO's past work on the oversight of high-containment laboratories, those designed for handling dangerous pathogens and emerging infectious diseases. Specifically, this statement addresses (1) the need for governmentwide strategic planning for the requirements for high-containment laboratories, including assessment of their risks; (2) the need for national standards for designing, constructing, commissioning, operating, and maintaining such laboratories; and (3) the oversight of biosafety and biosecurity at high-containment laboratories. In addition, it provides GAO's preliminary observations on the potential exposure of CDC staff to anthrax. For this preliminary work, GAO reviewed agency documents, including a report on the potential exposure, and scientific literature; and interviewed CDC officials.
What GAO Recommends
This testimony contains no new recommendations, but GAO has made recommendations in prior reports to responsible agencies.
For more information, contact Nancy Kingsbury at (202) 512-2700 or kingsburyn@gao.gov.Wed, 16 Jul 2014 13:00:00 -0400TestimonyNOAA Aircraft: Aging Fleet and Future Challenges Underscore the Need for a Capital Asset Plan, July 09, 2014http://www.gao.gov/products/GAO-14-566
What GAO Found
The National Oceanic and Atmospheric Administration (NOAA) within the Department of Commerce has 34 efforts aimed at improving its aircraft asset planning and management, some of which are ongoing, while others are under way or planned; however, because these efforts are not yet fully implemented, it is too early to determine whether they will reflect the leading practices in capital asset management that have been identified by the Office of Management and Budget (OMB). Among NOAA's initiatives are efforts to enhance its process for scheduling aircraft use among NOAA offices and to develop new aircraft performance metrics. NOAA's efforts also include the development of multiple long-term plans that together are intended to constitute a capital asset plan for aircraft. OMB leading practices encourage agencies to have capital asset plans—which help provide agencies with information and analysis to make long-term decisions about acquiring and managing capital assets—as a part of their strategic planning efforts, but NOAA currently does not have such a plan for its aircraft. NOAA expects to complete its various improvement efforts related to aircraft asset planning and management by fiscal year 2017.
Examples of NOAA Aircraft
NOAA faces challenges in improving its aircraft asset planning and management. NOAA's complex approach to creating a capital asset plan for aircraft may present challenges because it will comprise multiple stand-alone plans, and critical planning information and analysis on different types of assets will be spread across different documents. NOAA is in the early stages of some of these planning efforts and has not yet determined how, or whether, it will link and integrate the plans with one another to ensure that they will serve as a comprehensive plan. NOAA has faced challenges in finalizing a capital planning effort in the past. In 2009, NOAA leadership suspended a planning effort intended to address the agency's future aircraft needs in order to incorporate additional aircraft-related information, according to NOAA officials; the agency subsequently began its current planning effort 4 years later, in 2013. The importance of a capital asset plan is underscored by the significant decisions NOAA faces regarding its aircraft fleet, particularly its two operating P-3 Orion aircraft that are in high demand for hurricane work (see fig.). For example, given that the P-3 Orion aircraft are nearly 40 years old, NOAA faces decisions on whether to invest in additional costly service life extensions or replace the aircraft. Linking and integrating its multiple planning efforts could help NOAA demonstrate that it has a capital asset plan consistent with OMB guidance. Without a capital asset plan in place, NOAA risks making decisions that will not allow the agency to effectively address future challenges.
Why GAO Did This Study
NOAA's aircraft play a critical role in collecting scientific data to help NOAA advance understanding of changes in the environment and manage ocean and coastal resources. NOAA uses its aircraft for a wide range of scientific missions. In fiscal year 2013, NOAA's aircraft flew hundreds of flights and logged about 3,900 flight hours. NOAA officials predict that expanding mission needs will lead to increased demand for aircraft services. To address such challenges, NOAA has been working to improve its capital asset planning and management for aircraft.
A House committee report on the Consolidated and Further Continuing Appropriations Act, 2013, mandated GAO to examine various issues regarding NOAA's aircraft. This report examines (1) the status of NOAA's efforts to improve its aircraft planning and management and the extent to which these efforts reflect leading practices and (2) challenges NOAA faces in improving its aircraft asset planning and management. GAO analyzed aircraft cost and flight hour data from fiscal year 2004 through fiscal year 2013, reviewed agency planning and management documents, and interviewed agency officials. GAO reviewed capital asset planning guidance from OMB to identify leading practices.
What GAO Recommends
GAO recommends that the NOAA Administrator ensure that the agency links and integrates its multiple planning efforts as it finalizes a comprehensive capital asset plan for aircraft. NOAA concurred with the recommendation.
For more information, contact Anne-Marie Fennell at (202) 512-3841 or FennellA@gao.gov.Wed, 09 Jul 2014 13:00:00 -0400Letter ReportAdvanced Reactor Research: DOE Supports Multiple Technologies, but Actions Needed to Ensure a Prototype Is Built, June 23, 2014http://www.gao.gov/products/GAO-14-545
What GAO Found
The Department of Energy's (DOE) Office of Nuclear Energy's (NE) approach to advanced reactor research and development (R&amp;D) focuses on three reactor technologies—high-temperature gas-cooled reactors, sodium-cooled fast reactors, and fluoride-salt-cooled high-temperature reactors—but NE is also funding research into other advanced reactor technologies. NE's approach is to conduct research in support of multiple advanced reactor technologies, while collaborating with industry and academia, with the ultimate goal for industry to take the results of NE's research to the next step of development and commercialization. This approach provides several advantages, including flexibility in responding to changes in future U.S. energy policy. Many representatives that GAO talked to from the nuclear power industry and the National Academy of Sciences agree with NE's approach, saying that current policies on controlling greenhouse gas emissions and disposing of nuclear waste do not make a compelling case for choosing a reactor technology to develop. However, others GAO talked to are critical of some of the reactor technologies NE chooses to research, citing economic and technological challenges. The Nuclear Energy Advisory Committee has criticized NE's approach, recommending that NE focus its efforts on a smaller number of technologies to help ensure that a reactor prototype is deployed. To remain aware of industry's R&amp;D needs and international nuclear energy developments, NE regularly collaborates with industry and international organizations.
NE uses internal and external reviews to set program and funding priorities for advanced reactor R&amp;D activities and to evaluate progress toward program goals. For example, NE conducts internal monthly and quarterly reviews to discuss project status, budgets, and technical highlights. Furthermore, NE's R&amp;D efforts are periodically reviewed by external entities, including the Nuclear Energy Advisory Committee. Among the advanced reactor technologies that NE's R&amp;D currently supports, the high-temperature gas-cooled reactor is the technology that is most likely to be deployed and commercialized in the near term, according to an NE planning document. NE officials said this likelihood is based on the wide range of potential industry market applications and because of substantial government investments in the technology's development. NE has been pursuing this technology under the Next Generation Nuclear Plant (NGNP) Project, as established by the Energy Policy Act of 2005 (EPAct 2005). Under EPAct 2005, DOE is to deploy a prototype reactor for NGNP by the end of fiscal year 2021. However, in 2011, DOE decided not to proceed with the deployment phase of this project, citing several barriers. For example, NE and industry have been unable to reach an agreement on a cost-share arrangement to fund the deployment phase because of a disagreement on the applicable cost-share levels and how and when the cost-share would be applied to specific activities or project phases. Although NE continues to conduct R&amp;D for the NGNP Project, it has not developed a strategy to overcome the cost-share issue and other barriers to resuming the deployment phase of the project. Furthermore, DOE has not selected initial reactor design parameters or reported to Congress on an alternative date for making this selection. Without doing so, it is not clear when NE is going to take this next step in deploying the NGNP prototype reactor and it risks the project not being completed by the targeted date in 2021.
Why GAO Did This Study
NE conducts R&amp;D on advanced nuclear reactor technologies with multiple aims, including (1)&nbsp;improving the economic competitiveness of nuclear technology to ensure that nuclear power continues to play a role in meeting our nation's energy needs; (2)&nbsp;increasing safety; (3)&nbsp;minimizing the risk of nuclear proliferation and terrorism; and (4)&nbsp;addressing environmental challenges, such as reducing greenhouse gas emissions. External groups have been critical of NE for, among other things, how it prioritizes advanced reactor R&amp;D.
GAO was asked to review NE's advanced reactor R&amp;D efforts. This report (1) describes NE's approach to advanced nuclear reactor R&amp;D and (2) examines how NE plans and prioritizes its advanced reactor R&amp;D activities, including deploying an advanced reactor. GAO reviewed laws and reports concerning NE's efforts to develop advanced reactor technologies and interviewed NE officials and a nonprobability sample of companies developing such technology, selected because of their involvement with DOE's R&amp;D efforts.
What GAO Recommends
To better prepare DOE to meet the requirement of EPAct 2005 to deploy the NGNP prototype reactor, GAO recommends that DOE develop a strategy for resuming the NGNP Project and provide a report to Congress updating the status of the project. DOE agreed in principle with GAO's first recommendation and respectfully disagreed with the second. GAO believes these recommendations remain valid as discussed in the report.
For more information, contact Frank Rusco at (202) 512-3841 or ruscof@gao.gov.Mon, 23 Jun 2014 13:00:00 -0400Letter ReportExport Controls: NASA Management Action and Improved Oversight Needed to Reduce the Risk of Unauthorized Access to Its Technologies, June 20, 2014http://www.gao.gov/products/GAO-14-690T
What GAO Found
Weaknesses in the National Aeronautics and Space Administration (NASA) export control policy and implementation of foreign national access procedures at some centers increase the risk of unauthorized access to export-controlled technologies. NASA policies provide Center Directors wide latitude in implementing export controls at their centers. Federal internal control standards call for clearly defined areas of authority and establishment of appropriate lines of reporting. However, NASA procedures do not clearly define the level of center Export Administrator (CEA) authority and organizational placement, leaving it to the discretion of the Center Director. GAO found that 7 of the 10 CEAs are at least three levels removed from the Center Director. Three of these 7 stated that their placement detracted from their ability to implement export control policies by making it difficult to maintain visibility to staff, communicate concerns to the Center Director, and obtain resources; the other four did not express concerns about their placement. However, in a 2013 meeting of export control officials, the CEAs recommended placing the CEA function at the same organizational level at each center for uniformity, visibility, and authority. GAO identified and the NASA Inspector General also reported instances in which two centers did not comply with NASA policy on foreign national access to NASA technologies. For example, during a 4-month period in 2013, one center allowed foreign nationals on a major program to fulfill the role of sponsors for other foreign nationals, including determining access rights for themselves and others. Each instance risks damage to national security. Due to access concerns, the NASA Administrator restricted foreign national visits in March 2013, and directed each center to assess compliance with foreign national access and develop corrective plans. By June 2013, six centers identified corrective actions, but only two set time frames for completion and only one planned to assess the effectiveness of actions taken. Without plans and time frames to monitor corrective actions, it will be difficult for NASA to ensure that actions are effective.
NASA headquarters export control officials and CEAs lack a comprehensive inventory of the types and location of export-controlled technologies and NASA headquarters officials have not addressed deficiencies raised in oversight tools, limiting their ability to take a risk-based approach to compliance. Export compliance guidance from the regulatory agencies of State and Commerce states the importance of identifying controlled items and continuously assessing risks. NASA headquarters officials acknowledge the benefits of identifying controlled technologies, but stated that current practices, such as foreign national screening, are sufficient to manage risk and that they lack resources to do more. Recently identified deficiencies in foreign national visitor access discussed above suggest otherwise. Three CEAs have early efforts under way to better identify technologies which could help focus compliance on areas of greatest risk. For example, one CEA is working with NASA's Office of Protective Services Counterintelligence Division to identify the most sensitive technologies at the center to help tailor oversight efforts. Such approaches, implemented NASA-wide, could enable the agency to better target existing resources to protect sensitive technologies.
Why GAO Did This Study
NASA develops sophisticated technologies and shares them with its international partners and others. U.S. export control regulations require NASA to identify and protect its sensitive technology; NASA delegates implementation of export controls to its 10 research and space centers. Recent allegations of export control violations at two NASA centers have raised questions about NASA's ability to protect its sensitive technologies. GAO was asked to review NASA's export control program.
This report assessed (1) NASA's export control policies and how centers implement them, and (2) the extent to which NASA Headquarters and CEAs apply oversight of center compliance with its export control policies. To do this, GAO reviewed export control laws and regulations, NASA export control policies, and State and Commerce export control compliance guidance. GAO also reviewed NASA information on foreign national visits and technical papers and interviewed officials from NASA and its 10 centers as well as from other agencies.
What GAO Recommends
In April 2014, GAO recommended that the NASA Administrator establish guidance to better define the CEA function, establish time frames to implement foreign national access corrective actions and assess results, and establish a more risk-based approach to oversight, among other actions. NASA concurred with all of our recommendations and provided information on actions taken or planned to address them.
For more information, contact Belva Martin at (202) 512-4841 or martinb@gao.gov.Fri, 20 Jun 2014 13:00:00 -0400TestimonyBiosurveillance: Observations on the Cancellation of BioWatch Gen-3 and Future Considerations for the Program, June 10, 2014http://www.gao.gov/products/GAO-14-267T
What GAO Found
In September 2012, GAO reported that the Department of Homeland Security (DHS) approved the Office of Health Affairs (OHA) acquisition of a next generation biosurveillance technology (Gen-3) in October 2009 without fully following its acquisition processes. For example, the analysis of alternatives (AoA) prepared for the Gen-3 acquisition did not fully explore costs or consider benefits and risk information in accordance with DHS's Acquisition Life-cycle Framework. To help ensure DHS based its acquisition decisions on reliable performance, cost, and schedule information, GAO recommended that before continuing the Gen-3 acquisition, DHS reevaluate the mission need and alternatives. DHS concurred with the recommendation and in 2012 decided to reassess mission needs and conduct a more robust AoA. Following the issuance of the AoA in December 2013, DHS decided in April 2014 to cancel Gen-3 acquisition and move the technology development back to the Science and Technology Directorate (S&amp;T). According to DHS's acquisition decisions memorandum, the AoA did not confirm an overwhelming benefit to justify the cost of a full technology switch to Gen-3. Moreover, DHS officials said the decision to cancel the Gen-3 acquisition was a cost-effectiveness measure, because the system was going to be too costly to develop and maintain in its current form.
GAO's prior work on DHS research and development (R&amp;D) highlights challenges DHS may face in shifting efforts back to S&amp;T and acquiring another biodetection technology. In September 2012, GAO reported that while S&amp;T had dozens of technology transition agreements with DHS components, none of these had yet resulted in a technology developed by S&amp;T being used by a component. At the same time, other DHS component officials GAO interviewed did not view S&amp;T's coordination practices positively. GAO recommended that DHS develop and implement policies and guidance for defining and overseeing R&amp;D at the department that includes a well-understood definition of R&amp;D that provides reasonable assurance that reliable accounting and reporting of R&amp;D resources and activities for internal and external use are achieved. S&amp;T agreed with GAO's recommendations and efforts to address them are ongoing. Addressing these coordination challenges could help to ensure that S&amp;T's technology development efforts meet the operational needs of OHA.
Cancellation of the Gen-3 acquisition also raises potential challenges that the currently deployed Gen-2 system could face going forward. According to DHS officials, DHS will continue to rely on its Gen-2 system as an early indicator of an aerosolized biological attack. However, in 2011, National Academy of Sciences raised questions about the effectiveness of the currently deployed Gen-2 system. While Gen-2 has been used in the field for over a decade, the National Academy of Sciences reported that information about the technical capabilities of the system, including the limits of detection, is limited. In April 2014, DHS officials also indicated that they will soon need to replace laboratory equipment of the currently deployed Gen-2 system and readjust life cycle costs since there will be no Gen-3 technology to replace it.
Why GAO Did This Study
DHS's BioWatch program aims to detect the presence of biological agents considered to be at a high risk for weaponized attack in major U.S. cities. Initially, development of a next generation technology (Gen-3) was led by DHS S&amp;T, with the goal of improving upon currently deployed technology (Gen-2). Gen-3 would have potentially enabled collection and analysis of air samples in less than 6 hours, unlike Gen-2 which can take up to 36 hours to detect and confirm the presence of biological pathogens. Since fiscal year 2007, OHA has been responsible for overseeing the acquisition of this technology. GAO has published a series of reports on biosurveillance efforts, including a report on DHS's Gen-3 acquisition.
In April 2014, DHS cancelled the acquisition of Gen-3 and plans to move development efforts of an affordable automated aerosol biodetection capability, or other enhancements to the BioWatch system to DHS S&amp;T. This statement addresses (1) observations from GAO's prior work on the acquisition processes for Gen 3, and the current status of the program; (2) observations from GAO's prior work related to DHS S&amp;T and the impact it could have on the BioWatch program; and (3) future considerations for the currently deployed Gen-2 system.
This testimony is based on previous GAO reports issued from 2010 through 2014 related to biosurveillance and research and development, and selected updates obtained from January to June 2014. For these updates, GAO reviewed studies and documents and interviewed officials from DHS and the national labs, which have performed studies for DHS.
For more information, contact Chris Currie at (404) 679-1875 or curriec@gao.gov.Tue, 10 Jun 2014 13:00:00 -0400TestimonyScience, Technology, Engineering, and Mathematics Education: Assessing the Relationship between Education and the Workforce, May 08, 2014http://www.gao.gov/products/GAO-14-374
What GAO Found
Both the number of science, technology, engineering, and mathematics (STEM) degrees awarded and the number of jobs in STEM fields increased in recent years. The number of degrees awarded in STEM fields grew 55 percent from 1.35 million in the 2002-2003 academic year to over 2 million in the 2011-2012 academic year, while degrees awarded in non-STEM fields increased 37 percent. Since 2004, the number of STEM jobs increased 16 percent from 14.2 million to 16.5 million jobs in 2012, and non-STEM jobs remained fairly steady. The trends in STEM degrees and jobs varied across STEM fields. It is difficult to know if the numbers of STEM graduates are aligned with workforce needs, in part because demand for STEM workers fluctuates. For example, the number of jobs in core STEM fields, including engineering and information technology, declined during the recession but has grown substantially since then.
Science, Technology, Engineering, and Mathematics (STEM) Fields
Almost all of the 124 federal postsecondary STEM education programs that responded to GAO's survey reported that they considered workforce needs in some way. For example, the most common program objective was to prepare students for STEM careers. Some of these programs focused on occupations they considered to be in demand and/or related to their agency's mission. Many postsecondary programs also aimed to increase the diversity of the STEM workforce or prepare students for innovation. Most STEM programs reported having some outcome measures in place, but GAO found that some programs did not measure an outcome directly related to their stated objectives. As GAO recommended in 2012, the National Science and Technology Council recently issued guidance to help agencies better incorporate STEM education outcomes into their performance plans and reports. As agencies follow the guidance and focus on the effectiveness of the programs, more programs may measure outcomes directly related to their objectives.
Of the 30 kindergarten through 12th grade (K-12) STEM education programs responding to GAO's survey, almost all reported that they either directly or indirectly prepared students for postsecondary STEM education. For example, one program worked closely with students to provide math and science instruction and supportive services to prepare them for postsecondary STEM education, while another supported research projects intended to enhance STEM learning.
Why GAO Did This Study
Federal STEM education programs help enhance the nation's global competitiveness by preparing students for STEM careers. Researchers disagree about whether there are enough STEM workers to meet employer demand. GAO was asked to study the extent to which STEM education programs are aligned with workforce needs.
GAO examined (1) recent trends in the number of degrees and jobs in STEM fields, (2) the extent to which federal postsecondary STEM education programs take workforce needs into consideration, and (3) the extent to which federal K-12 STEM education programs prepare students for postsecondary STEM education. GAO analyzed trends in STEM degrees and jobs since 2002 using 3 data sets—the Integrated Postsecondary Education Data System, American Community Survey, and Occupational Employment Statistics—and surveyed 158 federal STEM education programs. There were 154 survey respondents (97 percent): 124 postsecondary and 30 K-12 programs. In addition, GAO conducted in-depth reviews—including interviews with federal officials and grantees—of 13 programs chosen from among those with the highest reported obligations.
What GAO Recommends
GAO makes no recommendations in this report. GAO received technical comments from the Departments of Education, Energy, and Health and Human Services; National Science Foundation; and Office of Management and Budget.
For more information, contact Melissa Emrey-Arras at (617) 788-0534 or EmreyArrasM@gao.gov.Mon, 09 Jun 2014 13:00:00 -0400Letter ReportSmall Business Research Programs: More Guidance and Oversight Needed to Comply with Spending and Reporting Requirements, June 06, 2014http://www.gao.gov/products/GAO-14-431
What GAO Found
Agency data indicate that 8 of the 11 agencies participating in the Small Business Innovation Research (SBIR) program and 2 of the 5 agencies participating in the Small Business Technology Transfer (STTR) program complied with spending requirements in fiscal year 2012. Program managers for agencies that did not comply with the requirements identified reasons for noncompliance. For example, program managers at two of the agencies told GAO that they believe their agencies comply with spending requirements if the agencies spend the total amount reserved or budgeted for the program, regardless of the year the funding is spent. However, the authorizing legislation for the programs requires agencies to “expend” a certain amount of funding each year. This difference in the interpretation of spending requirements occurred, in part, because the Small Business Administration's (SBA) policy directives for the programs inaccurately state that the authorizing legislation requires agencies to “reserve” the minimum amount each year. Additionally, some officials told GAO their agencies did not comply with spending requirements because the recent reauthorization of the programs included an increased spending requirement in fiscal year 2012, but the reauthorization was enacted a full quarter into the fiscal year, after some agencies had planned their programs and made awards.
Participating agencies and SBA did not fully comply with certain reporting requirements for the SBIR and STTR programs. For example, participating agencies are required to submit reports to SBA describing their methodologies for calculating their budgets for extramural research or research and development (R&amp;D)—which is generally conducted by nonfederal employees outside of federal facilities—within 4 months of the enactment of appropriations. However, all 11 participating agencies were late in submitting these reports because SBA allowed them to submit the reports later. As a result, SBA was unable to analyze the reports and provide timely feedback to assist agencies in accurately calculating these budgets.
Potential effects of basing each participating agency's spending requirement on its total R&amp;D budget instead of its extramural R&amp;D budget include an increase in the amount of the spending requirement—for some agencies more than others—and, if the thresholds for participation in the programs did not change, an increase in the number of agencies required to participate. Officials identified benefits of such a change, such as funding more projects, but they generally said the drawbacks could outweigh the benefits.
Little is known about the total amounts agencies spent administering the SBIR and STTR programs because agencies did not consistently collect such information for fiscal year 2012. Agencies are not required to track costs for administering the programs. Most agencies provided GAO with some data on such costs for fiscal year 2012, ranging from about $200,000 to about $8 million, but the data were wide-ranging, incomplete, and unverifiable. With the start of a pilot program in fiscal year 2013 that allows agencies to use up to 3 percent of SBIR program funds for administrative costs, agencies will be required to report to SBA on the amount spent for such activities. However, even with the pilot program, agencies will likely not identify or track all administrative costs.
Why GAO Did This Study
Federal agencies have awarded more than 156,000 contracts and grants, totaling nearly $40 billion, through the SBIR and STTR programs to small businesses to develop and commercialize innovative technologies. The Small Business Act requires agencies with extramural R&amp;D budgets that meet certain thresholds for participation—$100 million for SBIR and $1 billion for STTR—must spend a percentage of these annual budgets on the programs. The agencies are to report on their activities to SBA and, in turn, SBA is to report to Congress.
The 2011 reauthorization of the programs mandated GAO to review compliance with spending and reporting requirements, and other program aspects. This report addresses, for fiscal year 2012, (1) the extent to which agencies complied with spending requirements, (2) the extent to which agencies and SBA complied with certain reporting requirements, (3) the potential effects of basing spending requirements on agencies' total R&amp;D budgets, and (4) what is known about the amounts spent administering the programs. GAO reviewed agency spending data and required reports for fiscal year 2012 and interviewed program officials from SBA and the participating agencies.
What GAO Recommends
GAO recommends, among other things, that SBA revise program policy directives to accurately summarize spending requirements and request that agencies submit their methodology reports on time. SBA and participating agencies generally agreed with GAO's findings and recommendations.
For more information, contact Frank Rusco at (202) 512-3841 or ruscof@gao.gov.Fri, 06 Jun 2014 13:00:00 -0400Letter ReportFusion Energy: Actions Needed to Finalize Cost and Schedule Estimates for U.S. Contributions to an International Experimental Reactor, June 05, 2014http://www.gao.gov/products/GAO-14-499
What GAO Found
Since the International Thermonuclear Experimental Reactor (ITER) Agreement was signed in 2006, the Department of Energy's (DOE) estimated cost for the U.S. portion of ITER has grown by almost $3 billion, and its estimated completion date has slipped by 20 years (see fig.). DOE has identified several reasons for the changes, such as increases in hardware cost estimates as designs and requirements have been more fully developed over time.
DOE's current cost and schedule estimates for the U.S. ITER Project reflect most characteristics of reliable estimates, but the estimates cannot be used to set a performance baseline because they are linked to factors that DOE can only partially influence. A performance baseline would commit DOE to delivering the U.S. ITER Project at a specific cost and date and provide a way to measure the project's progress. According to DOE documents and officials, the agency has been unable to finalize its cost and schedule estimates in part because the international project schedule the estimates are linked to is not reliable. DOE has taken some steps to help push for a more reliable international project schedule, such as providing position papers and suggested actions to the ITER Organization. However, DOE has not taken additional actions such as preparing formal proposals that could help resolve these issues. Unless such formal actions are taken to resolve the reliability concerns of the international project schedule, DOE will remain hampered in its efforts to create and set a performance baseline for the U.S. ITER Project.
DOE has taken several actions that have reduced U.S. ITER Project costs by about $388 million as of February 2014, but DOE has not adequately planned for the potential impact of those costs on the overall U.S. fusion program. The House and Senate Appropriations Committees have directed DOE to complete a strategic plan for the U.S. fusion program. GAO has previously reported that strategic planning is a leading practice that can help clarify priorities, and DOE has begun work on such a plan but has not committed to a specific completion date. Without a strategic plan for the U.S. fusion program, DOE does not have information to create an understanding among stakeholders about its plans for balancing the competing demands the program faces with the limited available resources or to help improve Congress' ability to weigh the trade-offs of different funding decisions for the U.S. ITER Project and overall U.S. fusion program.
Total Estimated Cost and Completion Date for the U.S. ITER Project
Why GAO Did This Study
ITER is an international research facility being built in France to demonstrate the feasibility of fusion energy. Fusion occurs when the nuclei of two light atoms collide and fuse together at high temperatures, which results in the release of large amounts of energy. The United States has committed to providing about 9 percent of ITER's construction costs through contributions of hardware, personnel, and cash, and DOE is responsible for managing those contributions, as well as the overall U.S. fusion program. In fiscal year 2014, the U.S. ITER Project received $199.5 million, or about 40 percent of the overall U.S. fusion program budget.
GAO was asked to review DOE's cost and schedule estimates for the U.S. ITER Project. This report examines (1) how and why the estimated costs and schedule of the U.S. ITER Project have changed since 2006, (2) the reliability of DOE's current cost and schedule estimates, and (3) actions DOE has taken to reduce U.S. ITER Project costs and plan for their impact on the overall U.S. fusion program. GAO reviewed documents; assessed DOE's current estimates against best practices; and obtained the perspectives of 10 experts in fusion energy and project management.
What GAO Recommends
GAO recommends, among other things, that DOE formally propose the actions needed to set a reliable international project schedule and set a date to complete the U.S. fusion program's strategic plan. DOE agreed with GAO's recommendations.
For more information, contact Frank Rusco at (202) 512-3841 or ruscof@gao.gov.Thu, 05 Jun 2014 13:00:00 -0400Letter ReportDOE Loan Programs: DOE Has Made More Than $30 Billion in Loans and Guarantees and Needs to Fully Develop Its Loan Monitoring Function, May 30, 2014http://www.gao.gov/products/GAO-14-645T
What GAO Found
The Department of Energy‘s (DOE) three loan programs have made more than $30 billion in loans and loan guarantees for supporting certain renewable or innovative energy technologies. For the Section 1703 Loan Guarantee Program (LGP), begun in 2006, borrowers generally must pay credit subsidy costs for the loans. In 2014, DOE guaranteed its first two loans under Section 1703 for a total of $6.2 billion. For those two loan guarantees, DOE calculated credit subsidy costs to be $0. The 1703 program has $28.7 billion in remaining loan guarantee authority. In contrast, under Section 1705 of the LGP, which expired September 30, 2011, Congress appropriated funds to pay credit subsidy costs. DOE guaranteed 31 section 1705 loans for $15.7 billion; and three of these loans have defaulted. The Advanced Technology Vehicles Manufacturing (ATVM) loan program, which was also given appropriations to pay credit subsidy costs, has made 5 loans for $8.4 billion. The ATVM program has $16.6 billion of remaining loan authority. Two of the five ATVM loans have defaulted. DOE has not made an ATVM loan since March 2011 and, as of February 2014, it had one active application. In April 2014, GAO reported that unless DOE can demonstrate a demand for new ATVM loans and viable applications, Congress may wish to consider rescinding all or part of the remaining $4.2 billion in credit subsidy appropriations.
Status of DOE Loan Programs
Dollars in billions
&nbsp;
&nbsp;
&nbsp;
&nbsp;
Program
Number of loan guarantees/
loans made
Number of loans defaulted
Loan guarantee/loan amounts at closing
Remaining authority
LGP – Section 1703
2
0
$6.2
$28.7
LGP – Section 1705
31
3
$15.7
$0
ATVM loan program
5
2
$8.4
$16.6
Total
38
5
$30.3
$45.3
Source: GAO analysis of DOE data.
In May 2014, GAO reported that DOE had not fully developed or consistently adhered to loan monitoring policies for its loan programs. In particular, DOE has established policies for most loan monitoring activities, but policies for some of these activities—for example, for evaluating and mitigating program-wide risk—remain incomplete or outdated. Further, in some cases GAO examined, DOE generally adhered to its loan monitoring policies but, in others, DOE adhered to those policies inconsistently or not at all. For example, DOE did not adhere to its policy requiring it to evaluate the effectiveness of its loan monitoring. DOE did not consistently adhere to policies because the Loan Programs Office was still developing its organizational structure, including its staffing, management and reporting software, and procedures for implementing policies. As a consequence, during a period of significant program events (2009 to 2013), such as 5 borrower defaults, DOE was making loans and disbursing funds without a fully developed loan monitoring function.
Why GAO Did This Study
DOE's Loan Programs Office administers the LGP for certain renewable or innovative energy projects and the ATVM loan program for projects to produce more fuel-efficient vehicles and components. Both programs can expose the government and taxpayers to substantial financial risks if borrowers default. These risks are considered in calculating credit subsidy costs. Such costs represent the estimated net long-term cost of extending or guaranteeing credit, in present value terms, over the entire period the loans are outstanding (not including administrative costs). The law requires that the credit subsidy costs of DOE loans and loan guarantees be paid for by appropriations, borrowers, or some combination of both.
This testimony focuses on (1) the status of DOE's loan programs and (2) the findings in GAO's May 2014 report on the extent to which DOE has developed and adhered to loan monitoring policies for its loan programs. This statement is based on a series of GAO reports from 2007 through 2014 that noted concerns about DOE's implementation of the loan programs.
What GAO Recommends
In May 2014, GAO recommended that DOE fully develop its loan monitoring function by: evaluating loan monitoring effectiveness, staffing key positions, updating management and reporting software, and completing policies and procedures for loan monitoring. DOE generally agreed with the recommendations. GAO is making no new recommendations in this testimony.
For more information, contact Frank Rusco at (202) 512-3841 or ruscof@gao.gov.Fri, 30 May 2014 13:00:00 -0400TestimonyDefense Research: Improved Management of DOD's Technical Corrosion Collaboration Program Needed, May 29, 2014http://www.gao.gov/products/GAO-14-437
What GAO Found
The Department of Defense's (DOD) Office of Corrosion Policy and Oversight (Corrosion Office) has documented some, but not all, key procedures for the Technical Corrosion Collaboration (TCC) program. For civilian institutions, the Corrosion Office documented procedures for selecting projects, but has not done so for approving these projects. In addition, for military academic institutions, the office has not documented procedures for selecting and approving projects. Corrosion Office officials stated that procedures for some aspects of the TCC program are not documented because the program is still evolving and they would like flexibility to enable innovation in determining how to manage the program. However, without fully documenting its decision-making procedures for selecting and approving projects, the Corrosion Office cannot demonstrate how projects were selected and approved for the TCC program.
Corrosion Office officials provided data on the amount of funds spent on the TCC program for fiscal years 2008 through 2013, but in some cases the data were not readily available and were inconsistent for the same time frame. As a result, it is unclear what the Corrosion Office has spent on the TCC program. Section 2228 requires the Corrosion Office to include a description of the amount of funds used for the TCC program in its annual corrosion budget report to Congress. However, because the Corrosion Office does not track and maintain accurate records, it is unable to determine the amount of funds spent. In the absence of fully documented funding data that are readily available for examination, Corrosion Office officials cannot ensure that they will accurately account for and report TCC costs in the annual budget report to Congress.
DOD has set goals for the TCC program, but has not developed a process to transition demonstrated results from projects to military departments. According to the DOD Corrosion Prevention and Mitigation Strategic Plan , TCC program goals are to: (1) develop individuals with education, training, and experience who will form the future core of the technical community within DOD and private industry; and (2) produce solutions that will reduce the effect of corrosion on DOD infrastructure and weapon systems. To track the goal of developing people, the Corrosion Office cited, among other things, the research papers that have been produced as a result of the TCC program. Section 2228 requires that the Corrosion Office coordinate a research and development program that includes a plan for the transition of new corrosion-prevention technologies to the military departments. To track the goal to produce solutions that will reduce corrosion, the Corrosion Office monitors TCC projects' results; however, the office has not established a process to transition demonstrated results of the research projects to the military departments. Corrosion Office officials stated that it is difficult to transition results because outputs of TCC research are in the early stages of technology evolution and thus are not mature enough to be used by the military departments. Therefore, Corrosion Office officials acknowledge the need to establish a process to transition TCC results to the military departments. Until the Corrosion Office establishes a process to study and determine what, if any, TCC results could transition to the military departments, DOD will not be able to demonstrate the success of the TCC program and the extent to which TCC results are helping to prevent or mitigate corrosion.
Why GAO Did This Study
According to DOD, corrosion can significantly affect maintenance cost, service life of equipment, and military readiness by diminishing the operations of critical systems and creating safety hazards. Pursuant to Section 2228 of Title 10 of the U.S. Code, DOD's Corrosion Office is responsible for prevention and mitigation of corrosion of military equipment and infrastructure. To help identify technology to prevent or mitigate corrosion and educate personnel about corrosion prevention and control, DOD funds universities and military labs in the TCC program.
GAO was asked to review DOD's TCC program and its goals. In this report, GAO addressed the extent to which DOD (1) has established procedures for managing the TCC program, (2) can provide information on the amount of funds spent on the program to date, and (3) has established goals for the TCC program and transitioned demonstrated results from projects to military departments. GAO reviewed DOD policies and plans and met with DOD corrosion officials and TCC participants.
What GAO Recommends
GAO recommends five actions to improve DOD's management of the TCC program. DOD partially agreed with two actions: to document procedures to select and approve labs, and to track and maintain accurate funding data. DOD did not agree with three recommendations to document procedures to select and approve projects, and to establish a process to transition project results to the military departments. GAO believes that these recommendations remain valid.
For more information, contact Zina Merritt at (202) 512-5257 or merrittz@gao.gov.Thu, 29 May 2014 13:00:00 -0400Letter ReportNanomanufacturing and U.S. Competitiveness: Challenges and Opportunities, May 20, 2014http://www.gao.gov/products/GAO-14-618T
What GAO Found
Forum participants described nanomanufacturing as an emerging set of developments that will become a global megatrend: a technological revolution that is now in its formative phases but that many knowledgeable persons—in science, business, and government—expect to burgeon in the years ahead, bringing new opportunities, “disruptive innovation,” jobs creation, and diverse societal benefits. They said that the United States likely leads in sponsorship and overall quality of nanotechnology R&amp;D today as well as some areas of nanomanufacturing—for example, nanotherapeutic drug development and the design of semiconductor devices. But they cautioned that the United States faces global-scale competition and is struggling to compete in some industry areas (notably, advanced batteries). Challenges facing U.S. nanomanufacturing include (1) a key U.S. funding gap in the middle stages of the manufacturing-innovation process, as illustrated below; (2) lack of commercial or environmental, safety, and health (EHS) standards; (3) lack of a U.S. vision for nanomanufacturing; (4) extensive prior offshoring in some industries, which may have had unintended consequences; and (5) threats to U.S. intellectual property.
Funding/Investment Gap in the U.S. Manufacturing-Innovation Process
Key actions identified by our experts to enhance U.S. nanomanufacturing competitiveness include one or more of the following: (1) strengthen U.S. innovation by updating current innovation-related policies and programs, (2) promote U.S. innovation in manufacturing through public-private partnerships, and (3) design a strategy for attaining a holistic vision for U.S. nanomanufacturing.
Key policy issues identified by our experts include the development of international commercial nanomanufacturing standards, the need to maintain support for basic research and development in nanotechnology, and the development of a revitalized, integrative, and collaborative approach to EHS issues.
Why GAO Did This Study
Nanotechnology has been defined as the control or restructuring of matter at the atomic and molecular levels in the size range of about 1–100 nanometers (nm); 100 nm is about 1/1000th the width of a hair.
The U.S. National Nanotechnology Initiative (NNI), begun in 2001 and focusing primarily on R&amp;D, represents a cumulative investment of almost $20 billion, including the request for fiscal year 2014. As research continues and other nations increasingly invest in R&amp;D, nanotechnology is moving from the laboratory to commercial markets, mass manufacturing, and the global marketplace. Today, burgeoning markets and nanomanufacturing activities are increasingly competitive in a global context—and the potential EHS effects of nanomanufacturing remain largely unknown.
GAO was asked to testify on challenges to U.S. competitiveness in nanomanufacturing and related issues. Our statement is based on GAO's earlier report on the Forum on Nano-manufacturing, which was convened by the Comptroller General of the United States in July 2013 (GAO 2014; also referred to as GAO-14-181SP ). That report reflects forum discussions as well as four expert-based profiles of nano-industry areas, which GAO prepared prior to the forum and which are appended to the earlier report.
For more information, contact Timothy Persons, Chief Scientist, at (202) 512-6412 or personst@gao.gov.Tue, 20 May 2014 13:00:00 -0400TestimonyExport Controls: NASA Management Action and Improved Oversight Needed to Reduce the Risk of Unauthorized Access to Its Technologies, April 15, 2014http://www.gao.gov/products/GAO-14-315
What GAO Found
Weaknesses in the National Aeronautics and Space Administration (NASA) export control policy and implementation of foreign national access procedures at some centers increase the risk of unauthorized access to export-controlled technologies. NASA policies provide Center Directors wide latitude in implementing export controls at their centers. Federal internal control standards call for clearly defined areas of authority and establishment of appropriate lines of reporting. However, NASA procedures do not clearly define the level of Center Export Administrator (CEA) authority and organizational placement, leaving it to the discretion of the Center Director. GAO found that 7 of the 10 CEAs are at least three levels removed from the Center Director. Three of these 7 stated that their placement detracted from their ability to implement export control policies by making it difficult to maintain visibility to staff, communicate concerns to the Center Director, and obtain resources; the other four did not express concerns about their placement. However, in a 2013 meeting of export control officials, the CEAs recommended placing the CEA function at the same organizational level at each center for uniformity, visibility, and authority. GAO identified and the NASA Inspector General also reported instances in which two centers did not comply with NASA policy on foreign national access to NASA technologies. For example, during a 4-month period in 2013, one center allowed foreign nationals on a major program to fulfill the role of sponsors for other foreign nationals, including determining access rights for themselves and others. Each instance risks damage to national security. Due to access concerns, the NASA Administrator restricted foreign national visits in March 2013, and directed each center to assess compliance with foreign national access and develop corrective plans. By June 2013, six centers identified corrective actions, but only two set time frames for completion and only one planned to assess the effectiveness of actions taken. Without plans and time frames to monitor corrective actions, it will be difficult for NASA to ensure that actions are effective.
NASA headquarters export control officials and CEAs lack a comprehensive inventory of the types and location of export-controlled technologies and NASA headquarters officials have not addressed deficiencies raised in oversight tools, limiting their ability to take a risk-based approach to compliance. Export compliance guidance from the regulatory agencies of State and Commerce states the importance of identifying controlled items and continuously assessing risks. NASA headquarters officials acknowledge the benefits of identifying controlled technologies, but stated that current practices, such as foreign national screening, are sufficient to manage risk and that they lack resources to do more. Recently identified deficiencies in foreign national visitor access discussed above suggest otherwise. Three CEAs have early efforts under way to better identify technologies which could help focus compliance on areas of greatest risk. For example, one CEA is working with NASA's Office of Protective Services Counterintelligence Division to identify the most sensitive technologies at the center to help tailor oversight efforts. Such approaches, implemented NASA-wide, could enable the agency to better target existing resources to protect sensitive technologies.
Why GAO Did This Study
NASA develops sophisticated technologies and shares them with its international partners and others. U.S. export control regulations require NASA to identify and protect its sensitive technology; NASA delegates implementation of export controls to its 10 research and space centers. Recent allegations of export control violations at two NASA centers have raised questions about NASA's ability to protect its sensitive technologies. GAO was asked to review NASA's export control program.
This report assessed (1) NASA's export control policies and how centers implement them, and (2) the extent to which NASA Headquarters and CEAs apply oversight of center compliance with its export control policies. To do this, GAO reviewed export control laws and regulations, NASA export control policies, and State and Commerce export control compliance guidance. GAO also reviewed NASA information on foreign national visits and technical papers and interviewed officials from NASA and its 10 centers as well as from other agencies.
What GAO Recommends
GAO recommends that the NASA Administrator establish guidance to better define the CEA function, establish time frames to implement foreign national access corrective actions and assess results, and establish a more risk-based approach to oversight, among other actions. NASA concurred with all of our recommendations and provided information on actions taken or planned to address them.
For more information, contact Belva Martin at (202) 512-4841 or martinb@gao.gov.Thu, 15 May 2014 13:00:00 -0400Letter ReportOffice of Personnel Management: Agency Needs to Improve Outcome Measures to Demonstrate the Value of Its Innovation Lab, March 31, 2014http://www.gao.gov/products/GAO-14-306
What GAO Found
In March 2012, the Office of Personnel Management (OPM) opened its innovation lab, a distinct physical space with a set of policies for engaging people and using technology in problem solving. The goals of OPM's innovation lab are to provide federal workers with 21st century skills in design-led innovation, such as intelligent risk-taking to develop new services, products, and processes. OPM's lab was built at a reported cost of $1.1 million, including facility upgrades and construction, equipment and training, and other personnel costs. The lab employs approximately 6 full-time equivalents, including a director, and in fiscal year 2013, the lab's operating costs were approximately $476,000, including salaries.
View of OPM's Innovation Lab
OPM's innovation lab is similar in mission and design to other innovation labs GAO reviewed, and OPM has incorporated some of the prevalent practices that other labs use to sustain their operations. Specifically, OPM is using its lab for a variety of projects, including as a classroom for building the capacity to innovate in the federal government. Lab staff indicated that they plan to begin long-term immersion projects—complex projects with diverse users—within a few months. OPM plans to develop and implement evaluation plans specific to each immersion project that will help them track cost benefits or performance improvement benefits associated with the projects.
Starting in March 2013, OPM lab staff began work on a program evaluation framework to more systematically measure the lab's progress toward meeting its overarching goals. In addition, lab staff members are tracking lab activities, such as classes and workshops, and are surveying lab users about the quality of their experience in the lab. However, they have not developed performance targets or measures related to project outcomes, and without a rigorous evaluation framework that can help OPM track the lab's performance, it will be hard to demonstrate that the lab is operating as originally envisioned.
While labs provide a physical space where innovators can convene, federal agencies are not fully aware of their growing community. However, OPM is taking steps to ensure work done in the lab is shared across OPM and with other federal innovators—for example, by hosting weekly training sessions in the lab on best practices. Studies show that information sharing and interorganizational networks can be a powerful driver supporting innovation.
Why GAO Did This Study
Organizations from around the globe are emphasizing that strategies promoting innovation are vital to solving complex problems. To try to instill a culture of innovation in its agency, OPM followed the lead of a number of private sector companies, nonprofit organizations, and government bodies by creating an innovation lab. GAO was asked to examine the lab.
Specifically, GAO 1) described the lab's start-up costs, staffing and organization, activities, and policies governing the lab's use, and 2) assessed how OPM's innovation lab compares to other organizations' innovation labs, including how it uses benchmarks and metrics and how it addresses challenges to innovation. GAO reviewed cost, staffing, and performance information. GAO also reviewed relevant literature on innovation and interviewed officials from public, private, and nonprofit organizations with innovation facilities similar to OPM's lab.
What GAO Recommends
Among other things, GAO recommends that the Director of OPM should direct lab staff to 1) develop a mix of performance targets and measures to help them monitor and report on progress toward lab goals, and 2) build on existing efforts to share information with other agencies that have innovation labs. OPM generally concurred with GAO's recommendations; in addition, they described the steps being taken and planned to refine their ongoing evaluation efforts and to further leverage other federal innovation labs.
For more information, contact Seto J. Bagdoyan at (202) 512-4749 or bagdoyans@gao.gov.Wed, 30 Apr 2014 13:00:00 -0400Letter ReportAdvanced Imaging Technology: TSA Needs Additional Information before Procuring Next-Generation Systems, March 31, 2014http://www.gao.gov/products/GAO-14-357
What GAO Found
The Department of Homeland Security's (DHS) Transportation Security Administration (TSA) does not collect or analyze available information that could be used to enhance the effectiveness of the advanced imaging technology (AIT) with automated target recognition (ATR) system. Specifically, TSA does not collect or analyze available data on drills using improvised explosive devices (IED) at the checkpoint that could provide insight into how well screening officers (SO) resolve anomalies, including objects that could pose a threat to an aircraft, identified by AIT systems, because it does not enforce compliance with its operational directive. TSA's operational directive requires personnel at airports to conduct drills to assess SO compliance with TSA's screening standard operating procedures and to train SOs to better resolve anomalies identified by AIT-ATR systems. GAO found that TSA personnel at about half of airports with AIT systems did not report any IED checkpoint drill results on those systems from March 2011 through February 2013. According to TSA, it does not ensure compliance with the directive at every airport because it is unclear which office should oversee enforcing the directive. Without data on IED checkpoint drills, TSA lacks insight into how well SOs resolve anomalies detected by AIT systems, information that could be used to help strengthen existing screening processes. Potential weaknesses in the screening process could be caused by TSA not clarifying which office is responsible for overseeing TSA's operational directive, directing that office to ensure enforcement of the directive in conducting these drills, and analyzing the data. Further, when determining AIT-ATR system effectiveness, TSA uses laboratory test results that do not reflect the combined performance of the technology, the personnel who operate it, and the process that governs AIT-related security operations. TSA officials agreed that it is important to analyze performance by including an evaluation of the technology, operators, and processes and stated that TSA is planning to assess the performance of all layers of security. By not measuring system effectiveness based on the performance of the technology and SOs who operate the technology or taking into account current processes and deployment strategies, DHS and TSA are not ensuring that future procurements meet mission needs.
TSA completed the installation of ATR software upgrades intended to address privacy concerns for all deployed AIT systems; however, it has not met proposed milestones for enhancing capabilities as documented in its AIT roadmap—a document that contains milestones for achieving enhanced capabilities to meet the agency's mission needs. For example, TSA began operational test and evaluation for Tier II upgrades 17 months after the expected start date. Moreover, TSA did not use available scientific research or information from experts from the national laboratories or vendors on the technological challenges that it faces in developing requirements and milestones, because, according to TSA, it relied on time frames proposed by vendors. Thus, TSA cannot ensure that its roadmap reflects the true capabilities of the next generation of AIT systems by using scientific evidence and information from DHS's Science and Technology Directorate, the national laboratories, and vendors to develop a realistic schedule with achievable milestones that outlines the technological advancements, estimated time, and resources needed to achieve enhanced capabilities as outlined in TSA's roadmap.
Why GAO Did This Study
TSA accelerated the deployment of AIT systems, or full-body scanners, in response to the December 25, 2009, attempted terrorist attack on Northwest Airlines Flight 253. Pursuant to the Federal Aviation Administration Modernization and Reform Act of 2012, TSA was mandated to ensure that AIT systems were equipped with ATR software, which displays generic outlines of passengers rather than actual images, by June 1, 2013. All deployed AIT systems were equipped with ATR software by the deadline. GAO was asked to evaluate TSA's AIT-ATR systems' effectiveness. This report addresses the extent to which (1) TSA collects and analyzes available information that could be used to enhance the effectiveness of the AIT-ATR system and (2) TSA has made progress toward enhancing AIT capabilities to detect concealed explosives and other threat items, and any challenges that remain. GAO analyzed testing results conducted by the Transportation Security Laboratory and TSA personnel at airports and interviewed DHS and TSA officials. This is a public version of a classified report that GAO issued in December 2013. Information DHS and TSA deemed classified or sensitive has been omitted, including information and recommendations related to improving AIT capabilities.
What GAO Recommends
GAO recommends that TSA, among other things, clarify which office should oversee its operational directive, better measure system effectiveness, and develop a realistic schedule before procuring future generations. TSA concurred with GAO's recommendations.
For more information, contact Stephen M. Lord at (202) 512-4379 or lords@gao.gov.Wed, 30 Apr 2014 13:00:00 -0400Letter ReportSmall Business Research Programs: Agencies Did Not Consistently Comply with Spending and Reporting Requirements, April 24, 2014http://www.gao.gov/products/GAO-14-567T
What GAO Found
Using data agencies had reported to the Small Business Administration (SBA), GAO found in its 2013 report that 8 of the 11 agencies participating in the Small Business Innovation Research (SBIR) program and 4 of the 5 agencies participating in the Small Business Technology Transfer (STTR) program did not consistently comply with spending requirements for fiscal years 2006 to 2011. SBA, which oversees the programs, provided guidance in policy directives for agencies on calculating these requirements, but the directives did not provide guidance on calculating the requirements when appropriations are late and spending is delayed. Some SBIR and STTR program managers told GAO that it can be difficult to spend the required amount because delays in receiving final appropriations can delay agencies' awarding of contracts. As GAO found in its 2013 report, when appropriations were received late in the year agencies used differing methodologies to calculate their spending requirements, which made it difficult to determine whether agencies' calculations were correct. GAO found that, without further SBA guidance, agencies would likely continue calculating spending requirements in differing ways.
GAO also found in 2013 that the participating agencies and SBA had not consistently complied with certain program reporting requirements. For example, participating agencies did not itemize each program excluded from the calculation of their extramural research or research and development (R&amp;D) budgets and explain why the program was excluded, as required. (Extramural R&amp;D is generally conducted by nonfederal employees outside of federal facilities.) Also, SBA's annual reports to Congress that were available at the time of GAO's review contained limited analysis of the agencies' methodologies, often not including information on particular agencies. By providing more analysis of the agencies' reports, as GAO recommended in its 2013 report, SBA can provide information to Congress on the extent to which agencies were reporting what is required.
In 2013, GAO found that the potential effects of basing each participating agency's spending requirement on its total R&amp;D budget instead of its extramural R&amp;D budget would increase the amount of the spending requirement—for some agencies more than others, depending on how the change was implemented. Also, if the thresholds of the spending requirements for participation in the programs did not change, changing the base to an agency's total R&amp;D budget would increase the number of agencies required to participate.
In addition, GAO found in 2013 that the agencies' cost of administering the programs could not be determined because the agencies had not consistently tracked costs as they were not required to do so by the authorizing legislation of the programs. Estimates agencies provided to GAO indicated that the greatest amounts of administrative costs in fiscal year 2011 were for salaries and expenses, contract processing, outreach programs, technical assistance programs, support contracts, and other purposes. With the start of a pilot program allowing agencies to use up to 3 percent of SBIR program funds for administrative costs in fiscal year 2013, SBA planned to require participating agencies to track and report administrative costs paid from program funds.
Why GAO Did This Study
Federal agencies have awarded more than 156,000 contracts and grants, totaling nearly $40 billion through the SBIR and STTR programs to small businesses to develop and commercialize innovative technologies. The Small Business Act mandates that agencies, with extramural R&amp;D budgets that meet the thresholds for participation, must spend a percentage of these annual budgets on the two programs. The agencies are to report to SBA and SBA is to report to Congress.
This testimony is based on a report GAO issued in September 2013 and addresses for fiscal years 2006 through 2011, (1) the extent to which participating agencies complied with program spending requirements, (2) the extent to which participating agencies and SBA complied with certain reporting requirements, (3) the potential effects of basing the spending requirements for the SBIR and STTR programs on agencies' total R&amp;D budgets instead of their extramural R&amp;D budgets, and (4) what is known about the amounts participating agencies spent for administering the programs. For that report, GAO reviewed agency calculations of spending requirements and required reports, and interviewed SBA and participating agency officials.
What GAO Recommends
GAO is not making any new recommendations, but made several to SBA in GAO's 2013 report on this topic. SBA agreed with those recommendations and is taking steps to implement them.
For more information, contact John Neumann at (202) 512-3841 or neumannj@gao.gov.Thu, 24 Apr 2014 13:00:00 -0400TestimonyFederal Vehicle Collisions and Aftermarket Collision Avoidance Technologies, April 24, 2014http://www.gao.gov/products/GAO-14-408R
What GAO Found
Limited data are available on federal vehicle accidents and their associated costs. Available data sources do not allow for a comprehensive measurement of either the number of accidents involving federally owned or leased vehicles or the associated costs. Data on the number of accidents involving vehicles leased from the General Services Administration (GSA), representing about 29 percent of the federal fleet, show that from fiscal years 2008 through 2012, about 23,000 vehicles were involved in accidents in which the government was at fault; costs for repaired and totaled GSA-leased vehicles to the federal government amounted to about $53.6 million. Annual costs related to vehicle damage decreased between fiscal years 2008 and 2012, though GSA’s data does not allow a determination of why because it does not include data on specific cost elements, such as parts and labor. Similarly, comprehensive data are not readily available for costs beyond those related to GSA-leased vehicles. For example, information on the type of injury or costs of injuries and fatalities is not available from GSA. Officials from selected civilian agencies that GAO contacted stated that the information necessary to determine the number and cost of accidents involving their leased and owned vehicles is either unavailable or would be time consuming to collect.
Limited information also exists on the potential of aftermarket collision avoidance technologies to reduce vehicle accidents. For example, GAO’s literature review yielded no studies of the costs and benefits of aftermarket collision avoidance technologies, and GAO was unable to verify any claims of benefits associated with the technologies. In addition, officials from GSA and the Departments of Transportation, Agriculture, the Interior, and Veterans Affairs stated that their agencies are not presently using these technologies in their vehicle fleets. GSA does not offer aftermarket collision avoidance technologies as options for purchased or leased vehicles, citing a lack of demand from its customers for such technologies.
Why GAO Did This Study
The federal government can be liable for vehicular damages and the costs of injuries resulting from accidents involving vehicles that are owned or leased by federal agencies. Recently, new technologies have become available that use sensors, such as cameras and radar, to observe a vehicle’s surroundings and issue warnings to drivers when certain types of collisions may be imminent. These technologies, called collision avoidance technologies, may help reduce the frequency of accidents as well as the costs of accidents that do occur. While such technologies have increasingly become available as factory-installed options on vehicles, aftermarket collision avoidance technologies have also become available. They may offer similar benefits but cost less than factory-installed options. This report discusses what is known about (1) the extent to which federal vehicles were reported to have been involved in accidents from fiscal years 2008 through 2012 and the cost of those accidents to the government, and (2) the potential of aftermarket collision avoidance technologies to help reduce vehicle accidents.
To address these objectives, GAO analyzed government-wide data and interviewed officials from GSA, the Department of Transportation, selected civilian agencies, and industry officials. GAO also conducted a literature review of relevant studies published over the past 10 years. The Departments of Transportation and the Interior provided comments on a draft of this report and GSA provided technical comments.
For more information, contact Lori Rectanus at 202-512-2834 or rectanusl@gao.gov.
&nbsp;Thu, 24 Apr 2014 13:00:00 -0400Correspondence