The 2012 National Research Council report Disaster Resilience: A National Imperative highlighted the challenges of increasing national resilience in the United States. One finding of the report was that "without numerical means of assessing resilience, it would be impossible to identify the priority needs for improvement, to monitor changes, to show that resilience had improved, or to compare the benefits of increasing resilience with the associated costs." Although measuring resilience is a challenge, metrics and indicators to evaluate progress, and the data necessary to establish the metric, are critical for helping communities to clarify and formalize what the concept of resilience means for them, and to support efforts to develop and prioritize resilience investments. One of the recommendations from the 2012 report stated that government entities at federal, state, and local levels and professional organizations should partner to help develop a framework for communities to adapt to their circumstances and begin to track their progress toward increasing resilience.

To build upon this recommendation and begin to help communities formulate such a framework, the Resilient America Roundtable of the National Academies convened the workshop Measures of Community Resilience: From Lessons Learned to Lessons Applied on September 5, 2014 in Washington, D.C. The workshop's overarching objective was to begin to develop a framework of measures and indicators that could support community efforts to increase their resilience. The framework will be further developed through feedback and testing in pilot and other partner communities that are working with the Resilient America Roundtable. This report is a summary of the one-day workshop, which consisted of a keynote address and two panel sessions in the morning and afternoon breakout sessions that began the discussion on how to develop a framework of resilience measures. (Office of Special Projects (OSP))

Eyewitnesses play an important role in criminal cases when they can identify culprits. Estimates suggest that tens of thousands of eyewitnesses make identifications in criminal investigations each year. Research on factors that affect the accuracy of eyewitness identification procedures has given us an increasingly clear picture of how identifications are made, and more importantly, an improved understanding of the principled limits on vision and memory that can lead to failure of identification. Factors such as viewing conditions, duress, elevated emotions, and biases influence the visual perception experience. Perceptual experiences are stored by a system of memory that is highly malleable and continuously evolving, neither retaining nor divulging content in an informational vacuum. As such, the fidelity of our memories to actual events may be compromised by many factors at all stages of processing, from encoding to storage and retrieval. Unknown to the individual, memories are forgotten, reconstructed, updated, and distorted. Complicating the process further, policies governing law enforcement procedures for conducting and recording identifications are not standard, and policies and practices to address the issue of misidentification vary widely. These limitations can produce mistaken identifications with significant consequences. What can we do to make certain that eyewitness identification convicts the guilty and exonerates the innocent? Identifying the Culprit makes the case that better data collection and research on eyewitness identification, new law enforcement training protocols, standardized procedures for administering line-ups, and improvements in the handling of eyewitness identification in court can increase the chances that accurate identifications are made. (Committee on Science, Technology, and Law (CSTL))/ (Division of Behavioral and Social Sciences and Education (DBASSE))

Emerging and Readily Available Technologies and National Security is a study on the ethical, legal, and societal issues relating to the research on, development of, and use of rapidly changing technologies with low barriers of entry that have potential military application, such as information technologies, synthetic biology, and nanotechnology. The report also considers the ethical issues associated with robotics and autonomous systems, prosthetics and human enhancement, and cyber weapons. These technologies are characterized by readily available knowledge access, technological advancements that can take place in months instead of years, the blurring of lines between basic research and applied research, and a high uncertainty about how the future trajectories of these technologies will evolve and what applications will be possible.

Emerging and Readily Available Technologies and National Security addresses topics such as the ethics of using autonomous weapons that may be available in the future; the propriety of enhancing the physical or cognitive capabilities of soldiers with drugs or implants or prosthetics; and what limits, if any, should be placed on the nature and extent of economic damage that cyber weapons can cause. This report explores three areas with respect to emerging and rapidly available technologies: the conduct of research; research applications; and unanticipated, unforeseen, or inadvertent ethical, legal, and societal issues. The report articulates a framework for policy makers, institutions, and individual researchers to think about issues as they relate to these technologies of military relevance and makes recommendations for how each of these groups should approach these considerations in its research activities. Emerging and Readily Available Technologies and National Security makes an essential contribution to incorporate the full consideration of ethical, legal, and societal issues in situations where rapid technological change may outpace our ability to foresee consequences. (Committee on Science, Technology, and Law (CSTL))/Computer Science and Telecommunications Board (CSTB)/Other units

With the increasing frequency of natural and human-induced disasters and the increasing magnitude of their consequences, a clear need exists for governments and communities to become more resilient. The National Research Council's 2012 report Disaster Resilience: A National Imperative addressed the importance of resilience, discussed different challenges and approaches for building resilience, and outlined steps for implementing resilience efforts in communities and within government. Launching a National Conversation on Disaster Resilience in America is a summary of a one-day event in November 2012 to formally launch a national conversation on resilience. Nationally-recognized experts in disaster resilience met to discuss developing a culture of resilience, implementing resilience, and understanding federal perspectives about resilience. This report includes a broad range of perspectives and experiences derived from many types of hazards and disasters in all parts of the country. ((Committee on Science, Engineering, and Public Policy (COSEPUP))

Synthetic biology -- unlike any research discipline that precedes it -- has the potential to bypass the less predictable process of evolution to usher in a new and dynamic way of working with living systems. Ultimately, synthetic biologists hope to design and build engineered biological systems with capabilities that do not exist in natural systems -- capabilities that may ultimately be used for applications in manufacturing, food production, and global health. Importantly, synthetic biology represents an area of science and engineering that raises technical, ethical, regulatory, security, biosafety, intellectual property, and other issues that will be resolved differently in different parts of the world. As a better understanding of the global synthetic biology landscape could lead to tremendous benefits, six academies -- the United Kingdom's Royal Society and Royal Academy of Engineering, the United States' National Academy of Sciences and National Academy of Engineering, and the Chinese Academy of Science and Chinese Academy of Engineering -- organized a series of international symposia on the scientific, technical, and policy issues associated with synthetic biology. Positioning Synthetic Biology to Meet the Challenges of the 21st Century summarizes the symposia proceedings. (Committee on Science, Technology, and Law (CSTL))

When, in late 2011, it became public knowledge that two research groups had submitted for publication manuscripts that reported on their work on mammalian transmissibility of a lethal H5N1 avian influenza strain, the information caused an international debate about the appropriateness and communication of the researchers' work, the risks associated with the work, partial or complete censorship of scientific publications, and dual-use research of concern in general.

Recognizing that the H5N1 research is only the most recent scientific activity subject to widespread attention due to safety and security concerns, on May 1, 2012, the National Research Council's Committee on Science, Technology and Law, in conjunction with the Board on Life Sciences and the Institute of Medicine's Forum on Microbial Threats, convened a one-day public workshop for the purposes of 1) discussing the H5N1 controversy; 2) considering responses by the National Institute of Allergy and Infectious Diseases (NIAID), which had funded this research, the World Health Organization, the U.S. National Science Advisory Board for Biosecurity (NSABB), scientific publishers, and members of the international research community; and 3) providing a forum wherein the concerns and interests of the broader community of stakeholders, including policy makers, biosafety and biosecurity experts, non-governmental organizations, international organizations, and the general public might be articulated.

No person or place is immune from disasters or disaster-related losses. Infectious disease outbreaks, acts of terrorism, social unrest, or financial disasters in addition to natural hazards can all lead to large-scale consequences for the nation and its communities. Communities and the nation thus face difficult fiscal, social, cultural, and environmental choices about the best ways to ensure basic security and quality of life against hazards, deliberate attacks, and disasters. Beyond the unquantifiable costs of injury and loss of life from disasters, statistics for 2011 alone indicate economic damages from natural disasters in the United States exceeded $55 billion, with 14 events costing more than a billion dollars in damages each.

One way to reduce the impacts of disasters on the nation and its communities is to invest in enhancing resilience--the ability to prepare and plan for, absorb, recover from and more successfully adapt to adverse events. Disaster Resilience: A National Imperative addresses the broad issue of increasing the nation's resilience to disasters. This book defines "national resilience", describes the state of knowledge about resilience to hazards and disasters, and frames the main issues related to increasing resilience in the United States. It also provide goals, baseline conditions, or performance metrics for national resilience and outlines additional information, data, gaps, and/or obstacles that need to be addressed to increase the nation's resilience to disasters. Additionally, the book's authoring committee makes recommendations about the necessary approaches to elevate national resilience to disasters in the United States.

Enhanced resilience allows better anticipation of disasters and better planning to reduce disaster losses-rather than waiting for an event to occur and paying for it afterward. Disaster Resilience confronts the topic of how to increase the nation's resilience to disasters through a vision of the characteristics of a resilient nation in the year 2030. Increasing disaster resilience is an imperative that requires the collective will of the nation and its communities. Although disasters will continue to occur, actions that move the nation from reactive approaches to disasters to a proactive stance where communities actively engage in enhancing resilience will reduce many of the broad societal and economic burdens that disasters can cause. (Committee on Science, Engineering, and Public Policy (COSEPUP))

Animals are widely used in neuroscience research to explore biological mechanisms of nervous system function, to identify the genetic basis of disease states, and to provide models of human disorders and diseases for the development of new treatments. To ensure the humane care and use of animals, numerous laws, policies, and regulations are in place governing the use of animals in research, and certain animal regulations have implications specific to neuroscience research.

To consider animal research regulations from a global perspective, the IOM Forum on Neuroscience and Nervous System Disorders, in collaboration with the National Research Council and the Institute for Laboratory Animal Research, held a workshop in Buckinghamshire, UK, July 26-27, 2011. The workshop brought together neuroscientists, legal scholars, administrators, and other key stakeholders to discuss current and emerging trends in animal regulations as they apply to the neurosciences. This document summarizes the workshop. (Committee on Science, Technology, and Law (CSTL) /Board on Health Sciences Policy (HSP)/Institute for Laboratory Animal Research (ILAR))

Even though it is only in its early infancy, social networking appears likely to have a profound transformative effect on how people gain and share knowledge, collaborate with others, and identify innovative solutions to problems that have previously resisted traditional approaches. This impact will be felt across the broad spectrum of government, academic and industry sectors. In the sciences, social networking may result in a “paradigm shift” in science education and the conduct of many kinds of research. The presentations in the October 4-5, 2011 GUIRR meeting examined the underlying nature of social networking, and how it is affecting areas ranging from basic research to education, intelligence gathering, community-based programs, personal networking, business and the workplace.(Government-University-Industry Research Roundtable (GUIRR))

Natural disasters are having an increasing effect on the lives of people in the United States and throughout the world. Every decade, property damage caused by natural disasters and hazards doubles or triples in the United States. More than half of the U.S. population lives within 50 miles of a coast, and all Americans are at risk from such hazards as fires, earthquakes, floods, and wind. The year 2010 saw 950 natural catastrophes around the world--the second highest annual total ever--with overall losses estimated at $130 billion. The increasing impact of natural disasters and hazards points to increasing importance of resilience, the ability to prepare and plan for, absorb, recover from, or more successfully adapt to actual or potential adverse events, at the individual , local, state, national, and global levels.

Increasing National Resilience to Hazards and Disasters reviews the effects of Hurricane Katrina and other natural and human-induced disasters on the Gulf Coast of Louisiana and Mississippi and to learn more about the resilience of those areas to future disasters. Topics explored in the workshop range from insurance, building codes, and critical infrastructure to private-sector issues, public health, nongovernmental organizations and governance. This workshop summary provides a rich foundation of information to help increase the nation's resilience through actionable recommendations and guidance on the best approaches to reduce adverse impacts from hazards and disasters.

The Reference Manual on Scientific Evidence, Third Edition, assists judges in managing cases involving complex scientific and technical evidence by describing the basic tenets of key scientific fields from which legal evidence is typically derived and by providing examples of cases in which that evidence has been used.

First published in 1994 by the Federal Judicial Center, the Reference Manual on Scientific Evidence has been relied upon in the legal and academic communities and is often cited by various courts and others. Judges faced with disputes over the admissibility of scientific and technical evidence refer to the manual to help them better understand and evaluate the relevance, reliability and usefulness of the evidence being proffered. The manual is not intended to tell judges what is good science and what is not. Instead, it serves to help judges identify issues on which experts are likely to differ and to guide the inquiry of the court in seeking an informed resolution of the conflict.

The core of the manual consists of a series of chapters (reference guides) on various scientific topics, each authored by an expert in that field. The topics have been chosen by an oversight committee because of their complexity and frequency in litigation. Each chapter is intended to provide a general overview of the topic in lay terms, identifying issues that will be useful to judges and others in the legal profession. They are written for a non-technical audience and are not intended as exhaustive presentations of the topic. Rather, the chapters seek to provide judges with the basic information in an area of science, to allow them to have an informed conversation with the experts and attorneys. (Committee on Science, Technology, and Law (CSTL))

Rapid advances in genetic research already have begun to transform clinical practice and our understanding of disease progression. Existing research has revealed a genetic basis or component for numerous diseases, including Parkinson's disease, Alzheimer's disease, diabetes, heart disease, and several forms of cancer. The availability of the human genome sequence and the HapMap, plummeting costs of high-throughput screening, and increasingly sophisticated computational analyses have led to an explosion of discoveries of linkages between patterns of genetic variation and disease susceptibility. While this research is by no means a straight path toward better public health, improved knowledge of the genetic linkages has the potential to change fundamentally the way health professionals and public health practitioners approach the prevention and treatment of disease. Realizing this potential will require greater sophistication in the interpretation of genetic tests, new training for physicians and other diagnosticians, and new approaches to communicating findings to the public. As this rapidly growing field matures, all of these questions require attention from a variety of perspectives. (Committee on Science, Technology, and Law (CSTL))

Globally, child labor and forced labor are widespread and complex problems. They are conceptually different phenomena, requiring different policy responses, though they may also overlap in practice. The Trafficking Victims Protection Act of 2000 (TVPA) was designed to reduce the use of child and forced labor in the production of goods consumed in the United States. The Act was reauthorized in 2003, 2005, and 2008. In response to provisions of TVPA, the the Bureau of International Labor Affairs requested that the National Research Council organize a two-day workshop. The workshop, summarized in this volume, discusses methods for identifying and organizing a standard set of practices that will reduce the likelihood that persons will use forced labor or child labor to produce goods, with a focus on business and governmental practices. (Policy and Global Affairs (PGA))

This landmark report continues to influence legal decisions, including one by the Supreme Court and many others by various federal district courts. In a recent procedural order, Judge Nancy Gertner of Massachusetts stated that, “The NAS report suggests a different calculus – that admissibility of […forensic] evidence ought not to be presumed….” Several congressional hearings have been held on the report, the National Science and Technology Council established a working group, the Senate Judiciary Committee is working on draft legislation, and courses for judges and lawyers focusing on the report continue to be held across the country. The report concludes that the vitally important work of the forensic science community is often constrained by lack of adequate resources, sound policies, and national support. Both systematic and scientific changes are needed to ensure the reliability of work, establish enforceable standards, and promote best practices with consistent application. The report recommends the upgrading of systems and organizational structures, better training, widespread adoption of uniform and enforceable best practices, mandatory certification and accreditation programs, and the creation of a new government entity, the National Institute of Forensic Science. (Committee on Science, Technology, and Law (CSTL))

While governments throughout the world have different approaches to how they make their public sector information (PSI) available and the terms under which the information may be reused, there appears to be a broad recognition of the importance of digital networks and PSI to the economy and to society. However, despite the huge investments in PSI and the even larger estimated effects, surprisingly little is known about the costs and benefits of different information policies on the information society and the knowledge economy. By understanding the strengths and weaknesses of the current assessment methods and their underlying criteria, it should be possible to improve and apply such tools to help rationalize the policies and to clarify the role of the internet in disseminating PSI. This in turn can help promote the efficiency and effectiveness of PSI investments and management, and to improve their downstream economic and social results. The workshop that is summarized in this volume was intended to review the state of the art in assessment methods and to improve the understanding of what is known and what needs to be known about the effects of PSI activities. (Board on Research Data and Information (BRDI))

In 1993, the U.S. Supreme Court in Daubert v. Merrell Dow Pharmaceuticals, Inc., laid out a new test for federal trial judges to use when determining the admissibility of expert testimony. In Daubert, the Court ruled that judges should act as gatekeepers, assessing the reliability of the scientific methodology and reasoning that supports expert testimony. The resulting judicial screening of expert testimony has been particularly consequential. While the Supreme Court sought to bring better science into the courtroom, questions remain about whether the lower courts’ application of Daubert accords with scientific practices. This report summarizes discussions held by an ad hoc committee of the The National Academies to consider the impact of Daubert and subsequent Supreme Court opinions and to identify questions for future study. (Committee on Science, Technology, and Law (CSTL))

The patenting and licensing of human genetic material and proteins represents an extension of intellectual property (IP) rights to naturally occurring biological material and scientific information, much of it well upstream of drugs and other disease therapies. This report concludes that IP restrictions rarely impose significant burdens on biomedical research, but there are reasons to be apprehensive about their future impact on scientific advances in this area. The report recommends 13 actions that policy-makers, courts, universities, and health and patent officials should take to prevent the increasingly complex web of IP protections from getting in the way of potential breakthroughs in genomic and proteomic research. It endorses the National Institutes of Health guidelines for technology licensing, data sharing, and research material exchanges and says that oversight of compliance should be strengthened. It recommends enactment of a statutory exception from infringement liability for research on a patented invention and raising the bar somewhat to qualify for a patent on upstream research discoveries in biotechnology. With respect to genetic diagnostic tests to detect patient mutations associated with certain diseases, the report urges patent holders to allow others to perform the tests for purposes of verifying the results. (Board on Science, Technology and Economic Policy (STEP) and (Committee on Science, Technology, and Law (CSTL))Economic Models of Colorectal Cancer Screening in Average-Risk Adults: Workshop Summary (2005)Read Online FreeBuy the Book or Download the Free PDF

The National Cancer Policy Board and the Board on Science, Engineering, and Economic Policy convened a workshop in January 2004 on “Economic Models of Colorectal Cancer (CRC) Screening in Average-Risk Adults”. The purpose of the workshop was to explore the reasons for differences among leading cost-effectiveness analysis (CEA) models of CRC screening, which public health policy makers increasingly rely on to help them sift through the many choices confronting them. Participants discussed the results of a collaborative pre-workshop exercise undertaken by five research teams that have developed and maintained comprehensive models of CRC screening in average-risk adults, to gain insight into each model’s structure and assumptions and possible explanations for differences in their published analyses. Workshop participants also examined the current state of knowledge on key inputs to the models with a view toward identifying areas where further research may be warranted. This document summarized the presentations and discussion at the workshop. (Board on Science, Technology and Economic Policy (STEP))

The outlook for women with breast cancer has improved in recent years. Due to the combination of improved treatments and the benefits of mammography screening, breast cancer mortality has decreased steadily since 1989. Yet breast cancer remains a major problem, second only to lung cancer as a leading cause of death from cancer for women. To date, no means to prevent breast cancer has been discovered and experience has shown that treatments are most effective when a cancer is detected early, before it has spread to other tissues. These two facts suggest that the most effective way to continue reducing the death toll from breast cancer is improved early detection and diagnosis. Building on the 2001 report Mammography and Beyond, this new book not only examines ways to improve implementation and use of new and current breast cancer detection technologies but also evaluates the need to develop tools that identify women who would benefit most from early detection screening. Saving Women s Lives: Strategies for Improving Breast Cancer Detection and Diagnosis encourages more research that integrates the development, validation, and analysis of the types of technologies in clinical practice that promote improved risk identification techniques. In this way, methods and technologies that improve detection and diagnosis can be more effectively developed and implemented. (Board on Science, Technology and Economic Policy (STEP))

The U.S. patent system is in an accelerating race with human ingenuity and investments in innovation. In many respects the system has responded with admirable flexibility, but the strain of continual technological change and the greater importance ascribed to patents in a knowledge economy are exposing weaknesses including questionable patent quality, rising transaction costs, impediments to the dissemination of information through patents, and international inconsistencies. A panel including a mix of legal expertise, economists, technologists, and university and corporate officials recommends significant changes in the way the patent system operates. The report urges creation of a mechanism for post-grant challenges to newly issued patents, reinvigoration of the non-obviousness standard to quality for a patent, strengthening of the U.S. Patent and Trademark Office, simplified and less costly litigation, harmonization of the U.S., European, and Japanese examination process, and protection of some research from patent infringement liability. (Board on Science, Technology and Economic Policy (STEP))

The EPA commissioned The National Academies to provide advice on the vexing question of whether and, if so, under what circumstances EPA should accept and consider intentional human dosing studies conducted by companies or other sources outside the agency (so-called third parties) to gather evidence relating to the risks of a chemical or the conditions under which exposure to it could be judged safe. This report recommends that such studies be conducted and used for regulatory purposes only if all of several strict conditions are met, including the following: • The study is necessary and scientifically valid, meaning that it addresses an important regulatory question that can’t be answered with animal studies or nondosing human studies; • The societal benefits of the study outweigh any anticipated risks to participants. At no time, even when benefits beyond improved regulation exist, can a human dosing study be justified that is anticipated to cause lasting harm to study participants; and • All recognized ethical standards and procedures for protecting the interests of study participants are observed. In addition, EPA should establish a Human Studies Review Board (HSRB) to evaluate all human dosing studies – both at the beginning and upon completion of the experiments – if they are carried out with the intent of affecting the agency's policy-making. (Committee on Science, Technology, and Law (CSTL))

The National Academies Science, Technology, and Law Program convened three workshops focusing on specific aspects of OMB's "Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of Information Disseminated by Federal Agencies." The workshops were intended to assist the agencies in developing their agency-specific implementation guidelines. This workshop report details the approaches agencies are considering using to implement the guidelines. (Committee on Science, Technology, and Law (CSTL))

This volume assembles papers commissioned by the National Research Council’s Board on Science, Technology, and Economic Policy (STEP) to inform judgments about the significant institutional and policy changes in the patent system made over the past two decades. The chapters fall into three areas. The first four chapters consider the determinants and effects of changes in patent “quality.” Quality refers to whether patents issued by the U.S. Patent and Trademark Office (USPTO) meet the statutory standards of patentability, including novelty, nonobviousness, and utility. The fifth and sixth chapters consider the growth in patent litigation, which may itself be a function of changes in the quality of contested patents. The final three chapters explore controversies associated with the extension of patents into new domains of technology, including biomedicine, software, and business methods. (Board on International Scientific Organizations (BISO))

This symposium brought together leading experts and managers from the public and private sectors who are involved in the creation, dissemination, and use of scientific and technical data and information (STI) to: (1) describe and discuss the role and the benefits and costs--both economic and other--of the public domain in STI in the research and education context, (2) to identify and analyze the legal, economic, and technological pressures on the public domain in STI in research and education, (3) describe and discuss existing and proposed approaches to preserving the public domain in STI in the United States, and (4) identify issues that may require further analysis. (Board on International Scientific Organizations (BISO))

In the years since the Shelby Amendment, scientists, industry, and policy makers have struggled over how the public’s new right of access should be applied to scientific data. There is loose agreement that research data should be accessible, but wide disagreement over the “depth” to which the public has such a right. The National Academies’ Science, Technology, and Law Program held a workshop to explore the mounting tensions in the federal regulatory process between the need to provide access to research data and the need to protect the integrity of the research process. The workshop provided a picture of the debate arising from passage of the Shelby Amendment and the resulting OMB revisions of Circular A-110. This report is a summary of the workshop. (Committee on Science, Technology, and Law (CSTL))

A wave of new health care innovation and growing demand for health care, coupled with uncertain productivity improvements, could severely challenge efforts to control future health care costs. A committee of the National Research Council and the Institute of Medicine organized a conference to examine key health care trends and their impact on medical innovation. The conference addressed the following question: In an environment of renewed concern about rising health care costs, where can public policy stimulate or remove disincentives to the development, adoption and diffusion of high-value innovation in diagnostics, therapeutics, and devices? (Board on Science, Technology and Economic Policy (STEP))

Human reproductive cloning is an assisted reproductive technology that would be carried out with the goal of creating a newborn genetically identical to another human being. It is currently the subject of much debate around the world, involving a variety of ethical, religious, societal, scientific, and medical issues. Scientific and Medical Aspects of Human Reproductive Cloning considers the scientific and medical sides of this issue, plus ethical issues that pertain to human-subjects research. Based on experience with reproductive cloning in animals, the report concludes that human reproductive cloning would be dangerous for the woman, fetus, and newborn, and is likely to fail. The study panel did not address the issue of whether human reproductive cloning, even if it were found to be medically safe, would be or would not be acceptable to individuals or society. (Committee on Science, Engineering, and Public Policy (COSEPUP))

The federal courts are seeking ways to increase the ability of judges to deal with difficult issues of scientific expert testimony. The workshop explored the new environment judges, plaintiffs, defendants, and experts face in light of "Daubert" and "Kumho," when presenting and evaluating scientific, engineering, and medical evidence. (Committee on Science, Technology, and Law (CSTL))