- Table View
- List View
Starting in the mid 1990s, the United States economy experienced an unprecedented upsurge in economic productivity. Rapid technological change in communications, computing, and information management continue to promise further gains in productivity, a phenomenon often referred to as the New Economy. To better understand the sources of these gains and the policy measures needed to sustain these positive trends, the National Academies Board on Science, Technology, and Economic Policy (STEP) convened a series of workshops and commissioned papers on Measuring and Sustaining the New Economy. This workshop, entitled “The Telecommunications Challenge: Changing Technologies and Evolving Policies,” brought together leading industry representatives and government officials to discuss issues generated by the rapid technological change occurring in the telecommunications industry and the regulatory and policy challenges this creates. The workshop presented a variety of perspectives relating to developments in the telecommunications industry such as the potential of and impediments to broadband technology.
The chemical sector is a key part of the national economy and has been designated by the Department of Homeland Security (DHS) as one of 17 sectors comprising the nation's Critical Infrastructure. Although its products represent only 2 percent of the U.S. gross domestic product, those products underpin most other manufactured goods. To assist DHS in characterizing and mitigating the vulnerabilities faced by the nation from the chemical industry, this study examines classes of chemicals and chemical processes that are critical to the nation's security, economy, and health. It identifies vulnerabilities and points of weakness in the supply chain for these chemicals and chemical processes; assesses the likely impact of a significant disruption in the supply chain; identifies actions to help prevent disruption in the supply chain and mitigate loss and injury should such disruption occur; identifies incentives and disincentives to preventative and mitigating actions; and recommends areas of scientific, engineering, and economic research and development. The report concludes that the consequences of a deliberate attack on the chemical infrastructure would be expected to be similar in nature to the accidents we have already experienced. Under limited circumstances, such an attack could cause catastrophic casualties and loss of life, but it would take several simultaneous events to cause catastrophic economic consequences. Poor communication could amplify societal response. Overall, the recommendations in this report emphasize the benefit of investments to improve emergency preparedness for and response to chemical events. They also highlight the potential to minimize the physical hazards through development of cost-effective, safer processes that reduce the volume, toxicity, or hazardous conditions under which chemicals are processed.
A biological warfare agent (BWA) is a microorganism, or a toxin derived from a living organism, that causes disease in humans, plants, or animals or that causes the deterioration of material. The effectiveness of a BWA is greatly reduced if the attack is detected in time for the target population to take appropriate defensive measures. Therefore, the ability to detect a BWA, in particular to detect it before the target population is exposed, will be a valuable asset to defense against biological attacks. The ideal detection system will have quick response and be able to detect a threat plume at a distance from the target population. The development of reliable biological standoff detection systems, therefore, is a key goal. However, testing biological standoff detection systems is difficult because open-air field tests with BWAs are not permitted under international conventions and because the wide variety of environments in which detectors might be used may affect their performance. This book explores the question of how to determine whether or not a biological standoff detection system fulfills its mission reliably if we cannot conduct open-air field tests with live BWAs.
The National Academies Press (NAP)--publisher for the National Academies--publishes more than 200 books a year offering the most authoritative views, definitive information, and groundbreaking recommendations on a wide range of topics in science, engineering, and health. Our books are unique in that they are authored by the nation's leading experts in every scientific field.
The 1991 NRC decadal survey for astronomy and astrophysics included a project called the Millimeter Array (MMA). This instrument would be an array of millimeter-wavelength telescopes intended to capture images of star-forming regions and distant star-burst galaxies. With the addition of contributions form Europe, the MMA evolved into the Atacama Large Millimeter Array (ALMA), a proposed array of 64, 12-meter antennas. The project is now part of the NSF Major Research Equipment and Facilities budget request. Increased costs, however, have forced the NSF to reconsider the number of antennas. To help with that review, NSF asked the NRC to assess the scientific consequences of reducing the number of active antennas from 60 to either 50 or 40. This report presents an assessment of the effect of downsizing on technical performance specifications, performance degradation, and the ability to perform transformational science, and of the minimum number of antennas needed.
Chemistry plays a key role in conquering diseases, solving energy problems, addressing environmental problems, providing the discoveries that lead to new industries, and developing new materials and technologies for national defense and homeland security. However, the field is currently facing a crucial time of change and is struggling to position itself to meet the needs of the future as it expands beyond its traditional core toward areas related to biology, materials science, and nanotechnology. At the request of the National Science Foundation and the U.S. Department of Energy, the National Research Council conducted an in-depth benchmarking analysis to gauge the current standing of the U.S. chemistry field in the world. The Future of U.S. Chemistry Research: Benchmarks and Challenges highlights the main findings of the benchmarking exercise.
There is a growing sense of national urgency about the role of energy in long-term U.S. economic vitality, national security, and climate change. This urgency is the consequence of many factors, including the rising global demand for energy; the need for long-term security of energy supplies, especially oil; growing global concerns about carbon dioxide emissions; and many other factors affected to a great degree by government policies both here and abroad. On March 13, 2008, the National Academies brought together many of the most knowledgeable and influential people working on energy issues today to discuss how we can meet the need for energy without irreparably damaging Earth's environment or compromising U.S. economic and national security-a complex problem that will require technological and social changes that have few parallels in human history. The National Academies Summit on America's Energy Future: Summary of a Meeting chronicles that 2-day summit and serves as a current and far-reaching foundation for examining energy policy. The summit is part of the ongoing project 'America's Energy Future: Technology Opportunities, Risks, and Tradeoffs,' which will produce a series of reports providing authoritative estimates and analysis of the current and future supply of and demand for energy; new and existing technologies to meet those demands; their associated impacts; and their projected costs. The National Academies Summit on America's Energy Future: Summary of a Meeting is an essential base for anyone with an interest in strategic, tactical, and policy issues. Federal and state policy makers will find this book invaluable, as will industry leaders, investors, and others willing to convert concern into action to solve the energy problem.
Although its importance is not always recognized, theory is an integral part of all biological research. Biologists' theoretical and conceptual frameworks inform every step of their research, affecting what experiments they do, what techniques and technologies they develop and use, and how they interpret their data. By examining how theory can help biologists answer questions like "What are the engineering principles of life?" or "How do cells really work?" the report shows how theory synthesizes biological knowledge from the molecular level to the level of whole ecosystems. The book concludes that theory is already an inextricable thread running throughout the practice of biology; but that explicitly giving theory equal status with other components of biological research could help catalyze transformative research that will lead to creative, dynamic, and innovative advances in our understanding of life.
Across the United States, municipalities, counties, and states grapple with issues of ensuring adequate amounts of water in times of high demand and low supply. Instream flow programs aim to balance ecosystem requirements and human uses of water, and try to determine how much water should be in rivers. With its range of river and ecosystem conditions, growing population, and high demands on water, Texas is representative of instream flow challenges across the United States, and its instream flow program may be a model for other jurisdictions. Three state agencies—the Texas Water Development Board (TWDB), the Texas Parks and Wildlife Department (TPWD), and the Texas Commission on Environmental Quality (TCEQ)—asked a committee of the National Research Council (NRC) to review the Programmatic Work Plan (PWP) and Technical Overview Document (TOD) that outline the state’s instream flow initiative. The committee suggested several changes to the proposed plan, such as establishing clearer goals, modifying the flow chart that outlines the necessary steps for conducting an instream flow study, and provide better linkages between individual studies of biology, hydrology and hydraulics, physical processes, and water quality.
The Climate Change Science Program (CCSP) and its predecessor U.S. Global Change Research Program have sponsored climate research and observations for nearly 15 years, yet the overall progress of the program has not been measured systematically. Metrics—a system of measurement that includes the item being measured, the unit of measurement, and the value of the unit—offer a tool for measuring such progress; improving program performance; and demonstrating program successes to Congress, the Office of Management and Budget, and the public. This report lays out a framework for creating and implementing metrics for the CCSP. A general set of metrics provides a starting point for identifying the most important measures, and the principles provide guidance for refining the metrics and avoiding unintended consequences.
Formed in 1995 by EPA, several other federal and state agencies, and several private organizations, the National Advisory Committee on Acute Exposure Guideline Levels for Hazardous Substances (referred to as the NAC) develops, reviews, and approves acute exposure guideline levels (AEGLs) for up to 400 extremely hazardous substances (EHSs). AEGLs have a broad array of potential applications for federal, state, and local governments and for the private sector. They are necessary for prevention and emergency-response planning for potential releases of EHSs, either from accidents or as a result of terrorist activities. This report includes an assessment of the NAC’s draft AEGLs documents for the following 10 chemicals: 1, 4-dioxane; chloroform; carbon tetrachloride; sulfur dioxide; cis, trans 1,-2 dichloroethylene; monochloroacetic acid; carbon monoxide; fluorine; methanol; and phenol.
Today's world of rapid social, technological, and behavioral change provides new opportunities for communications with few limitations of time and space. Through these communications, people leave behind an ever-growing collection of traces of their daily activities, including digital footprints provided by text, voice, and other modes of communication. Meanwhile, new techniques for aggregating and evaluating diverse and multimodal information sources are available to security services that must reliably identify communications indicating a high likelihood of future violence. In the context of this changed and changing world of communications and behavior, the Board on Behavioral, Cognitive, and Sensory Sciences of the National Research Council presents this volume of three papers as one portion of the vast subject of threatening communications and behavior. The papers review the behavioral and social sciences research on the likelihood that someone who engages in abnormal and/or threatening communications will actually then try to do harm. The focus is on how the scientific knowledge can inform and advance future research on threat assessments, in part by considering the approaches and techniques used to analyze communications and behavior in the dynamic context of today's world. The papers in the collection were written within the context of protecting high-profile public figures from potential attach or harm. The research, however, is broadly applicable to U.S. national security including potential applications for analysis of communications from leaders of hostile nations and public threats from terrorist groups. This work highlights the complex psychology of threatening communications and behavior, and it offers knowledge and perspectives from multiple domains that contribute to a deeper understanding of the value of communications in predicting and preventing violent behaviors.
TRB and the Board on Energy and Environmental Systems, part of the National Academies’ Division on Engineering and Physical Sciences (DEPS), have released Special Report 286, Tires and Passenger Vehicle Fuel Economy: Informing Consumers, Improving Performance. This report examines the contribution of tires to vehicle fuel consumption and the prospects for improving tire energy performance without adversely affecting tire life, traction capability, and retail prices. The report reviews the technical literature and analyzes energy performance data from nearly 200 passenger tires on the market today. National fuel savings from improving the energy efficiency of passenger tires by 10 percent are quantified and the implications for consumer spending on tires, motor vehicle safety, and scrap tire generation are considered. Observing that consumers are given little, if any, information on the fuel economy effects of tires, the report recommends that government and industry cooperate to fill this information gap.
Although more women than men participate in higher education in the United States, the same is not true when it comes to pursuing careers in science and engineering. To Recruit and Advance: Women Students and Faculty in Science and Engineering identifies and discusses better practices for recruitment, retention, and promotion for women scientists and engineers in academia. Seeking to move beyond yet another catalog of challenges facing the advancement of women in academic science and engineering, this book describes actions actually taken by universities to improve the situation for women. Serving as a guide, it examines the following: Recruitment of female undergraduates and graduate students. Ways of reducing attrition in science and engineering degree programs in the early undergraduate years. Improving retention rates of women at critical transition points—from undergraduate to graduate student, from graduate student to postdoc, from postdoc to first faculty position. Recruitment of women for tenure-track positions. Increasing the tenure rate for women faculty. Increasing the number of women in administrative positions. This guide offers numerous solutions that may be of use to other universities and colleges and will be an essential resource for anyone interested in improving the position of women students, faculty, deans, provosts, and presidents in science and engineering.
Water is our most fundamental natural resource, a resource that is limited. Challenges to our nation's water resources continue to grow, driven by population growth, ecological needs, climate change, and other pressures. The nation needs more and improved water science and information to meet these challenges. Toward a Sustainable and Secure Water Future reviews the United States Geological Survey's (USGS) Water Resource Discipline (WRD), one of the nation's foremost water science organizations. This book provides constructive advice to help the WRD meet the nation's water needs over the coming decades. Of interest primarily to the leadership of the USGS WRD, many findings and recommendations also target the USGS leadership and the Department of Interior (DOI), because their support is necessary for the WRD to respond to the water needs of the nation.
The U.S. Special Operations Command (SOCOM) was formed in response to the failed rescue attempt in 1980 of American hostages held by Iran. Among its key responsibilities, SOCOM plans and synchronizes operations against terrorist networks. Special operations forces (SOF) often operate alone in austere environments with only the items they can carry, which makes equipment size, weight, and power needs especially important. Specialized radios and supporting equipment must be carried by the teams for their radio-frequency (RF) operations. As warfighting demands on SOCOM have intensified, SOCOM's needs for significantly improved radio-frequency (RF) systems have increased. Toward a Universal Radio Frequency System for Special Operations Forces examines the current state of the art for both handheld and manpackable platform-mounted RF systems, and determines which frequencies could be provided by handheld systems. The book also explores whether or not a system that fulfills SOF's unique requirements could be deployed in a reasonable time period. Several recommendations are included to address these and other issues.
Demographic changes, immigration, economic upheavals, and changing societal mores are creating new and altered structures, processes, and relationships in American families today. As families undergo rapid change, family science is at the brink of a new and exciting integration across methods, disciplines, and epistemological perspectives. The purpose of The Science of Research on Families: A Workshop, held in Washington, DC, on July 13-14, 2010, was to examine the broad array of methodologies used to understand the impact of families on children's health and development. It sought to explore individual disciplinary contributions and the ways in which different methodologies and disciplinary perspectives could be combined in the study of families. Toward an Integrated Science of Research on Families documents the information presented in the workshop presentations and discussions. The report explores the idea of family research as being both basic and applied, offering opportunities for learning as well as intervention. It discusses research as being most useful when organized around particular problems, such as obesity or injury prevention. Toward an Integrated Science of Research on Families offers a problem-oriented approach that can guide a broad-based research program that extends across funders, institutions, and scientific disciplines.
Despite many advances, security and privacy often remain too complex for individuals or enterprises to manage effectively or to use conveniently. Security is hard for users, administrators, and developers to understand, making it all too easy to use, configure, or operate systems in ways that are inadvertently insecure. Moreover, security and privacy technologies originally were developed in a context in which system administrators had primary responsibility for security and privacy protections and in which the users tended to be sophisticated. Today, the user base is much wider--including the vast majority of employees in many organizations and a large fraction of households--but the basic models for security and privacy are essentially unchanged.Security features can be clumsy and awkward to use and can present significant obstacles to getting work done. As a result, cybersecurity measures are all too often disabled or bypassed by the users they are intended to protect. Similarly, when security gets in the way of functionality, designers and administrators deemphasize it. The result is that end users often engage in actions, knowingly or unknowingly, that compromise the security of computer systems or contribute to the unwanted release of personal or other confidential information. <i>Toward Better Usability, Security, and Privacy of Information Technology</i> discusses computer system security and privacy, their relationship to usability, and research at their intersection.
Observable changes with regional and global implications, such as warming temperatures and reduced sea ice, are taking place across the Arctic. However, the record of Arctic observations suffers from incomplete geographic coverage and limited duration, and measurements are not well coordinated. This makes it difficult to comprehensively describe current conditions in the Arctic, let alone understand the changes that are underway or their connections to the rest of the Earth system. The U.S. National Science Foundation asked for guidance to help design a pan-arctic observing network. This book outlines the potential scope, composition, and implementation strategy for an arctic observing network. Such an integrated, complete, and multidisciplinary environmental observing network will improve society's understanding of and ability to respond to ongoing systemic changes in the Arctic and its capability to anticipate, predict, and respond to future change both in the Arctic and around the globe. The network would build on and enhance existing national and international efforts and deliver easily accessible, complete, reliable, timely, long-term, pan-arctic observations. Because many potential components of the network already exist or are being planned, and because of the surge of activity during the International Polar Year, there is an immediate opportunity for major progress.
The National Weather Service (NWS) is responsible for providing flood forecasts and warnings in the United States. The agency established the Advanced Hydrologic Prediction Services (AHPS) program in 1997 to advance technology for hydrologic services, specifically to provide accurate forecasts that support timely warnings for all users of hydrologic predictions. AHPS strives to provide information at the right time to facilitate adequate responses to mitigate damages to life, livelihoods, and property. AHPS is slated to be fully implemented nationwide in 2013. With seven years still remaining in its development and implementation timeline, a review of the program now is critical to providing NWS with information it needs to maximize the effectiveness of the AHPS program. This report assesses AHPS in respect to hydrologic science and technology research, river routing and mechanics, "systems" engineering aspects, and implementation. Overall, this report finds AHPS to be an ambitious program that promises to provide services and products that are timely and necessary. The report calls for AHPS to develop a detailed and comprehensive, multi-year implementation plan and for the program’s goals and budget to be brought into closer alignment.
In the last 20 years, there has been a remarkable emergence of innovations and technological advances that are generating promising changes and opportunities for sustainable agriculture, yet at the same time the agricultural sector worldwide faces numerous daunting challenges. Not only is the agricultural sector expected to produce adequate food, fiber, and feed, and contribute to biofuels to meet the needs of a rising global population, it is expected to do so under increasingly scarce natural resources and climate change. Growing awareness of the unintended impacts associated with some agricultural production practices has led to heightened societal expectations for improved environmental, community, labor, and animal welfare standards in agriculture. Toward Sustainable Agricultural Systems in the 21st Century assesses the scientific evidence for the strengths and weaknesses of different production, marketing, and policy approaches for improving agricultural sustainability and reducing the costs and unintended consequences of agricultural production. It discusses the principles underlying farming systems and practices that could improve the sustainability. It also explores how those lessons learned could be applied to agriculture in different regional and national settings, with an emphasis on sub-Saharan Africa. By presenting practices and how they could be combined in a systems approach to improving the sustainability of U.S. agriculture, this book can have a profound impact on the development and implementation of sustainable farming systems. Toward Sustainable Agricultural Systems in the 21st Century serves as a valuable resource for policy makers, farmers, experts in food production and agribusiness, and federal regulatory agencies.
In 2007, the National Research Council envisioned a new paradigm in which biologically important perturbations in key toxicity pathways would be evaluated with new methods in molecular biology, bioinformatics, computational toxicology, and a comprehensive array of in vitro tests based primarily on human biology. Although some considered the vision too optimistic with respect to the promise of the new science, no one can deny that a revolution in toxicity testing is under way. New approaches are being developed, and data are being generated. As a result, the U.S. Environmental Protection Agency (EPA) expects a large influx of data that will need to be evaluated. EPA also is faced with tens of thousands of chemicals on which toxicity information is incomplete and emerging chemicals and substances that will need risk assessment and possible regulation. Therefore, the agency asked the National Research Council to convene a symposium to stimulate discussion on the application of the new approaches and data in risk assessment. The symposium was held on May 11-13, 2009, in Washington, DC, and included presentations and discussion sessions on pathway-based approaches for hazard identification, applications of new approaches to mode-of-action analyses, the challenges to and opportunities for risk assessment in the changing paradigm, and future directions.
Toxicity testing in laboratory animals provides much of the information used by the Environmental Protection Agency (EPA) to assess the hazards and risks associated with exposure to environmental agents that might harm public health or the environment. The data are used to establish maximum acceptable concentrations of environmental agents in drinking water, set permissible limits of exposure of workers, define labeling requirements, establish tolerances for pesticides residues on food, and set other kinds of limits on the basis of risk assessment. Because the number of regulations that require toxicity testing is growing, EPA called for a comprehensive review of established and emerging toxicity-testing methods and strategies. This interim report reviews current toxicity-testing methods and strategies and near-term improvements in toxicity-testing approaches proposed by EPA and others. It identifies several recurring themes and questions in the various reports reviewed. The final report will present a long-range vision and strategic plan to advance the practices of toxicity testing and human health assessment of environmental contaminants.
Once dismissed by the medical profession as a purely cosmetic problem, obesity now ranks second only to smoking as a wholly preventable cause of death. Indeed, it's implicated in 300,000 deaths each year and is a major contributor to heart disease, diabetes, high blood pressure, high cholesterol, and depression. Even conservative estimates show that 15% of all children are now considered to be overweight--worldwide there are 22 million kids under five years old that are defined as fat. Supersized portions, unhealthy diets, and too little physical activity certainly contribute to what's making kids 'fat'. But that's not the whole story. Researchers are at a loss to explain why obesity rates have risen so suddenly and so steeply in the closing decades of the 20th century. But head out to the beaches, playgrounds, and amusement parks, and it's obvious that overweight children are more numerous and conspicuous. We see it in our neighborhoods and we read it in the headlines. Our nation--indeed the world--is in crisis. But knowledge is power and it's time to arm ourselves in the battle to win the war on obesity. Fed Up! is just what the doctor ordered. Based in part on the Institute of Medicine's ground-breaking report on childhood obesity, this new book from family physician and journalist Susan Okie provides in-depth background on the issue; shares heartrending but instructive case studies that illustrate just how serious and widespread the problem is; and gives honest, authoritative, science-based advice that constitute our best weapons in this critical battle.
Combustion has provided society with most of its energy needs for millennia, from igniting the fires of cave dwellers to propelling the rockets that traveled to the Moon. Even in the face of climate change and the increasing availability of alternative energy sources, fossil fuels will continue to be used for many decades. However, they will likely become more expensive, and pressure to minimize undesired combustion by-products (pollutants) will likely increase. The trends in the continued use of fossil fuels and likely use of alternative combustion fuels call for more rapid development of improved combustion systems. In January 2009, the Multi-Agency Coordinating Committee on Combustion Research (MACCCR) requested that the National Research Council (NRC) conduct a study of the structure and use of a cyberinfrastructure (CI) for combustion research. The charge to the authoring committee of Transforming Combustion Research through Cyberinfrastructure was to: identify opportunities to improve combustion research through computational infrastructure (CI) and the potential benefits to applications; identify necessary CI elements and evaluate the accessibility, sustainability, and economic models for various approaches; identify CI that is needed for education in combustion science and engineering; identify human, cultural, institutional, and policy challenges and how other fields are addressing them. Transforming Combustion Research through Cyberinfrastructure also estimates the resources needed to provide stable, long-term CI for research in combustion and recommends a plan for enhanced exploitation of CI for combustion research.