- Table View
- List View
The opportunity of repowering the existing condensing power stations by means of gas turbogenerators offers an important opportunity to considerably improvement of their energy efficiency. The Modernization Potential of Gas turbines in the Coal-Fired Power Industry presents the methodology, calculation procedures and tools used to support enterprise planning for adapting power stations to dual-fuel gas-steam combined-cycle technologies. Both the conceptual and practical aspects of the conversion of existing coal-fired power plants is covered. Discussions of the feasibility, advantages and disadvantages and possible methods are supported by chapters presenting equations of energy efficiency for the conditions of repowering a power unit by installing a gas turbogenerator in a parallel system and the results of technical calculations involving the selection heating structures of heat recovery steam generators. A methodology for analyzing thermodynamic and economic effectiveness for the selection of a structure of the heat recovery steam generator for the repowered power unit is also explained. The Modernization Potential of Gas turbines in the Coal-Fired Power Industry is an informative monograph written for researchers, postgraduate students and policy makers in power engineering.
This brief introduces people with a basic background in probability theory to various problems in cancer biology that are amenable to analysis using methods of probability theory and statistics. The title mentions "cancer biology" and the specific illustrative applications reference cancer data but the methods themselves are more broadly applicable to all aspects of computational biology. Aside from providing a self-contained introduction to basic biology and to cancer, the brief describes four specific problems in cancer biology that are amenable to the application of probability-based methods. The application of these methods is illustrated by applying each of them to actual data from the biology literature. After reading the brief, engineers and mathematicians should be able to collaborate fruitfully with their biologist colleagues on a wide variety of problems.
Modeling of Thermo-Electro-Mechanical Manufacturing Processes with Applications in Metal Forming and Resistance Welding provides readers with a basic understanding of the fundamental ingredients in plasticity, heat transfer and electricity that are necessary to develop and proper utilize computer programs based on the finite element flow formulation. Computer implementation of a wide range of theoretical and numerical subjects related to mesh generation, contact algorithms, elasticity, anisotropic constitutive equations, solution procedures and parallelization of equation solvers is comprehensively described. Illustrated and enriched with selected examples obtained from industrial applications, Modeling of Thermo-Electro-Mechanical Manufacturing Processes with Applications in Metal Forming and Resistance Welding works to diminish the gap between the developers of finite element computer programs and the professional engineers with expertise in industrial joining technologies by metal forming and resistance welding.
Stress-reducing defects and subsequent microcracks are a central focus during micromachining processes. After establishing the central process of micromachining Micromachining with Nanostructured Cutting Tools explains the underlying theories that describe chip formation and applies elementary cutting theory to machining at the microscale. Divided into three parts, the second half of Micromachining with Nanostructured Cutting Tools develops on this introduction; explaining how frictional interactions between uncoated and micro tools coated with nanostructered coatings can be characterized by using the elementary micromachining theories that were initially developed for machining at the macroscale. Shaw's methods for calculating temperatures at the interaction zone and Merchant's methods for calculating mechanical interactions are well described and justified for machining steel in both the dry and wet states. Finally, the further development and use of micro tools coated with thin-film nanostructured diamonds are shown. Micromachining with Nanostructured Cutting Tools is a resource for engineers and scientists working in this new field of micro and nanotechnology. The explanations of how to characterize, apply and adapt traditional approaches of understanding the mechanics of practical machining to the machining of microproducts using nanostructured tools provides a reliable reference for researchers and practitioners alike.
Energy efficiency plays and will continue to play an important role in the world to save energy and mitigate greenhouse gas (GHG) emissions. However, little is known on how much additional capital should be invested to ensure using energy efficiently as it should be, and very little is known which sub-areas, technologies, and countries shall achieve maximum greenhouse gas emissions mitigation per dollar of investment in energy efficiency worldwide. Analyzing completed and slowly moving energy efficiency projects by the Global Environment Facility during 1991-2010, Closing the Gap: GEF Experiences in Global Energy Efficiency evaluates impacts of multi-billion-dollar investments in the world energy efficiency. It covers the following areas: 1. Reviewing the world energy efficiency investment and disclosing the global energy efficiency gap and market barriers that cause the gap; 2. Leveraging private funds with public funds and other resources in energy efficiency investments; using these funds in tangible and intangible asset investments; 3. Investment effectiveness in dollars per metric ton of CO2 emissions mitigation in 10 energy efficiency sub-areas; 4. Major barriers causing failure and abandonments in energy efficiency investments; 5. Quantification of direct and indirect CO2 emissions mitigations inside and outside a project boundary; and 6. Classification and estimation of CO2 emissions mitigations from tangible and intangible asset investments. Closing the Gap: GEF Experiences in Global Energy Efficiency can serve as a handbook for policymakers, project investors and managers, and project implementation practitioners in need of benchmarks in energy efficiency project investments for decision-making. It can also be used by students, researchers and other professionals in universities and research institutions in methodology development for evaluating energy efficiency projects and programs.
Progressive reductions in vehicle emission requirements have forced the automotive industry to invest in research and development of alternative control strategies. Continual control action exerted by a dedicated electronic control unit ensures that best performance in terms of pollutant emissions and power density is married with driveability and diagnostics. Gasoline direct injection (GDI) engine technology is a way to attain these goals. This brief describes the functioning of a GDI engine equipped with a common rail (CR) system, and the devices necessary to run test-bench experiments in detail. The text should prove instructive to researchers in engine control and students are recommended to this brief as their first approach to this technology. Later chapters of the brief relate an innovative strategy designed to assist with the engine management system; injection pressure regulation for fuel pressure stabilization in the CR fuel line is proposed and validated by experiment. The resulting control scheme is composed of a feedback integral action and a static model-based feed-forward action, the gains of which are scheduled as a function of fundamental plant parameters. The tuning of closed-loop performance is supported by an analysis of the phase-margin and the sensitivity function. Experimental results confirm the effectiveness of the control algorithm in regulating the mean-value rail pressure independently from engine working conditions (engine speed and time of injection) with limited design effort.
Finite Element Method in Machining Processes provides a concise study on the way the Finite Element Method (FEM) is used in the case of manufacturing processes, primarily in machining. The basics of this kind of modeling are detailed to create a reference that will provide guidelines for those who start to study this method now, but also for scientists already involved in FEM and want to expand their research. A discussion on FEM, formulations, and techniques currently in use is followed up by machining case studies. Orthogonal cutting, oblique cutting, 3D simulations for turning and milling, grinding, and state-of-the-art topics such as high speed machining and micromachining are explained with relevant examples. This is all supported by a literature review and a reference list for further study. As FEM is a key method for researchers in the manufacturing and especially in the machining sector, Finite Element Method in Machining Processes is a key reference for students studying manufacturing processes but also for industry professionals.
Change detection using remotely sensed images has many applications, such as urban monitoring, land-cover change analysis, and disaster management. This work investigates two-dimensional change detection methods. The existing methods in the literature are grouped into four categories: pixel-based, transformation-based, texture analysis-based, and structure-based. In addition to testing existing methods, four new change detection methods are introduced: fuzzy logic-based, shadow detection-based, local feature-based, and bipartite graph matching-based. The latter two methods form the basis for a structural analysis of change detection. Three thresholding algorithms are compared, and their effects on the performance of change detection methods are measured. These tests on existing and novel change detection methods make use of a total of 35 panchromatic and multi-spectral Ikonos image sets. Quantitative test results and their interpretations are provided.
This work proposes a complete sensor-independent visual system that provides robust target motion detection. First, the way sensors obtain images, in terms of resolution distribution and pixel neighbourhood, is studied. This allows a spatial analysis of motion to be carried out. Then, a novel background maintenance approach for robust target motion detection is implemented. Two different situations are considered: a fixed camera observing a constant background where objects are moving; and a still camera observing objects in movement within a dynamic background. This distinction lies on developing a surveillance mechanism without the constraint of observing a scene free of foreground elements for several seconds when a reliable initial background model is obtained, as that situation cannot be guaranteed when a robotic system works in an unknown environment. Other problems are also addressed to successfully deal with changes in illumination, and the distinction between foreground and background elements.
The network management community has been pushed towards the design of alternative management approaches able to support heterogeneity, scalability, reliability, and minor human intervention. The employment of self-* properties and Peer-To-Peer (P2P) are seen as promising alternatives, able to provide the sophisticated solutions required. Despite being developed in parallel, and with minor direct connections perceived between them, self-* properties and P2P can be used concurrently. In Self-* and P2P for Network Management: Design Principles and Case Studies, the authors explore the issues behind the joint use of self-* properties and P2P, and present: a survey relating autonomic computing and self-* properties, P2P, and network and service management; the design of solutions that explore parallel and cooperative behavior of management peers; the change in angle of network management solution development from APIs, protocols, architectures, and frameworks to the design of management algorithms.
Since the advent of the Semantic Web, interest in the dynamics of ontologies (ontology evolution) has grown significantly. Belief revision presents a good theoretical framework for dealing with this problem; however, classical belief revision is not well suited for logics such as Description Logics. Belief Revision in Non-Classical Logics presents a framework which can be applied to a wide class of logics that include - besides most Description Logics such as the ones behind OWL - Horn Logic and Intuitionistic logic, amongst others. The author also presents algorithms for the most important constructions in belief bases. Researchers and practitioners in theoretical computing will find this an invaluable resource.
In Ambient Intelligence (AmI) systems, reasoning is fundamental for triggering actions or adaptations according to specific situations that may be meaningful and relevant to some applications. However, such reasoning operations may need to evaluate context data collected from distributed sources and stored in different devices, as usually not all context data is readily available to the reasoners within the system. Decentralized Reasoning in Ambient Intelligence proposes a decentralized reasoning approach for performing rule-based reasoning about context data targeting AmI systems. For this purpose, the authors define a context model assuming context data distributed over two sides: the user side, represented by the users and their mobile devices, and the ambient side, represented by the fixed computational infrastructure and ambient services. They formalize the cooperative reasoning operation -- in which two entities cooperate to perform decentralized rule-based reasoning -- and define a complete process to perform this operation.
Research in context-aware computing has produced a number of middleware systems for context management. However, development of ubiquitous context-aware applications is still a challenge because most current middleware systems are still focused on isolated and static context-aware environments. Context-aware environments are inherently dynamic as a result of occasional additions or upgrade of sensors, applications or context inference mechanisms. Context Management for Distributed and Dynamic Context-Aware Computing proposes a novel architecture for context management based on the concept of context domains, allowing applications to keep context interests across distributed context management systems. The authors describe a distributed middleware that implements the aforementioned concepts, without compromising scalability and efficiency of context access.
Description Logics (DLs) is a family of formalisms used to represent knowledge of a domain. They are equipped with a formal logic-based semantics. Knowledge representation systems based on description logics provide various inference capabilities that deduce implicit knowledge from the explicitly represented knowledge. A Proof Theory for Description Logics introduces Sequent Calculi and Natural Deduction for some DLs (ALC, ALCQ). Cut-elimination and Normalization are proved for the calculi. The author argues that such systems can improve the extraction of computational content from DLs proofs for explanation purposes.
Entropy Guided Transformation Learning: Algorithms and Applications (ETL) presents a machine learning algorithm for classification tasks. ETL generalizes Transformation Based Learning (TBL) by solving the TBL bottleneck: the construction of good template sets. ETL automatically generates templates using Decision Tree decomposition. The authors describe ETL Committee, an ensemble method that uses ETL as the base learner. Experimental results show that ETL Committee improves the effectiveness of ETL classifiers. The application of ETL is presented to four Natural Language Processing (NLP) tasks: part-of-speech tagging, phrase chunking, named entity recognition and semantic role labeling. Extensive experimental results demonstrate that ETL is an effective way to learn accurate transformation rules, and shows better results than TBL with handcrafted templates for the four tasks. By avoiding the use of handcrafted templates, ETL enables the use of transformation rules to a greater range of tasks. Suitable for both advanced undergraduate and graduate courses, Entropy Guided Transformation Learning: Algorithms and Applications provides a comprehensive introduction to ETL and its NLP applications.
Software similarity and classification is an emerging topic with wide applications. It is applicable to the areas of malware detection, software theft detection, plagiarism detection, and software clone detection. Extracting program features, processing those features into suitable representations, and constructing distance metrics to define similarity and dissimilarity are the key methods to identify software variants, clones, derivatives, and classes of software. Software Similarity and Classification reviews the literature of those core concepts, in addition to relevant literature in each application and demonstrates that considering these applied problems as a similarity and classification problem enables techniques to be shared between areas. Additionally, the authors present in-depth case studies using the software similarity and classification techniques developed throughout the book.
Distributed-order differential equations, a generalization of fractional calculus, are of increasing importance in many fields of science and engineering from the behaviour of complex dielectric media to the modelling of nonlinear systems. This Brief will broaden the toolbox available to researchers interested in modeling, analysis, control and filtering. It contains contextual material outlining the progression from integer-order, through fractional-order to distributed-order systems. Stability issues are addressed with graphical and numerical results highlighting the fundamental differences between constant-, integer-, and distributed-order treatments. The power of the distributed-order model is demonstrated with work on the stability of noncommensurate-order linear time-invariant systems. Generic applications of the distributed-order operator follow: signal processing and viscoelastic damping of a mass-spring set up. A new general approach to discretization of distributed-order derivatives and integrals is described. The Brief is rounded out with a consideration of likely future research and applications and with a number of MATLAB® codes to reduce repetitive coding tasks and encourage new workers in distributed-order systems.
Service provisioning in ad hoc networks is challenging given the difficulties of communicating over a wireless channel and the potential heterogeneity and mobility of the devices that form the network. Service placement is the process of selecting an optimal set of nodes to host the implementation of a service in light of a given service demand and network topology. The key advantage of active service placement in ad hoc networks is that it allows for the service configuration to be adapted continuously at run time. Service Placement in Ad Hoc Networks proposes the SPi service placement framework as a novel approach to service placement in ad hoc networks. The SPi framework takes advantage of the interdependencies between service placement, service discovery and the routing of service requests to minimize signaling overhead. The work also proposes the Graph Cost / Single Instance and the Graph Cost / Multiple Instances placement algorithms.
Since their first inception, automatic reading systems have evolved substantially, yet the recognition of handwriting remains an open research problem due to its substantial variation in appearance. With the introduction of Markovian models to the field, a promising modeling and recognition paradigm was established for automatic handwriting recognition. However, no standard procedures for building Markov model-based recognizers have yet been established. This text provides a comprehensive overview of the application of Markov models in the field of handwriting recognition, covering both hidden Markov models and Markov-chain or n-gram models. First, the text introduces the typical architecture of a Markov model-based handwriting recognition system, and familiarizes the reader with the essential theoretical concepts behind Markovian models. Then, the text reviews proposed solutions in the literature for open problems in applying Markov model-based approaches to automatic handwriting recognition.
Biomechanics of the Brain will present an introduction to brain anatomy for engineers and scientists. Experimental techniques such as brain imaging and brain tissue mechanical property measurement will be discussed, as well as computational methods for neuroimage analysis and modeling of brain deformations due to impacts and neurosurgical interventions. Brain trauma between the different sexes will be analyzed. Applications will include prevention and diagnosis of traumatic injuries, such as shaken baby syndrome, neurosurgical simulation and neurosurgical guidance, as well as brain structural disease modeling for diagnosis and prognosis. This book will be the first book on brain biomechanics. It will provide a comprehensive source of information on this important field for students, researchers, and medical professionals in the fields of computer-aided neurosurgery, head injury, and basic biomechanics.
Morton Deutsch is considered the founder of modern conflict resolution theory and practice. He has written and researched areas which pioneered current efforts in conflict resolution and diplomacy. This volume showcases six of Deutsch's more notable and influential papers, and include complementary chapters written by other significant contributors working in these areas who can situate the original papers in the context of the existing state of scholarship.
Life-Cycle Assessment of Semiconductors presents the first and thus far only available transparent and complete life cycle assessment of semiconductor devices. A lack of reliable semiconductor LCA data has been a major challenge to evaluation of the potential environmental benefits of information technologies (IT). The analysis and results presented in this book will allow a higher degree of confidence and certainty in decisions concerning the use of IT in efforts to reduce climate change and other environmental effects. Coverage includes but is not limited to semiconductor manufacturing trends by product type and geography, unique coverage of life-cycle assessment, with a focus on uncertainty and sensitivity analysis of energy and global warming missions for CMOS logic devices, life cycle assessment of flash memory and life cycle assessment of DRAM. The information and conclusions discussed here will be highly relevant and useful to individuals and institutions.
Twenty five years have elapsed since the original publication of Helium Cryogenics. During this time, a considerable amount of research and development involving helium fluids has been carried out culminating in several large-scale projects. Furthermore, the field has matured through these efforts so that there is now a broad engineering base to assist the development of future projects. Helium Cryogenics, 2nd edition brings these advances in helium cryogenics together in an updated form. As in the original edition, the author's approach is to survey the field of cryogenics with emphasis on helium fluids. This approach is more specialized and fundamental than that contained in other cryogenics books, which treat the associated range of cryogenic fluids. As a result, the level of treatment is more advanced and assumes a certain knowledge of fundamental engineering and physics principles, including some quantum mechanics. The goal throughout the work is to bridge the gap between the physics and engineering aspects of helium fluids to provide a source for engineers and scientists to enhance their usefulness in low-temperature systems. Dr. Van Sciver is a Distinguished Research Professor and John H. Gorrie Professor of Mechanical Engineering at Florida State University. He is also a Program Director at the National High Magnetic Field Laboratory (NHMFL). Dr. Van Sciver joined the FAMU-FSU College of Engineering and the NHMFL in 1991, initiating and teaching a graduate program in magnet and materials engineering and in cryogenic thermal sciences and heat transfer. He also led the NHMFL development efforts of the cryogenic systems for the NHMFL Hybrid and 900 MHz NMR superconducting magnets. Between 1997 and 2003, he served as Director of Magnet Science and Technology at the NHMFL. Dr. Van Sciver is a Fellow of the ASME and the Cryogenic Society of America and American Editor for the journal Cryogenics. He is the 2010 recipient of the Kurt Mendelssohn Award. Prior to joining Florida State University, Dr. Van Sciver was Research Scientist and then Professor of Nuclear Engineering, Engineering Physics and Mechanical Engineering at the University of Wisconsin-Madison from 1976 to 1991. During that time he also served as the Associate Director of the Applied Superconductivity Center. Dr. Van Sciver received his PhD in Low Temperature Physics from the University of Washington-Seattle in 1976. He received his BS degree in Engineering Physics from Lehigh University in 1970. Dr. Van Sciver is author of over 200 publications and patents in low temperature physics, liquid helium technology, cryogenic engineering and magnet technology. The first edition of Helium Cryogenics was published by Plenum Press (1986). The present work is an update and expansion of that original project.
The purpose of this book is to provide an up to date review of the nature and consequences of epigenetic changes in cancer. Epigenetics literally means "above" genetics, and consists of heritable gene expression or other phenotypic states not accounted for by DNA base sequence. Epigenetic changes are now known to make a large contribution to various aspects of tumorigenesis. These changes include alterations in global and promoter specific DNA methylation, activating and repressive histone modifications, and changes in higher order chromatin structures. Each of these topics will be covered in this book.
Transesophageal echocardiography (TEE) is a valuable diagnostic modality now routinely used during cardiac surgery and in the intensive care unit. Increasingly, anesthesiologists trained in TEE provide the service in both settings where they face the challenge of integrating numerous current TEE guidelines into day-to-day practice. Perioperative Two-Dimensional Transesophageal Echocardiography: A Practical Handbook has been designed to be a concise, portable guide for using TEE to recognize cardiac pathology during the perioperative period. This compact guide has a diverse appeal for anesthesiologists, cardiac surgeons, and cardiologists desiring comprehensive up-to-date echocardiographic information at their fingertips. Features More than 450 full-color, high quality clinical images and illustrationsSynopsis of cardiac pathology commonly encountered in cardiac surgery patientsConvenient spiral bindingOn-the-spot reference for echocardiographers with a wide range of experience, from novice to expert