Browse Results

Showing 21,001 through 21,025 of 61,842 results

High-Performance Computing Using FPGAs

by Khaled Benkrid Wim Vanderbauwhede

High-Performance Computing using FPGA covers the area of high performance reconfigurable computing (HPRC). This book provides an overview of architectures, tools and applications for High-Performance Reconfigurable Computing (HPRC). FPGAs offer very high I/O bandwidth and fine-grained, custom and flexible parallelism and with the ever-increasing computational needs coupled with the frequency/power wall, the increasing maturity and capabilities of FPGAs, and the advent of multicore processors which has caused the acceptance of parallel computational models. The Part on architectures will introduce different FPGA-based HPC platforms: attached co-processor HPRC architectures such as the CHREC's Novo-G and EPCC's Maxwell systems; tightly coupled HRPC architectures, e.g. the Convey hybrid-core computer; reconfigurably networked HPRC architectures, e.g. the QPACE system, and standalone HPRC architectures such as EPFL's CONFETTI system. The Part on Tools will focus on high-level programming approaches for HPRC, with chapters on C-to-Gate tools (such as Impulse-C, AutoESL, Handel-C, MORA-C++); Graphical tools (MATLAB-Simulink, NI LabVIEW); Domain-specific languages, languages for heterogeneous computing(for example OpenCL, Microsoft's Kiwi and Alchemy projects). The part on Applications will present case from several application domains where HPRC has been used successfully, such as Bioinformatics and Computational Biology; Financial Computing; Stencil computations; Information retrieval; Lattice QCD; Astrophysics simulations; Weather and climate modeling.

Mathematics of Complexity and Dynamical Systems

by Robert A. Meyers

Mathematics of Complexity and Dynamical Systems is an authoritative reference to the basic tools and concepts of complexity, systems theory, and dynamical systems from the perspective of pure and applied mathematics. Complex systems are systems that comprise many interacting parts with the ability to generate a new quality of collective behavior through self-organization, e.g. the spontaneous formation of temporal, spatial or functional structures. These systems are often characterized by extreme sensitivity to initial conditions as well as emergent behavior that are not readily predictable or even completely deterministic. The more than 100 entries in this wide-ranging, single source work provide a comprehensive explication of the theory and applications of mathematical complexity, covering ergodic theory, fractals and multifractals, dynamical systems, perturbation theory, solitons, systems and control theory, and related topics. Mathematics of Complexity and Dynamical Systems is an essential reference for all those interested in mathematical complexity, from undergraduate and graduate students up through professional researchers.

Terahertz Imaging for Biomedical Applications

by Brian W.-H. Ng Xiaoxia Yin Derek Abbott

Terahertz biomedical imaging has become an area of interest due to its ability to simultaneously acquire both image and spectral information. Terahertz imaging systems are being commercialized, with increasing trials performed in a biomedical setting. As a result, advanced digital image processing algorithms are needed to assist screening, diagnosis, and treatment. "Pattern Recognition and Tomographic Reconstruction" presents these necessary algorithms, which will play a critical role in the accurate detection of abnormalities present in biomedical imaging. Terhazertz tomographic imaging and detection technology contributes to the ability to identify opaque objects with clear boundaries, and would be useful to both in vivo and ex vivo environments, making this book a must-read for anyone in the field of biomedical engineering and digital imaging.

Undoing Ethics

by Natasha Whiteman

Over the past decade, researchers from different academic disciplines have paid increasing attention to the productivity of online environments. The ethical underpinnings of research in such settings, however, remain contested and often controversial. As traditional debates have been reignited by the need to respond to the particular characteristics of technologically-mediated environments, researchers have entered anew key debates regarding the moral, legal and regulative aspects of research ethics. A growing trend in this work has been towards the promotion of localized and contextualized research ethics - the suggestion that the decisions we make should be informed by the nature of the environments we study and the habits/expectations of participants within them. Despite such moves, the relationship between the empirical, theoretical and methodological aspects of Internet research ethics remains underexplored. Drawing from ongoing sociological research into the practices of media cultures online, this book provides a timely and distinctive response to this need. This book explores the relationship between the production of ethical stances in two different contexts: the ethical manoeuvring of participants within online media-fan communities and the ethical decision-making of the author as Internet researcher, manoeuvring, as it were, in the academic community. In doing so, the book outlines a reflexive framework for exploring research ethics at different levels of analysis; the empirical settings of research; the theoretical perspectives which inform the researcher's objectification of the research settings; and the methodological issues and practical decisions that constitute the activity as research. The analysis of these different levels develops a way of thinking about ethical practice in terms of stabilizing and destabilizing moves within and between research and researched communities. The analysis emphasizes the continuities and discontinuities between both research practice and online media-fan activity, and social activity in on and offline environments.

Geolocation Techniques

by Camillo Gentile Ronald Raulefs Nayef Alsindi Carole Teolis

Basics of Distributed and Cooperative Radio and Non-Radio Based Geolocation provides a detailed overview of geolocation technologies. The book covers the basic principles of geolocation, including ranging techniques to localization technologies, fingerprinting and localization in wireless sensor networks. This book also examines the latest algorithms and techniques such as Kalman Filtering, Gauss-Newton Filtering and Particle Filtering.

Localization in Wireless Networks

by Miodrag Potkonjak Jessica Feng Sanford Sasha Slijepcevic

In a computational tour-de-force, this volume wipes away a host of problems related to location discovery in wireless ad-hoc sensor networks. WASNs have recognized potential in many applications that are location-dependent, yet are heavily constrained by factors such as cost and energy consumption. Their "ad-hoc" nature, with direct rather than mediated connections between a network of wireless devices, adds another layer of difficulty. Basing this work entirely on data-driven, coordinated algorithms, the author's aim is to present location discovery techniques that are highly accurate--and which fit user criteria. The research deploys nonparametric statistical methods and relies on the concept of joint probability to construct error (including location error) models and environmental field models. It also addresses system issues such as the broadcast and scheduling of the beacon. Reporting an impressive accuracy gain of almost 17 percent, and organized in a clear, sequential manner, this book represents a stride forward in wireless localization.

The Effects of Traffic Structure on Application and Network Performance

by Kevin Jeffay Jay Aikat F. Donelson Smith

Over the past three decades, the Internet's rapid growth has spurred the development of new applications in mobile computing, digital music, online video, gaming and social networks. These applications rely heavily upon various underlying network protocols and mechanisms to enable, maintain and enhance their Internet functionality The Effects of Traffic Structure on Application and Network Performance provides the necessary tools for maximizing the network efficiency of any Internet application, and presents ground-breaking research that will influence how these applications are built in the future. The book outlines how to design and run all types of networking experiments, and establishes the best practices in synthetic traffic generation for current and future researchers and practitioners to follow. It addresses some basic concepts and methods of traffic generation, but also details extensive empirical research in testing and evaluating network protocols and applications within a laboratory setting. The Effects of Traffic Structure on Application and Network Performance is designed as a reference book for networking professionals who must design, plan, test and evaluate their networks. Advanced-level students and researchers in computer science and engineering will find this book valuable as well

Designing Sorting Networks

by Sherenaz W. Al-Haj Baddar Kenneth E. Batcher

Designing Sorting Networks: A New Paradigm provides an in-depth guide to maximizing the efficiency of sorting networks, and uses 0/1 cases, partially ordered sets and Haase diagrams to closely analyze their behavior in an easy, intuitive manner. This book also outlines new ideas and techniques for designing faster sorting networks using Sortnet, and illustrates how these techniques were used to design faster 12-key and 18-key sorting networks through a series of case studies. Finally, it examines and explains the mysterious behavior exhibited by the fastest-known 9-step 16-key network. Designing Sorting Networks: A New Paradigm is intended for advanced-level students, researchers and practitioners as a reference book. Academics in the fields of computer science, engineering and mathematics will also find this book invaluable.

Social Semantics

by Harry Halpin

Social Semantics: The Search for Meaning on the Web provides a unique introduction to identity and reference theories of the World Wide Web, through the academic lens of philosophy of language and data-driven statistical models. The Semantic Web is a natural evolution of the Web, and this book covers the URL-based Web architecture and Semantic Web in detail. It has a robust empirical side which has an impact on industry. Social Semantics: The Search for Meaning on the Web discusses how the largest problem facing the Semantic Web is the problem of identity and reference, and how these are the results of a larger general theory of meaning. This book hypothesizes that statistical semantics can solve these problems, illustrated by case studies ranging from a pioneering study of tagging systems to using the Semantic Web to boost the results of commercial search engines. Social Semantics: The Search for Meaning on the Web targets practitioners working in the related fields of the semantic web, search engines, information retrieval, philosophers of language and more. Advanced-level students and researchers focusing on computer science will also find this book valuable as a secondary text or reference book.

Creating New Medical Ontologies for Image Annotation: A Case Study (SpringerBriefs in Electrical and Computer Engineering)

by Liana Stanescu Dumitru Dan Burdescu Marius Brezovan Cristian Gabriel Mihai

Creating New Medical Ontologies for Image Annotation focuses on the problem of the medical images automatic annotation process, which is solved in an original manner by the authors. All the steps of this process are described in detail with algorithms, experiments and results. The original algorithms proposed by authors are compared with other efficient similar algorithms. In addition, the authors treat the problem of creating ontologies in an automatic way, starting from Medical Subject Headings (MESH). They have presented some efficient and relevant annotation models and also the basics of the annotation model used by the proposed system: Cross Media Relevance Models. Based on a text query the system will retrieve the images that contain objects described by the keywords.

Designing a New Class of Distributed Systems

by Rao Mikkilineni

Designing a New Class of Distributed Systems closely examines the Distributed Intelligent Managed Element (DIME) Computing Model, a new model for distributed systems, and provides a guide to implementing Distributed Managed Workflows with High Reliability, Availability, Performance and Security. The book also explores the viability of self-optimizing, self-monitoring autonomous DIME-based computing systems. Designing a New Class of Distributed Systems is designed for practitioners as a reference guide for innovative distributed systems design. Researchers working in a related field will also find this book valuable.

The Design of Cloud Workflow Systems

by Qiang He Dahai Cao Dong Yuan Wenhao Li Xiao Liu Yun Yang Gaofeng Zhang Jinjun Chen

Cloud computing is the latest market-oriented computing paradigm which brings software design and development into a new era characterized by "XaaS", i.e. everything as a service. Cloud workflows, as typical software applications in the cloud, are composed of a set of partially ordered cloud software services to achieve specific goals. However, due to the low QoS (quality of service) nature of the cloud environment, the design of workflow systems in the cloud becomes a challenging issue for the delivery of high quality cloud workflow applications. To address such an issue, this book presents a systematic investigation to the three critical aspects for the design of a cloud workflow system, viz. system architecture, system functionality and quality of service. Specifically, the system architecture for a cloud workflow system is designed based on the general four-layer cloud architecture, viz. application layer, platform layer, unified resources layer and fabric layer. The system functionality for a cloud workflow system is designed based on the general workflow reference model but with significant extensions to accommodate software services in the cloud. The support of QoS is critical for the quality of cloud workflow applications. This book presents a generic framework to facilitate a unified design and development process for software components that deliver lifecycle support for different QoS requirements. While the general QoS requirements for cloud workflow applications can have many dimensions, this book mainly focuses on three of the most important ones, viz. performance, reliability and security. In this book, the architecture, functionality and QoS management of our SwinDeW-C prototype cloud workflow system are demonstrated in detail as a case study to evaluate our generic design for cloud workflow systems. To conclude, this book offers a general overview of cloud workflow systems and provides comprehensive introductions to the design of the system architecture, system functionality and QoS management.

Simulation and Learning

by Franco Landriscina

The main idea of this book is that to comprehend the instructional potential of simulation and to design effective simulation-based learning environments, one has to consider both what happens inside the computer and inside the students' minds. The framework adopted to do this is model-centered learning, in which simulation is seen as particularly effective when learning requires a restructuring of the individual mental models of the students, as in conceptual change. Mental models are by themeselves simulations, and thus simulation models can extend our biological capacity to carry out simulative reasoning. For this reason, recent approaches in cognitive science like embodied cognition and the extended mind hypothesis are also considered in the book.. A conceptual model called the "epistemic simulation cycle" is proposed as a blueprint for the comprehension of the cognitive activies involved in simulation-based learning and for instructional design.

Simulation and Learning: A Model-Centered Approach

by Franco Landriscina

The main idea of this book is that to comprehend the instructional potential of simulation and to design effective simulation-based learning environments, one has to consider both what happens inside the computer and inside the students' minds. The framework adopted to do this is model-centered learning, in which simulation is seen as particularly effective when learning requires a restructuring of the individual mental models of the students, as in conceptual change. Mental models are by themeselves simulations, and thus simulation models can extend our biological capacity to carry out simulative reasoning. For this reason, recent approaches in cognitive science like embodied cognition and the extended mind hypothesis are also considered in the book.. A conceptual model called the "epistemic simulation cycle" is proposed as a blueprint for the comprehension of the cognitive activies involved in simulation-based learning and for instructional design.

Economics of Information Security and Privacy III

by Bruce Schneier

The Workshop on the Economics of Information Security (WEIS) is the leading forum for interdisciplinary scholarship on information security, combining expertise from the fields of economics, social science, business, law, policy and computer science. Prior workshops have explored the role of incentives between attackers and defenders, identified market failures dogging Internet security, and assessed investments in cyber-defense. Current contributions build on past efforts using empirical and analytic tools to not only understand threats, but also strengthen security through novel evaluations of available solutions. Economics of Information Security and Privacy III addresses the following questions: how should information risk be modeled given the constraints of rare incidence and high interdependence; how do individuals' and organizations' perceptions of privacy and security color their decision making; how can we move towards a more secure information infrastructure and code base while accounting for the incentives of stakeholders?

Enacting Electronic Government Success

by J. Ramon Gil-Garcia

Many countries around the world are investing a great amount of resources in government IT initiatives. However, few of these projects achieve their stated goals and some of them are complete failures. Therefore, understanding e-government success has become very important and urgent in recent years. In order to develop relevant knowledge about this complex phenomenon, researchers and practitioners need to identify and assess what are the main conditions, variables, or factors that have an impact on e-government success. However, before being able to evaluate these impacts, it is necessary to define what e-government success is and what some e-government success measures are. This book presents a review of both e-government success measures and e-government success factors. It also provides empirical evidence from quantitative analysis and two in-depth case studies. Although based on sound theory and rigorous empirical analysis, the book not only significantly contributes to academic knowledge, but also includes some practical recommendations for government officials and public managers. Theoretically, the book proposes a way to quantitatively operationalize Fountain's enactment framework. Based on the institutional tradition, the technology enactment framework attempts to explain the effects of organizational forms and institutional arrangements on the information technology used by government agencies. According to Fountain (1995; 2001) the technology enactment framework pays attention to the relationships among information technology, organizations, embeddedness, and institutions. This framework is very well known in the e-government field, but is normally used for qualitative analysis and there is no previous proposal of how to use it with quantitative data. The book proposes variables to measure each of the different constructs in this framework and also tests the relationships hypothesized by Fountain's theory. Finally, using the advantages of the selected quantitative analysis technique (Partial Least Squares), the study also proposes some adjustments and extensions to the original framework in a theory building effort. Methodologically, the book reports on one of the first multi-method studies in the field of e-government in general and e-government success in particular. This study uses a nested research design, which combines statistical analysis with two in depth case studies. The study begins with a statistical analysis using organizational, institutional, and contextual factors as the independent variables. An overall score representing e-government success in terms of the functionality of state websites is the dependent variable. Second, based on the statistical results two cases are selected based on their relative fitness to the model (residuals) and their position in the general ranking of website functionality (which includes four different measures). In order to complement the results of the statistical analysis, case studies were developed for the two selected states (New York and Indiana), using semi-structured interviews and document analysis. In terms of the statistical analysis, the book constitutes one of the first applications of Partial Least Squares (PLS) to an e-government success study. PLS is a structural equations modeling (SEM) technique and, therefore, allows estimating the measurement model and the structural model simultaneously. The use of this sophisticated statistical strategy helped to test the relationships between e-government success and different factors influencing it, as well as some of the relationships between several of the factors, thus allowing exploring some indirect effects too.

Theory, Analysis and Design of RF Interferometric Sensors

by Seoktae Kim Cam Nguyen

Theory, Analysis and Design of RF Interferometric Sensors presents the theory, analysis and design of RF interferometric sensors. RF interferometric sensors are attractive for various sensing applications that require every fine resolution and accuracy as well as fast speed. The book also presents two millimeter-wave interferometric sensors realized using RF integrated circuits. The developed millimeter-wave homodyne sensor shows sub-millimeter resolution in the order of 0.05 mm without correction for the non-linear phase response of the sensor's quadrature mixer. The designed millimeter-wave double-channel homodyne sensor provides a resolution of only 0.01 mm, or 1/840th of the operating wavelength, and can inherently suppress the non-linearity of the sensor's quadrature mixer. The experimental results of displacement and velocity measurement are presented as a way to demonstrate the sensing ability of the RF interferometry and to illustrate its many possible applications in sensing. The book is succinct, yet the material is very much self-contained, enabling readers with an undergraduate background in electrical engineering or physics with some experiences or graduate courses in RF circuits to understand easily.

Handbook of Space Security

by Kai-Uwe Schrogl Peter L. Hays Jana Robinson Denis Moura Christina Giannopapa

Space Security involves the use of space (in particular communication, navigation, earth observation, and electronic intelligence satellites) for military and security purposes on earth and also the maintenance of space (in particular the earth orbits) as safe and secure areas for conducting peaceful activities. The two aspects can be summarized as "space for security on earth" and the safeguarding of space for peaceful endeavors. The Handbook will provide a sophisticated, cutting-edge resource on the space security policy portfolio and the associated assets, assisting fellow members of the global space community and other interested policy-making and academic audiences in keeping abreast of the current and future directions of this vital dimension of international space policy. The debate on coordinated space security measures, including relevant 'Transparency and Confidence-Building Measures, ' remains at a relatively early stage of development. The book offers a comprehensive description of the various components of space security and how these challenges are being addressed today. It will also provide a number of recommendations concerning how best to advance this space policy area, given the often competing objectives of the world's major space-faring nations. The critical role to be played by the United States and Europe as an intermediary and "middle diplomat" in promoting sustainable norms of behavior for space will likewise be highlighted. In providing a global and coherent analytical approach to space security today, theHandbook focuses on four areas that together define the entire space security area: policies, technologies, applications, and programs. This structure will assure the overall view of the subject from its political to its technical aspects. Internationally recognized experts in each of the above fields contribute, with their analytical synthesis assured by the section editors. "

A Survey of Data Leakage Detection and Prevention Solutions (SpringerBriefs in Computer Science)

by Yuval Elovici Asaf Shabtai Lior Rokach

SpringerBriefs present concise summaries of cutting-edge research and practical applications across a wide spectrum of fields. Featuring compact volumes of 50 to 100 pages (approximately 20,000- 40,000 words), the series covers a range of content from professional to academic. Briefs allow authors to present their ideas and readers to absorb them with minimal time investment. As part of Springer's eBook collection, SpringBriefs are published to millions of users worldwide. Information/Data Leakage poses a serious threat to companies and organizations, as the number of leakage incidents and the cost they inflict continues to increase. Whether caused by malicious intent, or an inadvertent mistake, data loss can diminish a company's brand, reduce shareholder value, and damage the company's goodwill and reputation. This book aims to provide a structural and comprehensive overview of the practical solutions and current research in the DLP domain. This is the first comprehensive book that is dedicated entirely to the field of data leakage and covers all important challenges and techniques to mitigate them. Its informative, factual pages will provide researchers, students and practitioners in the industry with a comprehensive, yet concise and convenient reference source to this fascinating field. We have grouped existing solutions into different categories based on a described taxonomy. The presented taxonomy characterizes DLP solutions according to various aspects such as: leakage source, data state, leakage channel, deployment scheme, preventive/detective approaches, and the action upon leakage. In the commercial part we review solutions of the leading DLP market players based on professional research reports and material obtained from the websites of the vendors. In the academic part we cluster the academic work according to the nature of the leakage and protection into various categories. Finally, we describe main data leakage scenarios and present for each scenario the most relevant and applicable solution or approach that will mitigate and reduce the likelihood and/or impact of the leakage scenario.

Excel 2010 for Educational and Psychological Statistics

by Thomas J Quirk

Excel has become an important and nearly ubiquitous classroom and office resource for students and practitioners who are faced with solving statistical problems on an everyday basis. Despite this, there has yet to emerge a truly practical, "how-do-I-do-it" manual that teaches the various applications and processes/formulas for Excel in educational and psychological Statistics. Quirk's Excel 2010 for Educational and Psychological Statistics will fill this void, as it is designed to be a step-by-step, exercise-driven guide for education and psychology students who need to master Excel to create formulas and solve statistical problems. Each chapter first explains briefly the formulas that are included in the chapter, and then directs the student on how to use Excel commands and formulas to solve a specific business problem. Three practice problems are provided at the end of each chapter, along with their solutions in an Appendix. At the end of the Excel Guide, an additional Practice Exam allows the reader to test his or her understanding of each chapter by attempting to solve a specific educational or psychometrical issue or problem using Excel (the solution to this problem is also given in an Appendix). From the beginning of the book, readers/students are taught how to write their own formulas and then how to utilize Excel drop-down formula menus as well for such exercises involving one-way ANOVA, simple linear regression, and multiple correlation.

Disney Stories

by Newton Lee Krystina Madej

Disney Stories: Getting to Digital explores how Disney, the man and the company, used technological innovation to create characters and stories that engage audiences in many different media, in particular in Video Games and on the Internet. Drawing on Disney films from the twenties and thirties, as well as the writings of historians, screenwriters and producers, Disney Stories: Getting to Digital explains how new film and animation techniques, many developed by Disney, worked together to evolve character and content development and produce entertaining stories that riveted audiences. Through an insider's perspective of Disney's legendary creation process, the book closely examines how the Disney Company moved its stories into the digital world in the 1990s and the virtual, online communities of the 2000s. By embracing the digital era, Disney led storytelling and technological innovation by granting their audience the unique opportunity to take part in their creation process through their online games, including The Lion King Animated Story Book, Disney Blast and Toontown. Disney Stories: Getting to Digital is intended for Disney fans and current practitioners looking to study the creation process of one of the most famous animation studios in existence. Professors teaching courses in new media, animation and interactive storytelling will also find this book a valuable asset.

Basics of Computer Networking (SpringerBriefs in Electrical and Computer Engineering)

by Thomas Robertazzi

Springer Brief Basics of Computer Networking provides a non-mathematical introduction to the world of networks. This book covers both technology for wired and wireless networks. Coverage includes transmission media, local area networks, wide area networks, and network security. Written in a very accessible style for the interested layman by the author of a widely used textbook with many years of experience explaining concepts to the beginner.

Fault-Tolerant Design

by Elena Dubrova

This textbook serves as an introduction to fault-tolerance, intended for upper-division undergraduate students, graduate-level students and practicing engineers in need of an overview of the field. Readers will develop skills in modeling and evaluating fault-tolerant architectures in terms of reliability, availability and safety. They will gain a thorough understanding of fault tolerant computers, including both the theory of how to design and evaluate them and the practical knowledge of achieving fault-tolerance in electronic, communication and software systems. Coverage includes fault-tolerance techniques through hardware, software, information and time redundancy. The content is designed to be highly accessible, including numerous examples and exercises. Solutions and powerpoint slides are available for instructors.

Computational Modeling of Biological Systems: From Molecules to Pathways (Biological and Medical Physics, Biomedical Engineering)

by Nikolay V Dokholyan

Computational modeling is emerging as a powerful new approach to study and manipulate biological systems. Multiple methods have been developed to model, visualize, and rationally alter systems at various length scales, starting from molecular modeling and design at atomic resolution to cellular pathways modeling and analysis. Higher time and length scale processes, such as molecular evolution, have also greatly benefited from new breeds of computational approaches. This book provides an overview of the established computational methods used for modeling biologically and medically relevant systems.

MATLAB for Psychologists

by Alessandro Soranzo Mauro Borgo Massimo Grassi

The matrix laboratory interactive computing environment--MATLAB--has brought creativity to research in diverse disciplines, particularly in designing and programming experiments. More commonly used in mathematics and the sciences, it also lends itself to a variety of applications across the field of psychology. For the novice looking to use it in experimental psychology research, though, becoming familiar with MATLAB can be a daunting task. MATLAB for Psychologists expertly guides readers through the component steps, skills, and operations of the software, with plentiful graphics and examples to match the reader's comfort level. Using an extended illustration, this concise volume explains the program's usefulness at any point in an experiment, without the limits imposed by other types of software. And the authors demonstrate the responsiveness of MATLAB to the individual's research needs, whether the task is programming experiments, creating sensory stimuli, running simulations, or calculating statistics for data analysis. Key features of the coverage: Thinking in a matrix way.Handling and plotting data.Guidelines for improved programming, sound, and imaging.Statistical analysis and signal detection theory indexes.The Graphical User Interface.The Psychophysics Toolbox.MATLAB for Psychologists serves a wide audience of advanced undergraduate and graduate level psychology students, professors, and researchers as well as lab technicians involved in programming psychology experiments.

Refine Search

Showing 21,001 through 21,025 of 61,842 results