- Table View
- List View
Probabilistic Graphical Models
by Luis Enrique SucarThis accessible text/reference provides a general introduction to probabilistic graphical models (PGMs) from an engineering perspective. The book covers the fundamentals for each of the main classes of PGMs, including representation, inference and learning principles, and reviews real-world applications for each type of model. These applications are drawn from a broad range of disciplines, highlighting the many uses of Bayesian classifiers, hidden Markov models, Bayesian networks, dynamic and temporal Bayesian networks, Markov random fields, influence diagrams, and Markov decision processes. Features: presents a unified framework encompassing all of the main classes of PGMs; describes the practical application of the different techniques; examines the latest developments in the field, covering multidimensional Bayesian classifiers, relational graphical models and causal models; provides exercises, suggestions for further reading, and ideas for research or programming projects at the end of each chapter.
Probabilistic Graphical Models: Principles and Applications (Advances in Computer Vision and Pattern Recognition)
by Luis Enrique SucarThis fully updated new edition of a uniquely accessible textbook/reference provides a general introduction to probabilistic graphical models (PGMs) from an engineering perspective. It features new material on partially observable Markov decision processes, graphical models, and deep learning, as well as an even greater number of exercises.The book covers the fundamentals for each of the main classes of PGMs, including representation, inference and learning principles, and reviews real-world applications for each type of model. These applications are drawn from a broad range of disciplines, highlighting the many uses of Bayesian classifiers, hidden Markov models, Bayesian networks, dynamic and temporal Bayesian networks, Markov random fields, influence diagrams, and Markov decision processes.Topics and features:Presents a unified framework encompassing all of the main classes of PGMsExplores the fundamental aspects of representation, inference and learning for each techniqueExamines new material on partially observable Markov decision processes, and graphical modelsIncludes a new chapter introducing deep neural networks and their relation with probabilistic graphical models Covers multidimensional Bayesian classifiers, relational graphical models, and causal modelsProvides substantial chapter-ending exercises, suggestions for further reading, and ideas for research or programming projectsDescribes classifiers such as Gaussian Naive Bayes, Circular Chain Classifiers, and Hierarchical Classifiers with Bayesian NetworksOutlines the practical application of the different techniquesSuggests possible course outlines for instructorsThis classroom-tested work is suitable as a textbook for an advanced undergraduate or a graduate course in probabilistic graphical models for students of computer science, engineering, and physics. Professionals wishing to apply probabilistic graphical models in their own field, or interested in the basis of these techniques, will also find the book to be an invaluable reference.Dr. Luis Enrique Sucar is a Senior Research Scientist at the National Institute for Astrophysics, Optics and Electronics (INAOE), Puebla, Mexico. He received the National Science Prize en 2016.
Probabilistic Graphical Models: Principles and Techniques (Adaptive Computation and Machine Learning series)
by Daphne Koller Nir FriedmanA general framework for constructing and using probabilistic models of complex systems that would enable a computer to use available information for making decisions.Most tasks require a person or an automated system to reason—to reach conclusions based on available information. The framework of probabilistic graphical models, presented in this book, provides a general approach for this task. The approach is model-based, allowing interpretable models to be constructed and then manipulated by reasoning algorithms. These models can also be learned automatically from data, allowing the approach to be used in cases where manually constructing a model is difficult or even impossible. Because uncertainty is an inescapable aspect of most real-world applications, the book focuses on probabilistic models, which make the uncertainty explicit and provide models that are more faithful to reality. Probabilistic Graphical Models discusses a variety of models, spanning Bayesian networks, undirected Markov networks, discrete and continuous models, and extensions to deal with dynamical systems and relational data. For each class of models, the text describes the three fundamental cornerstones: representation, inference, and learning, presenting both basic concepts and advanced techniques. Finally, the book considers the use of the proposed framework for causal reasoning and decision making under uncertainty. The main text in each chapter provides the detailed technical development of the key ideas. Most chapters also include boxes with additional material: skill boxes, which describe techniques; case study boxes, which discuss empirical cases related to the approach described in the text, including applications in computer vision, robotics, natural language understanding, and computational biology; and concept boxes, which present significant concepts drawn from the material in the chapter. Instructors (and readers) can group chapters in various combinations, from core topics to more technically advanced material, to suit their particular needs.
Probabilistic Group Theory, Combinatorics, and Computing
by Alla Detinko Dane Flannery Eamonn O'BrienProbabilistic Group Theory, Combinatorics and Computing is based on lecture courses held at the Fifth de Brún Workshop in Galway, Ireland in April 2011. Each course discusses computational and algorithmic aspects that have recently emerged at the interface of group theory and combinatorics, with a strong focus on probabilistic methods and results. The courses served as a forum for devising new strategic approaches and for discussing the main open problems to be solved in the further development of each area. The book represents a valuable resource for advanced lecture courses. Researchers at all levels are introduced to the main methods and the state-of-the-art, leading up to the very latest developments. One primary aim of the book's approach and design is to enable postgraduate students to make immediate use of the material presented.
Probabilistic Linguistic Two-Sided Matching Decision-Making Methods and Applications (Studies in Fuzziness and Soft Computing #436)
by Zeshui Xu Bo LiThis book tackles the intricacies of decision-making processes where alternatives stem from distinct, finite sets. Discover the cutting-edge in decision-making with our groundbreaking book on complex two-sided matching methods. Harnessing the power of probabilistic linguistic term sets, it introduces innovative methods that enhance matching efficiency and practicality. It addresses the pressing question of how to navigate and optimize in scenarios with multifaceted matching challenges, offering an exploration into the psychological perceptions of agents through consistency checks and pairwise comparisons. It delves into the unknowns of static matching with multiple attribute weights, extends its scope to multi-sided agent sets in complex matching, and introduces dynamic screening mechanisms to refine the matching process. This book is not just a theoretical exploration. It lays the groundwork for intelligent matching algorithms and group mechanisms, providing actionable insights for technical supply and demand allocation, emergency personnel dispatch, and multi-stage medical management scheme selection. The effectiveness of these methods is backed by comparative analyses and simulation experiments, proving their superiority in real-world applications. Embrace the future of decision-making with our book, a must-read for those seeking to master complex matching scenarios and unlock practical solutions.
Probabilistic Machine Learning for Civil Engineers
by James-A. GouletAn introduction to key concepts and techniques in probabilistic machine learning for civil engineering students and professionals; with many step-by-step examples, illustrations, and exercises.This book introduces probabilistic machine learning concepts to civil engineering students and professionals, presenting key approaches and techniques in a way that is accessible to readers without a specialized background in statistics or computer science. It presents different methods clearly and directly, through step-by-step examples, illustrations, and exercises. Having mastered the material, readers will be able to understand the more advanced machine learning literature from which this book draws.The book presents key approaches in the three subfields of probabilistic machine learning: supervised learning, unsupervised learning, and reinforcement learning. It first covers the background knowledge required to understand machine learning, including linear algebra and probability theory. It goes on to present Bayesian estimation, which is behind the formulation of both supervised and unsupervised learning methods, and Markov chain Monte Carlo methods, which enable Bayesian estimation in certain complex cases. The book then covers approaches associated with supervised learning, including regression methods and classification methods, and notions associated with unsupervised learning, including clustering, dimensionality reduction, Bayesian networks, state-space models, and model calibration. Finally, the book introduces fundamental concepts of rational decisions in uncertain contexts and rational decision-making in uncertain and sequential contexts. Building on this, the book describes the basics of reinforcement learning, whereby a virtual agent learns how to make optimal decisions through trial and error while interacting with its environment.
Probabilistic Machine Learning for Finance and Investing: A Primer to Generative AI with Python
by Deepak K. KanungoThere are several reasons why probabilistic machine learning represents the next-generation ML framework and technology for finance and investing. This generative ensemble learns continually from small and noisy financial datasets while seamlessly enabling probabilistic inference, retrodiction, prediction, and counterfactual reasoning. Probabilistic ML also lets you systematically encode personal, empirical, and institutional knowledge into ML models.Whether they're based on academic theories or ML strategies, all financial models are subject to modeling errors that can be mitigated but not eliminated. Probabilistic ML systems treat uncertainties and errors of financial and investing systems as features, not bugs. And they quantify uncertainty generated from inexact inputs and outputs as probability distributions, not point estimates. This makes for realistic financial inferences and predictions that are useful for decision-making and risk management.Unlike conventional AI, these systems are capable of warning us when their inferences and predictions are no longer useful in the current market environment. By moving away from flawed statistical methodologies and a restrictive conventional view of probability as a limiting frequency, you’ll move toward an intuitive view of probability as logic within an axiomatic statistical framework that comprehensively and successfully quantifies uncertainty. This book shows you how.
Probabilistic Machine Learning: Advanced Topics (Adaptive Computation and Machine Learning series)
by Kevin P. MurphyAn advanced book for researchers and graduate students working in machine learning and statistics who want to learn about deep learning, Bayesian inference, generative models, and decision making under uncertainty.An advanced counterpart to Probabilistic Machine Learning: An Introduction, this high-level textbook provides researchers and graduate students detailed coverage of cutting-edge topics in machine learning, including deep generative modeling, graphical models, Bayesian inference, reinforcement learning, and causality. This volume puts deep learning into a larger statistical context and unifies approaches based on deep learning with ones based on probabilistic modeling and inference. With contributions from top scientists and domain experts from places such as Google, DeepMind, Amazon, Purdue University, NYU, and the University of Washington, this rigorous book is essential to understanding the vital issues in machine learning.Covers generation of high dimensional outputs, such as images, text, and graphs Discusses methods for discovering insights about data, based on latent variable models Considers training and testing under different distributionsExplores how to use probabilistic models and inference for causal inference and decision makingFeatures online Python code accompaniment
Probabilistic Machine Learning: An Introduction (Adaptive Computation and Machine Learning series)
by Kevin P. MurphyA detailed and up-to-date introduction to machine learning, presented through the unifying lens of probabilistic modeling and Bayesian decision theory.This book offers a detailed and up-to-date introduction to machine learning (including deep learning) through the unifying lens of probabilistic modeling and Bayesian decision theory. The book covers mathematical background (including linear algebra and optimization), basic supervised learning (including linear and logistic regression and deep neural networks), as well as more advanced topics (including transfer learning and unsupervised learning). End-of-chapter exercises allow students to apply what they have learned, and an appendix covers notation. Probabilistic Machine Learning grew out of the author&’s 2012 book, Machine Learning: A Probabilistic Perspective. More than just a simple update, this is a completely new book that reflects the dramatic developments in the field since 2012, most notably deep learning. In addition, the new book is accompanied by online Python code, using libraries such as scikit-learn, JAX, PyTorch, and Tensorflow, which can be used to reproduce nearly all the figures; this code can be run inside a web browser using cloud-based notebooks, and provides a practical complement to the theoretical topics discussed in the book. This introductory text will be followed by a sequel that covers more advanced topics, taking the same probabilistic approach.
Probabilistic Mapping of Spatial Motion Patterns for Mobile Robots (Cognitive Systems Monographs #40)
by Tomasz Piotr Kucner Achim J. Lilienthal Martin Magnusson Luigi Palmieri Chittaranjan Srinivas SwaminathanThis book describes how robots can make sense of motion in their surroundings and use the patterns they observe to blend in better in dynamic environments shared with humans.The world around us is constantly changing. Nonetheless, we can find our way and aren’t overwhelmed by all the buzz, since motion often follows discernible patterns. Just like humans, robots need to understand the patterns behind the dynamics in their surroundings to be able to efficiently operate e.g. in a busy airport. Yet robotic mapping has traditionally been based on the static world assumption, which disregards motion altogether. In this book, the authors describe how robots can instead explicitly learn patterns of dynamic change from observations, store those patterns in Maps of Dynamics (MoDs), and use MoDs to plan less intrusive, safer and more efficient paths. The authors discuss the pros and cons of recently introduced MoDs and approaches to MoD-informed motion planning, and provide an outlook on future work in this emerging, fascinating field.
Probabilistic Methods and Distributed Information: Rudolf Ahlswede’s Lectures on Information Theory 5 (Foundations in Signal Processing, Communications and Networking #15)
by Holger Boche Ingo Althöfer Christian Deppe Ulrich Tamm Alexander Ahlswede Rudolf Ahlswede Vladimir Blinovsky Ulrich Krengel Ahmed MansourThe fifth volume of Rudolf Ahlswede’s lectures on Information Theory focuses on several problems that were at the heart of a lot of his research. One of the highlights of the entire lecture note series is surely Part I of this volume on arbitrarily varying channels (AVC), a subject in which Ahlswede was probably the world's leading expert. Appended to Part I is a survey by Holger Boche and Ahmed Mansour on recent results concerning AVC and arbitrarily varying wiretap channels (AVWC). After a short Part II on continuous data compression, Part III, the longest part of the book, is devoted to distributed information. This Part includes discussions on a variety of related topics; among them let us emphasize two which are famously associated with Ahlswede: "multiple descriptions", on which he produced some of the best research worldwide, and "network coding", which had Ahlswede among the authors of its pioneering paper. The final Part IV on "Statistical Inference under Communication constraints" is mainly based on Ahlswede’s joint paper with Imre Csiszar, which received the Best Paper Award of the IEEE Information Theory Society. The lectures presented in this work, which consists of 10 volumes, are suitable for graduate students in Mathematics, and also for those working in Theoretical Computer Science, Physics, and Electrical Engineering with a background in basic Mathematics. The lectures can be used either as the basis for courses or to supplement them in many ways. Ph.D. students will also find research problems, often with conjectures, that offer potential subjects for a thesis. More advanced researchers may find questions which form the basis of entire research programs.
Probabilistic Topic Models: Foundation and Application
by Chen Zhang Di Jiang Yuanfeng SongThis book introduces readers to the theoretical foundation and application of topic models. It provides readers with efficient means to learn about the technical principles underlying topic models. More concretely, it covers topics such as fundamental concepts, topic model structures, approximate inference algorithms, and a range of methods used to create high-quality topic models. In addition, this book illustrates the applications of topic models applied in real-world scenarios. Readers will be instructed on the means to select and apply suitable models for specific real-world tasks, providing this book with greater use for the industry. Finally, the book presents a catalog of the most important topic models from the literature over the past decades, which can be referenced and indexed by researchers and engineers in related fields. We hope this book can bridge the gap between academic research and industrial application and help topic models play an increasingly effective role in both academia and industry. This book offers a valuable reference guide for senior undergraduate students, graduate students, and researchers, covering the latest advances in topic models, and for industrial practitioners, sharing state-of-the-art solutions for topic-related applications. The book can also serve as a reference for job seekers preparing for interviews.
Probability Collectives
by Ajith Abraham Anand Jayant Kulkarni Kang TaiThis book provides an emerging computational intelligence tool in the framework of collective intelligence for modeling and controlling distributed multi-agent systems referred to as Probability Collectives. In the modified Probability Collectives methodology a number of constraint handling techniques are incorporated, which also reduces the computational complexity and improved the convergence and efficiency. Numerous examples and real world problems are used for illustration, which may also allow the reader to gain further insight into the associated concepts.
Probability Models
by John HaighThe purpose of this book is to provide a sound introduction to the study of real-world phenomena that possess random variation. It describes how to set up and analyse models of real-life phenomena that involve elements of chance. Motivation comes from everyday experiences of probability, such as that of a dice or cards, the idea of fairness in games of chance, and the random ways in which, say, birthdays are shared or particular events arise. Applications include branching processes, random walks, Markov chains, queues, renewal theory, and Brownian motion. This popular second edition textbook contains many worked examples and several chapters have been updated and expanded. Some mathematical knowledge is assumed. The reader should have the ability to work with unions, intersections and complements of sets; a good facility with calculus, including integration, sequences and series; and appreciation of the logical development of an argument. Probability Models is designed to aid students studying probability as part of an undergraduate course on mathematics or mathematics and statistics.
Probability Theory: An Introduction Using R
by Shailaja R. Deshmukh Akanksha S. KashikarThis book introduces Probability Theory with R software and explains abstract concepts in a simple and easy-to-understand way by combining theory and computation. It discusses conceptual and computational examples in detail, to provide a thorough understanding of basic techniques and develop an enjoyable read for students seeking suitable material for self-study. It illustrates fundamental concepts including fields, sigma-fields, random variables and their expectations, various modes of convergence of a sequence of random variables, laws of large numbers and the central limit theorem. Computational exercises based on R software are included in each Chapter Includes a brief introduction to the basic functions of R software for beginners in R and serves as a ready reference Includes Numerical computations, simulation studies, and visualizations using R software as easy tools to explain abstract concepts Provides multiple-choice questions for practice Incorporates self-explanatory R codes in every chapter This textbook is for advanced students, professionals, and academic researchers of Statistics, Biostatistics, Economics and Mathematics.
Probability and Random Processes for Electrical and Computer Engineers
by John A. GubnerThe theory of probability is a powerful tool that helps electrical and computer engineers to explain, model, analyze, and design the technology they develop. The text begins at the advanced undergraduate level, assuming only a modest knowledge of probability, and progresses through more complex topics mastered at graduate level. The first five chapters cover the basics of probability and both discrete and continuous random variables. The later chapters have a more specialized coverage, including random vectors, Gaussian random vectors, random processes, Markov Chains, and convergence. Describing tools and results that are used extensively in the field, this is more than a textbook; it is also a reference for researchers working in communications, signal processing, and computer network traffic analysis. With over 300 worked examples, some 800 homework problems, and sections for exam preparation, this is an essential companion for advanced undergraduate and graduate students. Further resources for this title, including solutions (for Instructors only), are available online at www. cambridge. org/9780521864701.
Probability and Statistics for Data Science: Math + R + Data (Chapman & Hall/CRC Data Science Series)
by Norman MatloffProbability and Statistics for Data Science: Math + R + Data covers "math stat"—distributions, expected value, estimation etc.—but takes the phrase "Data Science" in the title quite seriously: <P><P> * Real datasets are used extensively. * All data analysis is supported by R coding. * Includes many Data Science applications, such as PCA, mixture distributions, random graph models, Hidden Markov models, linear and logistic regression, and neural networks. * Leads the student to think critically about the "how" and "why" of statistics, and to "see the big picture." * Not "theorem/proof"-oriented, but concepts and models are stated in a mathematically precise manner. <P><P> Prerequisites are calculus, some matrix algebra, and some experience in programming. <P><P> Norman Matloff is a professor of computer science at the University of California, Davis, and was formerly a statistics professor there. He is on the editorial boards of the Journal of Statistical Software and The R Journal. His book Statistical Regression and Classification: From Linear Models to Machine Learning was the recipient of the Ziegel Award for the best book reviewed in Technometrics in 2017. He is a recipient of his university's Distinguished Teaching Award.
Probability and Statistics for Machine Learning: A Textbook
by Charu C. AggarwalThis book covers probability and statistics from the machine learning perspective. The chapters of this book belong to three categories: 1. The basics of probability and statistics: These chapters focus on the basics of probability and statistics, and cover the key principles of these topics. Chapter 1 provides an overview of the area of probability and statistics as well as its relationship to machine learning. The fundamentals of probability and statistics are covered in Chapters 2 through 5. 2. From probability to machine learning: Many machine learning applications are addressed using probabilistic models, whose parameters are then learned in a data-driven manner. Chapters 6 through 9 explore how different models from probability and statistics are applied to machine learning. Perhaps the most important tool that bridges the gap from data to probability is maximum-likelihood estimation, which is a foundational concept from the perspective of machine learning. This concept is explored repeatedly in these chapters. 3. Advanced topics: Chapter 10 is devoted to discrete-state Markov processes. It explores the application of probability and statistics to a temporal and sequential setting, although the applications extend to more complex settings such as graphical data. Chapter 11 covers a number of probabilistic inequalities and approximations. The style of writing promotes the learning of probability and statistics simultaneously with a probabilistic perspective on the modeling of machine learning applications. The book contains over 200 worked examples in order to elucidate key concepts. Exercises are included both within the text of the chapters and at the end of the chapters. The book is written for a broad audience, including graduate students, researchers, and practitioners.
Probability and Statistics with Reliability, Queuing, and Computer Science Applications
by Kishor S. TrivediAn accessible introduction to probability, stochastic processes, and statistics for computer science and engineering applications This updated and revised edition of the popular classic relates fundamental concepts in probability and statistics to the computer sciences and engineering. The author uses Markov chains and other statistical tools to illustrate processes in reliability of computer systems and networks, fault tolerance, and performance. This edition features an entirely new section on stochastic Petri nets?as well as new sections on system availability modeling, wireless system modeling, numerical solution techniques for Markov chains, and software reliability modeling, among other subjects. Extensive revisions take new developments in solution techniques and applications into account and bring this work totally up to date. It includes more than 200 worked examples and self-study exercises for each section. Probability and Statistics with Reliability, Queuing and Computer Science Applications, Second Edition offers a comprehensive introduction to probability, stochastic processes, and statistics for students of computer science, electrical and computer engineering, and applied mathematics. Its wealth of practical examples and up-to-date information makes it an excellent resource for practitioners as well. An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.
Probability for Information Technology
by Changho SuhThis book introduces probabilistic modelling and explores its role in solving a broad spectrum of engineering problems that arise in Information Technology (IT). Divided into three parts, it begins by laying the foundation of basic probability concepts such as sample space, events, conditional probability, independence, total probability law and random variables. The second part delves into more advanced topics including random processes and key principles like Maximum A Posteriori (MAP) estimation, the law of large numbers and the central limit theorem. The last part applies these principles to various IT domains like communication, social networks, speech recognition, and machine learning, emphasizing the practical aspect of probability through real-world examples, case studies, and Python coding exercises. A notable feature of this book is its narrative style, seamlessly weaving together probability theories with both classical and contemporary IT applications. Each concept is reinforced with tightly-coupled exercise sets, and the associated fundamentals are explored mostly from first principles. Furthermore, it includes programming implementations of illustrative examples and algorithms, complemented by a brief Python tutorial. Departing from traditional organization, the book adopts a lecture-notes format, presenting interconnected themes and storylines. Primarily tailored for sophomore-level undergraduates, it also suits junior and senior-level courses. While readers benefit from mathematical maturity and programming exposure, supplementary materials and exercise problems aid understanding. Part III serves to inspire and provide insights for students and professionals alike, underscoring the pragmatic relevance of probabilistic concepts in IT.
Probability for Statistics and Machine Learning
by Anirban DasguptaThis book provides a versatile and lucid treatment of classic as well as modern probability theory, while integrating them with core topics in statistical theory and also some key tools in machine learning. It is written in an extremely accessible style, with elaborate motivating discussions and numerous worked out examples and exercises. The book has 20 chapters on a wide range of topics, 423 worked out examples, and 808 exercises. It is unique in its unification of probability and statistics, its coverage and its superb exercise sets, detailed bibliography, and in its substantive treatment of many topics of current importance. This book can be used as a text for a year long graduate course in statistics, computer science, or mathematics, for self-study, and as an invaluable research reference on probabiliity and its applications. Particularly worth mentioning are the treatments of distribution theory, asymptotics, simulation and Markov Chain Monte Carlo, Markov chains and martingales, Gaussian processes, VC theory, probability metrics, large deviations, bootstrap, the EM algorithm, confidence intervals, maximum likelihood and Bayes estimates, exponential families, kernels, and Hilbert spaces, and a self contained complete review of univariate probability.
Probability in Electrical Engineering and Computer Science: An Application-Driven Course
by Jean WalrandThis revised textbook motivates and illustrates the techniques of applied probability by applications in electrical engineering and computer science (EECS). The author presents information processing and communication systems that use algorithms based on probabilistic models and techniques, including web searches, digital links, speech recognition, GPS, route planning, recommendation systems, classification, and estimation. He then explains how these applications work and, along the way, provides the readers with the understanding of the key concepts and methods of applied probability. Python labs enable the readers to experiment and consolidate their understanding. The book includes homework, solutions, and Jupyter notebooks. This edition includes new topics such as Boosting, Multi-armed bandits, statistical tests, social networks, queuing networks, and neural networks. For ancillaries related to this book, including examples of Python demos and also Python labs used in Berkeley, please email Mary James at mary.james@springer.com. This is an open access book.
Probability, Markov Chains, Queues, and Simulation: The Mathematical Basis of Performance Modeling
by William J. StewartProbability, Markov Chains, Queues, and Simulation provides a modern and authoritative treatment of the mathematical processes that underlie performance modeling. The detailed explanations of mathematical derivations and numerous illustrative examples make this textbook readily accessible to graduate and advanced undergraduate students taking courses in which stochastic processes play a fundamental role. The textbook is relevant to a wide variety of fields, including computer science, engineering, operations research, statistics, and mathematics.The textbook looks at the fundamentals of probability theory, from the basic concepts of set-based probability, through probability distributions, to bounds, limit theorems, and the laws of large numbers. Discrete and continuous-time Markov chains are analyzed from a theoretical and computational point of view. Topics include the Chapman-Kolmogorov equations; irreducibility; the potential, fundamental, and reachability matrices; random walk problems; reversibility; renewal processes; and the numerical computation of stationary and transient distributions. The M/M/1 queue and its extensions to more general birth-death processes are analyzed in detail, as are queues with phase-type arrival and service processes. The M/G/1 and G/M/1 queues are solved using embedded Markov chains; the busy period, residual service time, and priority scheduling are treated. Open and closed queueing networks are analyzed. The final part of the book addresses the mathematical basis of simulation.Each chapter of the textbook concludes with an extensive set of exercises. An instructor's solution manual, in which all exercises are completely worked out, is also available (to professors only).Numerous examples illuminate the mathematical theoriesCarefully detailed explanations of mathematical derivations guarantee a valuable pedagogical approachEach chapter concludes with an extensive set of exercises
Probability, Random Variables, and Random Processes
by John J. ShynkProbability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background.The book has the following features:Several appendices include related material on integration, important inequalities and identities, frequency-domain transforms, and linear algebra. These topics have been included so that the book is relatively self-contained. One appendix contains an extensive summary of 33 random variables and their properties such as moments, characteristic functions, and entropy.Unlike most books on probability, numerous figures have been included to clarify and expand upon important points. Over 600 illustrations and MATLAB plots have been designed to reinforce the material and illustrate the various characterizations and properties of random quantities.Sufficient statistics are covered in detail, as is their connection to parameter estimation techniques. These include classical Bayesian estimation and several optimality criteria: mean-square error, mean-absolute error, maximum likelihood, method of moments, and least squares.The last four chapters provide an introduction to several topics usually studied in subsequent engineering courses: communication systems and information theory; optimal filtering (Wiener and Kalman); adaptive filtering (FIR and IIR); and antenna beamforming, channel equalization, and direction finding. This material is available electronically at the companion website.Probability, Random Variables, and Random Processes is the only textbook on probability for engineers that includes relevant background material, provides extensive summaries of key results, and extends various statistical techniques to a range of applications in signal processing.
Probabilità, Statistica e Simulazione: Programmi applicativi scritti in R (UNITEXT #131)
by Alberto Rotondi Paolo Pedroni Antonio PievatoloIl libro contiene in forma compatta il programma svolto negli insegnamenti introduttivi di Statistica e tratta alcuni argomenti indispensabili per l'attività di ricerca, come le tecniche di simulazione Monte Carlo, i metodi di inferenza statistica, di best fit e di analisi dei dati di laboratorio. Gli argomenti vengono sviluppati partendo dai fondamenti, evidenziandone gli aspetti applicativi, fino alla descrizione dettagliata di molti casi di particolare rilevanza in ambito scientifico e tecnico. Il testo è rivolto agli studenti universitari dei corsi ad indirizzo scientifico e a tutti quei ricercatori che devono risolvere problemi concreti che coinvolgono l’analisi dei dati e le tecniche di simulazione. In questa edizione, completamente rivista e corretta, sono stati aggiunti alcuni importanti argomenti sul test d’ipotesi (a cui è stato dedicato un capitolo interamente nuovo) e sul trattamento degli errori sistematici. Per la prima volta è stato adottato il software R, con una ricca libreria di programmi originali accessibile al lettore.