- Table View
- List View
Analysis of Images, Social Networks and Texts: 7th International Conference, AIST 2018, Moscow, Russia, July 5–7, 2018, Revised Selected Papers (Lecture Notes in Computer Science #11179)
by Wil M. P. van der Aalst Vladimir Batagelj Goran Glavaš Dmitry I. Ignatov Michael Khachay Sergei O. Kuznetsov Olessia Koltsova Irina A. Lomazova Natalia Loukachevitch Amedeo Napoli Alexander Panchenko Panos M. Pardalos Marcello Pelillo Andrey V. SavchenkoThis book constitutes the proceedings of the 7th International Conference on Analysis of Images, Social Networks and Texts, AIST 2018, held in Moscow, Russia, in July 2018. The 29 full papers were carefully reviewed and selected from 107 submissions (of which 26 papers were rejected without being reviewed). The papers are organized in topical sections on natural language processing; analysis of images and video; general topics of data analysis; analysis of dynamic behavior through event data; optimization problems on graphs and network structures; and innovative systems.
Analysis of Images, Social Networks and Texts: 9th International Conference, AIST 2020, Skolkovo, Moscow, Russia, October 15–16, 2020, Revised Selected Papers (Lecture Notes in Computer Science #12602)
by Wil M. P. van der Aalst Vladimir Batagelj Dmitry I. Ignatov Michael Khachay Olessia Koltsova Andrey Kutuzov Sergei O. Kuznetsov Irina A. Lomazova Natalia Loukachevitch Amedeo Napoli Alexander Panchenko Panos M. Pardalos Marcello Pelillo Andrey V. Savchenko Elena TutubalinaThis book constitutes revised selected papers from the 9th International Conference on Analysis of Images, Social Networks and Texts, AIST 2020, held during October 15-16, 2020. The conference was planned to take place in Moscow, Russia, but changed to an online format due to the COVID-19 pandemic.The 27 full papers and 4 short papers presented in this volume were carefully reviewed and selected from a total of 108 qualified submissions. The papers are organized in topical sections as follows: invited papers; natural language processing; computer vision; social network analysis; data analysis and machine learning; theoretical machine learning and optimization; and process mining.
Analysis of Images, Social Networks and Texts: 8th International Conference, AIST 2019, Kazan, Russia, July 17–19, 2019, Revised Selected Papers (Lecture Notes in Computer Science #11832)
by Wil M. P. van der Aalst Vladimir Batagelj Dmitry I. Ignatov Michael Khachay Valentina Kuskova Andrey Kutuzov Sergei O. Kuznetsov Irina A. Lomazova Natalia Loukachevitch Amedeo Napoli Panos M. Pardalos Marcello Pelillo Andrey V. Savchenko Elena TutubalinaThis book constitutes the post-conference proceedings of the 8th International Conference on Analysis of Images, Social Networks and Texts, AIST 2019, held in Kazan, Russia, in July 2019. The 27 full and 8 short papers were carefully reviewed and selected from 134 submissions (of which 21 papers were automatically rejected without being reviewed). The papers are organized in topical sections on general topics of data analysis; natural language processing; social network analysis; analysis of images and video; optimization problems on graphs and network structures; and analysis of dynamic behavior through event data.
Analysis of Images, Social Networks and Texts: 6th International Conference, AIST 2017, Moscow, Russia, July 27–29, 2017, Revised Selected Papers (Lecture Notes in Computer Science #10716)
by Wil M. P. van der Aalst Dmitry I. Ignatov Michael Khachay Sergei O. Kuznetsov Victor Lempitsky Irina A. Lomazova Natalia Loukachevitch Amedeo Napoli Alexander Panchenko Panos M. Pardalos Andrey V. Savchenko Stanley WassermanThis book constitutes the proceedings of the 6th International Conference on Analysis of Images, Social Networks and Texts, AIST 2017, held in Moscow, Russia, in July 2017. The 29 full papers and 8 short papers were carefully reviewed and selected from 127 submissions. The papers are organized in topical sections on natural language processing; general topics of data analysis; analysis of images and video; optimization problems on graphs and network structures; analysis of dynamic behavior through event data; social network analysis.
Analysis of Incidence Rates (Chapman & Hall/CRC Biostatistics Series)
by Peter CummingsIncidence rates are counts divided by person-time; mortality rates are a well-known example. Analysis of Incidence Rates offers a detailed discussion of the practical aspects of analyzing incidence rates. Important pitfalls and areas of controversy are discussed. The text is aimed at graduate students, researchers, and analysts in the disciplines of epidemiology, biostatistics, social sciences, economics, and psychology. Features: Compares and contrasts incidence rates with risks, odds, and hazards. Shows stratified methods, including standardization, inverse-variance weighting, and Mantel-Haenszel methods Describes Poisson regression methods for adjusted rate ratios and rate differences. Examines linear regression for rate differences with an emphasis on common problems. Gives methods for correcting confidence intervals. Illustrates problems related to collapsibility. Explores extensions of count models for rates, including negative binomial regression, methods for clustered data, and the analysis of longitudinal data. Also, reviews controversies and limitations. Presents matched cohort methods in detail. Gives marginal methods for converting adjusted rate ratios to rate differences, and vice versa. Demonstrates instrumental variable methods. Compares Poisson regression with the Cox proportional hazards model. Also, introduces Royston-Parmar models. All data and analyses are in online Stata files which readers can download. Peter Cummings is Professor Emeritus, Department of Epidemiology, School of Public Health, University of Washington, Seattle WA. His research was primarily in the field of injuries. He used matched cohort methods to estimate how the use of seat belts and presence of airbags were related to death in a traffic crash. He is author or co-author of over 100 peer-reviewed articles.
Analysis of Industrial Clusters in China
by Zhu YingmingTaking a close look at the national economic system of China, this book defines industrial clusters, then summarizes their measurement indices and identifies their methods. The author identifies 11 industrial clusters and analyses their structural relationships. He studies the relationships between structures and characters of industrial clusters u
Analysis of Infectious Disease Data
by N.G. BeckerThe book gives an up-to-date account of various approaches availablefor the analysis of infectious disease data. Most of the methods havebeen developed only recently, and for those based on particularlymodern mathematics, details of the computation are carefullyillustrated. Interpretation is discussed at some length and the emphasisthroughout is on making statistical inferences about epidemiologicallyimportant parameters.Niels G. Becker is Reader in Statistics at La Trobe University,Australia.
Analysis of Integrated Data (Chapman & Hall/CRC Statistics in the Social and Behavioral Sciences)
by Li-Chun Zhang and Raymond L. ChambersThe advent of "Big Data" has brought with it a rapid diversification of data sources, requiring analysis that accounts for the fact that these data have often been generated and recorded for different reasons. Data integration involves combining data residing in different sources to enable statistical inference, or to generate new statistical data for purposes that cannot be served by each source on its own. This can yield significant gains for scientific as well as commercial investigations. However, valid analysis of such data should allow for the additional uncertainty due to entity ambiguity, whenever it is not possible to state with certainty that the integrated source is the target population of interest. Analysis of Integrated Data aims to provide a solid theoretical basis for this statistical analysis in three generic settings of entity ambiguity: statistical analysis of linked datasets that may contain linkage errors; datasets created by a data fusion process, where joint statistical information is simulated using the information in marginal data from non-overlapping sources; and estimation of target population size when target units are either partially or erroneously covered in each source. Covers a range of topics under an overarching perspective of data integration. Focuses on statistical uncertainty and inference issues arising from entity ambiguity. Features state of the art methods for analysis of integrated data. Identifies the important themes that will define future research and teaching in the statistical analysis of integrated data. Analysis of Integrated Data is aimed primarily at researchers and methodologists interested in statistical methods for data from multiple sources, with a focus on data analysts in the social sciences, and in the public and private sectors.
Analysis of Jazz: A Comprehensive Approach (American Made Music Series)
by Laurent CugnyAnalysis of Jazz: A Comprehensive Approach, originally published in French as Analyser le jazz, is available here in English for the first time. In this groundbreaking volume, Laurent Cugny examines and connects the theoretical and methodological processes that underlie all of jazz. Jazz in all its forms has been researched and analyzed by performers, scholars, and critics, and Analysis of Jazz is required reading for any serious study of jazz; but not just musicians and musicologists analyze jazz. All listeners are analysts to some extent. Listening is an active process; it may not involve questioning but it always involves remembering, comparing, and listening again. This book is for anyone who attentively listens to and wants to understand jazz. Divided into three parts, the book focuses on the work of jazz, analytical parameters, and analysis. In part one, Cugny aims at defining what a jazz work is precisely, offering suggestions based on the main features of definition and structure. Part two he dedicates to the analytical parameters of jazz in which a work is performed: harmony, rhythm, form, sound, and melody. Part three takes up the analysis of jazz itself, its history, issues of transcription, and the nature of improvised solos. In conclusion, Cugny addresses the issues of interpretation to reflect on the goals of analysis with regard to understanding the history of jazz and the different cultural backgrounds in which it takes place. Analysis of Jazz presents a detailed inventory of theoretical tools and issues necessary for understanding jazz.
Analysis of Kinetic Reaction Mechanisms
by Tamás Turányi Alison S. TomlinChemical processes in many fields of science and technology, including combustion, atmospheric chemistry, environmental modelling, process engineering, and systems biology, can be described by detailed reaction mechanisms consisting of numerous reaction steps. This book describes methods for the analysis of reaction mechanisms that are applicable in all these fields. Topics addressed include: how sensitivity and uncertainty analyses allow the calculation of the overall uncertainty of simulation results and the identification of the most important input parameters, the ways in which mechanisms can be reduced without losing important kinetic and dynamic detail, and the application of reduced models for more accurate engineering optimizations. This monograph is invaluable for researchers and engineers dealing with detailed reaction mechanisms, but is also useful for graduate students of related courses in chemistry, mechanical engineering, energy and environmental science and biology.
An Analysis of Knowing (Routledge Revivals)
by John Hartland-SwannFirst published in 1958, this book focuses on the meaning, interpretation, and use of the verb ‘to know’. In our daily lives we are often claiming to know this or not to know that; and it is not therefore surprising that the verb has played a major role in philosophical speculation from Plato down to Bertrand Russell. This book analyses the varying meanings of ‘know’ in its different operational roles: knowing Jones seems to have a different sort of logic from knowing French or from knowing what to do – and equally from knowing that the earth is round and from knowing how to read music. Knowing something is also different from merely believing it. The main purpose of this book is to elucidate, in a new and original way, this whole question of the logical behaviour of ‘know’; but its further and no less important purpose is to show how, once we have grasped the way in which certain key ‘know’-statements function, a number of philosophical disputes may be discussed more fruitfully and settled more expeditiously. Some of the analyses offered will be regarded as controversial and will undoubtedly provoke discussion. The style is lucid and economical and technical terms are reduced to a minimum. This work is intended not only for the professional philosopher and the university student, but also for the general reader who is interested in the methods of modern philosophical analysis.
The Analysis of Knowledge (Routledge Library Editions: Epistemology)
by Ledger WoodOriginally published in 1940. Firstly, this book seeks to combine epistemology and the new developments of the time in psychology. It holds that no epistemology can be sound if it is psychologically defective, nor can a psychological analysis of knowledge be philosophically naïve. Secondly, it attempts to suggest a single structural pattern underlying every type of cognitive situation. Offering a significant reorientation to epistemological thought of its time, this work considers perception, sense and memory and examines the referential theory of knowledge. It is a lucid and precisely organised reading and analysis of knowledge.
Analysis of Large and Complex Data (Studies in Classification, Data Analysis, and Knowledge Organization #0)
by Adalbert F.X. Wilhelm Hans A. KestlerThis book offers a snapshot of the state-of-the-art in classification at the interface between statistics, computer science and application fields. The contributions span a broad spectrum, from theoretical developments to practical applications; they all share a strong computational component. The topics addressed are from the following fields: Statistics and Data Analysis; Machine Learning and Knowledge Discovery; Data Analysis in Marketing; Data Analysis in Finance and Economics; Data Analysis in Medicine and the Life Sciences; Data Analysis in the Social, Behavioural, and Health Care Sciences; Data Analysis in Interdisciplinary Domains; Classification and Subject Indexing in Library and Information Science. The book presents selected papers from the Second European Conference on Data Analysis, held at Jacobs University Bremen in July 2014. This conference unites diverse researchers in the pursuit of a common topic, creating truly unique synergies in the process.
Analysis of Legal Argumentation Documents: A Computational Argumentation Approach (Translational Systems Sciences #29)
by Hayato Hirata Katsumi NittaThis book introduces methods to analyze legal documents such as negotiation records and legal precedents, using computational argumentation theory.First, a method to automatically evaluate argumentation skills from the records of argumentation exercises is proposed. In law school, argumentation exercises are often conducted and many records of them are produced. From each utterance in the record, a pattern of “speech act +factor” is extracted, and argumentation skills are evaluated from the sequences of the patterns, using a scoring prediction model constructed by multiple regression analyses between the appearance pattern and the scoring results. The usefulness of this method is shown by applying it to the example case “the garbage house problem”. Second, a method of extracting factors (elements that characterize precedents and cases) and legal topoi from individual precedents and using them as the expression of precedents to analyze how the pattern of factors and legal topoi appearing in a group of precedents affects the judgment (plaintiff wins/defendant wins) is proposed. This method has been applied to a group of tax cases. Third, the logical structure of 70 labor cases is described in detail by using factors and a bipolar argumentation framework (BAF) and an (extended argumentation framework (EAF) together. BAF describes the logical structure between plaintiff and defendant, and EAF describes the decision of the judge. Incorporating the legal topoi into the EAF of computational argumentation theory, the strength of the analysis of precedents by combined use of factored BAF and EAF, not only which argument the judge adopted could be specified. It was also possible to determine what kind of value judgment was made and to verify the logic. The analysis methods in this book demonstrate the application of logic-based AI methods to the legal domain, and they contribute to the education and training of law school students in logical ways of argumentation.
The Analysis of Legal Cases: A Narrative Approach (Law, Language and Communication)
by Flora Di DonatoThis book examines the roles played by narrative and culture in the construction of legal cases and their resolution. It is articulated in two parts. Part I recalls epistemological turns in legal thinking as it moves from theory to practice in order to show how facts are constructed within the legal process. By combining interdisciplinary paradigms and methods, the work analyses the evolution of facts from their expression by the client to their translation within the lawyer-client relationship and the subsequent decision of the judge, focusing on the dynamic activity of narrative constuction among the key actors: client, lawyer and judge. Part II expands the scientific framework toward a law-and-culture-oriented perspective, illustrating how legal stories come about in the fabric of the authentic dimensions of everyday life. The book stresses the capacity of laypeople, who in this activity are equated with clients, to shape the law, dealing not just with formal rules, but also with implicit or customary rules, in given contexts. By including the illustration of cases concerning vulnerable clients, it lays the foundations for developing a socio-clinical research programme, whose aims including enabling lay and expert actors to meet for the purposes of improving forms of collective narrations and generating more just legal systems.
The Analysis of Linear Economic Systems: Father Maurice Potron’s Pioneering Works (Routledge Studies In The History Of Economics #117)
by Christian Bidard Guido Erreygers Paul A. SamuelsonMaurice Potron (1872-1942), a French Jesuit mathematician, constructed and analyzed a highly original, but virtually unknown economic model. This book presents translated versions of all his economic writings, preceded by a long introduction which sketches his life and environment based on extensive archival research and family documents. Potron had no education in economics and almost no contact with the economists of his time. His primary source of inspiration was the social doctrine of the Church, which had been updated at the end of the nineteenth century. Faced with the ‘economic evils’ of his time, he reacted by utilizing his talents as a mathematician and an engineer to invent and formalize a general disaggregated model in which production, employment, prices and wages are the main unknowns. He introduced four basic principles or normative conditions (‘sufficient production’, the ‘right to rest’, ‘justice in exchange’, and the ‘right to live’) to define satisfactory regimes of production and labour on the one hand, and of prices and wages on the other. He studied the conditions for the existence of these regimes, both on the quantity side and the value side, and he explored the way to implement them. This book makes it clear that Potron was the first author to develop a full input-output model, to use the Perron-Frobenius theorem in economics, to state a duality result, and to formulate the Hawkins-Simon condition. These are all techniques which now belong to the standard toolkit of economists. This book will be of interest to Economics postgraduate students and researchers, and will be essential reading for courses dealing with the history of mathematical economics in general, and linear production theory in particular. Paul A. Samuelson’s short foreword to the book may have been his last academic contribution.
The Analysis of Linear Economic Systems: Father Maurice Potron�s Pioneering Works (Routledge Studies In The History Of Economics #117)
by Christian Bidard Guido Erreygers Paul A. SamuelsonMaurice Potron (1872-1942), a French Jesuit mathematician, constructed and analyzed a highly original, but virtually unknown economic model. This book presents translated versions of all his economic writings, preceded by a long introduction which sketches his life and environment based on extensive archival research and family documents.Potron had no education in economics and almost no contact with the economists of his time. His primary source of inspiration was the social doctrine of the Church, which had been updated at the end of the nineteenth century. Faced with the ‘economic evils’ of his time, he reacted by utilizing his talents as a mathematician and an engineer to invent and formalize a general disaggregated model in which production, employment, prices and wages are the main unknowns. He introduced four basic principles or normative conditions (‘sufficient production’, the ‘right to rest’, ‘justice in exchange’, and the ‘right to live’) to define satisfactory regimes of production and labour on the one hand, and of prices and wages on the other. He studied the conditions for the existence of these regimes, both on the quantity side and the value side, and he explored the way to implement them.This book makes it clear that Potron was the first author to develop a full input-output model, to use the Perron-Frobenius theorem in economics, to state a duality result, and to formulate the Hawkins-Simon condition. These are all techniques which now belong to the standard toolkit of economists. This book will be of interest to Economics postgraduate students and researchers, and will be essential reading for courses dealing with the history of mathematical economics in general, and linear production theory in particular.
Analysis of Longitudinal Data with Example
by You-Gan Wang Liya Fu Sudhir PaulDevelopment in methodology on longitudinal data is fast. Currently, there are a lack of intermediate /advanced level textbooks which introduce students and practicing statisticians to the updated methods on correlated data inference. This book will present a discussion of the modern approaches to inference, including the links between the theories of estimators and various types of efficient statistical models including likelihood-based approaches. The theory will be supported with practical examples of R-codes and R-packages applied to interesting case-studies from a number of different areas. Key Features: •Includes the most up-to-date methods •Use simple examples to demonstrate complex methods •Uses real data from a number of areas •Examples utilize R code
Analysis of Machining and Machine Tools
by Steven Y. Liang Albert J. ShihThis book provides readers with the fundamental, analytical, and quantitative knowledge of machining process planning and optimization based on advanced and practical understanding of machinery, mechanics, accuracy, dynamics, monitoring techniques, and control strategies that they need to understanding machining and machine tools. It is written for first-year graduate students in mechanical engineering, and is also appropriate for use as a reference book by practicing engineers. It covers topics such as single and multiple point cutting processes; grinding processes; machine tool components, accuracy, and metrology; shear stress in cutting, cutting temperature and thermal analysis, and machine tool chatter. The second section of the book is devoted to "Non-Traditional Machining," where readers can find chapters on electrical discharge machining, electrochemical machining, laser and electron beam machining, and biomedical machining. Examples of realistic problems that engineers are likely to face in the field are included, along with solutions and explanations that foster a didactic learning experience.
An Analysis of Manstein’s Winter Campaign on the Russian Front 1942-1943: A Perspective of the Operational Level of War and Its Implications
by Lt.-Col. Lawrence L. IzzoThis study is a historical analysis of the campaign waged by Field Marshal von Manstein on the Russian southern front during the winter 1942-43. The study begins just after the 6th Army's encirclement in Stalingrad and describes the four principal phases of Manstein's campaign: the attempted relief of the 6th Army; the protection of Army Group A as it disengaged from the Caucasus; the prevention of Manstein's lines of communications from being cut; and the counterblow to regain the initiative.The lessons learned from the campaign provide a perspective of battle at the operational level of war. The factors leading to Manstein's success are discussed and include: superior generalship at the operational level; superior tactical maturity of the Germans; and German tactical and operational agility. The study describes the transition from the operational defensive to the operational offensive and how a defender can have the initiative. Manstein's use of depth is explained. The concepts of center of gravity and operational art as they pertain to this campaign are also described.The study concludes with the implications of the lessons learned for a NATO-Soviet conflict in a central European scenario. The study points out that Manstein demonstrated that victory is possible even when forced to react to the enemy's plan. The ability of NATO to replicate, today, the agility of Manstein's forces and the synchronization achieved by his commanders is questioned. The implications of NATO's lack of operational depth, in contrast to Manstein, are described. The impact of changes in force design since World War II are also explained.
The Analysis of Matter (Routledge Classics)
by Bertrand RussellThe Analysis of Matter is the product of thirty years of thinking by one of the twentieth century's best-known philosophers. An inquiry into the philosophical foundations of physics, it was written against the background of stunning new developments in physics earlier in the century, above all relativity, as well as the excitement around quantum theory, which was just being developed. Concerned to place physics on a stable footing at a time of great theoretical change, Russell argues that the concept of matter itself can be replaced by a logical construction whose basic foundations are events. He is careful to point out that this does not prove that matter does not exist, but it does show that physicists can get on with their work without assuming that matter does exist. Russell argues that fundamental bits of ''matter'', such as electrons and protons, are simply groups of events connected in a certain way and their properties are all that are required for physics. This Routledge Classics edition includes the 1992 Introduction by John G. Slater.
Analysis of Medical Modalities for Improved Diagnosis in Modern Healthcare
by Varun Bajajand G. R. SinhaIn modern healthcare, various medical modalities play an important role in improving the diagnostic performance in healthcare systems for various applications, such as prosthesis design, surgical implant design, diagnosis and prognosis, and detection of abnormalities in the treatment of various diseases. Analysis of Medical Modalities for Improved Diagnosis in Modern Healthcare discusses the uses of analysis, modeling, and manipulation of modalities, such as EEG, ECG, EMG, PCG, EOG, MRI, and FMRI, for an automatic identification, classification, and diagnosis of different types of disorders and physiological states. The analysis and applications for post-processing and diagnosis are much-needed topics for researchers and faculty members all across the world in the field of automated and efficient diagnosis using medical modalities. To meet this need, this book emphasizes real-time challenges in medical modalities for a variety of applications for analysis, classification, identification, and diagnostic processes of healthcare systems. Each chapter starts with the introduction, need and motivation of the medical modality, and a number of applications for the identification and improvement of healthcare systems. The chapters can be read independently or consecutively by research scholars, graduate students, faculty members, and practicing scientists who wish to explore various disciplines of healthcare systems, such as computer sciences, medical sciences, and biomedical engineering. This book aims to improve the direction of future research and strengthen research efforts of healthcare systems through analysis of behavior, concepts, principles, and case studies. This book also aims to overcome the gap between usage of medical modalities and healthcare systems. Several novel applications of medical modalities have been unlocked in recent years, therefore new applications, challenges, and solutions for healthcare systems are the focus of this book.
Analysis of Membrane Lipids (Springer Protocols Handbooks)
by Rajendra Prasad Ashutosh SinghThis book provides a timely overview of analytical tools and methodological approaches for studying membrane lipids. It outlines the ground-breaking advances that have been made over the last two decades in high-throughput lipidomics, and in studying lipid-protein interactions, signalling pathways and the regulation of lipid metabolism.This user-friendly laboratory handbook is an ideal companion for membrane biologists, researchers, students, and clinicians alike. It is also well suited for teaching biochemistry, microbiology and biotechnology courses, making it a must-have for everyone whose work involves lipid research.
Analysis of Messy Data Volume 1: Designed Experiments, Second Edition
by George A. Milliken Dallas E. JohnsonA bestseller for nearly 25 years, Analysis of Messy Data, Volume 1: Designed Experiments helps applied statisticians and researchers analyze the kinds of data sets encountered in the real world. Written by two long-time researchers and professors, this second edition has been fully updated to reflect the many developments that have occurred since t
Analysis of Messy Data, Volume II: Nonreplicated Experiments
by Dallas E. Johnson George A. MillikenResearchers often do not analyze nonreplicated experiments statistically because they are unfamiliar with existing statistical methods that may be applicable. Analysis of Messy Data, Volume II details the statistical methods appropriate for nonreplicated experiments and explores ways to use statistical software to make the required computations feasible.