Browse Results

Showing 18,226 through 18,250 of 54,029 results

Awareness Systems: Advances in Theory, Methodology and Design (Human–Computer Interaction Series)

by Wendy Mackay Panos Markopoulos

Includes contributions by some leading authorities in the field of Awareness Systems

A Science of Operations: Machines, Logic and the Invention of Programming (History of Computing)

by Mark Priestley

Today, computers fulfil a dazzling array of roles, a flexibility resulting from the great range of programs that can be run on them. A Science of Operations examines the history of what we now call programming, defined not simply as computer programming, but more broadly as the definition of the steps involved in computations and other information-processing activities. This unique perspective highlights how the history of programming is distinct from the history of the computer, despite the close relationship between the two in the 20th century. The book also discusses how the development of programming languages is related to disparate fields which attempted to give a mechanical account of language on the one hand, and a linguistic account of machines on the other. Topics and features: Covers the early development of automatic computing, including Babbage's "mechanical calculating engines" and the applications of punched-card technology, examines the theoretical work of mathematical logicians such as Kleene, Church, Post and Turing, and the machines built by Zuse and Aiken in the 1930s and 1940s, discusses the role that logic played in the development of the stored program computer, describes the "standard model" of machine-code programming popularised by Maurice Wilkes, presents the complete table for the universal Turing machine in the Appendices, investigates the rise of the initiatives aimed at developing higher-level programming notations, and how these came to be thought of as 'languages' that could be studied independently of a machine, examines the importance of the Algol 60 language, and the framework it provided for studying the design of programming languages and the process of software development and explores the early development of object-oriented languages, with a focus on the Smalltalk project. This fascinating text offers a new viewpoint for historians of science and technology, as well as for the general reader. The historical narrative builds the story in a clear and logical fashion, roughly following chronological order.

Calculation and Computation in the Pre-electronic Era: The Mechanical and Electrical Ages (History of Computing)

by Aristotle Tympas

This text offers an introduction to the history of computing during the first (steam) and the second (electricity) industrial revolution.

Algorithms for Next Generation Networks (Computer Communications and Networks)

by Marina Thottan Graham Cormode

Data networking now plays a major role in everyday life and new applications continue to appear at a blinding pace. Yet we still do not have a sound foundation for designing, evaluating and managing these networks. This book covers topics at the intersection of algorithms and networking. It builds a complete picture of the current state of research on Next Generation Networks and the challenges for the years ahead. Particular focus is given to evolving research initiatives and the architecture they propose and implications for networking. Topics: Network design and provisioning, hardware issues, layer-3 algorithms and MPLS, BGP and Inter AS routing, packet processing for routing, security and network management, load balancing, oblivious routing and stochastic algorithms, network coding for multicast, overlay routing for P2P networking and content delivery. This timely volume will be of interest to a broad readership from graduate students to researchers looking to survey recent research its open questions.

The Definitive Guide to AdonisJs: Building Node. Js Applications With Javascript

by Christopher Pitt

Learn everything you need to master the AdonisJs framework, including topics such as interacting with a database, rendering templates, writing asynchronous code, and hosting sites with SSL. Along the way, you’ll see how to build a commerce application, which lists products and allows shoppers to register and purchase those products. The site will feature a product catalog, a shopping cart, user registration and login, and profile management.The Definitive Guide to AdonisJs also covers how to create a front-end build chain, so that you can use a modern front-end framework, such as React. You’ll discover how to connect your front end to the server, so that data and transactions can be shared between the two. Finally, you’ll see how to secure and deploy the application to a virtual private server, including how to apply for and install an SSL certificate and start accepting payments.After reading and using this book, you’ll know all you need about AdonisJs. You’ll have the tools to turn that side-project you’ve been thinking about into a real money-making product. It is written by a web expert and reviewed by the AdonisJs project lead. This is the complete start-to-finish guide you’ve been waiting for. What You'll LearnSet up Node.js and AdonisJs, so that you can start building your application Create and use views and template codeImplement cooperative multitasking, in JavaScriptRepresent eventual values with AdonisJs promisesOrganize and isolate your code in controllers and decorate them with middleware, to do things like authenticationBuild queries, using the Lucid DSL, and package these database entities up into model classesValidate form data and respond with the appropriate error messagesRespond to general framework errors with custom error pagesLearn the deeper parts of sessions and cookiesUpdate the state of the user interface with WebSocketsHost AdonisJs applications in a modern hosting environmentWho This Book Is ForReaders should have a functional understanding of JavaScript.

Visual Quality Assessment for Natural and Medical Image

by Yong Ding

Image quality assessment (IQA) is an essential technique in the design of modern, large-scale image and video processing systems. This book introduces and discusses in detail topics related to IQA, including the basic principles of subjective and objective experiments, biological evidence for image quality perception, and recent research developments. In line with recent trends in imaging techniques and to explain the application-specific utilization, it particularly focuses on IQA for stereoscopic (3D) images and medical images, rather than on planar (2D) natural images. In addition, a wealth of vivid, specific figures and formulas help readers deepen their understanding of fundamental and new applications for image quality assessment technology. This book is suitable for researchers, clinicians and engineers as well as students working in related disciplines, including imaging, displaying, image processing, and storage and transmission. By reviewing and presenting the latest advances, and new trends and challenges in the field, it benefits researchers and industrial R&D engineers seeking to implement image quality assessment systems for specific applications or design/optimize image/video processing algorithms.

Computational Methods in Biometric Authentication: Statistical Methods for Performance Evaluation (Information Science and Statistics)

by Michael E. Schuckers

Biometrics, the science of using physical traits to identify individuals, is playing an increasing role in our security-conscious society and across the globe. Biometric authentication, or bioauthentication, systems are being used to secure everything from amusement parks to bank accounts to military installations. Yet developments in this field have not been matched by an equivalent improvement in the statistical methods for evaluating these systems. Compensating for this need, this unique text/reference provides a basic statistical methodology for practitioners and testers of bioauthentication devices, supplying a set of rigorous statistical methods for evaluating biometric authentication systems. This framework of methods can be extended and generalized for a wide range of applications and tests. This is the first single resource on statistical methods for estimation and comparison of the performance of biometric authentication systems. The book focuses on six common performance metrics: for each metric, statistical methods are derived for a single system that incorporates confidence intervals, hypothesis tests, sample size calculations, power calculations and prediction intervals. These methods are also extended to allow for the statistical comparison and evaluation of multiple systems for both independent and paired data. Topics and features: * Provides a statistical methodology for the most common biometric performance metrics: failure to enroll (FTE), failure to acquire (FTA), false non-match rate (FNMR), false match rate (FMR), and receiver operating characteristic (ROC) curves * Presents methods for the comparison of two or more biometric performance metrics * Introduces a new bootstrap methodology for FMR and ROC curve estimation * Supplies more than 120 examples, using publicly available biometric data where possible * Discusses the addition of prediction intervals to the bioauthentication statistical toolset * Describes sample-size and power calculations for FTE, FTA, FNMR and FMR Researchers, managers and decisions makers needing to compare biometric systems across a variety of metrics will find within this reference an invaluable set of statistical tools. Written for an upper-level undergraduate or master's level audience with a quantitative background, readers are also expected to have an understanding of the topics in a typical undergraduate statistics course. Dr. Michael E. Schuckers is Associate Professor of Statistics at St. Lawrence University, Canton, NY, and a member of the Center for Identification Technology Research.

An Introduction to Object Recognition: Selected Algorithms for a Wide Variety of Applications (Advances in Computer Vision and Pattern Recognition)

by Marco Alexander Treiber

Rapid development of computer hardware has enabled usage of automatic object recognition in an increasing number of applications, ranging from industrial image processing to medical applications, as well as tasks triggered by the widespread use of the internet. Each area of application has its specific requirements, and consequently these cannot all be tackled appropriately by a single, general-purpose algorithm. This easy-to-read text/reference provides a comprehensive introduction to the field of object recognition (OR). The book presents an overview of the diverse applications for OR and highlights important algorithm classes, presenting representative example algorithms for each class. The presentation of each algorithm describes the basic algorithm flow in detail, complete with graphical illustrations. Pseudocode implementations are also included for many of the methods, and definitions are supplied for terms which may be unfamiliar to the novice reader. Supporting a clear and intuitive tutorial style, the usage of mathematics is kept to a minimum. Topics and features: presents example algorithms covering global approaches, transformation-search-based methods, geometrical model driven methods, 3D object recognition schemes, flexible contour fitting algorithms, and descriptor-based methods; explores each method in its entirety, rather than focusing on individual steps in isolation, with a detailed description of the flow of each algorithm, including graphical illustrations; explains the important concepts at length in a simple-to-understand style, with a minimum usage of mathematics; discusses a broad spectrum of applications, including some examples from commercial products; contains appendices discussing topics related to OR and widely used in the algorithms, (but not at the core of the methods described in the chapters). Practitioners of industrial image processing will find this simple introduction and overview to OR a valuable reference, as will graduate students in computer vision courses. Marco Treiber is a software developer at Siemens Electronics Assembly Systems, Munich, Germany, where he is Technical Lead in Image Processing for the Vision System of SiPlace placement machines, used in SMT assembly.

Cloud Computing: Principles, Systems and Applications (Computer Communications and Networks)

by Lee Gillam Nikos Antonopoulos

Cloud computing continues to emerge as a subject of substantial industrial and academic interest. Although the meaning and scope of "cloud computing" continues to be debated, the current notion of clouds blurs the distinctions between grid services, web services, and data centers, among other areas. Clouds also bring considerations of lowering the cost for relatively bursty applications to the fore. Cloud Computing: Principles, Systems and Applications is an essential reference/guide that provides thorough and timely examination of the services, interfaces and types of applications that can be executed on cloud-based systems. The book identifies and highlights state-of-the-art techniques and methods for designing cloud systems, presents mechanisms and schemes for linking clouds to economic activities, and offers balanced coverage of all related technologies that collectively contribute towards the realization of cloud computing. With an emphasis on the conceptual and systemic links between cloud computing and other distributed computing approaches, this text also addresses the practical importance of efficiency, scalability, robustness and security as the four cornerstones of quality of service. Topics and features: explores the relationship of cloud computing to other distributed computing paradigms, namely peer-to-peer, grids, high performance computing and web services; presents the principles, techniques, protocols and algorithms that can be adapted from other distributed computing paradigms to the development of successful clouds; includes a Foreword by Professor Mark Baker of the University of Reading, UK; examines current cloud-practical applications and highlights early deployment experiences; elaborates the economic schemes needed for clouds to become viable business models. This book will serve as a comprehensive reference for researchers and students engaged in cloud computing. Professional system architects, technical managers, and IT consultants will also find this unique text a practical guide to the application and delivery of commercial cloud services. Prof. Nick Antonopoulos is Head of the School of Computing, University of Derby, UK. Dr. Lee Gillam is a Lecturer in the Department of Computing at the University of Surrey, UK.

Brain-Computer Interfaces: Applying our Minds to Human-Computer Interaction (Human–Computer Interaction Series)

by Anton Nijholt Desney S. Tan

For generations, humans have fantasized about the ability to create devices that can see into a person's mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science fiction stories. Recent advances in cognitive neuroscience and brain imaging technologies have started to turn these myths into a reality, and are providing us with the ability to interface directly with the human brain. This ability is made possible through the use of sensors that monitor physical processes within the brain which correspond with certain forms of thought. Brain-Computer Interfaces: Applying our Minds to Human-Computer Interaction broadly surveys research in the Brain-Computer Interface domain. More specifically, each chapter articulates some of the challenges and opportunities for using brain sensing in Human-Computer Interaction work, as well as applying Human-Computer Interaction solutions to brain sensing work. For researchers with little or no expertise in neuroscience or brain sensing, the book provides background information to equip them to not only appreciate the state-of-the-art, but also ideally to engage in novel research. For expert Brain-Computer Interface researchers, the book introduces ideas that can help in the quest to interpret intentional brain control and develop the ultimate input device. It challenges researchers to further explore passive brain sensing to evaluate interfaces and feed into adaptive computing systems. Most importantly, the book will connect multiple communities allowing research to leverage their work and expertise and blaze into the future.

Data Mining: Concepts, Methods and Applications in Management and Engineering Design (Decision Engineering)

by Jiafu Tang Yong Yin Ikou Kaku Jianming Zhu

Data Mining introduces in clear and simple ways how to use existing data mining methods to obtain effective solutions for a variety of management and engineering design problems. Data Mining is organised into two parts: the first provides a focused introduction to data mining and the second goes into greater depth on subjects such as customer analysis. It covers almost all managerial activities of a company, including: * supply chain design, * product development, * manufacturing system design, * product quality control, and * preservation of privacy. Incorporating recent developments of data mining that have made it possible to deal with management and engineering design problems with greater efficiency and efficacy, Data Mining presents a number of state-of-the-art topics. It will be an informative source of information for researchers, but will also be a useful reference work for industrial and managerial practitioners.

Decision Support Using Nonparametric Statistics

by Warren Beatty

This concise volume covers nonparametric statistics topics that most are most likely to be seen and used from a practical decision support perspective. While many degree programs require a course in parametric statistics, these methods are often inadequate for real-world decision making in business environments. Much of the data collected today by business executives (for example, customer satisfaction opinions) requires nonparametric statistics for valid analysis, and this book provides the reader with a set of tools that can be used to validly analyze all data, regardless of type. Through numerous examples and exercises, this book explains why nonparametric statistics will lead to better decisions and how they are used to reach a decision, with a wide array of business applications. Online resources include exercise data, spreadsheets, and solutions.

Java Image Processing Recipes: With OpenCV and JVM

by Nicolas Modrzyk

Quickly obtain solutions to common Java image processing problems, learn best practices, and understand everything OpenCV has to offer for image processing. You will work with a JVM image wrapper to make it very easy to run image transformation through pipelines and obtain instant visual feedback. This book makes heavy use of the Gorilla environment where code can be executed directly in the browser, and image transformation results can also be visualized directly in the browser.Java Image Processing Recipes includes recipes on more advanced image manipulation techniques, such as image smoothing, cartooning, sketching, and mastering masks to apply changes only to parts of the image. You’ll see how OpenCV features provide instant solutions to problems such as edges detection and shape finding. Finally, the book contains practical recipes dealing with webcams and various video streams, giving you ready-made code with which to do real-time video analysis. What You Will LearnCreate your personal real-time image manipulation environmentManipulate image characteristics with OpenCVWork with the Origami image wrapperApply manipulations to webcams and video streamsWho This Book Is ForDevelopers that want to manipulate images and use other advanced imaging techniques, through code running in the JVM.

Application and Multidisciplinary Aspects of Wireless Sensor Networks: Concepts, Integration, and Case Studies (Computer Communications and Networks)

by Veljko Milutinović Srdjan Krco Roman Trobec Liljana Gavrilovska Ivan Stojmenovic

It is a general trend in computing that computers are becoming ever smaller and ever more interconnected. Sensor networks - large networks of small, simple devices - are a logical extreme of this trend. Wireless sensor networks (WSNs) are attracting an increasing degree of research interest, with a growing number of industrial applications starting to emerge. Two of these applications, personal health monitoring and emergency/disaster recovery, are the focus of the European Commission project ProSense: Promote, Mobilize, Reinforce and Integrate Wireless Sensor Networking Research and Researchers. This hands-on introduction to WSN systems development presents a broad coverage of topics in the field, contributed by researchers involved in the ProSense project. An emphasis is placed on the practical knowledge required for the successful implementation of WSNs. Divided into four parts, the first part covers basic issues of sensors, software, and position-based routing protocols. Part two focuses on multidisciplinary issues, including sensor network integration, mobility aspects, georouting, medical applications, and vehicular sensor networks. The remaining two parts present case studies and further applications. Topics and features: presents a broad overview of WSN technology, including an introduction to sensor and sensing technologies; contains an extensive section on case studies, providing details of the development of a number of WSN applications; discusses frameworks for WSN systems integration, through which WSN technology will become fundamental to the Future Internet concept; investigates real-world applications of WSN systems in medical and vehicular sensor networks; with a Foreword by the Nobel Laurate Professor Martin Perl of Stanford University. Providing holistic coverage of WSN technology, this text/reference will enable graduate students of computer science, electrical engineering and telecommunications to master the specific domains of this emerging area. The book will also be a valuable resource for researchers and practitioners interested in entering the field.

Algorithms from and for Nature and Life: Classification and Data Analysis (Studies in Classification, Data Analysis, and Knowledge Organization)

by Berthold Lausen Alfred Ultsch Dirk Van den Poel

This volume provides approaches and solutions to challenges occurring at the interface of research fields such as, e.g., data analysis, data mining and knowledge discovery, computer science, operations research, and statistics. In addition to theory-oriented contributions various application areas are included. Moreover, traditional classification research directions concerning network data, graphs, and social relationships as well as statistical musicology describe examples for current interest fields tackled by the authors. The book comprises a total of 55 selected papers presented at the Joint Conference of the German Classification Society (GfKl), the German Association for Pattern Recognition (DAGM), and the Symposium of the International Federation of Classification Societies (IFCS) in 2011.

Conversations About Challenges in Computing

by Aslak Tveito Are Magnus Bruaset

This text sheds light on how mathematical models and computing can help understanding and prediction of complicated physical processes; how communication networks should be designed and implemented to meet the increasingly challenging requirements from users; and how modern engineering principles can lead to better and more robust software systems. Through interviews with 12 internationally recognized researchers within these fields, conducted by the well-known science writer Dana Mackenzie and the science journalist Kathrine Aspaas, the reader gets views on recent achievements and future challenges.

An Intelligent Customer Complaint Management System with Application to the Transport and Logistics Industry (Springer Theses)

by Alireza Faed

This thesis addresses the issue of customer complaints in the context of Customer Relationship Management (CRM). After a comprehensive survey of the current literature on CRM, the thesis describes the development of a new intelligent CRM (I-CRM) framework, which integrates text analytics, type mapping, SPSS, structural equation modeling, and linear and fuzzy approaches. This new methodology, in contrast to previous ones, is able to handle customer complaints with respect to different variables, thus allowing organizations to find their key customers and key complaints, and to address and provide solution to the major complaints of the key customers, hence promoting business development. The thesis also describes the successful application of the method to a real-world case, represented by the immeasurable truck drivers complaints at the Fremantle port in Western Australia.

Compressed Sensing with Side Information on the Feasible Region (SpringerBriefs in Electrical and Computer Engineering)

by Mohammad Rostami

This book discusses compressive sensing in the presence of side information. Compressive sensing is an emerging technique for efficiently acquiring and reconstructing a signal. Interesting instances of Compressive Sensing (CS) can occur when, apart from sparsity, side information is available about the source signals. The side information can be about the source structure, distribution, etc. Such cases can be viewed as extensions of the classical CS. In these cases we are interested in incorporating the side information to either improve the quality of the source reconstruction or decrease the number of samples required for accurate reconstruction. In this book we assume availability of side information about the feasible region. The main applications investigated are image deblurring for optical imaging, 3D surface reconstruction, and reconstructing spatiotemporally correlated sources. The author shows that the side information can be used to improve the quality of the reconstruction compared to the classic compressive sensing. The book will be of interest to all researchers working on compressive sensing, inverse problems, and image processing.

Advanced Intelligent Computational Technologies and Decision Support Systems (Studies in Computational Intelligence #486)

by Roumen Kountchev Barna Iantovics

This book offers a state of the art collection covering themes related to Advanced Intelligent Computational Technologies and Decision Support Systems which can be applied to fields like healthcare assisting the humans in solving problems. The book brings forward a wealth of ideas, algorithms and case studies in themes like: intelligent predictive diagnosis; intelligent analyzing of medical images; new format for coding of single and sequences of medical images; Medical Decision Support Systems; diagnosis of Down's syndrome; computational perspectives for electronic fetal monitoring; efficient compression of CT Images; adaptive interpolation and halftoning for medical images; applications of artificial neural networks for real-life problems solving; present and perspectives for Electronic Healthcare Record Systems; adaptive approaches for noise reduction in sequences of CT images etc.

Defying Reality: The Inside Story of the Virtual Reality Revolution

by David M. Ewalt

A fascinating exploration of the history, development, and future of virtual reality, a technology with world-changing potential, written by award-winning journalist and author David Ewalt, stemming from his 2015 Forbes cover story about the Oculus Rift and its creator Palmer Luckey.You’ve heard about virtual reality, seen the new gadgets, and read about how VR will be the next big thing. But you probably haven’t yet realized the extent to which this technology will change the way we live. We used to be bound to a physical reality, but new immersive computer simulations allow us to escape our homes and bodies. Suddenly anyone can see what it’s like to stand on the peak of Mount Everest. A person who can’t walk can experience a marathon from the perspective of an Olympic champion. And why stop there? Become a dragon and fly through the universe. But it’s not only about spectacle. Virtual and augmented reality will impact nearly every aspect of our lives—commerce, medicine, politics—the applications are infinite. It may sound like science fiction, but this vision of the future drives billions of dollars in business and is a top priority for such companies as Facebook, Google, and Sony. Yet little is known about the history of these technologies. In Defying Reality, David M. Ewalt traces the story from ancient amphitheaters to Cold War military laboratories, through decades of hype and failure, to a nineteen-year-old video game aficionado who made the impossible possible. Ewalt looks at how businesses are already using this tech to revolutionize the world around us, and what we can expect in the future. Writing for a mainstream audience as well as for technology enthusiasts, Ewalt offers a unique perspective on VR. With firsthand accounts and on-the-ground reporting, Defying Reality shows how virtual reality will change our work, our play, and the way we relate to one another.

Mathematica for Bioinformatics: A Wolfram Language Approach To Omics

by George Mias

This book offers a comprehensive introduction to using Mathematica and the Wolfram Language for Bioinformatics. The chapters build gradually from basic concepts and the introduction of the Wolfram Language and coding paradigms in Mathematica, to detailed worked examples derived from typical research applications using Wolfram Language code. The coding examples range from basic sequence analysis, accessing genomic databases, differential gene expression, and machine learning implementations to time series analysis of longitudinal omics experiments, multi-omics integration and building dynamic interactive bioinformatics tools using the Wolfram Language. The topics address the daily bioinformatics needs of a broad audience: experimental users looking to understand and visualize their data, beginner bioinformaticians acquiring coding expertise in providing biological research solutions, and practicing expert bioinformaticians working on omics who wish to expand their toolset to include the Wolfram Language.

Transforming Digital Worlds: 13th International Conference, Iconference 2018, Sheffield, Uk, March 25-28, 2018, Proceedings (Lecture Notes in Computer Science #10766)

by Gobinda Chowdhury Julie McLeod Val Gillet Peter Willett

This book constitutes the proceedings of the 13th International Conference on Transforming Digital Worlds, iConference 2018, held in Sheffield, UK, in March 2018. The 42 full papers and 40 short papers presented together with the abstracts of 3 invited talks in this volume were carefully reviewed and selected from 219 submissions. The papers address topics such as social media; communication studies and online communities; mobile information and cloud computing; data mining and data analytics; information retrieval; information behaviour and digital literacy; digital curation; and information education and libraries.

Life After Google: The Fall of Big Data and the Rise of the Blockchain Economy

by George Gilder

“Google’s algorithms assume the world’s future is nothing more than the next moment in a random process. George Gilder shows how deep this assumption goes, what motivates people to make it, and why it’s wrong: the future depends on human action.” — Peter Thiel, founder of PayPal and Palantir Technologies and author of Zero to One: Notes on Startups, or How to Build the Future. “If you want to be clued in to the unfolding future, then you have come to the right place. For decades, George Gilder has been the undisputed oracle of technology’s future. Are giant companies like Google, Amazon, and Facebook the unstoppable monopolistic juggernauts that they seem, or are they dysfunctional giants about to be toppled by tech-savvy, entrepreneurial college dropouts?” — Nick Tredennick, Ph.D., Chief Scientist, QuickSilver Technolog Silicon Valley’s Nervous Breakdown The Age of Google, built on big data and machine intelligence, has been an awesome era. But it’s coming to an end. In Life after Google, George Gilder—the peerless visionary of technology and culture—explains why Silicon Valley is suffering a nervous breakdown and what to expect as the post-Google age dawns. Google’s astonishing ability to “search and sort” attracts the entire world to its search engine and countless other goodies—videos, maps, email, calendars….And everything it offers is free, or so it seems. Instead of paying directly, users submit to advertising. The system of “aggregate and advertise” works—for a while—if you control an empire of data centers, but a market without prices strangles entrepreneurship and turns the Internet into a wasteland of ads. The crisis is not just economic. Even as advances in artificial intelligence induce delusions of omnipotence and transcendence, Silicon Valley has pretty much given up on security. The Internet firewalls supposedly protecting all those passwords and personal information have proved hopelessly permeable. The crisis cannot be solved within the current computer and network architecture. The future lies with the “cryptocosm”—the new architecture of the blockchain and its derivatives. Enabling cryptocurrencies such as bitcoin and ether, NEO and Hashgraph, it will provide the Internet a secure global payments system, ending the aggregate-and-advertise Age of Google. Silicon Valley, long dominated by a few giants, faces a “great unbundling,” which will disperse computer power and commerce and transform the economy and the Internet. Life after Google is almost here. For fans of "Wealth and Poverty," "Knoweldge and Power," and "The Scandal of Money."

Complexity in Financial Markets: Modeling Psychological Behavior in Agent-Based Models and Order Book Models (Springer Theses)

by Matthieu Cristelli

Tools and methods from complex systems science can have a considerable impact on the way in which the quantitative assessment of economic and financial issues is approached, as discussed in this thesis. First it is shown that the self-organization of financial markets is a crucial factor in the understanding of their dynamics. In fact, using an agent-based approach, it is argued that financial markets' stylized facts appear only in the self-organized state. Secondly, the thesis points out the potential of so-called big data science for financial market modeling, investigating how web-driven data can yield a picture of market activities: it has been found that web query volumes anticipate trade volumes. As a third achievement, the metrics developed here for country competitiveness and product complexity is groundbreaking in comparison to mainstream theories of economic growth and technological development. A key element in assessing the intangible variables determining the success of countries in the present globalized economy is represented by the diversification of the productive basket of countries. The comparison between the level of complexity of a country's productive system and economic indicators such as the GDP per capita discloses its hidden growth potential.

Investigations in Computational Sarcasm (Cognitive Systems Monographs #37)

by Aditya Joshi Pushpak Bhattacharyya Mark J. Carman

This book describes the authors’ investigations of computational sarcasm based on the notion of incongruity. In addition, it provides a holistic view of past work in computational sarcasm and the challenges and opportunities that lie ahead. Sarcastic text is a peculiar form of sentiment expression and computational sarcasm refers to computational techniques that process sarcastic text. To first understand the phenomenon of sarcasm, three studies are conducted: (a) how is sarcasm annotation impacted when done by non-native annotators? (b) How is sarcasm annotation impacted when the task is to distinguish between sarcasm and irony? And (c) can targets of sarcasm be identified by humans and computers. Following these studies, the book proposes approaches for two research problems: sarcasm detection and sarcasm generation. To detect sarcasm, incongruity is captured in two ways: ‘intra-textual incongruity’ where the authors look at incongruity within the text to be classified (i.e., target text) and ‘context incongruity’ where the authors incorporate information outside the target text. These approaches use machine-learning techniques such as classifiers, topic models, sequence labelling, and word embeddings. These approaches operate at multiple levels: (a) sentiment incongruity (based on sentiment mixtures), (b) semantic incongruity (based on word embedding distance), (c) language model incongruity (based on unexpected language model), (d) author’s historical context (based on past text by the author), and (e) conversational context (based on cues from the conversation). In the second part of the book, the authors present the first known technique for sarcasm generation, which uses a template-based approach to generate a sarcastic response to user input. This book will prove to be a valuable resource for researchers working on sentiment analysis, especially as applied to automation in social media.

Refine Search

Showing 18,226 through 18,250 of 54,029 results