Browse Results

Showing 25,576 through 25,600 of 28,620 results

The Cooperative Enterprise: Practical Evidence for a Theory of Cooperative Entrepreneurship (Cooperative Management)

by George Baourakis Gert van Dijk Panagiota Sergaki

This book presents a study of cooperatives as a two-layer entrepreneurial model, and analyzes cooperative enterprises. Above all, it explores how inducements (from the firm) and contributions (from its members, in their respective roles) are aligned, and seeks to answer the question of what this means for managing each cooperative as a firm as well as a group. The book is divided into three parts, the first of which begins with an analysis of specific aspects of cooperative enterprises, with a focus on the added value of cooperation, the weighing of interests, and a behavioral perspective on the imminent communities and their goals. In a structured approach, the book examines the various facets of relationships in cooperatives on a transactional, financial and control level. Further, a case study on the Dutch cooperative Rabobank illustrates what happens when members fail. In turn, part two concentrates on integrating the lessons learned with the existing economic literature on cooperatives, so as to contribute to a theory of cooperative management. Finally, the book links the theoretical approach to practice: in the third part, it reports on the outcomes of using a computerized simulation game to show members of cooperatives how to manage their business and the cooperative business at the same time, enabling them to understand and actively practice two-level entrepreneurship.

The Cooperstown Casebook: Who's in the Baseball Hall of Fame, Who Should Be In, and Who Should Pack Their Plaques

by Jay Jaffe

A revolutionary method for electing players to the Baseball Hall of Fame from Sports Illustrated writer Jay Jaffe, using his popular and proprietary “JAWS” ranking system.The National Baseball Hall of Fame and Museum, tucked away in upstate New York in a small town called Cooperstown, is far from any major media market or big league stadium. Yet no sports hall of fame’s membership is so hallowed, nor its qualifications so debated, nor its voting process so dissected. Since its founding in 1936, the Hall of Fame’s standards for election have been nebulous, and its selection processes arcane, resulting in confusion among voters, not to mention mistakes in who has been recognized and who has been bypassed. Numerous so-called “greats” have been inducted despite having not been so great, while popular but controversial players such as all-time home run leader Barry Bonds and all-time hits leader Pete Rose are on the outside looking in. Now, in The Cooperstown Casebook, Jay Jaffe shows us how to use his revolutionary ranking system to ensure the right players are recognized. The foundation of Jaffe’s approach is his JAWS system, an acronym for the Jaffe WAR Score, which he developed over a decade ago. Through JAWS, each candidate can be objectively compared on the basis of career and peak value to the players at his position who are already in the Hall of Fame. Because of its utility, JAWS has gained an increasing amount of exposure in recent years. Through his analysis, Jaffe shows why the Hall of Fame still matters and how it can remain relevant in the 21st century.

The Corona Problem

by Steven G. Krantz Ronald G. Douglas Eric T. Sawyer Sergei Treil Brett D. Wick

The purpose of the corona workshop was to consider the corona problem in both one and several complex variables, both in the context of function theory and harmonic analysis as well as the context of operator theory and functional analysis. It was held in June 2012 at the Fields Institute in Toronto, and attended by about fifty mathematicians. This volume validates and commemorates the workshop, and records some of the ideas that were developed within. The corona problem dates back to 1941. It has exerted a powerful influence over mathematical analysis for nearly 75 years. There is material to help bring people up to speed in the latest ideas of the subject, as well as historical material to provide background. Particularly noteworthy is a history of the corona problem, authored by the five organizers, that provides a unique glimpse at how the problem and its many different solutions have developed. There has never been a meeting of this kind, and there has never been a volume of this kind. Mathematicians--both veterans and newcomers--will benefit from reading this book. This volume makes a unique contribution to the analysis literature and will be a valuable part of the canon for many years to come.

The Correctness-by-Construction Approach to Programming

by Bruce W. Watson Derrick G. Kourie

The focus of this book is on bridging the gap between two extreme methods for developing software. On the one hand, there are texts and approaches that are so formal that they scare off all but the most dedicated theoretical computer scientists. On the other, there are some who believe that any measure of formality is a waste of time, resulting in software that is developed by following gut feelings and intuitions. Kourie and Watson advocate an approach known as "correctness-by-construction," a technique to derive algorithms that relies on formal theory, but that requires such theory to be deployed in a very systematic and pragmatic way. First they provide the key theoretical background (like first-order predicate logic or refinement laws) that is needed to understand and apply the method. They then detail a series of graded examples ranging from binary search to lattice cover graph construction and finite automata minimization in order to show how it can be applied to increasingly complex algorithmic problems. The principal purpose of this book is to change the way software developers approach their task at programming-in-the-small level, with a view to improving code quality. Thus it coheres with both the IEEE's Guide to the Software Engineering Body of Knowledge (SWEBOK) recommendations, which identifies themes covered in this book as part of the software engineer's arsenal of tools and methods, and with the goals of the Software Engineering Method and Theory (SEMAT) initiative, which aims to "refound software engineering based on a solid theory."

The Courant–Friedrichs–Lewy (CFL) Condition

by Carlos A. de Moura Carlos S. Kubrusly

This volume comprises a carefully selected collection of articles emerging from and pertinent to the 2010 CFL-80 conference in Rio de Janeiro, celebrating the 80th anniversary of the Courant-Friedrichs-Lewy (CFL) condition. A major result in the field of numerical analysis, the CFL condition has influenced the research of many important mathematicians over the past eight decades, and this work is meant to take stock of its most important and current applications. The Courant-Friedrichs-Lewy (CFL) Condition: 80 Years After its Discovery will be of interest to practicing mathematicians, engineers, physicists, and graduate students who work with numerical methods.

The Cox Model and Its Applications

by Mikhail Nikulin Hong-Dar Isaac Wu

This book will be of interest to readers active in the fields of survival analysis, genetics, ecology, biology, demography, reliability and quality control. Since Sir David Cox's pioneering work in 1972, the proportional hazards model has become the most important model in survival analysis. The success of the Cox model stimulated further studies in semiparametric and nonparametric theories, counting process models, study designs in epidemiology, and the development of many other regression models that could offer more flexible or more suitable approaches in data analysis. Flexible semiparametric regression models are increasingly being used torelate lifetime distributions to time-dependent explanatory variables. Throughout the book, various recent statistical models are developed in close connection with specific data from experimental studies in clinical trials or from observational studies.

The Craft of Economics

by Edward E. Leamer

In this spirited and provocative book, Edward Leamer turns an examination of the Heckscher--Ohlin framework for global competition into an opportunity to consider the craft of economics: what economists do, what they should do, and what they shouldn't do. Claiming "a lifetime relationship with Heckscher--Ohlin," Leamer argues that Bertil Ohlin's original idea offered something useful though vague and not necessarily valid; the economists who later translated his ideas into mathematical theorems offered something precise and valid but not necessarily useful. He argues further that the best economists keep formal and informal thinking in balance. An Ohlinesque mostly prose style can let in faulty thinking and fuzzy communication; a mostly math style allows misplaced emphasis and opaque communication. Leamer writes that today's model- and math-driven economics needs more prose and less math. Leamer shows that the Heckscher--Ohlin framework is still useful, and that there is still much work to be done with it. But he issues a caveat about economists: "What we do is not science, it's fiction and journalism. " Economic theory, he writes, is fiction (stories, loosely connected to the facts); data analysis is journalism (facts, loosely connected to the stories). Rather then titling the two sections of his book Theory and Evidence, he calls them Economic Fiction and Econometric Journalism, explaining, "If you find that startling, that's good. I am trying to keep you awake. "

The Craft of Economics: Lessons from the Heckscher-Ohlin Framework (Ohlin Lectures)

by Edward E. Leamer

A review of the Heckscher–Ohlin framework prompts a noted economist to consider the methodology of economics.In this spirited and provocative book, Edward Leamer turns an examination of the Heckscher–Ohlin framework for global competition into an opportunity to consider the craft of economics: what economists do, what they should do, and what they shouldn't do. Claiming “a lifetime relationship with Heckscher–Ohlin,” Leamer argues that Bertil Ohlin's original idea offered something useful though vague and not necessarily valid; the economists who later translated his ideas into mathematical theorems offered something precise and valid but not necessarily useful. He argues further that the best economists keep formal and informal thinking in balance. An Ohlinesque mostly prose style can let in faulty thinking and fuzzy communication; a mostly math style allows misplaced emphasis and opaque communication. Leamer writes that today's model- and math-driven economics needs more prose and less math.Leamer shows that the Heckscher–Ohlin framework is still useful, and that there is still much work to be done with it. But he issues a caveat about economists: “What we do is not science, it's fiction and journalism.” Economic theory, he writes, is fiction (stories, loosely connected to the facts); data analysis is journalism (facts, loosely connected to the stories). Rather than titling the two sections of his book Theory and Evidence, he calls them Economic Fiction and Econometric Journalism, explaining, “If you find that startling, that's good. I am trying to keep you awake.”

The Craft of Model-Based Testing

by Paul C. Jorgensen

In his latest work, author Paul C Jorgensen takes his well-honed craftsman’s approach to mastering model-based testing (MBT). To be expert at MBT, a software tester has to understand it as a craft rather than an art. This means a tester should have deep knowledge of the underlying subject and be well practiced in carrying out modeling and testing techniques. Judgment is needed, as well as an understanding of MBT the tools. <P><P>The first part of the book helps testers in developing that judgment. It starts with an overview of MBT and follows with an in-depth treatment of nine different testing models with a chapter dedicated to each model. These chapters are tied together by a pair of examples: a simple insurance premium calculation and an event-driven system that describes a garage door controller. The book shows how simpler models—flowcharts, decision tables, and UML Activity charts—express the important aspects of the insurance premium problem. It also shows how transition-based models—finite state machines, Petri nets, and statecharts—are necessary for the garage door controller but are overkill for the insurance premium problem. Each chapter describes the extent to which a model can support MBT. <P><P>The second part of the book gives testers a greater understanding of MBT tools. It examines six commercial MBT products, presents the salient features of each product, and demonstrates using the product on the insurance premium and the garage door controller problems. These chapters each conclude with advice on implementing MBT in an organization. The last chapter describes six Open Source tools to round out a tester’s knowledge of MBT. In addition, the book supports the International Software Testing Qualifications Board’s (ISTQB®) MBT syllabus for certification.

The Cramér–Lundberg Model and Its Variants: A Queueing Perspective (Springer Actuarial)

by Michel Mandjes Onno Boxma

This book offers a comprehensive examination of the Cramér–Lundberg model, which is the most extensively researched model in ruin theory. It covers the fundamental dynamics of an insurance company's surplus level in great detail, presenting a thorough analysis of the ruin probability and related measures for both the standard model and its variants.Providing a systematic and self-contained approach to evaluate the crucial quantities found in the Cramér–Lundberg model, the book makes use of connections with related queueing models when appropriate, and its emphasis on clean transform-based techniques sets it apart from other works. In addition to consolidating a wealth of existing results, the book also derives several new outcomes using the same methodology.This material is complemented by a thoughtfully chosen collection of exercises. The book's primary target audience is master's and starting PhD students in applied mathematics, operations research, and actuarial science, although it also serves as a useful methodological resource for more advanced researchers. The material is self-contained, requiring only a basic grounding in probability theory and some knowledge of transform techniques.

The Crayon Counting Book (Jerry Pallotta's Counting Books)

by Pam Muñoz Ryan Jerry Pallotta

Crayons aren't just for coloring anymore!This colorful rhyme teaches counting by twos–two different ways. First, use the even numbers to count up to 24. Then start over with the odd numbers. Along the way you'll learn unusual colors, like iguana and fiddlehead. Do any of them sound familiar? They should! They come from the pages of Jerry Pallotta's alphabet books.Counting has never been more fun or colorful!

The Creativity Code: Art and Innovation in the Age of AI

by Marcus Du Sautoy

Most books on AI focus on the future of work. But now that algorithms can learn and adapt, does the future of creativity also belong to well-programmed machines? To answer this question, Marcus du Sautoy takes us to the forefront of creative new technologies and offers a more positive and unexpected vision of our future cohabitation with machines.

The Creativity Code: Art and Innovation in the Age of AI

by Marcus du Sautoy

Most books on AI focus on the future of work. But now that algorithms can learn and adapt, does the future of creativity also belong to well-programmed machines? To answer this question, Marcus du Sautoy takes us to the forefront of creative new technologies and offers a more positive and unexpected vision of our future cohabitation with machines.

The Credibility Gap: Evaluating and Improving Empirical Research in the Social Sciences

by Anna Dreber Magnus Johannesson

Which scientific results can we trust? This question has been brought to the forefront of research in the social sciences in recent years with the movement towards open science practices and preregistration. Systematic replication studies of laboratory experiments in the social sciences have found that only about half of the “statistically significant” results published in top journals can be replicated in the sense that similar results are achieved with new data. This low replicability may be even lower in studies based on observational data as such studies have more degrees of freedom in the analysis of the data leading to larger possibilities to selectively report more publishable findings.In this book, the authors provide a framework for evaluating reproducibility, replicability and generalizability of empirical research in the social sciences. They define different types of reproducibility and replicability and show how they can be measured to evaluate the credibility of published findings. Different approaches to improving the credibility of published findings, such as preregistration with detailed pre-analysis plans, Registered Report publications, and preregistered prospective meta-analysis are also outlined and discussed. Even if published results are not systematically biased, the variation in results across populations, research designs, and analyses decreases the reliability and generalizability of published findings. The book shows how such heterogeneity in results can be measured and incorporated in the analysis to more accurately represent the uncertainty and thereby generalizability of reported results.

The Crest of the Peacock

by George Gheverghese Joseph

From the Ishango Bone of central Africa and the Inca quipu of South America to the dawn of modern mathematics, The Crest of the Peacock makes it clear that human beings everywhere have been capable of advanced and innovative mathematical thinking. George Gheverghese Joseph takes us on a breathtaking multicultural tour of the roots and shoots of non-European mathematics. He shows us the deep influence that the Egyptians and Babylonians had on the Greeks, the Arabs' major creative contributions, and the astounding range of successes of the great civilizations of India and China. The third edition emphasizes the dialogue between civilizations, and further explores how mathematical ideas were transmitted from East to West. The book's scope is now even wider, incorporating recent findings on the history of mathematics in China, India, and early Islamic civilizations as well as Egypt and Mesopotamia. With more detailed coverage of proto-mathematics and the origins of trigonometry and infinity in the East, The Crest of the Peacock further illuminates the global history of mathematics.

The Crime Data Handbook

by Mark Mills Rosemary Barberet Nicholas Lord Jude Towers Alex Sutherland Craig Bennell Lisa Tompson Jack Cunliffe Henk Elffers Kirsty Bennett Stuart Thomas Anthony Morgan Scott Keay Alexandru Cernat Ian Brunton-Smith Tim Verlaan Sam Langton Sophie Curtis-Ham Sarah Czarnomski Jesús C. Aguerri Fernando Miró-Llinares Tomas Diviak Tori Semple Bryce Jenkins Angelo Moretti Jose Pina-Sanchez Thiago R. Oliveira Leticia Couto Marta Murrià Sangenís Cristina Sobrino Garcés Timothy I. Cubitt Nico Trajtenberg Olga Sanchez de Ribera de Castro Carly Lighttowlers Lucy Bryant Olivia Horsefield Francisco J. Castro-Toledo Ana B. Gómez-Bellvís Sara Correia-Hopkins José María López Riba Raquel Bartolomé Gutiérrez Esther Fernández-Molina

Crime research has grown substantially over the past decade, with a rise in evidence-informed approaches to criminal justice, statistics-driven decision-making and predictive analytics. The fuel that has driven this growth is data – and one of its most pressing challenges is the lack of research on the use and interpretation of data sources. This accessible, engaging book closes that gap for researchers, practitioners and students. International researchers and crime analysts discuss the strengths, perils and opportunities of the data sources and tools now available and their best use in informing sound public policy and criminal justice practice.

The Crossing of Heaven

by Karl Gustafson Ioannis Antoniou

Among the group of physics honors students huddled in 1957 on a Colorado mountain watching Sputnik bisect the heavens, one young scientist was destined, three short years later, to become a key player in America's own top-secret spy satellite program. One of our era's most prolific mathematicians, Karl Gustafson was given just two weeks to write the first US spy satellite's software. The project would fundamentally alter America's Cold War strategy, and this autobiographical account of a remarkable academic life spent in the top flight tells this fascinating inside story for the first time. Gustafson takes you from his early pioneering work in computing, through fascinating encounters with Nobel laureates and Fields medalists, to his current observations on mathematics, science and life. He tells of brushes with death, being struck by lightning, and the beautiful women who have been a part of his journey.

The Cryptoclub: Using Mathematics to Make and Break Secret Codes

by Janet Beissinger Vera Pless

Join the Cryptokids as they apply basic mathematics to make and break secret codes. This book has many hands-on activities that have been tested in both classrooms and informal settings. Classic coding methods are discussed, such as Caesar, substitution, Vigenère, and multiplicative ciphers as well as the modern RSA. Math topics covered include: - Addition and Subtraction with, negative numbers, decimals, and percentages - Factorization - Modular Arithmetic - Exponentiation - Prime Numbers - Frequency Analysis.The accompanying workbook, The Cryptoclub Workbook: Using Mathematics to Make and Break Secret Codes provides students with problems related to each section to help them master the concepts introduced throughout the book. A PDF version of the workbook is available at no charge on the download tab, a printed workbook is available for $19.95 (K00701). The teacher manual can be requested from the publisher by contacting the Academic Sales Manager, Susie Carlisle

The Cultural Landscape: An Introduction to Human Geography

by James M. Rubenstein

The Cultural Landscape: An Introduction to Human Geography uses a structured learning path to look into the patterns and processes of Earth's human landscapes. Rubenstein explores the relationships between people and their locations, looking at where people and activities are located across Earth's surface and understanding why they are located in particular places.

The Culture Transplant: How Migrants Make the Economies They Move To a Lot Like the Ones They Left

by Garett Jones

A provocative new analysis of immigration's long-term effects on a nation's economy and culture. Over the last two decades, as economists began using big datasets and modern computing power to reveal the sources of national prosperity, their statistical results kept pointing toward the power of culture to drive the wealth of nations. In The Culture Transplant, Garett Jones documents the cultural foundations of cross-country income differences, showing that immigrants import cultural attitudes from their homelands—toward saving, toward trust, and toward the role of government—that persist for decades, and likely for centuries, in their new national homes. Full assimilation in a generation or two, Jones reports, is a myth. And the cultural traits migrants bring to their new homes have enduring effects upon a nation's economic potential. Built upon mainstream, well-reviewed academic research that hasn't pierced the public consciousness, this book offers a compelling refutation of an unspoken consensus that a nation's economic and political institutions won't be changed by immigration. Jones refutes the common view that we can discuss migration policy without considering whether migration can, over a few generations, substantially transform the economic and political institutions of a nation. And since most of the world's technological innovations come from just a handful of nations, Jones concludes, the entire world has a stake in whether migration policy will help or hurt the quality of government and thus the quality of scientific breakthroughs in those rare innovation powerhouses.

The Curve Shortening Problem

by Kai-Seng Chou Xi-Ping Zhu

Although research in curve shortening flow has been very active for nearly 20 years, the results of those efforts have remained scattered throughout the literature. For the first time, The Curve Shortening Problem collects and illuminates those results in a comprehensive, rigorous, and self-contained account of the fundamental results.The a

The Data Book: Collection and Management of Research Data (Chapman & Hall/CRC Interdisciplinary Statistics)

by Meredith Zozus

The Data Book: Collection and Management of Research Data is the first practical book written for researchers and research team members covering how to collect and manage data for research. The book covers basic types of data and fundamentals of how data grow, move and change over time. Focusing on pre-publication data collection and handling, the text illustrates use of these key concepts to match data collection and management methods to a particular study, in essence, making good decisions about data. The first section of the book defines data, introduces fundamental types of data that bear on methodology to collect and manage them, and covers data management planning and research reproducibility. The second section covers basic principles of and options for data collection and processing emphasizing error resistance and traceability. The third section focuses on managing the data collection and processing stages of research such that quality is consistent and ultimately capable of supporting conclusions drawn from data. The final section of the book covers principles of data security, sharing, and archival. This book will help graduate students and researchers systematically identify and implement appropriate data collection and handling methods.

The Data Detective: Ten Easy Rules to Make Sense of Statistics

by Tim Harford

From &“one of the great (greatest?) contemporary popular writers on economics&” (Tyler Cowen) comes a smart, lively, and encouraging rethinking of how to use statistics.Today we think statistics are the enemy, numbers used to mislead and confuse us. That&’s a mistake, Tim Harford says in The Data Detective. We shouldn&’t be suspicious of statistics—we need to understand what they mean and how they can improve our lives: they are, at heart, human behavior seen through the prism of numbers and are often &“the only way of grasping much of what is going on around us.&” If we can toss aside our fears and learn to approach them clearly—understanding how our own preconceptions lead us astray—statistics can point to ways we can live better and work smarter.As &“perhaps the best popular economics writer in the world&” (New Statesman), Tim Harford is an expert at taking complicated ideas and untangling them for millions of readers. In The Data Detective, he uses new research in science and psychology to set out ten strategies for using statistics to erase our biases and replace them with new ideas that use virtues like patience, curiosity, and good sense to better understand ourselves and the world. As a result, The Data Detective is a big-idea book about statistics and human behavior that is fresh, unexpected, and insightful.

The Data Game: Controversies in Social Science Statistics (Habitat Guides)

by Mark Maier Jennifer Imazeki

This book introduces students to the collection, uses, and interpretation of statistical data in the social sciences. It would suit all social science introductory statistics and research methods courses. Separate chapters are devoted to data in the fields of demography, housing, health, education, crime, the economy, wealth, income, poverty, labor, business statistics, and public opinion polling, with a concluding chapter devoted to the common problem of ambiguity. Each chapter includes multiple case studies illustrating the controversies, overview of data sources including web sites, chapter summary and a set of case study questions designed to stimulate further thought.

The Data Industry: The Business and Economics of Information and Big Data

by Chunlei Tang

An introduction of the data industry to the field of economics This book bridges the gap between economics and data science to help data scientists understand the economics of big data, and enable economists to analyze the data industry. It begins by explaining data resources and introduces the data asset. This book defines a data industry chain and enumerates data enterprises' business models, operating models, and developing models. The author describes five types of enterprise agglomerations, and multiple industrial cluster effects. A discussion on the establishment and development of data industry related laws and regulations is provided. In addition, this book discusses several scenarios on how to convert data driving forces into productivity that can then serve society. This book is designed to serve as a reference and training guide for data scientists, data-oriented managers and executives, entrepreneurs, scholars, and government employees. Defines and develops the concept of a "Data Industry," and explains the economics of data to data scientists and statisticians Includes numerous case studies and examples from a variety of industries and disciplines Serves as a useful guide for practitioners and entrepreneurs in the business of data technology The Data Industry: The Business and Economics of Information and Big Data is a resource for practitioners in the data science industry, government, and students in economics, business, and statistics. Chunlei Tang, Ph.D., is a research fellow at Harvard University. She is the co-founder of Fudan's Institute for Data Industry and proposed the concept of the "data industry". She received a Ph.D. in Computer and Software Theory in 2012 and a Master of Software Engineering in 2006 from Fudan University, Shanghai, China.

Refine Search

Showing 25,576 through 25,600 of 28,620 results