- Table View
- List View
High Speed VCSELs for Optical Interconnects
by Alex MutigThe transmission speed of data communication systems is forecast to increase exponentially over the next decade. Development of both Si-based high-speed drivers as well as III-V-semiconductor-based high-speed vertical cavity surface emitting lasers (VCSELs) are prerequisites for future ultrahigh data-rate systems. This thesis presents: - a survey of the present state of the art of VCSELs - a systematic investigation of the various effects limiting present VCSELs - a catalogue of solutions to overcome present limits - detailed progress in modelling, fabricating and testing the currently most advanced VCSELs at the two commercially most important wavelengths.
High Tech Heretic: Why Computers Don't Belong in the Classroom and Other Reflections by a Computer Contrarian
by Clifford StollInteresting analysis of the use and misuse of technology in education.
High Tech Trash: Digital Devices, Hidden Toxics, and Human Health
by Elizabeth GrossmanThe Digital Age was expected to usher in an era of clean production, an alternative to smokestack industries and their pollutants. But as environmental journalist Elizabeth Grossman reveals in this penetrating analysis of high tech manufacture and disposal, digital may be sleek, but it's anything but clean. Deep within every electronic device lie toxic materials that make up the bits and bytes, a complex thicket of lead, mercury, cadmium, plastics, and a host of other often harmful ingredients.High Tech Trash is a wake-up call to the importance of the e-waste issue and the health hazards involved. Americans alone own more than two billion pieces of high tech electronics and discard five to seven million tons each year. As a result, electronic waste already makes up more than two-thirds of the heavy metals and 40 percent of the lead found in our landfills. But the problem goes far beyond American shores, most tragically to the cities in China and India where shiploads of discarded electronics arrive daily. There, they are "recycled"-picked apart by hand, exposing thousands of workers and community residents to toxics.As Grossman notes, "This is a story in which we all play a part, whether we know it or not. If you sit at a desk in an office, talk to friends on your cell phone, watch television, listen to music on headphones, are a child in Guangdong, or a native of the Arctic, you are part of this story."The answers lie in changing how we design, manufacture, and dispose of high tech electronics. Europe has led the way in regulating materials used in electronic devices and in e-waste recycling. But in the United States many have yet to recognize the persistent human health and environmental effects of the toxics in high tech devices. If Silent Spring brought national attention to the dangers of DDT and other pesticides, High Tech Trash could do the same for a new generation of technology's products.
High Tech and High Heels in the Global Economy: Women, Work, and Pink-Collar Identities in the Caribbean
by Carla FreemanHigh Tech and High Heels in the Global Economy is an ethnography of globalization positioned at the intersection between political economy and cultural studies. Carla Freeman's fieldwork in Barbados grounds the processes of transnational capitalism--production, consumption, and the crafting of modern identities--in the lives of Afro-Caribbean women working in a new high-tech industry called "informatics. " It places gender at the center of transnational analysis, and local Caribbean culture and history at the center of global studies. Freeman examines the expansion of the global assembly line into the realm of computer-based work, and focuses specifically on the incorporation of young Barbadian women into these high-tech informatics jobs. As such, Caribbean women are seen as integral not simply to the workings of globalization but as helping to shape its very form. Through the enactment of "professionalism" in both appearances and labor practices, and by insisting that motherhood and work go hand in hand, they re-define the companies' profile of "ideal" workers and create their own "pink-collar" identities. Through new modes of dress and imagemaking, the informatics workers seek to distinguish themselves from factory workers, and to achieve these new modes of consumption, they engage in a wide array of extra income earning activities. Freeman argues that for the new Barbadian pink-collar workers, the globalization of production cannot be viewed apart from the globalization of consumption. In doing so, she shows the connections between formal and informal economies, and challenges long-standing oppositions between first world consumers and third world producers, as well as white-collar and blue-collar labor. Written in a style that allows the voices of the pink-collar workers to demonstrate the simultaneous burdens and pleasures of their work, High Tech and High Heels in the Global Economy will appeal to scholars and students in a wide range of disciplines, including anthropology, cultural studies, sociology, women's studies, political economy, and Caribbean studies, as well as labor and postcolonial studies.
High Throughput Imaging Technology (Advances in Optics and Optoelectronics)
by Yutong Li Zhengjun LiuThis book highlights a comprehensive introduction to high-throughput imaging, with the focus on the principles and methods. High-throughput imaging has become a research trend in the field of optics. It combines fast imaging, super-resolution imaging and large field of view imaging, improving the performance of the imaging system in many aspects. The development of a fast and high-throughput imaging system requires integration of optics, mathematics, programming, and other related science and technology. They bridge the theory and the system and realize the software-hardware integration, finally achieving high-performance imaging. An effective evaluation criterion of high-throughput imaging is the spatio-temporal bandwidth product, which provides guidance for research. The imaging technology with better comprehensive performance is the key target of research. Nowadays, new super-resolution imaging technologies and high-throughput imaging technologies have been emerging one after another, together with a number of new technical indicators. However, the integration and cascade of various technologies is still a very difficult challenge, and different technologies are difficult to be used in combination because of differences in physical space and technical means. Creating an imaging system with fast and high-throughput imaging capability is an urgent research task, which has important economic and social benefits for practical applications such as observing the dynamic (transient) process of large-size targets and on-line detection. High-throughput imaging is one of the major research goals of global research teams in optical imaging. This book summarizes latest research advances and introduces a variety of imaging methods targeting key problems, bringing together new theories and technologies in the aspects of high resolution, large field of view and fast imaging. The book provides a handy reference and systematic handbook for graduate students, researchers, and technicians engaged in the study, research and work in optical imaging.
High Value Manufacturing: Proceedings of the 6th International Conference on Advanced Research in Virtual and Rapid Prototyping, Leiria, Portugal, 1-5 October, 2013
by Miguel Gaspar Flávio Craveiro Helena Maria Coelho da Rocha Terreiro Galha Bártolo Paulo Jorge da Silva Bártolo Nuno Manuel Fernandes Alves Carina Ramos Lina Durão Telma Ferreira Ana Cristina Soares de Lemos António Mário Henriques Pereira Cyril Dos Santos Artur Jorge dos Santos Mateus David Oliveira Elodie Pinto Henrique de Amorim Almeia Inês Sousa João Manuel Matias Pedro Carreira Tiago MarquesHigh Value Manufacturing is the result of the 6th International Conference on Advanced Research in Virtual and Rapid Prototyping, held in Leiria, Portugal, October 2013. It contains current contributions to the field of virtual and rapid prototyping (V&RP) and is also focused on promoting better links between industry and academia. This book contains current contributions to the field of virtual and rapid prototyping (V&RP) and is also focused on promoting better links between industry and academia. It covers a wide range of topics, such as additive and nano manufacturing technologies, biomanufacturing, materials, rapid tooling and manufacturing, CAD and 3D data acquisition technologies, simulation and virtual environments, and novel applications. The book is intended for engineers, designers and manufacturers who are active in the fields of mechanical, industrial and biomedical engineering.
High-Bandwidth Memory Interface
by Chulwoo Kim Junyoung Song Hyun-Woo LeeThis book provides an overview of recent advances in memory interface design at both the architecture and circuit levels. Coverage includes signal integrity and testing, TSV interface, high-speed serial interface including equalization, ODT, pre-emphasis, wide I/O interface including crosstalk, skew cancellation, and clock generation and distribution. Trends for further bandwidth enhancement are also covered.
High-Dimensional Covariance Estimation
by Mohsen PourahmadiMethods for estimating sparse and large covariance matricesCovariance and correlation matrices play fundamental roles in every aspect of the analysis of multivariate data collected from a variety of fields including business and economics, health care, engineering, and environmental and physical sciences. High-Dimensional Covariance Estimation provides accessible and comprehensive coverage of the classical and modern approaches for estimating covariance matrices as well as their applications to the rapidly developing areas lying at the intersection of statistics and machine learning.Recently, the classical sample covariance methodologies have been modified and improved upon to meet the needs of statisticians and researchers dealing with large correlated datasets. High-Dimensional Covariance Estimation focuses on the methodologies based on shrinkage, thresholding, and penalized likelihood with applications to Gaussian graphical models, prediction, and mean-variance portfolio management. The book relies heavily on regression-based ideas and interpretations to connect and unify many existing methods and algorithms for the task.High-Dimensional Covariance Estimation features chapters on:Data, Sparsity, and RegularizationRegularizing the EigenstructureBanding, Tapering, and ThresholdingCovariance MatricesSparse Gaussian Graphical ModelsMultivariate RegressionThe book is an ideal resource for researchers in statistics, mathematics, business and economics, computer sciences, and engineering, as well as a useful text or supplement for graduate-level courses in multivariate analysis, covariance estimation, statistical learning, and high-dimensional data analysis.
High-Dimensional Covariance Matrix Estimation: An Introduction to Random Matrix Theory (SpringerBriefs in Applied Statistics and Econometrics)
by Aygul ZagidullinaThis book presents covariance matrix estimation and related aspects of random matrix theory. It focuses on the sample covariance matrix estimator and provides a holistic description of its properties under two asymptotic regimes: the traditional one, and the high-dimensional regime that better fits the big data context. It draws attention to the deficiencies of standard statistical tools when used in the high-dimensional setting, and introduces the basic concepts and major results related to spectral statistics and random matrix theory under high-dimensional asymptotics in an understandable and reader-friendly way. The aim of this book is to inspire applied statisticians, econometricians, and machine learning practitioners who analyze high-dimensional data to apply the recent developments in their work.
High-Dimensional and Low-Quality Visual Information Processing
by Yue DengThis thesis primarily focuses on how to carry out intelligent sensing and understand the high-dimensional and low-quality visual information. After exploring the inherent structures of the visual data, it proposes a number of computational models covering an extensive range of mathematical topics, including compressive sensing, graph theory, probabilistic learning and information theory. These computational models are also applied to address a number of real-world problems including biometric recognition, stereo signal reconstruction, natural scene parsing, and SAR image processing.
High-Impact Content Marketing: Strategies to Make Your Content Intentional, Engaging and Effective
by Purna VirjiCreate meaningful engagement, drive conversion rates and boost customer retention with this crucial resource to unlocking the true potential of your content marketing strategy.In an era of user-generated, human-generated and machine-generated content, mistakes are increasingly costlier to make. And more difficult to recover from. To succeed in the highly competitive creator economy of today and the future, content marketers need to rethink their approach or go the way of the dinosaurs.High-Impact Content Marketing shows how to succeed by taking a simplified yet strategic approach to standing out and driving revenue impact. It covers time-proven strategies to create video, audio, social media and longer-form content that audiences will actually want to consume and how to do so in a genuinely inclusive way. It also shows how to master content distribution across channels such as websites, blogs, email and social media networks to maximize reach, engagement and impact. What makes High-Impact Content Marketing unique is how it weaves in behavioral science and adult learning principles to maximize and measure impact. It features easy-to-implement frameworks and actionable guides throughout as well as examples of best-in-class content marketing from the likes of Patagonia, Microsoft, Spotify and Google plus interviews with top industry experts from across the globe. Guidance is also included on how to align content with various stages of the customer journey. This is an essential blueprint for ensuring the long-term success of your content marketing strategy to increase brand awareness, build relationships and boost conversions.
High-Impact Design for Online Courses: Blueprinting Quality Digital Learning in Eight Practical Steps
by Andrea Gregg Bethany Simunich Penny Ralston-BergHigh-Impact Design for Online Courses introduces higher education professionals to an eight-step course design model that leverages the unique considerations of online and hybrid modalities at each stage in the process. Though relevant to and informed by instructional designers and educational technologists, this book is specifically geared toward faculty who lack the administrative and technical supports they need to thrive in the new normal. Each chapter includes step-by-step guidance on learner analysis, course structure, appropriate activities and assessments, continuous improvement, and other key elements of a successful digital course. Teachers across disciplines and levels of experience will come away newly inspired and motivated with fresh insights into planning and drafting, practical tips for pedagogy and design, opportunities for self-reflection and course revision, and implications for learner-centered delivery.
High-Impact eportfolio Practice: A Catalyst for Student, Faculty and Institutional Learning
by Bret Eynon Laura M. GambinoAt a moment when over half of US colleges are employing ePortfolios, the time is ripe to develop their full potential to advance integrative learning and broad institutional change. The authors outline how to deploy the ePortfolio as a high-impact practice and describe widely-applicable models of effective ePortfolio pedagogy and implementation that demonstrably improve student learning across multiple settings. Drawing on the campus ePortfolio projects developed by a constellation of institutions that participated in the Connect to Learning network, Eynon and Gambino present a wealth of data and revealing case studies. Their broad-based evidence demonstrates that, implemented with a purposeful framework, ePortfolios correlate strongly with increased retention and graduation rates, broadened student engagement in deep learning processes, and advanced faculty and institutional learning. The core of the book presents a comprehensive research-based framework, along with practical examples and strategies for implementation, and identifies the key considerations that need to be addressed in the areas of Pedagogy, Professional Development, Outcomes Assessment, Technology and Scaling Up. The authors identify how the ePortfolio experience enhances other high-impact practices (HIPs) by creating unique opportunities for connection and synthesis across courses, semesters and co-curricular experiences. Using ePortfolio to integrate learning across multiple HIPs enables students reflect and construct a cohesive signature learning experience. This is an invaluable resource for classroom faculty and educational leaders interested in transformative education for 21st century learners.
High-Level Models of Unconventional Computations: A Case Of Plasmodium (Studies in Systems, Decision and Control #159)
by Krzysztof Pancerz Andrew SchumannThis book shows that the plasmodium of Physarum polycephalum can be considered a natural labelled transition system, and based on this, it proposes high-level programming models for controlling the plasmodium behaviour. The presented programming is a form of pure behaviourism: the authors consider the possibility of simulating all basic stimulus–reaction relations. As plasmodium is a good experimental medium for behaviouristic models, the book applies the programming tools for modelling plasmodia as unconventional computers in different behavioural sciences based on studying the stimulus–reaction relations. The authors examine these relations within the framework of a bio-inspired game theory on plasmodia they have developed i.e. within an experimental game theory, where, on the one hand, all basic definitions are verified in experiments with Physarum polycephalum and Badhamia utricularis and, on the other hand, all basic algorithms are implemented in the object-oriented language for simulations of plasmodia. The results allow the authors to propose that the plasmodium can be a model for concurrent games and context-based games.
High-Level Subject Access Tools and Techniques in Internet Cataloging
by Judith AhronheimIs your library's portal as efficient as it could be? High-Level Subject Access Tools and Techniques explores the potential and early development of high-level subject access. It examines Web tools and traditionally maintained library structures that facilitate the automated relation of resources to high-level subject categories based on the descriptive metadata that already exists in traditional library records. It includes a research study of high-level subject browse structures, as well as hands-on reports of actual projects and development activities and an examination of the environment in which demand for high-level subject access arises. From the editor: As the World Wide Web and graphic user interfaces developed in the 90s, libraries began to build gateways for their online resources. These gateways allowed library users to employ the browse, point, and click approach to resource discovery that they had come to expect from online tools. Most of these interfaces amounted to little more than hand-constructed lists of links. Today, many libraries offer access to users through a set list of broad topics, sometimes called a high-level browse display. Methods for populating these subject categories remain crude and their maintenance requires considerable resources. As a result, libraries have begun to look at ways of applying traditional techniques associated with cataloging to these new interfaces. Several goals are involved in these developments. Many hope to reuse data from library catalogs and thus limit maintenance burdens. Others seek to apply a more standard set of tools and principles to the construction of portals to allow greater cooperation among institutions that want to interoperate with each other. This pathbreaking book examines vital issues in high-level subject access, including: subject trees and their relationship to the structure inherent in Dewey Classification emerging patterns in the development of browsing services, including a hierarchy of subjects that is not based in classification, a map that relates data from catalog records to the subject hierarchy, and tools for extracting data from a catalog and storing it in a separate database to produce a more flexible display task-based (as opposed to materials-based) subject lists the social issues that are associated with choosing categoriesbased on the nature and activity of an institution's library users the political issues involved in selecting disciplines or topics for a browsing service And presents fascinating case studies of: Columbia University's efforts to build an automatically generated browsable display based on Library of Congress Classification as it occurs in catalog records the High-Level Thesaurus Project (HILT), in which a group of libraries, archives, and museums attempted to find a common method for high-level subject access via portal
High-Level Verification
by Sorin Lerner Sudipta Kundu Rajesh K. GuptaGiven the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based testing. This book focuses on high-level verification, presenting a design methodology that relies upon advances in synthesis techniques as well as on incremental refinement of the design process. These refinements can be done manually or through elaboration tools. This book discusses verification of specific properties in designs written using high-level languages, as well as checking that the refined implementations are equivalent to their high-level specifications. The novelty of each of these techniques is that they use a combination of formal techniques to do scalable verification of system designs completely automatically. The verification techniques presented in this book include methods for verifying properties of high-level designs and methods for verifying that the translation from high-level design to a low-level Register Transfer Language (RTL) design preserves semantics. Used together, these techniques guarantee that properties verified in the high-level design are preserved through the translation to low-level RTL.
High-Order Finite Difference and Finite Element Methods for Solving Some Partial Differential Equations (Synthesis Lectures on Engineering, Science, and Technology)
by Ulziibayar Vandandoo Tugal Zhanlav Ochbadrakh Chuluunbaatar Alexander Gusev Sergue Vinitsky Galmandakh ChuluunbaatarThe monograph is devoted to the construction of the high-order finite difference and finite element methods for numerical solving multidimensional boundary-value problems (BVPs) for different partial differential equations, in particular, linear Helmholtz and wave equations, nonlinear Burgers’ equations, and elliptic (Schrödinger) equation. Despite of a long history especially in development of the theoretical background of these methods there are open questions in their constructive implementation in numerical solving the multidimensional BVPs having additional requirement on physical parameters or desirable properties of its approximate solutions. Over the last two decades many papers on this topics have been published, in which new constructive approaches to numerically solving the multidimensional BVPs were proposed, and its highly desirable to systematically collect these results. This motivate us to write thus monograph based on our research results obtained in collaboration with the co-authors. Since the topic is importance we believe that this book will be useful to readers, graduate students and researchers interested in the field of computational physics, applied mathematics, numerical analysis and applied sciences
High-Orders Motion Analysis: Computer Vision Methods
by Yan SunThis book shows how different types of motion can be disambiguated into their components in a richer way than that currently possible in computer vision. Previous research of motion analysis has generally not yet considered the basic nature of higher orders of motion such as acceleration. Hence, this book introduces an approximation of the acceleration field using established optical flow techniques. Further, acceleration is decomposed into radial and tangential based on geometry and propagated as a general motion descriptor; this book shows the capability for differentiating different types of motion both on synthesized data and real image sequences. Beyond acceleration, the higher orders of motion flow and their continuant parts are investigated for further revealing the chaotic motion fields. Naturally, it is possible to extend this notion further: to detect higher orders of image motion. In this respect, this book shows how jerk and snap can be obtained from image sequences. The derived results on test images and heel strike detection in gait analysis illustrate the ability of higher-order motion, which provide the basis for the following research and applications in the future. We hope that the publication of this book will bring a new perspective to researchers and graduate students in the field of video analysis in computer vision.
High-Performance Algorithms for Mass Spectrometry-Based Omics (Computational Biology)
by Fahad Saeed Muhammad HaseebTo date, processing of high-throughput Mass Spectrometry (MS) data is accomplished using serial algorithms. Developing new methods to process MS data is an active area of research but there is no single strategy that focuses on scalability of MS based methods. Mass spectrometry is a diverse and versatile technology for high-throughput functional characterization of proteins, small molecules and metabolites in complex biological mixtures. In the recent years the technology has rapidly evolved and is now capable of generating increasingly large (multiple tera-bytes per experiment) and complex (multiple species/microbiome/high-dimensional) data sets. This rapid advance in MS instrumentation must be matched by equally fast and rapid evolution of scalable methods developed for analysis of these complex data sets. Ideally, the new methods should leverage the rich heterogeneous computational resources available in a ubiquitous fashion in the form of multicore, manycore, CPU-GPU, CPU-FPGA, and IntelPhi architectures. The absence of these high-performance computing algorithms now hinders scientific advancements for mass spectrometry research. In this book we illustrate the need for high-performance computing algorithms for MS based proteomics, and proteogenomics and showcase our progress in developing these high-performance algorithms.
High-Performance Big Data Computing (Scientific and Engineering Computation)
by Dhabaleswar K. Panda Xiaoyi Lu Dipti ShankarAn in-depth overview of an emerging field that brings together high-performance computing, big data processing, and deep lLearning. Over the last decade, the exponential explosion of data known as big data has changed the way we understand and harness the power of data. The emerging field of high-performance big data computing, which brings together high-performance computing (HPC), big data processing, and deep learning, aims to meet the challenges posed by large-scale data processing. This book offers an in-depth overview of high-performance big data computing and the associated technical issues, approaches, and solutions. The book covers basic concepts and necessary background knowledge, including data processing frameworks, storage systems, and hardware capabilities; offers a detailed discussion of technical issues in accelerating big data computing in terms of computation, communication, memory and storage, codesign, workload characterization and benchmarking, and system deployment and management; and surveys benchmarks and workloads for evaluating big data middleware systems. It presents a detailed discussion of big data computing systems and applications with high-performance networking, computing, and storage technologies, including state-of-the-art designs for data processing and storage systems. Finally, the book considers some advanced research topics in high-performance big data computing, including designing high-performance deep learning over big data (DLoBD) stacks and HPC cloud technologies.
High-Performance Big-Data Analytics
by Pethuru Raj Anupama Raman Dhivya Nagaraj Siddhartha DuggiralaThis book presents a detailed review of high-performance computing infrastructures for next-generation big data and fast data analytics. Features: includes case studies and learning activities throughout the book and self-study exercises in every chapter; presents detailed case studies on social media analytics for intelligent businesses and on big data analytics (BDA) in the healthcare sector; describes the network infrastructure requirements for effective transfer of big data, and the storage infrastructure requirements of applications which generate big data; examines real-time analytics solutions; introduces in-database processing and in-memory analytics techniques for data mining; discusses the use of mainframes for handling real-time big data and the latest types of data management systems for BDA; provides information on the use of cluster, grid and cloud computing systems for BDA; reviews the peer-to-peer techniques and tools and the common information visualization techniques, used in BDA.
High-Performance Computational Solutions in Protein Bioinformatics
by Dariusz MrozekRecent developments in computer science enable algorithms previously perceived as too time-consuming to now be efficiently used for applications in bioinformatics and life sciences. This work focuses on proteins and their structures, protein structure similarity searching at main representation levels and various techniques that can be used to accelerate similarity searches. Divided into four parts, the first part provides a formal model of 3D protein structures for functional genomics, comparative bioinformatics and molecular modeling. The second part focuses on the use of multithreading for efficient approximate searching on protein secondary structures. The third and fourth parts concentrate on finding 3D protein structure similarities with the support of GPUs and cloud computing. Parts three and four both describe the acceleration of different methods. The text will be of interest to researchers and software developers working in the field of structural bioinformatics and biomedical databases.
High-Performance Computing Applications in Numerical Simulation and Edge Computing: ACM ICS 2018 International Workshops, HPCMS and HiDEC, Beijing, China, June 12, 2018, Revised Selected Papers (Communications in Computer and Information Science #913)
by Wen Yang Changjun Hu Congfeng Jiang Dong DaiThis book constitutes the referred proceedings of two workshops held at the 32nd ACM International Conference on Supercomputing, ACM ICS 2018, in Beijing, China, in June 2018. This volume presents the papers that have been accepted for the following workshops: Second International Workshop on High Performance Computing for Advanced Modeling and Simulation in Nuclear Energy and Environmental Science, HPCMS 2018, and First International Workshop on HPC Supported Data Analytics for Edge Computing, HiDEC 2018. The 20 full papers presented during HPCMS 2018 and HiDEC 2018 were carefully reviewed and selected from numerous submissions. The papers reflect such topics as computing methodologies; parallel algorithms; simulation types and techniques; machine learning.
High-Performance Computing Systems and Technologies in Scientific Research, Automation of Control and Production: 10th International Conference, HPCST 2020, Barnaul, Russia, May 15–16, 2020, Revised Selected Papers (Communications in Computer and Information Science #1304)
by Vladimir Jordan Nikolay Filimonov Ilya Tarasov Vladimir FaermanThis book constitutes selected revised and extended papers from the 10th International Conference on High-Performance Computing Systems and Technologies in Scientific Research, Automation of Control and Production, HPCST 2020, Barnaul, Russia, in May 2020. Due to the COVID-19 pancemic the conference was partly held in virtual mode. The 14 full papers presented in this volume were thoroughly reviewed and selected form 51 submissions. The papers are organized in topical sections on hardware for high-performance computing and its applications; information technologies and computer simulation of physical phenomena.
High-Performance Computing Systems and Technologies in Scientific Research, Automation of Control and Production: 11th International Conference, HPCST 2021, Barnaul, Russia, May 21–22, 2021, Revised Selected Papers (Communications in Computer and Information Science #1526)
by Vladimir Jordan Ilya Tarasov Vladimir FaermanThis book constitutes selected revised and extended papers from the 11th International Conference on High-Performance Computing Systems and Technologies in Scientific Research, Automation of Control and Production, HPCST 2021, Barnaul, Russia, in May 2021. The 32 full papers presented in this volume were thoroughly reviewed and selected form 98 submissions. The papers are organized in topical sections on Hardware for High-Performance Computing and Signal Processing; Information Technologies and Computer Simulation of Physical Phenomena; Computing Technologies in Discrete Mathematics and Decision Making; Information and Computing Technologies in Automation and Control Science; and Computing Technologies in Information Security Applications.