Browse Results

Showing 2,101 through 2,125 of 82,593 results

COBOL From Pascal (Computer Science Series)

by A.J. Tyrrell

This book is concerned with language skills and language understanding rather than programming methodology. No mention is made of testing, and no attention given to the environment in which programs must be entered, or executed. It is assumed that a reader will be familiar with these matters.

COLT '89: Proceedings of the Second Annual Workshop, UC Santa Cruz, California, July 31 - August 2 1989

by Colt

Computational Learning Theory presents the theoretical issues in machine learning and computational models of learning. This book covers a wide range of problems in concept learning, inductive inference, and pattern recognition.Organized into three parts encompassing 32 chapters, this book begins with an overview of the inductive principle based on weak convergence of probability measures. This text then examines the framework for constructing learning algorithms. Other chapters consider the formal theory of learning, which is learning in the sense of improving computational efficiency as opposed to concept learning. This book discusses as well the informed parsimonious (IP) inference that generalizes the compatibility and weighted parsimony techniques, which are most commonly applied in biology. The final chapter deals with the construction of prediction algorithms in a situation in which a learner faces a sequence of trials, with a prediction to be given in each and the goal of the learner is to make some mistakes.This book is a valuable resource for students and teachers.

Computation of Language: An Essay on Syntax, Semantics and Pragmatics in Natural Man-Machine Communication (Symbolic Computation)

by Roland Hausser

The study of linguistics has been forever changed by the advent of the computer. Not only does the machine permit the processing of enormous quantities of text­ thereby securing a better empirical foundation for conclusions-but also, since it is a modelling device, the machine allows the implementation of theories of grammar and other kinds of language processing. Models can have very unexpected properties­ both good and bad-and it is only through extensive tests that the value of a model can be properly assessed. The computer revolution has been going on for many years, and its importance for linguistics was recognized early on, but the more recent spread of personal workstations has made it a reality that can no longer be ignored by anyone in the subject. The present essay, in particular, could never have been written without the aid of the computer. I know personally from conversations and consultations with the author over many months how the book has changed. If he did not have at his command a powerful typesetting program, he would not have been able to see how his writing looked and exactly how it had to be revised and amplified. Even more significant for the evolution of the linguistic theory is the easy testing of examples made possible by the implementation of the parser and the computer-held lexicon. Indeed, the rule set and lexicon grew substantially after the successes of the early implementations created the desire to incorporate more linguistic phenomena.

Computational Morphology: A Computational Geometric Approach to the Analysis of Form (ISSN #Volume 6)

by G. T. Toussaint

Computational Geometry is a new discipline of computer science that deals with the design and analysis of algorithms for solving geometric problems. There are many areas of study in different disciplines which, while being of a geometric nature, have as their main component the extraction of a description of the shape or form of the input data. This notion is more imprecise and subjective than pure geometry. Such fields include cluster analysis in statistics, computer vision and pattern recognition, and the measurement of form and form-change in such areas as stereology and developmental biology.This volume is concerned with a new approach to the study of shape and form in these areas. Computational morphology is thus concerned with the treatment of morphology from the computational geometry point of view. This point of view is more formal, elegant, procedure-oriented, and clear than many previous approaches to the problem and often yields algorithms that are easier to program and have lower complexity.

The Computer Animation Dictionary: Including Related Terms Used in Computer Graphics, Film and Video, Production, and Desktop Publishing

by Robi Roncarelli

Dr AIvy Ray Smith Executive Vice President, Pixar The pOlyglot language of computer animation has arisen piecemeal as a collection of terms borrowed from geometry, film, video, painting, conventional animation, computer graphiCS, computer science, and publishing - in fact, from every older art or science which has anything to do with pictures and picture making. Robi Roncarelli, who has already demonstrated his foresight by formally identifying a nascent industry and addressing his Computer Animation Newsletter to it, here again makes a useful contribution to it by codifying its jargon. My pleasure in reading his dictionary comes additionally from the many historical notes sprinkled throughout and from surprise entries such as the one referring to Zimbabwe. Just as Samuel Johnson's dictionary of the English language was a major force in stabilizing the spelling of English, perhaps this one will serve a similar purpose for computer animation. Two of my pets are "color" for "colour" and "modeling" "modelling", under the rule that the shorter accepted spelling is always preferable. [Robi, are you reading this?] [Yes, AIvy!] Now I commend this book to you, whether you be a newcomer or an oldtimer.

Computer in Büro und Verwaltung: Psychologisches Wissen für die Praxis

by Michael Frese Felix C. Brodbeck

Hier wird erstmals eine umfassende und verständliche Einführung in das komplexe Gebiet der Mensch-Computer-Interaktion in Büro und Verwaltung gegeben. Behandelt werden folgende Fragen: Wie kann man neue Kommunikationstechniken in die betriebliche Praxis integrieren? Wie können dabei die Kriterien humaner Arbeitsgestaltung berücksichtigt werden? Wie kann die Software-Gestaltung dahingehend optimiert werden? Wie sollte ein adäquates Training speziell für die Mensch-Computer-Interaktion aussehen? Welche sozialen Veränderungen werden durch die Einführung neuer Kommunikationstechniken erwartet? Das Buch bietet trotz seines Lehrbuchcharakters eine stark praxisorientierte Einführung in diesen Themenbereich.

Computer Network Architectures and Protocols (Applications of Communications Theory)

by Carl A. Sunshine

This is a book about the bricks and mortar from which are built those edifices that will permeate the emerging information society of the future-computer networks. For many years such computer networks have played an indirect role in our daily lives as the hidden servants of banks, airlines, and stores. Now they are becoming more visible as they enter our offices and homes and directly become part of our work, entertainment, and daily living. The study of how computer networks function is a combined study of communication theory and computer science, two disciplines appearing to have very little in common. The modern communication scientist wishing to work in this area soon finds that solving the traditional problems of transmission, modulation, noise immunity, and error bounds in getting the signal from one point to another is just the beginning of the challenge. The communication must be in the right form to be routed properly, to be handled without congestion, and to be understood at various points in the network. As for the computer scientist, he finds that his discipline has also changed. The fraction of computers that belong to networks is increasing all the time. And for a typical single computer, the fraction of its execution load, storage occupancy, and system management problems that are in­ volved with being part of a network is also growing.

Computer Presentation of Data in Science: a do-it-yourself guide, based on the Apple Macintosh, for authors and illustrators in the Sciences

by D. Simmonds L. Reynolds

Books about printing written for printers or would-be printers go back over 300 years. The earliest of them were almost exclusively concerned with books; this century, however, there has been more emphasis on other kinds of documents, and particularly their design. But no shift in document production has been more sudden than the one that has happened most recently. ConSequently, the last five years have witnessed a substantial movement away from books written for professionals to ones whose aim is to help would-be authors produce their own documents. The opportunities for authors to do this have been opened up by the advent of desktop publishing (a term coined as recently as 1984). As most exponents of desktop publishing have come to realise, the term is something of a misnomer because the provision of facilities that allow authors to produce their own material for publishing is not quite the same thing as publish­ ing. Nevertheless, it has been useful in focussing attention on author-produced documents, and what might be described as the democratisation of document production. This book is different from others in the field. Its target audience is the busy scientist engaged in teaching or research who uses computers in the ordinary course of work. The world of scientific publishing is rapidly moving towards the day when journals will expect contributions from authors on disc, or even by direct transfer of data from the author's computer to the output device of an editor via telephone and satellite.

Computer Programming and Architecture: The Vax

by Henry Levy Richard Eckhouse

Takes a unique systems approach to programming and architecture of the VAX Using the VAX as a detailed example, the first half of this book offers a complete course in assembly language programming. The second describes higher-level systems issues in computer architecture. Highlights include the VAX assembler and debugger, other modern architectures such as RISCs, multiprocessing and parallel computing, microprogramming, caches and translation buffers, and an appendix on the Berkeley UNIX assembler.

Computer Simulation and Computer Algebra: Lectures for Beginners

by Dietrich Stauffer Friedrich W Hehl Volker Winkelmann John G. Zabolitzky

The chapter on statistical-physics simulations has been enlarged, mainly by a dis­ cussion of multispin coding techniques for the Ising model (bit-by-bit parallel oper­ ations). In the chapter about Reduce, some details of the presentation have been cor­ rected or clarified. The new operator MATEIGEN for the computation of eigenvec­ tors of matrices is explained. The first chapter and the appendix remain unchanged. Needless to say, the field of computational science is advancing so quickly, for ex­ ample with the development of parallel, as opposed to vectorized, algorithms, that it will not be too long before a further edition is called for. Cologne, March 1989 The authors Preface to the First Edition Computers play an increasingly important role in many of today's activities, and correspondingly physicists find employment after graduation in computer­ related jobs, often quite remote from their physics education. The present lectures, on the other hand, emphasize how we can use computers for the purposes of fundamental research in physics. Thus we do not deal with programs designed for newspapers, banks, or travel agencies, i.e., word processing and storage of large amounts of data.

Computers and Informatics in Developing Countries

by Mohan Munasinghe

Computers and Informatics in Developing Countries is a collection of papers documenting the conference of the Expert Group on Computers and Informatics for Development which investigates how the international scientific and development community can assist developing countries in using computer and informatics technology to promote progress and growth. The papers address the need for developing countries to formulate and apply computer and informatics policies for development, as well as the role of an International Centre for Computers and Informatics (ICCI) should play in the development process. The ICCI should be based on the network principle that links other regional and national computer centers. The advantages of the network principle are lower startup costs, avoidance of setting up a large organization, and localized services of needs. An international organization similar to ICCI can accelerate Third World developmental efforts following the identification of needs of developing countries as regards computer and informatics, setting of clear objectives of ICCI, and meeting with potential donors. The collection is suitable for heads of both non-government agencies and government departments involved in international aid, education, or development, and also to administrators of educational institutions and philanthropic organizations.

Computers and Mathematics

by Erich Kaltofen Stephen M. Watt

Advances in computer technology have had a tremendous impact on mathematics in the last two decades. In June of 1989, an international conference was held at MIT, bringing together mathematicians and computer scientists, to survey the work that has been done in computational mathematics, to report recent results in this field, and to discuss research directions as well as educational issues. This book presents a fascinating collection of contributions on topics ranging from computational algebra, and parallel computing, to mathematics education. Mathematicians interested in the computational aspects of their discipline as well as computer scientists interested in mathematical applications will enjoy the integrative view provided by this book.

Computers, Brains and Minds: Essays in Cognitive Science (Studies in History and Philosophy of Science #7)

by P. Slezak W. R. Albury

The institutionalization of History and Philosophy of Science as a distinct field of scholarly endeavour began comparatively early - though not always under that name - in the Australasian region. An initial lecturing appointment was made at the University of Melbourne imme­ diately after the Second World War, in 1946, and other appointments followed as the subject underwent an expansion during the 1950s and 1960s similar to that which took place in other parts of the world. Today there are major Departments at the University of Melbourne, the University of New South Wales and the University of Wollongong, and smaller groups active in many other parts of Australia and in New Zealand. "Australasian Studies in History and Philosophy of Science" aims to provide a distinctive publication outlet for Australian and New Zealand scholars working in the general area of history, philosophy and social studies of science. Each volume comprises a group of essays on a connected theme, edited by an Australian or a New Zealander with special expertise in that particular area. Papers address general issues, however, rather than local ones; parochial topics are avoided. Further­ more, though in each volume a majority of the contributors is from Australia or New Zealand, contributions from elsewhere are by no means ruled out. Quite the reverse, in fact - they are actively encour­ aged wherever appropriate to the balance of the volume in question.

Computers in Art, Design and Animation

by John Lansdown Rae Earnshaw

The collection of papers that makes up this book arises largely from the joint activities of two specialist groups of the British Computer Society, namely the Displays Group and the Computer Arts Society. Both these groups are now more than 20 years old and during the whole of this time have held regular, separate meetings. In recent years, however, the two groups have held a joint annual meeting at which presentations of mutual interest have been given and it is mainly from the last two of these that the present papers have been drawn. They fall naturally into four classes: visualisation, art, design and animation-although, as in all such cases, the boundaries between the classes are fuzzy and overlap inevitably occurs. Visualisation The graphic potential of computers has been recognised almost since computing was first used, but it is only comparatively recently that their possibilities as devices for the visualisation of complex. and largely ab­ stract phenomena has begun to be more fully appreciated. Some workers stress the need to be able to model photographic reality in order to assist in this task. They look to better algorithms and more resolution to achieve this end. Others-Alan Mackay for instance-suggest that it is "not just a matter of providing more and more pixels. It is a matter of providing congenial clues which employ to the greatest extent what we already know.

Conceptual and Numerical Analysis of Data: Proceedings of the 13th Conference of the Gesellschaft für Klassifikation e.V., University of Augsburg, April 10–12, 1989

by W. Gaul H. Schnelling P. O. Degens

The 13th conference of the Gesellschaft fUr Klassifikation e. V. took place at the Universitat Augsburg from April 10 to 12, 1989, with the' local organization by the Lehrstuhl fUr Mathematische Me­ thoden der Wirtschaftswissenschaften. The wide ranged subject of the conference Conceptual and Numerical Analysis of Data was obliged to indicate the variety of the concepts of data and information as well as the manifold methods of analysing and structuring. Based on the received announcements of papers four sections have been arranged: 1. Data Analysis and Classification: Basic Concepts and Methods 2. Applications in Library Sciences, Documentation and Information Sciences 3. Applications in Economics and Social Sciences 4. Applications in Natural Sciences and Computer Sciences This classification doesn't separate strictly, but it shows that theo­ retic and applying researchers of most different disciplines were disposed to present a paper. In 60 survey and special lectures the speakers reported on developments in theory and applications en­ couraging the interdisciplinary dialogue of all participants. This volume contains 42 selected papers grouped according to the four sections. Now we give a short insight into the presented papers. x Several problems of concept analysis, cluster analysis, data analysis and multivariate statistics are considered in 18 pa­ pers of section 1. The geometric representation of a concept lattice is a collection of figures in the plane corresponding to the given concepts in such a way that the subconcept-superconcept-relation corresponds to the containment relation between the figures. R.

Connectionism in Perspective

by R. Pfeifer Z. Schreter F. Fogelman-Soulié L Steels

An evaluation of the merits, potential, and limits of Connectionism, this book also illustrates current research programs and recent trends.Connectionism (also known as Neural Networks) is an exciting new field which has brought together researchers from different areas such as artificial intelligence, computer science, cognitive science, neuroscience, physics, and complex dynamics. These researchers are applying the connectionist paradigm in an interdisciplinary way to the analysis and design of intelligent systems.In this book, researchers from the above-mentioned fields not only report on their most recent research results, but also describe Connectionism from the perspective of their own field, looking at issues such as: - the effects and the utility of Connectionism for their field - the potential and limitations of Connectionism - can it be combined with other approaches?

Constructive Methods in Computing Science: International Summer School directed by F.L. Bauer, M. Broy, E.W. Dijkstra, C.A.R. Hoare (NATO ASI Subseries F: #55)

by F. L. Bauer M. Broy E. W. Dijkstra C. A. Hoare

Computing Science is a science of constructive methods. The solution of a problem has to be described formally by constructive techniques, if it is to be evaluated on a computer. The Marktoberdorf Advanced Study Institute 1988 presented a comprehensive survey of the recent research in constructive methods in Computing Science. Some approaches to a methodological framework and to supporting tools for specification, development and verification of software systems were discussed in detail. Other lectures dealt with the relevance of the foundations of logic for questions of program construction and with new programming paradigms and formalisms which have proven to be useful for a constructive approach to software development. The construction, specification, design and verification especially of distributed and communicating systems was discussed in a number of complementary lectures. Examples for those approaches were given on several levels such as semaphores, nondeterministic state transition systems with fairness assumptions, decomposition of specifications for concurrent systems in liveness and safety properties and functional specifications of distributed systems. Construction methods in programming that were presented range from type theory, the theory of evidence, theorem provers for proving properties of functional programs to category theory as an abstract and general concept for the description of programming paradigms.

Contending Approaches to the Political Economy of Taiwan

by Susan Greenhalgh Edwin A. Winckler

This work compares IT parks in China, India, Malaysia, Singapore, Taiwan, and Hawaii, in search of strategies that policy makers can employ to reduce the Global Digital Divide, advance distributional equity, and soften some of the negative effects of economic globalization.

Contending Approaches to the Political Economy of Taiwan (Taiwan In The Modern World Ser.)

by Susan Greenhalgh Edwin A. Winckler

This work compares IT parks in China, India, Malaysia, Singapore, Taiwan, and Hawaii, in search of strategies that policy makers can employ to reduce the Global Digital Divide, advance distributional equity, and soften some of the negative effects of economic globalization.

Coordinating User Interfaces for Consistency (Interactive Technologies)

by Jakob Nielsen

In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analyses of its benefits and some warnings about its possible dangers. Most of what you'll learn applies equally to hardware and software development, and all of it holds real benefits for both your organization and your users.Begins with a new preface by the collection's distinguished editorDetails a variety of methods for attaining interface consistency, including central control, user definitions, exemplary applications, shared code, and model analysisPresents a cost-benefits analysis of organizational efforts to promote and achieve consistencyExamines and appraises the dimensions of consistency-consistency within an application, across a family of applications, and beyondMakes the case for some unexpected benefits of interface consistency while helping you avoid the risks it can sometimes entailConsiders the consistency of interface elements other than screen designIncludes case studies of major corporations that have instituted programs to ensure the consistency of their products

Critical Reflections On Dist.

by Terry Evans Daryl Nation

This book suggests that apparently unrelated vignettes of Mikhail Gorbachev, Robert Mugabe, and Harold Wilson are closely connected and illustrates that the concept of distance education may be seen as one of those innovations which was forged on the frontier of European expansion overseas.

Critical Reflections On Dist.

by Terry Evans

This book suggests that apparently unrelated vignettes of Mikhail Gorbachev, Robert Mugabe, and Harold Wilson are closely connected and illustrates that the concept of distance education may be seen as one of those innovations which was forged on the frontier of European expansion overseas.

Current Trends in Hardware Verification and Automated Theorem Proving

by Graham Birtwistle P. A. Subrahmanyam

This report describes the partially completed correctness proof of the Viper 'block model'. Viper [7,8,9,11,23] is a microprocessor designed by W. J. Cullyer, C. Pygott and J. Kershaw at the Royal Signals and Radar Establishment in Malvern, England, (henceforth 'RSRE') for use in safety-critical applications such as civil aviation and nuclear power plant control. It is currently finding uses in areas such as the de­ ployment of weapons from tactical aircraft. To support safety-critical applications, Viper has a particulary simple design about which it is relatively easy to reason using current techniques and models. The designers, who deserve much credit for the promotion of formal methods, intended from the start that Viper be formally verified. Their idea was to model Viper in a sequence of decreasingly abstract levels, each of which concentrated on some aspect ofthe design, such as the flow ofcontrol, the processingofinstructions, and so on. That is, each model would be a specification of the next (less abstract) model, and an implementation of the previous model (if any). The verification effort would then be simplified by being structured according to the sequence of abstraction levels. These models (or levels) of description were characterized by the design team. The first two levels, and part of the third, were written by them in a logical language amenable to reasoning and proof.

Data Organization in Parallel Computers (The Springer International Series in Engineering and Computer Science #67)

by Harry A.G. Wijshoff

The organization of data is clearly of great importance in the design of high performance algorithms and architectures. Although there are several landmark papers on this subject, no comprehensive treatment has appeared. This monograph is intended to fill that gap. We introduce a model of computation for parallel computer architec­ tures, by which we are able to express the intrinsic complexity of data or­ ganization for specific architectures. We apply this model of computation to several existing parallel computer architectures, e.g., the CDC 205 and CRAY vector-computers, and the MPP binary array processor. The study of data organization in parallel computations was introduced as early as 1970. During the development of the ILLIAC IV system there was a need for a theory of possible data arrangements in interleaved mem­ ory systems. The resulting theory dealt primarily with storage schemes also called skewing schemes for 2-dimensional matrices, i.e., mappings from a- dimensional array to a number of memory banks. By means of the model of computation we are able to apply the theory of skewing schemes to var­ ious kinds of parallel computer architectures. This results in a number of consequences for both the design of parallel computer architectures and for applications of parallel processing.

Refine Search

Showing 2,101 through 2,125 of 82,593 results