Transactions of the Seventh Prague Conference on Information Theory, Statistical Decision Functions, Random Processes and of the 1974 European Meeting of Statisticians

Transactions of the Seventh Prague Conference on Information Theory, Statistical Decision Functions, Random Processes and of the 1974 European Meeting of Statisticians
Author :
Publisher : Springer Science & Business Media
Total Pages : 577
Release :
ISBN-10 : 9789401099103
ISBN-13 : 9401099103
Rating : 4/5 (03 Downloads)

Synopsis Transactions of the Seventh Prague Conference on Information Theory, Statistical Decision Functions, Random Processes and of the 1974 European Meeting of Statisticians by : J. Kozesnik

The Prague Conferences on Information Theory, Statistical Decision Functions, and Random Processes have been organized every three years since 1956. During the eighteen years of their existence the Prague Conferences developed from a platform for presenting results obtained by a small group of researchers into a probabilistic congress, this being documented by the increasing number of participants as well as of presented papers. The importance of the Seventh Prague Conference has been emphasized by the fact that this Conference was held jointly with the eighth European Meeting of Statisticians. This joint meeting was held from August 18 to 23, 1974 at the Technical University of Prague. The Conference was organized by the Institute of Information Theory and Automation of the Czechoslovak Academy of Sciences and was sponsored by the Czechoslovak Academy of Sciences, by the Committee for the European Region of the Institute of Mathematical Statistics, and by the International As sociation for Statistics in Physical Sciences. More than 300 specialists from 25 countries participated in the Conference. In 57 sessions 164 papers (including 17 invited papers) were read, 128 of which are published in the present two volumes of the Transactions of the Conference. Volume A includes papers related mainly to probability theory and stochastic processes, whereas the papers of Volume B concern mainly statistics and information theory.

A First Course in Information Theory

A First Course in Information Theory
Author :
Publisher : Springer Science & Business Media
Total Pages : 440
Release :
ISBN-10 : 0306467917
ISBN-13 : 9780306467912
Rating : 4/5 (17 Downloads)

Synopsis A First Course in Information Theory by : Raymond W. Yeung

An introduction to information theory for discrete random variables. Classical topics and fundamental tools are presented along with three selected advanced topics. Yeung (Chinese U. of Hong Kong) presents chapters on information measures, zero-error data compression, weak and strong typicality, the I-measure, Markov structures, channel capacity, rate distortion theory, Blahut-Arimoto algorithms, information inequalities, and Shannon-type inequalities. The advanced topics included are single-source network coding, multi-source network coding, and entropy and groups. Annotation copyrighted by Book News, Inc., Portland, OR.

Topics in Statistical Information Theory

Topics in Statistical Information Theory
Author :
Publisher : Springer Science & Business Media
Total Pages : 169
Release :
ISBN-10 : 9781461580805
ISBN-13 : 1461580803
Rating : 4/5 (05 Downloads)

Synopsis Topics in Statistical Information Theory by : Solomon Kullback

The relevance of information theory to statistical theory and its applications to stochastic processes is a unifying influence in these TOPICS. The integral representation of discrimination information is presented in these TOPICS reviewing various approaches used in the literature, and is also developed herein using intrinsically information-theoretic methods. Log likelihood ratios associated with various stochastic processes are computed by an application of minimum discrimination information estimates. Linear discriminant functionals are used in the information-theoretic analysis of a variety of stochastic processes. Sections are numbered serially within each chapter, with a decimal notation for subsections. Equations, examples, theorems and lemmas, are numbered serially within each section with a decimal notation. The digits to the left of the decimal point represent the section and the digits to the right of the decimal point the serial number within the section. When reference is made to a section, equation, example, theorem or lemma within the same chapter only the section number or equation number, etc., is given. When the reference is to a section ,equation, etc., in a different chapter, then in addition to the section or equation etc., number, the chapter number is also given. References to the bibliography are by the author's name followed by the year of publication in parentheses. The transpose of a matrix is denoted by a prime; thus one-row matrices are denoted by primes as the transposes of one-column matrices (vectors).

Abstract Methods In Information Theory

Abstract Methods In Information Theory
Author :
Publisher : World Scientific
Total Pages : 265
Release :
ISBN-10 : 9789814495417
ISBN-13 : 9814495417
Rating : 4/5 (17 Downloads)

Synopsis Abstract Methods In Information Theory by : Yuichiro Kakihara

Information Theory is studied from the following view points: (1) the theory of entropy as amount of information; (2) the mathematical structure of information sources (probability measures); and (3) the theory of information channels. Shannon entropy and Kolmogorov-Sinai entropy are defined and their basic properties are examined, where the latter entropy is extended to be a linear functional on a certain set of measures. Ergodic and mixing properties of stationary sources are studied as well as AMS (asymptotically mean stationary) sources. The main purpose of this book is to present information channels in the environment of real and functional analysis as well as probability theory. Ergodic channels are characterized in various manners. Mixing and AMS channels are also considered in detail with some illustrations. A few other aspects of information channels including measurability, approximation and noncommutative extensions, are also discussed.

Information-Spectrum Methods in Information Theory

Information-Spectrum Methods in Information Theory
Author :
Publisher : Springer Science & Business Media
Total Pages : 568
Release :
ISBN-10 : 3540435816
ISBN-13 : 9783540435815
Rating : 4/5 (16 Downloads)

Synopsis Information-Spectrum Methods in Information Theory by : Te Sun Han

From the reviews: "This book nicely complements the existing literature on information and coding theory by concentrating on arbitrary nonstationary and/or nonergodic sources and channels with arbitrarily large alphabets. Even with such generality the authors have managed to successfully reach a highly unconventional but very fertile exposition rendering new insights into many problems." -- MATHEMATICAL REVIEWS

Information Theory

Information Theory
Author :
Publisher :
Total Pages : 72
Release :
ISBN-10 : MINN:31951000908766O
ISBN-13 :
Rating : 4/5 (6O Downloads)

Synopsis Information Theory by : Defense Documentation Center (U.S.)

Introduction to the Statistics of Poisson Processes and Applications

Introduction to the Statistics of Poisson Processes and Applications
Author :
Publisher : Springer Nature
Total Pages : 683
Release :
ISBN-10 : 9783031370540
ISBN-13 : 3031370546
Rating : 4/5 (40 Downloads)

Synopsis Introduction to the Statistics of Poisson Processes and Applications by : Yury A. Kutoyants

This book covers an extensive class of models involving inhomogeneous Poisson processes and deals with their identification, i.e. the solution of certain estimation or hypothesis testing problems based on the given dataset. These processes are mathematically easy-to-handle and appear in numerous disciplines, including astronomy, biology, ecology, geology, seismology, medicine, physics, statistical mechanics, economics, image processing, forestry, telecommunications, insurance and finance, reliability, queuing theory, wireless networks, and localisation of sources. Beginning with the definitions and properties of some fundamental notions (stochastic integral, likelihood ratio, limit theorems, etc.), the book goes on to analyse a wide class of estimators for regular and singular statistical models. Special attention is paid to problems of change-point type, and in particular cusp-type change-point models, then the focus turns to the asymptotically efficient nonparametric estimation of the mean function, the intensity function, and of some functionals. Traditional hypothesis testing, including some goodness-of-fit tests, is also discussed. The theory is then applied to three classes of problems: misspecification in regularity (MiR),corresponding to situations where the chosen change-point model and that of the real data have different regularity; optical communication with phase and frequency modulation of periodic intensity functions; and localization of a radioactive (Poisson) source on the plane using K detectors. Each chapter concludes with a series of problems, and state-of-the-art references are provided, making the book invaluable to researchers and students working in areas which actively use inhomogeneous Poisson processes.