Social Entropy Theory

Social Entropy Theory
Author :
Publisher : State University of New York Press
Total Pages : 333
Release :
ISBN-10 : 9780791495612
ISBN-13 : 0791495612
Rating : 4/5 (12 Downloads)

Synopsis Social Entropy Theory by : Kenneth D. Bailey

Social Entropy Theory illuminates the fundamental problems of societal analysis with a nonequilibrium approach, a new frame of reference built upon contemporary macrological principles, including general systems theory and information theory. Social entropy theory, using Shannon's H and the entropy concept, avoids the common (and often artificial) separation of theory and method in sociology. The hallmark of the volume is integration, as seen in the author's interdisciplinary discussions of equilibrium, entropy, and homeostasis. Unique features of the book are the introduction of the three-level model of social measurement, the theory of allocation, the concepts of global-mutable-immutable, discussion of order and power, and a large set of testable hypotheses.

Social Entropy Theory

Social Entropy Theory
Author :
Publisher : SUNY Press
Total Pages : 336
Release :
ISBN-10 : 0791400565
ISBN-13 : 9780791400562
Rating : 4/5 (65 Downloads)

Synopsis Social Entropy Theory by : Kenneth D. Bailey

Social Entropy Theory illuminates the fundamental problems of societal analysis with a nonequilibrium approach, a new frame of reference built upon contemporary macrological principles, including general systems theory and information theory. Social entropy theory, using Shannon's H and the entropy concept, avoids the common (and often artificial) separation of theory and method in sociology. The hallmark of the volume is integration, as seen in the author's interdisciplinary discussions of equilibrium, entropy, and homeostasis. Unique features of the book are the introduction of the three-level model of social measurement, the theory of allocation, the concepts of global-mutable-immutable, discussion of order and power, and a large set of testable hypotheses.

Social Theory and Education

Social Theory and Education
Author :
Publisher : SUNY Press
Total Pages : 540
Release :
ISBN-10 : 0791422526
ISBN-13 : 9780791422526
Rating : 4/5 (26 Downloads)

Synopsis Social Theory and Education by : Raymond Allen Morrow

This book summarizes and critiques theories of social and cultural reproduction as they relate to sociology of education.

Sociology and the New Systems Theory

Sociology and the New Systems Theory
Author :
Publisher : State University of New York Press
Total Pages : 392
Release :
ISBN-10 : 9780791495629
ISBN-13 : 0791495620
Rating : 4/5 (29 Downloads)

Synopsis Sociology and the New Systems Theory by : Kenneth D. Bailey

This book provides current information about the many recent contributions of social systems theory. While some sociologists feel that the systems age ended with functionalism, in reality a number of recent developments have occurred within the field. The author makes these developments accessible to sociologists and other non-systems scholars, and begins a synthesis of the burgeoning systems field and mainstream sociological theory. The analysis shows not only that important points of rapprochement exist between systems theory and sociological theory, but also that systems theory has in some cases anticipated developments needed in mainstream theory.

Entropy and Information Theory

Entropy and Information Theory
Author :
Publisher : Springer Science & Business Media
Total Pages : 346
Release :
ISBN-10 : 9781475739824
ISBN-13 : 1475739826
Rating : 4/5 (24 Downloads)

Synopsis Entropy and Information Theory by : Robert M. Gray

This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behavior of sample information and expected information. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.

A General Theory of Entropy

A General Theory of Entropy
Author :
Publisher : Springer
Total Pages : 286
Release :
ISBN-10 : 9783030181598
ISBN-13 : 3030181596
Rating : 4/5 (98 Downloads)

Synopsis A General Theory of Entropy by : Kofi Kissi Dompere

This book presents an epistemic framework for dealing with information-knowledge and certainty-uncertainty problems within the space of quality-quantity dualities. It bridges between theoretical concepts of entropy and entropy measurements, proposing the concept and measurement of fuzzy-stochastic entropy that is applicable to all areas of knowing under human cognitive limitations over the epistemological space. The book builds on two previous monographs by the same author concerning theories of info-statics and info-dynamics, to deal with identification and transformation problems respectively. The theoretical framework is developed by using the toolboxes such as those of the principle of opposites, systems of actual-potential polarities and negative-positive dualities, under different cost-benefit time-structures. The category theory and the fuzzy paradigm of thought, under methodological constructionism-reductionism duality, are used in the fuzzy-stochastic and cost-benefit spaces to point to directions of global application in knowing, knowledge and decision-choice actions. Thus, the book is concerned with a general theory of entropy, showing how the fuzzy paradigm of thought is developed to deal with the problems of qualitative-quantitative uncertainties over the fuzzy-stochastic space, which will be applicable to conditions of soft-hard data, fact, evidence and knowledge over the spaces of problem-solution dualities, decision-choice actions in sciences, non-sciences, engineering and planning sciences to abstract acceptable information-knowledge elements.

Entropy Theory Of Aging Systems: Humans, Corporations And The Universe

Entropy Theory Of Aging Systems: Humans, Corporations And The Universe
Author :
Publisher : World Scientific
Total Pages : 275
Release :
ISBN-10 : 9781908978653
ISBN-13 : 1908978651
Rating : 4/5 (53 Downloads)

Synopsis Entropy Theory Of Aging Systems: Humans, Corporations And The Universe by : Daniel Hershey

Entropy is a measure of order and disorder. If left alone, aging systems go spontaneously from youthful, low entropy and order to old, high entropy and disorder. This book presents the commonality of entropy principles which govern the birth, maturation, and senescent history of aging humans, corporations, and the universe. Mainly we introduce an entropy theory of aging, based on the non-equilibrium thermodynamic ideas of Ilya Prigogine, leading to the thermodynamic concepts of Excess Entropy (EE) and Excess Entropy Production (EEP). We describe the aging process in humans in terms of the EE and EEP concepts. This book also describes the informational entropy theory and equations of Claude Shannon and the six Hershey parameters which trace and mark the lifecycle of corporations. To conclude, this volume uses classical and informational entropy concepts, equations and calculations to explain the birth, evolution, and death of our aging universe, and all of this in relation to the concept of Infinity./a

The Entropy of Capitalism

The Entropy of Capitalism
Author :
Publisher : BRILL
Total Pages : 401
Release :
ISBN-10 : 9789004204294
ISBN-13 : 9004204296
Rating : 4/5 (94 Downloads)

Synopsis The Entropy of Capitalism by : Robert Biel

The project of applying general systems theory to social sciences is crucial in today’s crisis when social and ecological systems clash. This book concretely demonstrates the necessity of a Marxist approach to this challenge, notably in asserting agency (struggle) as against determinism. It similarly shows how Marxism can be reinvigorated from a systems perspective. Drawing on his experience in both international systems and low-input agriculture, Biel explores the interaction of social and physical systems, using the conceptual tools of thermodynamics and information. He reveals the early twenty-first century as a period when capitalism starts parasitising on the chaos it itself creates, notably in the link between the two sides of imperialism: militarism (the ‘war on terror’) and speculative finance capital.

Entropy Measures, Maximum Entropy Principle and Emerging Applications

Entropy Measures, Maximum Entropy Principle and Emerging Applications
Author :
Publisher : Springer
Total Pages : 300
Release :
ISBN-10 : 9783540362128
ISBN-13 : 3540362126
Rating : 4/5 (28 Downloads)

Synopsis Entropy Measures, Maximum Entropy Principle and Emerging Applications by : Karmeshu

The last two decades have witnessed an enormous growth with regard to ap plications of information theoretic framework in areas of physical, biological, engineering and even social sciences. In particular, growth has been spectac ular in the field of information technology,soft computing,nonlinear systems and molecular biology. Claude Shannon in 1948 laid the foundation of the field of information theory in the context of communication theory. It is in deed remarkable that his framework is as relevant today as was when he 1 proposed it. Shannon died on Feb 24, 2001. Arun Netravali observes "As if assuming that inexpensive, high-speed processing would come to pass, Shan non figured out the upper limits on communication rates. First in telephone channels, then in optical communications, and now in wireless, Shannon has had the utmost value in defining the engineering limits we face". Shannon introduced the concept of entropy. The notable feature of the entropy frame work is that it enables quantification of uncertainty present in a system. In many realistic situations one is confronted only with partial or incomplete information in the form of moment, or bounds on these values etc. ; and it is then required to construct a probabilistic model from this partial information. In such situations, the principle of maximum entropy provides a rational ba sis for constructing a probabilistic model. It is thus necessary and important to keep track of advances in the applications of maximum entropy principle to ever expanding areas of knowledge.

New Foundations for Information Theory

New Foundations for Information Theory
Author :
Publisher : Springer Nature
Total Pages : 121
Release :
ISBN-10 : 9783030865528
ISBN-13 : 3030865525
Rating : 4/5 (28 Downloads)

Synopsis New Foundations for Information Theory by : David Ellerman

This monograph offers a new foundation for information theory that is based on the notion of information-as-distinctions, being directly measured by logical entropy, and on the re-quantification as Shannon entropy, which is the fundamental concept for the theory of coding and communications. Information is based on distinctions, differences, distinguishability, and diversity. Information sets are defined that express the distinctions made by a partition, e.g., the inverse-image of a random variable so they represent the pre-probability notion of information. Then logical entropy is a probability measure on the information sets, the probability that on two independent trials, a distinction or “dit” of the partition will be obtained. The formula for logical entropy is a new derivation of an old formula that goes back to the early twentieth century and has been re-derived many times in different contexts. As a probability measure, all the compound notions of joint, conditional, and mutual logical entropy are immediate. The Shannon entropy (which is not defined as a measure in the sense of measure theory) and its compound notions are then derived from a non-linear dit-to-bit transform that re-quantifies the distinctions of a random variable in terms of bits—so the Shannon entropy is the average number of binary distinctions or bits necessary to make all the distinctions of the random variable. And, using a linearization method, all the set concepts in this logical information theory naturally extend to vector spaces in general—and to Hilbert spaces in particular—for quantum logical information theory which provides the natural measure of the distinctions made in quantum measurement. Relatively short but dense in content, this work can be a reference to researchers and graduate students doing investigations in information theory, maximum entropy methods in physics, engineering, and statistics, and to all those with a special interest in a new approach to quantum information theory.