Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Theory information

Information theory was first applied [177] to chemical reactions in an attempt to compact and classify the energy distributions of reaction products. This is achieved by surprisal analysis, where the observed product energy distribution, say for vibration, P(v ), is compared with a non-specific prior distribution P°(v ). Then, the surprisal, I(v ), is given by [Pg.382]

In calculating the prior distribution, P°(v ), it is usual to assume that all product quantum states have an equal probability of being populated at a given total energy. Many reactions show linearity of the surprisal [Pg.382]

In addition to surprisal analysis of measured product energy distributions, surprisal synthesis has been applied [178] to the prediction of energy distributions either by induction from some more limited experimental data or by deduction from some dynamical calculation. In the inductive approach to surprisal synthesis, the available experimental data is used as a constraint to compute the surprisal parameter, X, by ensuring that the entropy is maximised. This surprisal parameter then determines a more detailed distribution. In a more modest way, this approach may be used to extend incomplete product energy distributions. For example, as mentioned before, infrared chemiluminescence measurements are incapable of determining the population of products in the vibrational ground state, v = 0, and this is often induced from the surprisal analysis of the other vibrational levels. [Pg.382]

Other relevant applications of information theory include the estimation of branching ratios [179] for reactions in which several products are possible and methods for transforming calculated collinear product energy distributions into three-dimensional distributions for comparison with experiment [180]. The literature and detailed descriptions of the methods of information theory may be found in the recent reviews of Levine and Bernstein [181], Levine [182] and Levine and Kinsey [178]. [Pg.383]

In calculating the prior distribution, it is usual to assume that [Pg.382]

Only a very short introduction to some basic ideas of information theory C189, 410, 412, 417, 420, 427, 4303 is given here in order to define those terms which are used for the evaluation of classifiers. [Pg.129]

Information theory is the mathematical background in the area of communication but can be also used to describe the output of a series of experiments. It deals with probabilities and is therefore not applicable to single experiments. [Pg.129]

The information of an event is considered to be only a function of the probability p. of this event. The importance or content or meaning of an event to a human interpreter is completely neglected in the information theory. The information I. of an event i is defined according to a proposal by Hartley C4193. [Pg.129]

Suppose a certain experiment A which may have n possible discrete results (outputs, events). All outputs are mutually exclusive and have the probabilities p (i = 1...n). [Pg.129]

Experiment A can be considered as a source which randomly produces discrete signals. The expected value for the information is given by the weighted average of I. over all possible outputs. [Pg.129]

X — log2 Af bits), where log2 is the logarithm base-2. [Pg.28]

Equation 2.29, giving the maximum information content of a system with N equally likely possible outcomes, may be generalized to the case of N possible outcomes with [Pg.28]

7 may be considered as being an average over the self-information, U = — log2Pi (bits), contained in the outcome. Note that this definition is an intuitive one if we observe outcome H , whose a-priori probability is known to be Pi 1, we do not gain much information since we already expected to see it if Pi 0, on the other hand, the outcome is a rare event an observation of which therefore yields a significant amount of information. If all N events are equally likely, pi = 1/N for all i and = I = 0g2N, consistent with the information content defined in equation 2.29 above. [Pg.29]

Properties of We list a few basic properties of without proof most are either obvious or straightforward to verify (for details see ([aczel75] and [guiasu77]). [Pg.29]


Y. Z. Tsypkin, Fundamentals of the Information Theory for Identification (In Russian), Nauka, Moscow, 1984... [Pg.126]

Hummer G, Garde S, Garcia A E, Pohorille A and Pratt L R 1996 An information theory model of hydrophobic interactions Proc. Natl Acad. Sc/. 93 8951... [Pg.552]

Klug A and Crowther R A 1972 Three-dimensional image reoonstruotion from the viewpoint of information theory Nature 238 435-40... [Pg.1653]

Systems can possess different extents of complexity. To measure complexity, the information content of the system can be used. Application of information theory is increasingly finitful for modeling biological activities with regard to the symmetry of molecules. [Pg.207]

We have already mentioned that real-world data have drawbacks which must be detected and removed. We have also mentioned outliers and redundancy. So far, only intuitive definitions have been given. Now, aimed with information theory, we are going firom the verbal model to an algebraic one. [Pg.212]

L. Brillouin, Science and Information Theory, 2nd ed. Academic Press, New York. 1962. [Pg.226]

James L. Unmack, "A Comparison of Periodic Versus Random Sampling From an Information Theory Point of View," presented at CMA Exposure Assessment Workshop, Washington, D.C., 1986. [Pg.110]

K. Eckschlager and V. Stepanek, Information Theory as Applied to Chemical Analysis, Wiley-Interscience, New York, 1984. [Pg.398]

JE Gibrat, J Garnier, B Robson. Eurther developments of protein secondary structure prediction using information theory. New parameters and consideration of residue pairs. J Mol Biol 198 425-443, 1987. [Pg.347]

The remainder of the book is divided into eleven largely self-contained chapters. Chapter 2 introduces some basic mathematical formalism that will be used throughout the book, including set theory, information theory, graph theory, groups, rings and field theory, and abstract automata. It concludes with a preliminary mathematical discussion of one and two dimensional CA. [Pg.18]

On the other hand, we also have a set of desired probabilities P that we want the Boltzman Machine to learn. From elementary information theory, we know that the relative entropy... [Pg.535]

A third hint of a connection between physics and information theory comes from the thermodynamics of Black Holes, which is still a deep mystery that embodies principles from quantum mechanics, statistical mechanics and general relativity. [Pg.636]

W. B. Davenport, Jr., and W. L. Root, An Introduction to the Theory of Random Signals and Noise, McGraw-Hill Book Co., New York, 1958 P. M. Woodward, Probability and Information Theory with Applications to Radar, Pergamon Press, New York, 1957. [Pg.151]

Most of this chapter is concerned with discrete systems such as those in parts b, c, d of Pig. 4-1. These are systems in which the messages going into and out of the coder and decoder can be represented as sequences of symbols from a finite alphabet. While information theory can be generalized to deal with continuous processes, this generalization is best treated as a limiting operation on discrete processes. [Pg.192]


See other pages where Theory information is mentioned: [Pg.537]    [Pg.721]    [Pg.311]    [Pg.28]    [Pg.28]    [Pg.310]    [Pg.534]    [Pg.603]    [Pg.604]    [Pg.604]    [Pg.634]    [Pg.693]    [Pg.704]    [Pg.736]    [Pg.738]    [Pg.738]    [Pg.739]    [Pg.739]    [Pg.741]    [Pg.741]    [Pg.741]    [Pg.754]    [Pg.758]    [Pg.765]    [Pg.765]    [Pg.765]    [Pg.769]    [Pg.776]    [Pg.784]    [Pg.787]    [Pg.190]    [Pg.190]    [Pg.194]    [Pg.196]   
See also in sourсe #XX -- [ Pg.27 , Pg.603 ]

See also in sourсe #XX -- [ Pg.20 , Pg.36 ]

See also in sourсe #XX -- [ Pg.24 ]

See also in sourсe #XX -- [ Pg.273 ]

See also in sourсe #XX -- [ Pg.3 ]

See also in sourсe #XX -- [ Pg.287 ]

See also in sourсe #XX -- [ Pg.186 ]

See also in sourсe #XX -- [ Pg.58 , Pg.64 ]

See also in sourсe #XX -- [ Pg.225 ]

See also in sourсe #XX -- [ Pg.346 ]

See also in sourсe #XX -- [ Pg.194 ]

See also in sourсe #XX -- [ Pg.407 ]

See also in sourсe #XX -- [ Pg.130 ]

See also in sourсe #XX -- [ Pg.432 ]

See also in sourсe #XX -- [ Pg.56 ]

See also in sourсe #XX -- [ Pg.106 ]

See also in sourсe #XX -- [ Pg.159 ]

See also in sourсe #XX -- [ Pg.269 ]

See also in sourсe #XX -- [ Pg.167 ]

See also in sourсe #XX -- [ Pg.19 ]

See also in sourсe #XX -- [ Pg.341 ]

See also in sourсe #XX -- [ Pg.129 ]

See also in sourсe #XX -- [ Pg.175 ]

See also in sourсe #XX -- [ Pg.753 ]

See also in sourсe #XX -- [ Pg.140 , Pg.155 , Pg.159 , Pg.161 , Pg.162 , Pg.165 , Pg.166 , Pg.167 , Pg.168 ]

See also in sourсe #XX -- [ Pg.51 , Pg.66 ]

See also in sourсe #XX -- [ Pg.52 ]

See also in sourсe #XX -- [ Pg.264 ]




SEARCH



© 2024 chempedia.info