Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Information theory, description

Chanon, M., Barone, R., Baralotto, C., JuHiard, M., Hendrickson, J.B. (1998) Information Theory Description of Synthetic Strategies in the Polyquinane Series. The Holosynthon Concept. [Pg.188]

Barone, R. Petitjean, M. Baralotto, C. Piras, P Chanon, M. Information theory description of synthetic strategies A new similarity index. 2003, 16, 9. [Pg.17]

The recent theoretical approach based on the information theory (IT) in studying aqueous solutions and hydration phenomena [62 66] shows such a direction. IT is a part of the system based on a probabilistic way of thinking about communication, introduced in 1948 by Sharmon and subsequently developed [114]. It consists in the quantitative description of the information by defining entropy as a function of probability... [Pg.707]

Chemical systems may store information either in an analog fashion, in the structural features (size, shape, nature and disposition of interaction sites, etc. [1.27]) of a molecule or a supermolecule, or in a digital fashion, in the various states or connectivities of a chemical entity. Information theory has been applied to the description of the features of molecular machines [10.2]. The evaluation of the information content of a recognition process based on structural sensing in receptor-substrate pairs... [Pg.199]

Other relevant applications of information theory include the estimation of branching ratios [179] for reactions in which several products are possible and methods for transforming calculated collinear product energy distributions into three-dimensional distributions for comparison with experiment [180]. The literature and detailed descriptions of the methods of information theory may be found in the recent reviews of Levine and Bernstein [181], Levine [182] and Levine and Kinsey [178]. [Pg.383]

It is the main purpose of this review to summarize all these novel developments in the combined DFT-Information-Theoretic approach to molecular and reactive systems. We shall begin, however, with a short survey of the key ideas in the theory of chemical reactivity, emphasizing a need for the truly two-reactant description. The conceptual advantages of the DFT and information theory in making a connection to the classical ideas and concepts of chemistry are elucidated throughout. [Pg.123]

Molecular fragments are the mutually open subsystems, which exhibit fluctuations in their electron densities and overall numbers of electrons. In chemistry one is interested in both the equilibrium distributions of electrons and non-equilibrium processes characterized by rates. Recently, it has been demonstrated [23] that the information theory provides all necessary tools for the local dynamical description of the density fluctuations and electron flows between molecular subsystems, which closely follows the thermodynamic theory of irreversible processes [146],... [Pg.163]

As we have demonstrated in this survey, there is a wide range of problems in the theory of electronic structure and chemical reactivity, which can be tackled using concepts and techniques of the density functional and information theories. These two descriptions are complementary in character, providing the energy and entropy representations of molecular systems, respectively. Together they constitute the... [Pg.175]

This analysis of living systems uses concepts of thermodynamics, information theory, cybernetics, and systems engineering, as well as the classical concepts appropriate to each level. The purpose is to produce a description of living structure and process in terms of input and output, flows through systems, steady states, and feedbacks, which will clarify and unify the facts of life. The approach generates hypotheses relevant to sin e individuals, types, and levels of living systems, or relevant across individuals, types, and levels. These hypotheses can be confirmed, disconfirmed, or evaluated by experiments and other empirical evidence. [Pg.361]

Another relevant concept within information theory, in some cases strongly related to the aforementioned measures, is the so-called complexity of a given system or process. There is not a unique and universal definition of complexity for arbitrary distributions, but it could be roughly understood as an indicator of pattern, structure, and correlation associated to the system the distribution describes. Nevertheless, many different mathematical quantifications exist under such an intuitive description. This is the case of the algorithmic [19, 20], Lempel-Ziv [21] and Grassberger... [Pg.419]

Within a given sensory modality, it is easy to understand that different stimuU place different demands or loads on information-processing systems. Thus, in order to properly interpret results of performance tests, it is necessary to describe the stimulus. While this remains a topic of ongoing research with inherent controversies, some useful working constructs are available. At issue is not simply a qualitative description, but the measurement of stimulus content (or complexity). Shannon s information theory [1948], which teaches how to measure the amount of information associated with a generalized information source, has been the primary tool used in these efforts. Thus, a stimulus can be characterized in terms of the amount of information present in it Simple stimuli (e.g., a light that is on or off ) possess less information... [Pg.1291]

First, informative theories can also reduce the complexity arising from descriptions by enabling us to see patterns with simple explanations. [Pg.55]

Because of the wealth of information available on this class of viruses (theory, descriptions, and sample code) many new boot sector viruses are being written. Taking the precautions listed here, and checking your computers weekly with a virus checker, should defeat this class of viruses. [Pg.145]

At present no microscopic theory exists which can answer questions like this, and there are few phenomenological descriptions either. What is badly needed is a common framework which will at least facilitate the development of incisive and informative phenomenological descriptions. I believe that polarizabilities can provide such a framework, and that they can be used to explain trends in chemical bonding and physical properties even when an absolute and rigorous connection is not demonstrated. For this to become so, however, it is necessary to understand the quantum mechanical meaning of polarizabilities in a more profound way than has generally been the case in the past. The story of how little this rather simple subject has been explored theoretically illustrates why it is that our understanding of the properties of materials is still at so primitive a level. [Pg.32]

Theory and Me.ihod.s includes th e eq nation s. analytical description s. an d data you n eed to nnderstan d the calcu lation s. It deals with the science behind IlyperChein calculations. Information on parameters and settings lets you modify and ciistomi/.e calculations. [Pg.1]


See other pages where Information theory, description is mentioned: [Pg.182]    [Pg.182]    [Pg.326]    [Pg.3]    [Pg.257]    [Pg.221]    [Pg.11]    [Pg.67]    [Pg.119]    [Pg.133]    [Pg.176]    [Pg.178]    [Pg.254]    [Pg.418]    [Pg.239]    [Pg.158]    [Pg.165]    [Pg.165]    [Pg.64]    [Pg.75]    [Pg.196]    [Pg.264]    [Pg.253]    [Pg.246]    [Pg.55]    [Pg.3]    [Pg.4]    [Pg.248]    [Pg.23]    [Pg.65]   
See also in sourсe #XX -- [ Pg.127 , Pg.128 , Pg.129 ]




SEARCH



© 2024 chempedia.info