Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Information-theoretic

Price category production, departmental, institutional Platforms PC (Linux, Windows-98, Windows-NT), UNIX Contact information Theoretical Chemistry Group Dipartimento di Chimica IFM Via Giuria 5-1-10125 Torino, Italy crystal ch.unito.it... [Pg.334]

As defined above, the Lyapunov exponents effectively determine the degree of chaos that exists in a dynamical system by measuring the rate of the exponential divergence of initially closely neighboring trajectories. An alternative, and, from the point of view of CA theory, perhaps more fundamental interpretation of their numeric content, is an information-theoretic one. It is, in fact, not hard to see that Lyapunov exponents are very closely related to the rate of information loss in a dynamical system (this point will be made more precise during our discussion of entropy in the next section). [Pg.205]

Recalling our earlier discussion of the information-theoretic, interpretation of... [Pg.214]

The relationships between thermodynamic entropy and Shannon s information-theoretic entropy and between physics and computation have been explored and hotly debated ever since. It is now well known, for example, that computers can, in principle, provide an arbitrary amount of reliable computation per kT of dissipated energy ([benu73], [fredkin82] see also the discussion in section 6.4). Whether a dissipationless computer can be built in practice, remains an open problem. We must also remember that computers are themselves physical (and therefore, ultimately, quantum) devices, so that any exploration of the limitations of computation will be inextricably linked with the fundamental limitations imposed by the laws of physics. [Pg.635]

It-from-bit embodies the central notion that every it - that is, every aspect of reality electrons, protons, photons, fields of force, or even the what we call space-time itself - is in the deepest sense a derivative of experimentally deduced answers to yes/no questions that is, to bits. If we allow ourselves for a moment to go back to the roots of what it is that we by convention call reality, we see that it is something that is literally defined by a particular sequence of yes/no responses elicited from either a mechanical or (our own biological) sensory apparatus in other words, reality s origin is fundamentally information-theoretic. [Pg.641]

The procedure for generating a decision tree consists of selecting the variable that gives the best classification, as the root node. Each variable is evaluated for its ability to classify the training data using an information theoretic measure of entropy. Consider a data set with K classes, Cj, I = Let M be the total number of training examples, and let... [Pg.263]

From the information-theoretical point of view, calibration corresponds to the coding of the input quantity into the output quantity and, vice versa, the evaluation process corresponds to decoding of output data. From the mathematical viewpoint, qin is the independent quantity in the calibration step and qout the dependent one. In the evaluation step, the situation is reverse qout is the independent, and qin the dependent quantity. From the statistical standpoint, qout is a random variable both in calibration and evaluation whereas qin is a fixed variable in the calibration step and a random variable in the evaluation step. This rather complicated situation has some consequences on which will be returned in Sect. 6.1.2. [Pg.149]

The Kerridge-Bongard model of information is of great importance in quality assurance, in particular for the assessment of interlaboratory studies. Examples of the information-theoretical evaluation of analytical results within the context of interlaboratory comparisons have been given by Danzer et al. [1987, 2001], Wienke et al. [1991] and Danzer [1993]. [Pg.297]

On the other hand, not only the enormous number of signals in multicomponent methods but also the large number of species that can be detected in highly resolved spectra and chromatograms, respectively, influence the information amount. Therefore, Matherny and Eckschlager [1996] proposed the introduction of so-called relevancy coefficients, k, into the system of information-theoretical assessment. In analytical practice, the coefficients k can be considered as being weight factors of the information contents of the respective species with which Eq. (9.21) becomes... [Pg.300]

Distribution analysis in atomic dimensions becomes structure analysis. But because of its specific methodology, it makes sense to consider structure analysis as a separate field of analytical chemistry see Sect. 1.2. Therefore, the information-theoretical fundamentals of structure analysis are different from that of element analysis and have been represented by Danzer and Marx [ 1979a,b]. [Pg.303]

Based on the information flow, a number of information-theoretical performance quantities can be derived, and some important ones are compiled in Table 9.2. The information performance of analytical methods can be related to the information requirement of an given analytical problem. The resulting measures, information efficiency and information profitability, may be used to assess economical aspects of analytical chemistry. [Pg.303]

Table 9.2. Information-theoretical performance parameters, according to Danzer and Eckschlager [1978] and Eckschlager and Danzer [1994]... [Pg.304]

P erez-Freire, L., Comesa na, P., P erez-Gonz alez, F. (2005). Information-theoretic analysis of security in side-informed data hiding. Proceeding of the yth Information Hiding Workshop. [Pg.20]

B. Grocholsky, A. Makarenko, and H. Durrant-Whyte, Information-theoretic coordinated control of multiple sensor platforms, in Proceedings of the IEEE International Conference on Robotics and Automation, Taipei, Taiwan, September 2003, pp. 1521-1526. [Pg.117]

With the advent of radars capable of waveform agility, the design of optimal waveform libraries comes into question. The purpose of this section is to consider the design of such waveform libraries for radar tracking applications, from an information theoretic point of view. We note that waveform libraries will depend in general on the specific applications in which the systems are to be used. Airborne radars will require different libraries from ship-borne ones. Radars used in a tracking mode will require different optimal libraries than radars in a surveillance mode. [Pg.277]

In designing or improving a waveform library certain questions arise. Firstly it is important to establish the measure of effectiveness (MoE) for individual waveforms (cost function) and then to extend this to an MoE for the library. If a particular set of waveforms is added, will this improve the library in these terms and, on the other hand, how much will removing some waveforms reduce the utility of the library It is the purpose of this chapter to develop an information theoretic framework... [Pg.277]

Nalewajski, R. F. 2003. Electronic structure and chemical reactivity Density functional and information theoretic perspectives. Adv. Quantum Chem. 43 119-184. [Pg.477]

American quantum chemistry, then, was not entirely indigenous, nor was it entirely chemistry. Mayer wrote Lewis in 1930 about attending informal theoretical sessions at the recent meeting of the Physical Society. [Pg.270]

Proof with the Escape-Rate Theory Markov Chains and Information Theoretic Aspects Eluctuation Theorem for the Currents... [Pg.83]

MacKay s textbook [114] offers not only a comprehensive coverage of Shannon s theory of information but also probabilistic data modeling and the mathematical theory of neural networks. Artificial NN can be applied when problems appear with processing and analyzing the data, with their prediction and classification (data mining). The wide range of applications of NN also comprises optimization issues. The information-theoretic capabilities of some neural network algorithms are examined and neural networks are motivated as statistical models [114]. [Pg.707]

S.L Luddite an information theoretic library design tool./. Chem. Inf. Comput. Sci. 2003, 43, 47-54. [Pg.196]

Med. Chem. 2002, 45, 4350-4358. Shanmugasundaeam, V. and Maggioea, G.M. Characterizing property and activity landscapes using an information-theoretic approach. Abstr. Papers ACS, 32-CINF, Part 1 August 2001, 222. [Pg.331]

P. Moulin and J. A. O Sullivan, Information-Theoretic Analysis of ItrformationHiding. Preprint, September 1999. [Pg.12]

M. S. Brown, L. Frommhold, and G. Birnbaum. About an information theoretical spectral shape proposed for the collision induced spectroscopies. Molec. Phys., 62 907, 1987. See also Molec. Phys.,... [Pg.408]

C. E. Shannon (1916-2001) developed an information-theoretic definition of entropy that (although not equivalent to the physical quantity) carries similar associations with microstates and probability theory. Shannon recognized that Boolean bit patterns (sequences of l s and 0 s) can be considered the basis of all methods for encoding information. ... [Pg.176]

Information-Theoretical Indices Information theory has been employed to define topological indices based on the Shannon equation [17] ... [Pg.33]

There are a vast number of possible recipes to derive information-theoretical indices. In-depth discussions of selected information-theoretical indices, also referred to as information content indices (ICI), appear elsewhere [17-19]. [Pg.33]

I mean mean information content (information theoretical index) [8]... [Pg.218]

The continuous absorption with maximum near 1935 A gives no useful information. Theoretically it has been assigned to a charge-transfer transition leading to a state in which the dipolar structure (7) makes a major contribution (Nagakura, 1960). [Pg.410]

A 128-unit flexible-chain heteropolymer with a HP composition fixed at 1 1 was simulated for the condition when hydrophobic H monomers strongly attract each other, thus stabilizing a dense globular core, whereas the attraction energy epp between hydrophilic P monomers was considered as a parameter (the interaction between H and P monomers is given by hp = V hh pp). For this model system, various conformation-dependent and sequence-dependent properties, including information-theoretic-based quantities, can be calculated. [Pg.27]

Information complexity of copolymer sequences. A common approach to the analysis of the complexity of a system is to use concepts from information theory and information-theoretic-based techniques. [Pg.27]


See other pages where Information-theoretic is mentioned: [Pg.205]    [Pg.607]    [Pg.640]    [Pg.84]    [Pg.306]    [Pg.490]    [Pg.21]    [Pg.121]    [Pg.274]    [Pg.173]    [Pg.176]    [Pg.58]    [Pg.93]   


SEARCH



Communication processes information-theoretic approach

Harmonic oscillator information theoretical

Information Theoretic Approach to Statistical Mechanics

Information theoretic analysis

Information theoretic indices

Information theoretical uncertainty-like

Information-Theoretic Background

Information-Theoretic Interpretation

Information-Theoretic Security for Signers Introduction

Information-Theoretically Secure Symmetric Schemes

Information-theoretic analysis molecules

Information-theoretic approach

Information-theoretic methods

Information-theoretic sensitivity analysis

Information-theoretical indices

Lower Bounds on Information-Theoretically Secure Signature Schemes

Prior distributions Information-theoretic analysis

Profile information-theoretical

Schemes with Information-Theoretic Security

Security information-theoretic

Statistical Theories and the Information-Theoretic Approach

The Information-Theoretic Approach

© 2024 chempedia.info