Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Algorithmic information theory

Last, there is information theory and there is algorithmic information theory. The reader is encouraged to consult the classic text by Chaitin [15]. Wilf also presents a rigorous treatment of algorithms and information contexts [16]. Zurek has explored the subject in detail as well [17,18]. [Pg.120]

Chaitin, G. J. 19K7. Algorithmic Information Theory, Cambridge University Press, New York. [Pg.123]

Charles H. Bennett is an IBM fellow at IBM Research, where he has worked on various aspects of the relation between physics and information. He received his bachelor s degree from Brandeis University, majoring in chemistry, and his Ph.D. from Harvard in 1970 for molecular dynamics studies (computer simulation of molecular motion). His research has included work on quantum cryptography, algorithmic information theory, and quantum teleportation. He is an IBM fellow, a fellow of the American Physical Society, and a member of the National Academy of Sciences. [Pg.177]

Robertson, D. S. (1999). Algorithmic information theory, free will, and the Turing test. Complexity, 4(3), 25-34. Available from http //cires.colorado.edu/ doug / philosophy /i nfo8. pdf... [Pg.122]

Gregory Chaitin formalized this idea in his algorithmic information theory. He pointed out that while certain problems were unsolvable given a base set of axioms. [Pg.127]

Another way of looking at it is that Shannon information is a formal equivalent of thermodynamic entroi)y, or the degree of disorder in a physical system. As such it essentially measures how much information is missing about the individual constituents of a system. In contrast, a measure of complexity ought to (1) refer to individual states and not ensembles, and (2) reflect how mnc h is known about a system vice what is not. One approach that satisfies both of these requirements is algorithmic complexity theory. [Pg.616]

Coifman, R. R., and Wickerhauser, M. V., Entropy-based algorithms for best basis selection, IEEE Trans. Inform. Theory 38(2), 713-718 (1992). [Pg.98]

MacKay, D. J. C. Information theory, inference and learning algorithms Draft 3.1415, January 2003 www.inference.phy.cam.ac.uk/mackay/itprnn/book. html. [Pg.713]

PoHe78 Stephen C. Pohlig, Martin E. Heilman An Improved Algorithm for Computing Logarithms over GF(p) and its Cryptographic Significance IEEE Transactions on Information Theory 10/1 (1978) 106-110. [Pg.382]

R. Coifman and V. Wickerhauser, Entropy-based Algorithms for Best-basis Selection, IEEE Transactions on Information Theory, 38 (1992), 496-518. [Pg.164]

J. Ziv and A. Lempel, A universal algorithm for sequential data compression, IEEE Transactions on Information Theory, 23 (1977). 337-343. [Pg.477]

Another relevant concept within information theory, in some cases strongly related to the aforementioned measures, is the so-called complexity of a given system or process. There is not a unique and universal definition of complexity for arbitrary distributions, but it could be roughly understood as an indicator of pattern, structure, and correlation associated to the system the distribution describes. Nevertheless, many different mathematical quantifications exist under such an intuitive description. This is the case of the algorithmic [19, 20], Lempel-Ziv [21] and Grassberger... [Pg.419]

ViTERBi, A. J. Error bounds for convolutional codes and an asymptotically optimum decoding algorithm. IEEE Transactions on Information Theory 13, 2 (1967), 260-269. [Pg.600]

Discrete Systems Modeling Evolutionary Algorithms AND Metaheuristics Information Theory... [Pg.59]

D.J.C. MacKay, Information Theory, Inference, and Learning Algorithms (Cambridge University Press, Cambridge, UK, 2003), http //www.inference.phy.cam.ac.uk/mackay/itila/... [Pg.5]

Shannon, C.E. 1948. A mathematical theory of communication. Bell Sys. Tech.. 27 379 423, 623 656. Ungerboeck, G. 1982. Channel coding with multilevel/phase signals. IEEE Trans. Inf Theory (Jan.). Viterbi, A.J. 1967. Error bounds for convolutional codes and an asymptotically optimum decoding algorithm. IEEE Transactions on Information Theory IT-13 260-269. [Pg.1618]


See other pages where Algorithmic information theory is mentioned: [Pg.741]    [Pg.741]    [Pg.6]    [Pg.118]    [Pg.132]    [Pg.138]    [Pg.180]    [Pg.741]    [Pg.741]    [Pg.6]    [Pg.118]    [Pg.132]    [Pg.138]    [Pg.180]    [Pg.537]    [Pg.721]    [Pg.112]    [Pg.111]    [Pg.90]    [Pg.147]    [Pg.929]    [Pg.307]    [Pg.126]    [Pg.403]    [Pg.686]    [Pg.471]    [Pg.521]    [Pg.705]    [Pg.301]    [Pg.148]    [Pg.46]    [Pg.224]    [Pg.72]    [Pg.124]   
See also in sourсe #XX -- [ Pg.127 , Pg.132 ]




SEARCH



© 2024 chempedia.info