Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Claude Shannon

Shannon, Claude E. A Mathematical Theory of Communication, The Bell System Technical Journal, 1948 Vol. 7J, pp. 379-623. [Pg.285]

Shakanovich Eugene L, 354 Sham Lu J., 688 Shannon Claude Elwood, 971, 990, 991 Shavitt Isaiah, 885 Shaw Graham, 835 Shingu Haruo, 174 Shirakawa Hideki, 505 Schnoll Simon... [Pg.1027]

Shannon Claude Elwood (1916-1980) US. math., research on Boolean algebra, cryptography, pioneered information theory - full statement of which appeared in The Mathematical Theory of Communication (1949)... [Pg.468]

Claude Shannon won the Nobel Prize for relating the maximum information transfer to bandwidth using entropy. [Pg.37]

To a significant extent, the theoretical basis of modern communication theory arose from the work of Claude Shannon at Bell Labs. [80]. In these seminal works, the concept of the information entropy associated with an arbitrary signal arose. In 1981, Watanabe realised the close association between entropy minimization and pattern recognition [81]. An association between entropy minimization and the principle of simplicity is also recognized [82]. The basic mathematical form of signal... [Pg.176]

Incidentally, Equation (1.15) is also called the Shannon formula for entropy. Claude Shannon was an engineer who developed his definition of entropy, sometimes called information entropy, as a measure of the level of uncertainty of a random variable. Shannon s formula is central in the discipline of information theory. [Pg.13]

The first important step towards modem scientific cryptology was Claude Shannon s work [Shan49]. There, for the first time, a precise (and, according to informal requirements, certainly sufficient) notion of security for any type of cryptologic scheme was defined the information-theoretic security of secrecy schemes, sometimes called Shannon security. Roughly, the definition requires that a ciphertext provides an outsider with no additional information at all about the message. The information-theoretic notion means that the scheme is absolutely unbreakable, i.e., unbreakable even by attackers with unrestricted computing power and unrestricted memory. [Pg.12]

A similar work for authentication schemes was only published 15 years later In [GiMS74], the information-theoretic, i.e., absolute security of symmetric authentication schemes was defined. Schemes complying with this definition are often called authentication codes. Like Claude Shannon s work, [GiMS74] already contains both concrete constructions of authentication codes and lower bounds on the achievable efficiency, and in particular, the key length. In contrast to secrecy schemes, however, the upper and lower bounds are not identical furthermore, the constructions are less trivial. Therefore, there has been further research in this field. [Pg.12]

Shan48 Claude E. Shannon A Mathematical Theory of Coiimiunication The Bell System Technical Journal 27 (1948) 379-423, 623-656. [Pg.384]

Shan49 Claude E. Shannon Communication Theory of Secrecy Systems The Bell System Technical Journal 28/4 (1949) 656-715. [Pg.384]

In 1948, a century after Clausius introduced the word "entropy" into the scientific literature, Claude Shannon published a paper which gave a precise definition and meaning for information (7). [Pg.277]

Schneider, T. D., 2006, Claude Shannon Biologist, ffiEEEng. Aferf.SioZ.Afag. 25(1) 30-33 (January/February). [Pg.678]

Chemistry is still in a stage in which one does not consider quantitatively the exchange of information. Information became, however, an object of quantitative research in telecommunication. As soon as 1924, Harry Nyquist was studying the efficiency of information channels when using a given number of electric potential entries in telegraphs. A few years later, Ralph Hartley published an article on measuring information. Twenty years after that, Claude E. Shannon introduced the notion of information entropy. [Pg.990]

Two important properties that can be used to help in guiding the design of a secure cryptosystem, identified by Claude Shannon in 1949, are confusion and diffusion. Confusion measures the complexity of the relationship between the key and the ciphertext. Diffusion measures the degree to which small changes in the plaintext have large changes in the ciphertext. [Pg.65]

Claude Elwood Shannon (1916-2001), American mathematician, prolessor at the Mcissachusetts Institute of Technology, his prof sional life was associated with the Bell Laboratories. His idea, now so obvious, that information may be transmitted as a sequence of 0 and 1" was shocking in 1948. It was said that Shannon used to understand problems in zero Hm. ... [Pg.876]

Claude Shannon introduced the notion of the average information associated with all possible N results of an event in the usual way... [Pg.876]

Here kg is Boltzmann s constant and prob(j) is the probability of observing the y th state of a system. The similarities of Equations (2.17) through (2.19) are not coincidental. It is apparent that information and entropy are related if not alternate sides of the same coin. The inaugural properties and applications of Equation (2.17) were the brainchild of Claude Shannon and thus / is commonly referred to as the Shannon information [4]. The term Shannon entropy is written almost as often on account of the ties to Equations (2.18) and (2.19). The mixing entropy of Eqnation (2.18) is visited several times in subsequent chapters. [Pg.22]

In his 1940 Massachusetts Institute of Technology master s thesis, Claude Elwood Shannon used symbolic Boolean algebra as a way to analyze relay and switching circuits. Boole s work thus became the foundation for the development of modern electronics and digital computer technology. [Pg.48]

With the publication of A Symbolic Analysis of Relay and Svntching Circuits (1940) and A Mathematical Theory of Communication (1948), American mathematician Claude Elwood Shannon introduced a new area for the application of Boolean algebra. He showed that the basic properties of series and parallel combinations of electric devices such as relays could be adequately represented by this symbolic algebra. Since then. Boolean algebra has played a significant role in computer science and technology. [Pg.52]

The World War 11 antiaircraft project resulted not only in the development of hardware but also in extensive research on the theory behind what was being done as well as what else could be done. Out of this ferment eventually came such work as Norbert Wiener s cybernetics and Claude Shannon s information theory. Modern computer science is an outgrowth of all this work, which continues aU around the world in industry, government, academia, and various organizations. [Pg.425]

Concepts of digital circuits and information theory (Claude Elwood Shannon) Shannon s most important contributions were electronic switching and using information theory to discover the basic requirements for data transmission. [Pg.2058]

The theoretical foundation of data compression comprises the source coding portion of information theory. The book by Pierce provides an elementary introduction to this field, although a more advanced, yet accessible, introduction can be found in the concise volume by Mansuripur for a complete treatment, the reader is referred to the works by Cover and Thomas, GaUager, and others. Also worth mentioning is the still interesting book by Shannon and Weaver, in which the original paper on information theory by Claude Shannon appears, preceded by an introduction from the second author. The popular book by Lucky discusses data compression in the context of text, speech, and video compression. As for specific implementation details of the various coding methods, as weU as historical perspective, the work by Bell, Cleary, and Witten is recommended. [Pg.1633]

Another breakthrough paper appeared twelve years afterwards, in 1948, again by Claude Shannon A mathematical theory of communication [2]. On this paper. Shannon defined the unit of information, the binary digit, or bit, and established the theory which tells us the amount of information (i.e., the number of bits) which can be sent per unit time through a communication channel, and how this information can be fully recovered, even in the presence of noise in the channel. This work founded the Theory of Information. [Pg.1]

Claude E. Shannon is generally recognized as the founding father of information theory as we understand it today a mathematical theory or framework to quantitatively describe the communication of data. Irrespective of their nature or type, data need to be transmitted over channels, and a focal point of Shannon s pioneering work has been that channels available for communicating data are generally noisy. Shannon demonstrated that data can be communicated over noisy channels with a small probability of error if it is possible to encode (and subsequently) decode the data in a way that communicates data at a rate below but close to channel capacity. [Pg.264]


See other pages where Claude Shannon is mentioned: [Pg.1074]    [Pg.1074]    [Pg.184]    [Pg.543]    [Pg.971]    [Pg.23]    [Pg.23]    [Pg.850]    [Pg.971]    [Pg.990]    [Pg.1608]    [Pg.1]    [Pg.194]    [Pg.132]    [Pg.113]    [Pg.221]    [Pg.265]    [Pg.132]    [Pg.277]   
See also in sourсe #XX -- [ Pg.386 ]

See also in sourсe #XX -- [ Pg.277 ]




SEARCH



Claude

Shannon

© 2024 chempedia.info