Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Self-information

We will refer to the combination of alphabet and probability measure as an ensemble. The self information of the mth symbol in the ensemble is now defined as... [Pg.195]

The logarithm base in Eq. (4-5) determines the units of self information. The two most common units are bits (base 2) and nats (base e) the conversion between them is given by... [Pg.195]

The most important characteristic of self information is that it is a discrete random variable that is, it is a real valued function of a symbol in a discrete ensemble. As a result, it has a distribution function, an average, a variance, and in fact moments of all orders. The average value of self information has such a fundamental importance in information theory that it is given a special symbol, H, and the name entropy. Thus... [Pg.196]

As an example of self information and entropy, consider the ensemble, U, consisting of the binary digits, 0, and 1. Then if we letp = Pr(l),... [Pg.197]

The self information of a point in this product space is given, as before, by... [Pg.198]

Thus, the self information of a sequence of N symbols from a discrete memoryless source is the sum of N independent random variables, namely the self informations of the individual symbols. An immediate consequence of this is that the entropy of the sequence, ff(Ujr), is the average value of a sum of random variables so that... [Pg.198]

Note that if the self information of each sequence can be made equal to the number of binary digits in its code word, then Nb = NH(U) thus the self information of a sequence in bits is the number of binary digits that should ideally be used to represent the sequence. [Pg.203]

Mutual Information.—In the preceding sections, self informa- tion was defined and interpreted as a fundamental quantity associated with a discrete memoryless communication source. In this section we define, and in the next section interpret, a measure of the information being transmitted over a communication system. One might at first be tempted to simply analyze the self information at each point in the system, but if the channel output is statistically independent of the input, the self information at the output of the channel bears no connection to the self information of the source. What is needed instead is a measure of the information in the channel output about the channel input. [Pg.205]

These are referred to as the entropy (or average self information) of X given Y, the entropy of Y given X, and the entropy of X and Y. It follows immediately from these definitions that... [Pg.206]

It will now be proven that the average self information is always non-negative. [Pg.207]

Let a discrete memoryless source have an M letter alphabet, % >um> and a probability measure, Pr(%), , Pr(uM). Let Ts be the time interval between successive letters of a sequence from this source. Then we define the rate of the source as the average self information per unit time,... [Pg.215]

Self information and entropy, 195 Sells, R. E., 408 Seven bridge problem, 256 Sgrnn, 313... [Pg.783]

Rich, R., Maestro-Scherer, J. B. (2001). Self-informing organizational change A participatory method of data collection, analysis and action planning. Revue Internationale de Psychosociologie, 16/17, 1-16. [Pg.362]

The mutual information of an event with itself defines its self-information I(i i) = I(t) = log[P(i i)/pi = -logPi, since P(i i) = 1. It vanishes when p = 1 (i.e., when there is no uncertainty about the occurrence of aj) so that the occurrence of this event removes no uncertainty, hence conveying no information. This quantity provides a measure of the uncertainty about the occurrence of the event (i.e., the information received when the event occurs). Shannon entropy can thus be interpreted as the mean value of self-information in all individual events S(p) = pj i). One similarly defines the average mutual information in two probability distributions as the n-weighted mean value of the mutual information quantities for the individual joint events ... [Pg.162]


See other pages where Self-information is mentioned: [Pg.29]    [Pg.195]    [Pg.195]    [Pg.196]    [Pg.197]    [Pg.198]    [Pg.199]    [Pg.200]    [Pg.200]    [Pg.206]    [Pg.773]    [Pg.496]    [Pg.30]    [Pg.406]    [Pg.162]    [Pg.263]    [Pg.2465]   
See also in sourсe #XX -- [ Pg.57 ]




SEARCH



© 2024 chempedia.info