Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Autocorrelation coefficients

Another approach employing the autocorrelation coefficients as descriptors was suggested by Gasteiger et al, [22]. They used the neural networks as a working tool for solving a similarity problem. [Pg.311]

These first components of the autocorrelation coefficient of the seven physicochemical properties were put together with the other 15 descriptors, providing 22 descriptors. Pairwise correlation analysis was then performed a descriptor was eliminated if the correlation coefficient was equal or higher than 0.90, and four descriptors (molecular weight, the number of carbon atoms, and the first component of the 2D autocorrelation coefficient for the atomic polarizability and n-charge) were removed. This left 18 descriptors. [Pg.499]

I quantities x and y are different, then the correlation function js sometimes referred to ross-correlation function. When x and y are the same then the function is usually called an orrelation function. An autocorrelation function indicates the extent to which the system IS a memory of its previous values (or, conversely, how long it takes the system to its memory). A simple example is the velocity autocorrelation coefficient whose indicates how closely the velocity at a time t is correlated with the velocity at time me correlation functions can be averaged over all the particles in the system (as can elocity autocorrelation function) whereas other functions are a property of the entire m (e.g. the dipole moment of the sample). The value of the velocity autocorrelation icient can be calculated by averaging over the N atoms in the simulation ... [Pg.392]

Here t is a correlation length, which grows as correlation in the data grows. The net effect of t is to reduce the effective number of independent data points, t is calculated from the autocorrelation coefficients for the series of data ... [Pg.22]

It can be seen that the series is correlated with itself by a lag of one. The series has a memory for the first preceding value. All cases of the series are dependent on the preceding value. The statistical expression for a relationship between two variables is the correlation coefficient (see Section 2.4.2). The resulting correlation coefficient between the variables x(t) and x(t - 1) is the autocorrelation coefficient [KATEMAN, 1987] for the lag t = 1 in this case. If such a relationship between the values x t) and x(t— 1) exists, we can formulate a linear regression expression ... [Pg.222]

In the multivariate case, the significant cross-correlation or autocorrelation coefficients for each variable add up to the significant multivariate correlation value. [Pg.230]

Unfortunately, there is great scope for confusion, as two distinct techniques include the phrase maximum entropy in their names. The first technique, due to Burg,135 uses the autocorrelation coefficients of the time series signal, and is effectively an alternative means of calculating linear prediction coefficients. It has become known as the maximum-entropy method (MEM). The second technique, which is more directly rooted in information theory, estimates a spectrum with the maximum entropy (i.e. assumes the least about its form) consistent with the measured FID. This second technique has become known as maximum-entropy reconstruction (MaxEnt). The two methods will be discussed only briefly here. Further details can be found in references 24, 99, 136 and 137. Note that Laue et a/.136 describe the MaxEnt technique although they refer to it as MEM. [Pg.109]

Burg135 derived a means of estimating a power spectrum from the limited number of autocorrelation coefficients of a short time series, without adding extra information. It can be shown that this approach leads to equations identical to an autoregressive LP method, with the power spectrum P(f) given by... [Pg.109]

Sadowski etal. [49] have described the use of 3D autocorrelation vectors that are based on the electrostatic potential measured on the molecular surface of a molecule. The electrostatic potential was measured over 12 different distances giving 12 autocorrelation coefficients per molecule. The vectors were calculated for the molecules in two different combinatorial libraries a xanthene library and a cubane library. The compounds were then used to train a Kohonen network. The network was successfully able to separate the libraries. [Pg.60]

Taking the inverse Fourier transform of the exact expression in Eq. (31) yields the autocorrelation coefficient [23]... [Pg.33]

Because we are working with stationary variables, the autocorrelation gives no information on the origin of time, so that it can only depend on the time difference s. The autocorrelation coefficient is the correlation coefficient between the process at time t and f - - s. [Pg.107]

Autocorrelation coefficients General shape, distribution of atomic properties 1 to 100 s... [Pg.569]

Autocorrelation coefficients Similarity of shape and similarity of possible interaction sites. Similarities between different chemical classes. 10 to 10"s... [Pg.569]

Listing 1 Examples of descriptor sets of the drug molecule diclofenac, a) Molecular fingerprints [21]. b) Substructure descriptors [22]. c) Autocorrelation coefficients, based on the three-dimensional interaction potential of the molecule [23]. [Pg.570]

Autocorrelation coefficients are used to transform a pattern of atom properties into a representation that allows comparison of molecules without needing to find the correct atom-by-atom superposition [60, 61]. Any atom property P (atomic charge, Hpophilicity parameter, topological or electrotopological index, etc.) can be used as input... [Pg.578]

The autocorrelation coefficient A describes the correlation of any atomic property of atom pairs (a,h) vdth a distance of i bonds ((a,h),d = i). The products of the properties of all atom pairs with the same distance i are summarized and result in one autocorrelation coefficient ... [Pg.578]

The set of all computed A values is a descriptor with the ability to identify typical patterns of properties in a molecule, e.g., the. distance of two polar functional groups or the distance of lipophilic groups. Due to the importance of functional groups as interaction sites of drag molecules, the autocorrelation coefficients are very helpful in classifying or describing drugs. [Pg.579]

Because the autocorrelation function uses the number of bonds to describe the distance between two atoms in a molecule, autocorrelation coefficients are topological descriptors. In addition to the topology information, however, they include the atom properties considered. [Pg.579]

More time-consuming computations are needed to obtain autocorrelation coefficients based on molecular surface properties [88, 89]. Atom properties (such as... [Pg.585]

Autocorrelation coefficients are also calculated on the basis of three-dimensional molecular interaction fields (e.g., MIP, CoMFA field or CoMSIA field). These fields are generated by mapping of atom properties to the spatial neighborhood of the molecule [23]. Distances between grid points located in the space around the molecules are used as input for the autocorrelation algorithm. [Pg.586]

The invariance to translation and rotation is achieved by integration over aU five independent modes of motion (i. e., translational movement in the x, y or z directions and rotation around two independent axes). The autocorrelation coefficient F(r) is calculated from all pairs of points x and y with a distance of r and the properties/(x) or f(y) ... [Pg.586]

The properties A of all atoms i,j at a distance R are multiplied and added up to give one component of the RBF for the distance R. The complete RBF is a function of R [93-95]. RBF descriptors show similar characteristics as autocorrelation coefficients. RBFs of different molecules can be compared directly without superimposing the molecular structures. [Pg.587]

For this reason, molecular interaction potentials based on three-dimensional structures are calculated for the compounds of the test dataset. The potentials include all possibibhes of interaction between the small molecule and the en-2ymes, as well as the shape of the molecule. Twenty autocorrelation coefficients are derived from these potentials for each molecule and used as a descriptor set. The new descriptors are mapped in the same way as the topological descriptors by means of a self-organizing map. [Pg.602]

A combined set of 312 substructure descriptors, topological autocorrelation coefficients and physico-chemical properties. [Pg.607]

A set of 15 autocorrelation coefficients, calculated from three-dimensional molecular interaction potentials. [Pg.607]

Fig. 26 Kohonen maps of the diverse screening libraries. Gray levels indicate the population of a cell. Light gray cells contain only one compound black cells indicate the highest population for each map (a 141, b 12, c 8). The chemical space on the maps is defined by a) fingerprints, b) substmcture descriptors, and c) autocorrelation coefficients (from three-dimensional structure). Fig. 26 Kohonen maps of the diverse screening libraries. Gray levels indicate the population of a cell. Light gray cells contain only one compound black cells indicate the highest population for each map (a 141, b 12, c 8). The chemical space on the maps is defined by a) fingerprints, b) substmcture descriptors, and c) autocorrelation coefficients (from three-dimensional structure).

See other pages where Autocorrelation coefficients is mentioned: [Pg.498]    [Pg.498]    [Pg.392]    [Pg.393]    [Pg.49]    [Pg.50]    [Pg.223]    [Pg.224]    [Pg.227]    [Pg.245]    [Pg.245]    [Pg.177]    [Pg.48]    [Pg.37]    [Pg.107]    [Pg.107]    [Pg.1279]    [Pg.583]    [Pg.583]    [Pg.586]    [Pg.602]    [Pg.607]   
See also in sourсe #XX -- [ Pg.498 ]

See also in sourсe #XX -- [ Pg.222 ]

See also in sourсe #XX -- [ Pg.24 ]

See also in sourсe #XX -- [ Pg.24 ]




SEARCH



Autocorrelation

Autocorrelations

Autocorrelator

Autocorrelators

Topological Autocorrelation and Cross-correlation Coefficients

© 2024 chempedia.info