Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Algorithms adaptive methods

Riedmiller, M. Braun, H. (1993). A (Greet adaptive method for faster backpropagation learning the Rprop algorithm. In Proceedings of the IEEE International Conference on Neural Networks (ICNN 93). (ed. Ruspini H.), pp. 586-91. [Pg.113]

Liu showed how to extend Davidson s method to solve for several roots simultaneously,174 leading to what is called the Simultaneous Expansion Method, the Davidson-Liu method, or the block Davidson method. The detailed Davidson Liu algorithm, adapted from ref. 174, is presented in Figure 5. [Pg.183]

A doubly-adaptive method [3,7,11,12] is both order and partition adaptive [16]. This scheme chooses either to apply the order adaptive method to the current subinterval or to further split the subinterval, by detecting the local regularity of the integrand. An improved version QXG (QXGS) due to Favati, Lotti and Romani [5] (we call FLR henceforth) of QAG (QAGS) is a doubly-adaptive algorithm based on recursive monotone stable (RMS) formulas [4, 6]. Ninomiya s method is less effective for oscillatory integrands than the FLR method. [Pg.2]

The adaptive methods that use the Gauss-Lobatto algorithms have the following pros and cons. [Pg.40]

All the adaptive methods already discussed can be parallelized. In fact, in each subinterval, we can simultaneously calculate the values of the integral function required by the algorithm. [Pg.41]

The success of the adaptive strategy relies on effective application of the MH algorithm at each simulation level s, which requires that (which is approximated by -p< -i>) varies with a similar length scale to for s = 1, 2,..., sq. The choice of the sequence is thus important to the success of the adaptive method. If the updated PDF with data V is of the form [18,268] ... [Pg.53]

In addition to basis set expansions, there are various numerical methods for parameterizing orbitals including numerical basis sets of the form (p(r) = Yim(r)f(r), in which the radial function,/fr) does not have an analytical form, but is evaluated by a spline procedure [117]. Numerical orbitals may be more flexible than STO or GTO basis sets, but their use is more computationally demanding. Wavelet representations of orbitals [118] are exceptionally flexible as well and have an intriguing multiresolution property wavelet algorithms adaptively increase the flexibility of the orbital in regions where the molecular energy depends sensitively on the precision of the orbital and use coarser descriptions where precision is less essential. [Pg.269]

Three solution algorithms based on the method of lines for systems of parabolic differential equations cire tested by simulation of a reversed flow reactor for exhaust air purification. The solutions are compared with regcird to solution quality and computing time needed. It will be shown, that only the fully adaptive method will guarantee a sufficient solution quality. [Pg.51]

The adaptive estimation of the pseudo-inverse parameters a n) consists of the blocks C and E (Fig. 1) if the transformed noise ( ) has unknown properties. Bloek C performes the restoration of the posterior PDD function w a,n) from the data a (n) + (n). It includes methods and algorithms for the PDD function restoration from empirical data [8] which are based on empirical averaging. Beeause the noise is assumed to be a stationary process with zero mean value and the image parameters are constant, the PDD function w(a,n) converges, at least, to the real distribution. The posterior PDD funetion is used to built a back loop to block B and as a direct input for the estimator E. For the given estimation criteria f(a,d) an optimal estimation a (n) can be found from the expression... [Pg.123]

Concomitantly with the increase in hardware capabilities, better software techniques will have to be developed. It will pay us to continue to learn how nature tackles problems. Artificial neural networks are a far cry away from the capabilities of the human brain. There is a lot of room left from the information processing of the human brain in order to develop more powerful artificial neural networks. Nature has developed over millions of years efficient optimization methods for adapting to changes in the environment. The development of evolutionary and genetic algorithms will continue. [Pg.624]

There are two basic types of unconstrained optimization algorithms (I) those reqmring function derivatives and (2) those that do not. The nonderivative methods are of interest in optimization applications because these methods can be readily adapted to the case in which experiments are carried out directly on the process. In such cases, an ac tual process measurement (such as yield) can be the objec tive function, and no mathematical model for the process is required. Methods that do not reqmre derivatives are called direc t methods and include sequential simplex (Nelder-Meade) and Powell s method. The sequential simplex method is quite satisfac tory for optimization with two or three independent variables, is simple to understand, and is fairly easy to execute. Powell s method is more efficient than the simplex method and is based on the concept of conjugate search directions. [Pg.744]


See other pages where Algorithms adaptive methods is mentioned: [Pg.174]    [Pg.498]    [Pg.322]    [Pg.2]    [Pg.139]    [Pg.391]    [Pg.279]    [Pg.54]    [Pg.243]    [Pg.2645]    [Pg.50]    [Pg.172]    [Pg.726]    [Pg.231]    [Pg.77]    [Pg.261]    [Pg.82]    [Pg.119]    [Pg.140]    [Pg.98]    [Pg.108]    [Pg.2]    [Pg.83]    [Pg.396]    [Pg.467]    [Pg.103]    [Pg.70]    [Pg.167]    [Pg.24]    [Pg.99]    [Pg.253]    [Pg.42]    [Pg.545]    [Pg.115]    [Pg.377]    [Pg.405]    [Pg.688]   
See also in sourсe #XX -- [ Pg.21 , Pg.22 ]




SEARCH



Adaptive algorithm

Algorithm methods

© 2024 chempedia.info