# SEARCH

** Bayesian statistics and parameter estimation **

** Correcting Parameter Estimates for Statistical Bias **

** Parameter Estimation and Statistical Linearization **

** Parameter Estimation and Statistical Testing of Models **

Parameter estimation is a procedure for taking the unit measurements and reducing them to a set of parameters for a physical (or, in some cases, relational) mathematical model of the unit. Statistical interpretation tempered with engineering judgment is required to arrive at realistic parameter estimates. Parameter estimation can be an integral part of fault detection and model discrimination. [Pg.2572]

First, the parameter estimate may be representative of the mean operation for that time period or it may be representative of an extreme, depending upon the set of measurements upon which it is based. This arises because of the normal fluc tuations in unit measurements. Second, the statistical uncertainty, typically unknown, in the parameter estimate casts a confidence interv around the parameter estimate. Apparently, large differences in mean parameter values for two different periods may be statistically insignificant. [Pg.2577]

Parameter estimation to fit the data is carried out with VARY YM Y1 Y2, FIT M, and OPTIMIZE. The result is optimized values for Ym (0.7835), Y1 (0.6346), and Y2 (1.1770). The statistical summary shows that the residual sum of squares decreases from 0.494 to 0.294 with the parameter optimization compared to that with starting values (Ym=Yl=Y2=l. 0. ) The values of after optimization of Ym, Yl, and Y2 are shown in Figure 2, which illustrates the anchor-pivot method and forced linearization with optimization of the initiator parameters through Yl and Y2. [Pg.314]

Optimal parameter estimation, statistical inference, and model predictions [Pg.52]

Compute statistical properties of parameter estimates (see Chapter 11). [Pg.88]

Appendix B. Parameter estimation and statistical analysis of regression [Pg.539]

Once we have estimated the unknown parameter values in a linear regression model and the underlying assumptions appear to be reasonable, we can proceed and make statistical inferences about the parameter estimates and the response variables. [Pg.32]

While many methods for parameter estimation have been proposed, experience has shown some to be more effective than others. Since most phenomenological models are nonlinear in their adjustable parameters, the best estimates of these parameters can be obtained from a formalized method which properly treats the statistical behavior of the errors associated with all experimental observations. For reliable process-design calculations, we require not only estimates of the parameters but also a measure of the errors in the parameters and an indication of the accuracy of the data. [Pg.96]

Given an EoS, the objective of the parameter estimation problem is to compute optimal values for the interaction parameter vector, k, in a statistically correct and computationally efficient manner. Those values are expected to enhance the correlational ability of the EoS without compromising its ability to predict the correct phase behavior. [Pg.229]

The application of optimisation techniques for parameter estimation requires a useful statistical criterion (e.g., least-squares). A very important criterion in non-linear parameter estimation is the likelihood or probability density function. This can be combined with an error model which allows the errors to be a function of the measured value. A simple but flexible and useful error model is used in SIMUSOLV (Steiner et al., 1986 Burt, 1989). [Pg.114]

The estimates are the maximum likelihood estimates determined by NONMEM. %RSE is the percent relative error calculated by dividing the asymptotic standard error by the parameter estimate. Statistical significance is the significance level as determined by the log likelihood difference. NT = not tested. [Pg.712]

While the statistical weighting is elegant and rigorous if the uncertainties are known, its applicability is hmited because the uncertainties are seldom known. Commercial simulator models are yet unable to iterate on the parameter estimates against the unit measurements. And, the focus should be on a limited subset of the complete measurements set. [Pg.2573]

In the maximum-likelihood method used here, the "true" value of each measured variable is also found in the course of parameter estimation. The differences between these "true" values and the corresponding experimentally measured values are the residuals (also called deviations). When there are many data points, the residuals can be analyzed by standard statistical methods (Draper and Smith, 1966). If, however, there are only a few data points, examination of the residuals for trends, when plotted versus other system variables, may provide valuable information. Often these plots can indicate at a glance excessive experimental error, systematic error, or "lack of fit." Data points which are obviously bad can also be readily detected. If the model is suitable and if there are no systematic errors, such a plot shows the residuals randomly distributed with zero means. This behavior is shown in Figure 3 for the ethyl-acetate-n-propanol data of Murti and Van Winkle (1958), fitted with the van Laar equation. [Pg.105]

This is all very straightforward in principle, but in practice there may be difficulties. These arise from two sources the limited temperature range that is usually studied and the veiy long extrapolation to MT = 0 that is required to estimate In A. It is, therefore, advisable to make use of statistical techniques in order to obtain the best parameter estimates, and we now consider this problem. We developed the necessaiy mathematics in Section 2.3. [Pg.247]

In this paper we have endeavored to show some of the statistical principles underlying wavefront sensing. It is important to realize that the problem is inherently one of parameter estimation because as such it is part of a significant literature beyond traditional wavefront sensing. [Pg.394]

For the data the squared correlation coefficient was 0.93 with a root mean square error of 2.2. The graph of predicted versus actual observed MS(1 +4) along with the summary of fit statistics and parameter estimates is shown in Figure 16.7. [Pg.494]

The models with insignificant overall model regression as indicated by the F -value and with meaningless parameter estimates (with confidence limits) as indicated by r-values should be rejected. If rejection of the parameter does not lead to a physically nonsensical model stmcture, repeat parameter estimation and statistical analysis. [Pg.550]

The development of tools to ensure the use of such objective behef systems, and computational methods such as Monte Carlo simulation have brought the Bayesian approach to the fore in recent years in areas such as parameter estimation, statistical learning, and statistical decision theory. Here, we focus our attention primarily upon parameter estimation, first restricting our discussion to single-response data. [Pg.382]

It was shown by Englezos et al. (1998) that use of the entire database can be a stringent test of the correlational ability of the EoS and/or the mixing rules. An additional benefit of using all types of phase equilibrium data in the parameter estimation database is the fact that the statistical properties of the estimated parameter values are usually improved in terms of their standard deviation. [Pg.258]

Measurements have been made in a static laboratory set-up. A simulation model for generating supplementary data has been developed and verified. A statistical data treatment method has been applied to estimate tracer concentration from detector measurements. Accuracy in parameter estimation in the range of 5-10% has been obtained. [Pg.1057]

The above assumes that the measurement statistics are known. This is rarely the case. Typically a normal distribution is assumed for the plant and the measurements. Since these distributions are used in the analysis of the data, an incorrect assumption will lead to further bias in the resultant troubleshooting, model, and parameter estimation conclusions. [Pg.2561]

The primary purpose for expressing experimental data through model equations is to obtain a representation that can be used confidently for systematic interpolations and extrapolations, especially to multicomponent systems. The confidence placed in the calculations depends on the confidence placed in the data and in the model. Therefore, the method of parameter estimation should also provide measures of reliability for the calculated results. This reliability depends on the uncertainties in the parameters, which, with the statistical method of data reduction used here, are estimated from the parameter variance-covariance matrix. This matrix is obtained as a last step in the iterative calculation of the parameters. [Pg.102]

It would be of obvious interest to have a theoretically underpinned function that describes the observed frequency distribution shown in Fig. 1.9. A number of such distributions (symmetrical or skewed) are described in the statistical literature in full mathematical detail apart from the normal- and the f-distributions, none is used in analytical chemistry except under very special circumstances, e.g. the Poisson and the binomial distributions. Instrumental methods of analysis that have Powjon-distributed noise are optical and mass spectroscopy, for instance. For an introduction to parameter estimation under conditions of linked mean and variance, see Ref. 41. [Pg.29]

** Bayesian statistics and parameter estimation **

** Correcting Parameter Estimates for Statistical Bias **

** Parameter Estimation and Statistical Linearization **

** Parameter Estimation and Statistical Testing of Models **

© 2019 chempedia.info