Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Regression, multiple

Multiple regression can be used to develop a quantitative equation relating a dependent variable with several independent variables. In multiple linear regression, any number of independent variables may be considered  [Pg.136]

Assumptions of a multiple regression analysis are identical to those for linear regression except for the p independent variables in this case. To reach regression coefficient estimates b by the method of least squares, we again have to minimize [Pg.136]

The sum of squared errors between the observed value and predicted value is minimized by taking the partial derivative (SSE)/b with respect to each parameter and setting each result equal to zero  [Pg.137]

The sum that is in the central parenthesis is the sum of squares due to regression SSR so that  [Pg.137]

It is of particular importance to do statistical testing of the null hypothesis  [Pg.138]

In the previous section we saw how to study the dependence of an outcome variable on another variable measured at baseline. It could well be that there are several baseline variables which predict outcome and in this section we will see how to incorporate these variables simultaneously through a methodology termed multiple (linear) regression. [Pg.94]

Taking up the example from the previous section it may be that time to disease progression depends potentially not just on size of the primary tumour, but also [Pg.94]

The extension of simple linear regression to deal with multiple baseline variables is somewhat difficult visually, but algebraically it is simply a matter of adding terms to the equation  [Pg.95]

Now bj measures the effect of size on time to disease recurrence while the coefficients 2 and bj measure the effects of age and sex respectively. More specifically, bj and b2 are the changes in the average time to disease recurrence as size and age each increase by one unit, respectively, while bj measures the sex effect the average time to disease recurrence for females minus that for males. Each of these quantities, which we can estimate from the data again using the method of least squares, represents the contribution of each of the variables separately in the presence of the other variables. [Pg.95]

The questions of interest revolve around the values of the b coefficients. Do age and size of primary tumour predict time to disease recurrence Is there a sex difference We address these questions by formulating hypotheses  [Pg.95]

The highest integration of all variables and a certain combination of both sets of variables are possible by application of the method of simultaneous equations which is also called the structural equations method or path analysis. In this method the correlation of independent variables is explicitly accounted for. It is, furthermore, allowed that some independent variables are also considered as dependent variables within the same system of equations. As a result of both advantages the system equations are coupled at least by common error variables. [Pg.196]

So far, the mentioned methods have one basic assumption in common all start from linear models. [Pg.196]

For some years a family of nonlinear methods, called (artificial) neural networks, has gained some importance in chemistry in general [ZUPAN and GASTEIGER, 1993] and in analytical chemistry in particular [JANSON, 1991 KATEMAN, 1993]. These networks are black boxes trained in a learning phase to give the best fit output to given responses. [Pg.196]

The principal advantage of modeling nonlinear dependencies is sometimes compensated by the lack of interpretation of the weights. [Pg.196]

From Section 2.4 we know that univariate regression models describe the relationship of one centered variable, y, on another variable x1 e.g. by the model  [Pg.196]


Understanding the distribution allows us to calculate the expected values of random variables that are normally and independently distributed. In least squares multiple regression, or in calibration work in general, there is a basic assumption that the error in the response variable is random and normally distributed, with a variance that follows a ) distribution. [Pg.202]

Partial least-squares path modeling with latent variables (PLS), a newer, general method of handling regression problems, is finding wide apphcation in chemometrics. This method allows the relations between many blocks of data ie, data matrices, to be characterized (32—36). Linear and multiple regression techniques can be considered special cases of the PLS method. [Pg.426]

Other chemometrics methods to improve caUbration have been advanced. The method of partial least squares has been usehil in multicomponent cahbration (48—51). In this approach the concentrations are related to latent variables in the block of observed instmment responses. Thus PLS regression can solve the colinearity problem and provide all of the advantages discussed earlier. Principal components analysis coupled with multiple regression, often called Principal Component Regression (PCR), is another cahbration approach that has been compared and contrasted to PLS (52—54). Cahbration problems can also be approached using the Kalman filter as discussed (43). [Pg.429]

When experimental data is to be fit with a mathematical model, it is necessary to allow for the facd that the data has errors. The engineer is interested in finding the parameters in the model as well as the uncertainty in their determination. In the simplest case, the model is a hn-ear equation with only two parameters, and they are found by a least-squares minimization of the errors in fitting the data. Multiple regression is just hnear least squares applied with more terms. Nonlinear regression allows the parameters of the model to enter in a nonlinear fashion. The following description of maximum likehhood apphes to both linear and nonlinear least squares (Ref. 231). If each measurement point Uj has a measurement error Ayi that is independently random and distributed with a normal distribution about the true model y x) with standard deviation <7, then the probability of a data set is... [Pg.501]

Multiple Regression A general linear model is one expressed as... [Pg.502]

Multiple regression analysis can be executed by various programs. The one shown in the Appendix is from Mathcad 6 Plus, the regress method. Taking the log of the rates first and averaging later gives somewhat different result. [Pg.113]

The manual evaluation and the multiple regression results are in good agreement. The result from the log rates is still very close The evaluation by multiple regression is shown in Appendix E. The reader is encouraged to do the manual evaluation of effects of C and M ( m and n exponents) and compare those with the multiple regression results. [Pg.114]

Multiple regression techniques have been applied by investigators to determine the coefficients in a plume rise equation containing both of the above terms ... [Pg.296]

The following expressions can be used to estimate the temperature and enthalpy of steam. The expressions are based upon multiple regression analysis. The equation for temperature is accurate to within 1.5 % at 1,000 psia. The expression for latent heat is accurate to within + 3 % at 1,000 psia. Input data required to use these equations is the steam pressure in psia. The parameters in the equations are defined as t for temperature in F, for latent heat in Btu/lb, and P for pressure in psia. [Pg.494]

Miller first used Eq. (7-41) to correlate multiple variations, and this approach has more recently been subjected to considerable development. Many cross-interaction constants have been evaluated multiple regression analysis is one technique, but Miller and Dubois et ah discuss other methods. Lee et al. consider Pxy to be a measure of the distance between groups x and y in the transition state... [Pg.332]

Indications of the relative efficiencies of transmission through either aza or sulfur in benzo derivatives can be obtained from Jaffe s empirical multiple regression approach.In these systems, just as in aza-naphthalenes (Section IV,C, l,d), knowledge of the dissociation... [Pg.349]

Bourgoyne, A. T., Jr., and F. S. Young, A multiple regression approach to optimal drilling and abnormal pressure detection, Society of Petroleum Engineering Journal, August 12974. [Pg.1380]

In the Yukawa-Tsuno equation (1959)58 (equation 4), the sliding scale is provided by multiple regression on a and (cr+ — electron demanding than the ionization of benzoic acid. [Pg.496]

In the form of treatment developed by Taft and his colleagues since 195660-62, the Hammett constants are analyzed into inductive and resonance parameters, and the sliding scale is then provided by multiple regression on these. Equations 5 and 6 show the basic relationships. [Pg.497]

Multiple regression on ar and parameters employs the dual substituent-parameter equation, which may be written as in equation 964. [Pg.497]

Very often empirical equations can be developed from plant data using multiple regression techniques. The main advantage of this approach is that the correlations are often linear, can be easily coupled to optimization algorithms, do not cause convergence problems and are easily transferred from one computer to another. However, there are disadvantages, namely,... [Pg.100]

Topliss JG, Costello RJ. Chance correlations in structure-activity studies using multiple regression analysis. J Med Chem 1972 15 1066-9. [Pg.490]

FIGURE 5.3 The Albumin in Acute Stroke (ALIAS) Phase II Trial. Data represent mean SEM. p-Value according to multiple regression analysis. Dead patients have been censored, (a) Mean change in NIH Stroke Scale score over time since treatment in rt-PA and non-rt-PA cohorts receiving the three lowest doses (Tiers I, 0.34 mg/kg II, 0.68 mg/kg III, 1.03 mg/kg) and three highest doses of albumin (Tiers IV, 1.37 mg/kg V, 1.71 mg/kg VI, 2.03 mg/kg). [Pg.105]

Our basic methods have been detailed In previous reports (11, 12). In summary, however, our approach Is basically the same as that used by Hansch and co-workers (20-22) A set of compounds, which can reasonably be expected to elicit their carcinogenic response via the same general mechanism. Is chosen, and their relative biological activities, along with a set of molecular descriptors. Is entered Into a computer. The computer, using the relative biological response as the dependent variable, then performs stepwise multiple regression anayses (23) to select... [Pg.79]

Molecular Descriptors and Statistical Terms for Multiple Regression Analyses... [Pg.80]

Here, the notation (, I C, X2) stands for the squared multiple correlation coefficient (or coefficient of determination) of the multiple regression of y, on Xj and X2. The improvement is quite modest, suggesting once more that there is only a weak (linear) relation between the two sets of data. [Pg.319]

We can go one step further, however. Each of the above multiple regression relations is between a single variable (response) of one data set and a linear combination of the variables (predictors) from the other set. Instead, one may consider the multiple-multiple correlation, i.e. the correlation of a linear combination from one set with a linear combination of the other set. Such linear combinations of the original variables are variously called factors, components, latent variables, canonical variables or canonical variates (also see Chapters 9,17, 29, and 31). [Pg.319]

This is already a considerable improvement. The natural question then is Which linear combination of K-variables yields the highest R when regressed on the X-variables in a multiple regression Canonical correlation analysis answers this question. [Pg.319]

The total residual sum of squares, taken over all elements of E, achieves its minimum when each column Cj separately has minimum sum of squares. The latter occurs if each (univariate) column of Y is fitted by X in the least-squares way. Consequently, the least-squares minimization of E is obtained if each separate dependent variable is fitted by multiple regression on X. In other words the multivariate regression analysis is essentially identical to a set of univariate regressions. Thus, from a methodological point of view nothing new is added and we may refer to Chapter 10 for a more thorough discussion of theory and application of multiple regression. [Pg.323]


See other pages where Regression, multiple is mentioned: [Pg.426]    [Pg.426]    [Pg.426]    [Pg.327]    [Pg.113]    [Pg.173]    [Pg.189]    [Pg.444]    [Pg.27]    [Pg.521]    [Pg.424]    [Pg.480]    [Pg.123]    [Pg.870]    [Pg.521]    [Pg.44]    [Pg.298]    [Pg.33]    [Pg.359]    [Pg.319]    [Pg.319]    [Pg.323]    [Pg.325]   
See also in sourсe #XX -- [ Pg.6 , Pg.94 , Pg.99 ]

See also in sourсe #XX -- [ Pg.136 ]

See also in sourсe #XX -- [ Pg.196 ]

See also in sourсe #XX -- [ Pg.173 ]

See also in sourсe #XX -- [ Pg.136 , Pg.137 , Pg.226 ]

See also in sourсe #XX -- [ Pg.139 ]

See also in sourсe #XX -- [ Pg.12 ]

See also in sourсe #XX -- [ Pg.136 ]

See also in sourсe #XX -- [ Pg.392 ]

See also in sourсe #XX -- [ Pg.89 ]




SEARCH



Algebra and Multiple Linear Regression Part

Algebra and Multiple Linear Regression Part 4 - Concluding Remarks

Analysis in Multiple Regression

Analytical methods multiple linear regression

Case study 7.2 Multiple regression

Confidence intervals multiple regression

Continuous data multiple regression

Correlation in Multiple Linear Regression

Correlation multiple linear regression

Creating multiple regression models

Cross-validated multiple regression

Forward stepwise multiple linear regression

Functionality multiple regression modeling

INDEX multiple linear regression

Inverse multiple linear regression

Multiple Linear Regression

Multiple Linear Regression Analysis (MLRA)

Multiple Nonlinear Regression

Multiple Simple Linear Regression Functions

Multiple linear least squares regression

Multiple linear least squares regression MLLSR)

Multiple linear regression analysis

Multiple linear regression analysis Subject

Multiple linear regression and partial least squares

Multiple linear regression and principal

Multiple linear regression applications

Multiple linear regression calibration model

Multiple linear regression chromatography

Multiple linear regression coefficient

Multiple linear regression collinearity

Multiple linear regression definition

Multiple linear regression equations

Multiple linear regression inverse least squares model

Multiple linear regression model

Multiple linear regression model prediction

Multiple linear regression monitoring

Multiple linear regression multivariate approaches

Multiple linear regression predicted value, response

Multiple linear regression prediction

Multiple linear regression procedures

Multiple linear regression selection

Multiple linear regression, for

Multiple linear regression, solvent effects

Multiple linear regression, use

Multiple linear regression. Least squares fitting of response surface models

Multiple linear regression. MLR

Multiple regression analyses

Multiple regression analyses product

Multiple regression analysis data

Multiple regression equations

Multiple regression proteins

Multiple regression techniques

Multiple regression techniques, application

Multiple regression, backward

Multiple regression, backward elimination

Multiple regression, forward

Multiple regression, forward selection

Multiple-wavelength Linear Regression

Multivariate chemometric techniques multiple linear regression analysis

PCA with multiple linear regression analysis

Parameter estimation multiple regression

Parametric Statistics in Linear and Multiple Regression

Protein multiple regression modeling

Regression methods, assumptions multiple

Some Important Issues in Multiple Linear Regression

Source multiple regression

Statistical methods multiple regression analysis

Stepwise multiple linear regression

The Multiple Linear Regression Model

The Need for Multiple Regression and Partial Correlation

© 2024 chempedia.info