Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Ordinary least squares regression independent variables

The expression x (J)P(j - l)x(j) in eq. (41.4) represents the variance of the predictions, y(j), at the value x(j) of the independent variable, given the uncertainty in the regression parameters P(/). This expression is equivalent to eq. (10.9) for ordinary least squares regression. The term r(j) is the variance of the experimental error in the response y(J). How to select the value of r(j) and its influence on the final result are discussed later. The expression between parentheses is a scalar. Therefore, the recursive least squares method does not require the inversion of a matrix. When inspecting eqs. (41.3) and (41.4), we can see that the variance-covariance matrix only depends on the design of the experiments given by x and on the variance of the experimental error given by r, which is in accordance with the ordinary least-squares procedure. [Pg.579]

The standard method for multivariate data of type 2a (Figure 2, X-matrix and one j -variable) is multiple linear regression (MLR), also called ordinary least squares regression (OLS). Only a few basic principles can be summarized here. The aim of data interpretation in this case is to build a linear model for the prediction of a response y from the independent variables (regressors, features) X, X2... Xp as given in equation (17) ... [Pg.353]

To model the relationship between PLA and PLR, we used each of these in ordinary least squares (OLS) multiple regression to explore the relationship between the dependent variables Mean PLR or Mean PLA and the independent variables (Berry and Feldman, 1985).OLS regression was used because data satisfied OLS assumptions for the model as the best linear unbiased estimator (BLUE). Distribution of errors (residuals) is normal, they are uncorrelated with each other, and homoscedastic (constant variance among residuals), with the mean of 0. We also analyzed predicted values plotted against residuals, as they are a better indicator of non-normality in aggregated data, and found them also to be homoscedastic and independent of one other. [Pg.152]

Ridge regression analysis is used when the independent variables are highly interrelated, and stable estimates for the regression coefficients cannot be obtained via ordinary least squares methods (Rozeboom, 1979 Pfaffenberger and Dielman, 1990). It is a biased estimator that gives estimates with small variance, better precision and accuracy. [Pg.169]

The Matlab Simulink Model was designed to represent the model stmctuie and mass balance equations for SSF and is shown in Fig. 6. Shaded boxes represent the reaction rates, which have been lumped into subsystems. To solve the system of ordinary differential equations (ODEs) and to estimate unknown parameters in the reaction rate equations, the inter ce parameter estimation was used. This program allows the user to decide which parameters to estimate and which type of ODE solver and optimization technique to use. The user imports observed data as it relates to the input, output, or state data of the SimuUnk model. With the imported data as reference, the user can select options for the ODE solver (fixed step/variable step, stiff/non-stiff, tolerance, step size) as well options for the optimization technique (nonlinear least squares/simplex, maximum number of iterations, and tolerance). With the selected solver and optimization method, the unknown independent, dependent, and/or initial state parameters in the model are determined within set ranges. For this study, nonlinear least squares regression was used with Matlab ode45, which is a Rimge-Kutta [3, 4] formula for non-stiff systems. The steps of nonlinear least squares regression are as follows ... [Pg.385]

In the case of multivariate modeling, several independent as well as several dependent variables may operate. Out of the many regression methods, we will learn about the conventional method of ordinary least squares (OLS) as well as methods that are based on biased parameter estimations reducing simultaneously the dimensionality of the regression problem, that is, principal component regression (PCR) and the partial least squares (PLS) method. [Pg.231]

The goal of this study is to test hypotheses about the relationships between multiple independent variables and one dependent variable. As most of my latent constructs are measured on interval scales and I expect linear relationships between the variables, multiple linear regression analysis with ordinary least squares estimation was used (Cohen 2003 Tabachnick and Fidell 1989). The study had two thematically separate parts the first part is focused on the antecedents of lead usemess of employees in firms (n=83, hypotheses 1-3, dependent variable lead usemess). in the second part, h3 otheses about lead usemess of employees and behavioral outcomes are tested (n=149, hypotheses 4-8, dependent variables innovative work behavior, internal boundaiy spanning behavior, external boundary spanning behavior, organizational... [Pg.136]


See other pages where Ordinary least squares regression independent variables is mentioned: [Pg.89]    [Pg.400]    [Pg.28]    [Pg.309]    [Pg.183]    [Pg.106]    [Pg.107]    [Pg.3496]    [Pg.87]    [Pg.293]    [Pg.593]    [Pg.52]    [Pg.382]    [Pg.150]   
See also in sourсe #XX -- [ Pg.134 ]




SEARCH



Least squares regression

Ordinary least squares

Variable independent

© 2024 chempedia.info