Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Curve-fitting data visualization

A smooth function f(x) is used quite often to describe a set of data, (x y,). (x2,y2),. . .,(x j,yN). Fitting of a smooth curve, f(x), through these points is usually done for interpolation or visual purposes (Sutton and MacGregor, 1977). This is called curve fitting. The parameters defining the curve are calculated by minimizing a measure of the closeness of fit such as the function S(k)= [y j - f(x j ) 2. ... [Pg.2]

There was a significant negative correlation between (log) admission urinary PCP level and self-reported time since last PCP use (r= -0.53, p<0.001). Visual inspection of a graph of these two variables suggested a possible biphasic elimination curve, with the initial phase having a half-life of 5 to 7 days, and the later phase a half-life of about 30 days. However, formal curve fitting of these data to standard pharmacokinetic models (using BMDP... [Pg.234]

Such a function exhibits peaks (Fig. 9C) that correspond to interatomic distances but are shifted to smaller values (recall the distance correction mentioned above). This finding was a major breakthrough in the analysis of EXAFS data since it allowed ready visualization. However, because of the shift to shorter distances and the effects of truncation, such an approach is generally not employed for accurate distance determination. This approach, however, allows for the use of Fourier filtering techniques which make possible the isolation of individual coordination shells (the dashed line in Fig. 9C represents a Fourier filtering window that isolates the first coordination shell). After Fourier filtering, the data is back-transformed to k space (Fig. 9D), where it is fitted for amplitude and phase. The basic principle behind the curve-fitting analysis is to employ a parameterized function that will model the... [Pg.283]

An interesting method of fitting was presented with the introduction, some years ago, of the model 310 curve resolver by E. I. du Pont de Nemours and Company. With this equipment, the operator chose between superpositions of Gaussian and Cauchy functions electronically generated and visually superimposed on the data record. The operator had freedom to adjust the component parameters and seek a visual best match to the data. The curve resolver provided an excellent graphic demonstration of the ambiguities that can result when any method is employed to resolve curves, whether the fit is visually based or firmly rooted in rigorous least squares. The operator of the model 310 soon discovered that, when data comprise two closely spaced peaks, acceptable fits can be obtained with more than one choice of parameters. The closer the blended peaks, the wider was the choice of parameters. The part played by noise also became rapidly apparent. The noisy data trace allowed the operator additional freedom of choice, when he considered the error bar that is implicit at each data point. [Pg.33]

Data analysis and visualization. InPlot for curve fitting. InStat for statistics. InTend for laboratory chemical calculations. PCs and Macintosh. [Pg.406]

DineUa et al. (2013) also proposed a methodology to assess product differences and monitor subject discrimination ability in the TDS context according to a stepwise approach. The first step consists in the data visualization of the TDS curves to identify attributes, product pairs and/or subjects that are worthy of investigation. Then, an ANOVA approach is proposed to assess product differences. Compared to the permutation approach described above, the latter is based on more assumptions that might not fit perfectly with TDS data, but the authors confirm the validity of the approach since the distributions of the residuals fit with their assumption in the cases selected according to their first step. [Pg.295]

This nine-dimensional data representation of the body pose is recorded at a 30-Hz rate in an XML hierarchical format resulting in a sequence of poses that can represent the body movement. This data is then filtered using filtering and curve-fitting techniques, as discussed in [4]. The resulting XML data can either be used in real time for applications such as real-time activity classification, or it can be saved to be later visualized and processed offline. [Pg.647]

Often a relationship is sought between corrosion j>er-formance and some controllable variable, such as exposure time. Random error can produce enough scatter in the data to make visual curve fitting imprecise. What is desired is the best data fit. One way used frequently to accomplish a good fit is to use regression analysis to minimize the sum of the square of the data deviations about the fitted curve. This is the method of least squares. Many physical relationships are not linear functions. For example. [Pg.53]

After the identification of the temporal compensation factors (m-factors), the time stamps for one system were compensated in order to be able to correlate all measured values over the whole measurement time. The compensated curves for both HRs are shown in fig. 5b. Since the Vigilance system provides CO data, CO has been calculated by impendance to compare both values. Fig. 7 shows the results including a linear fit to visualize the global trend. [Pg.42]

Higher order polynomial fits were expected to be better models to fit this synthetic data. Fig. 23c shows a fourth order polynomial applied to the synthetic data. Visually, the curve drawn through the data points of Fig. 23c appears ideal. However, when the corresponding residuals are examined. Fig. 23d, there is no question about the poorness of fit of the fourth order polynomial applied to the three-line body of data. The fourth order polynomial case is presented here for illustrative purposes only. The residuals for die second and third order polynomial fits appear as intermediate patterns to those shown in Figs. 23b and 23d for the linear and quartic cases, respectively. [Pg.210]

It must be noted, however, that a value of r close to either + 1 or — 1 does not necessarily confirm that there is a linear relationship between the variables. It is sound practice first to plot the calibration curve on graph paper and ascertain by visual inspection if the data points could be described by a straight line or whether they may fit a smooth curve. [Pg.145]

Figure 6.2. Illustration of fitting Eq. (6-2, solid curve) to open-loop step test data representative of self-regulating and multi-capacity processes (dotted curve). The time constant estimation shown here is based on the initial slope and a visual estimation of dead time. The Ziegler-Nichols tuning relation (Table 6.1) also uses the slope through the inflection point of the data (not shown). Alternative estimation methods are provided on our Web Support. Figure 6.2. Illustration of fitting Eq. (6-2, solid curve) to open-loop step test data representative of self-regulating and multi-capacity processes (dotted curve). The time constant estimation shown here is based on the initial slope and a visual estimation of dead time. The Ziegler-Nichols tuning relation (Table 6.1) also uses the slope through the inflection point of the data (not shown). Alternative estimation methods are provided on our Web Support.

See other pages where Curve-fitting data visualization is mentioned: [Pg.59]    [Pg.114]    [Pg.40]    [Pg.122]    [Pg.122]    [Pg.130]    [Pg.231]    [Pg.296]    [Pg.252]    [Pg.405]    [Pg.1712]    [Pg.14]    [Pg.115]    [Pg.61]    [Pg.218]    [Pg.71]    [Pg.400]    [Pg.440]    [Pg.93]    [Pg.332]    [Pg.301]    [Pg.332]    [Pg.332]    [Pg.473]    [Pg.35]    [Pg.227]    [Pg.369]    [Pg.61]    [Pg.207]    [Pg.207]    [Pg.157]    [Pg.157]    [Pg.756]    [Pg.155]    [Pg.234]   


SEARCH



Curve fitting

Data fitting

© 2024 chempedia.info