Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Time series data

Katsouyanni, K., Touloumi, G., Spix, C., Schwartz, J., Balducci, F., Medina, S., Rossi, G Woj tyiiiak, B., Sunyep Bacharova, L., Schouten, J. P., Ponka, A., and Anderson, H. R. (1997). Short term effects of ambient sulphur dioxide and particulate matter on mortality in 12 Eiiiopean cities Results from time series data from the APHEA project. Brit. Med. J. 314, 1658-1663. [Pg.337]

Computer hardware and software used in 2DLC generally take care of three critical operations. These include real-time control of valves and sequencing functions such as autosampler control, formatting the time series data into a 2D data matrix, and analyzing the data. These will be described in some detail. [Pg.110]

The reflection calibration method with the specialized filter block has the advantage that it does not require the sample to be moved to recalibrate. As a result it might be particularly useful for long time scale time series data. [Pg.89]

The convolution integral and the Exponential Piston Flow Model (EPM) were used to relate measured tracer concentrations to historical tracer input. The tritium input function is based on tritium concentrations measured monthly since the 1960s near Wellington, New Zealand. CFC and SF6 input functions are based on measured and reconstructed data from southern hemisphere sites. The EPM was applied consistently in this study because statistical justification for selection of some other response function requires a substantial record of time-series tracer data which is not yet available for the majority of NGMP sites, and for those NGMP sites with the required time-series data, the EPM and other response functions yield similar results for groundwater age. [Pg.77]

For time-series data, the contiguous block method can provide a good assessment of the temporal stability of the model, whereas the Venetian blinds method can better assess nontemporal errors. For batch data, one can either specify custom subsets where each subset is assigned to a single batch (i.e., leave one batch out cross-validation), or use Venetian blinds or contiguous blocks to assess within-batch and between-batch prediction errors, respectively. For blocked data that contains replicates, one must be very careful with the Venetian blinds and contiguous block methods to select parameters such that the rephcate sample trap and the external subset traps, respectively, are avoided. [Pg.411]

This measure is likely to be a reasonable proxy for disease-specific health outcomes for two reasons. First, the proportion of deaths occurring above a certain age can be interpreted as the probability of survival until that age, for example, age 65 (Lichtenberg 2005b). Second, there is a statistically positive relationship between life expectancy at birth and the proportion of deaths occurring above a specific age, based on comparisons of time series data within a country or cross-sectional data across countries. For example, with life expectancy at birth on the vertical axis and the proportion of deaths occurring above age 65 for the whole population at the horizontal axis using time series data from Taiwan for 1971-2002, there is a significantly positive relationship, for both males and females (Fig. 13.4). Life expectancy at birth increases as the age at death increases. [Pg.250]

Change in aggregate cancer incidence is a poor substitute for individual-level data. Better medical treatment has meant better cancer detection, especially for the elderly. Death certificate data are more reliable than incidence data, but many researchers believe that elderly deaths have not been classified consisfently over time. Finally, aggregate time-series data might tell more about exposure in the less relevant distant past than in the more relevant recent past. [Pg.14]

Data from external and internal sources is integrated, aggregated, or associated in time series. Data items may contain errors or the data may be missing, unsharp, redundant, or contradictory. A language with operators and variables is required to establish models. Validity levels also have to be defined using suitable optimization and validation criteria. In addition, a search method is required to extract the data from the data warehouse and prepare it for analysis. [Pg.360]

Another recent trend is to show the importance of hydrophobic profiles rather than molecular hydrophobicity. Giuliani et al. (2002) suggested nonlinear signal analysis methods in the elucidation of protein sequence-structure relationships. The major algorithm used for analyzing hydrophobicity sequences or profiles was recurrence quantification analysis (RQA), in which a recurrence plot depicted a single trajectory as a two-dimensional representation of experimental time-series data. Examples of the global properties used in this... [Pg.311]

Brillinger, 1981] Brillinger, D. R. (1981). Time Series Data Analysis and Theory. Holden-Day, expanded edition. [Pg.253]

Variations (usually decrease) in detection limits occur with time, affecting both long-running projects, and the comparison of time-series data or adjacent project areas separated in time. It arises as analytical methods improve, and has its greatest impact on the trace elements, where the natural abundance is low in relation to the lowest measurable concentration. Such improvements can significantly increase the number of sample locations with measurable values in comparison to older data. The ability to make use of all the data acquired to its best potential is important, particularly for national mapping programmes. The data can be levelled as described above, if the standards used fall above the detection limit of the older method. [Pg.113]

Since this monograph is devoted only to the conception of mathematical models, the inverse problem of estimation is not fully detailed. Nevertheless, estimating parameters of the models is crucial for verification and applications. Any parameter in a deterministic model can be sensibly estimated from time-series data only by embedding the model in a statistical framework. It is usually performed by assuming that instead of exact measurements on concentration, we have these values blurred by observation errors that are independent and normally distributed. The parameters in the deterministic formulation are estimated by nonlinear least-squares or maximum likelihood methods. [Pg.372]

Matis, J. and Hartley, H., Stochastic compartmental analysis Model and least squares estimation from time series data, Biometrics, Vol. 27, 1971, pp. 77-102. [Pg.410]

A common time domain analysis involves computing the standard deviation of the potential (oe) and the standard deviation of the current (of) from the time series data and taking their ratio to compute a noise resistance Rn ... [Pg.451]

Wavelet analysis is a rather new mathematical tool for the frequency analysis of nonstationary time series signals, such as ECN data. This approach simulates a complex time series by breaking up the ECN data into different frequency components or wave packets, yielding information on the amplitude of any periodic signals within the time series data and how this amplitude varies with time. This approach has been applied to the analysis of ECN data [v, vi]. Since electrochemical noise is 1/f (or flicker) noise, the new technique of -> flicker noise spectroscopy may also find increasing application. [Pg.451]

Data mining to extract information on dynamics from time series data from experiments and simulations of molecular dynamics. [Pg.557]

The C-chart in Exhibit 52.1 shows time series data in defects per day, the average number of defects, and the upper and lower control limits for the process. Note that day 13 is above the UCL, indicating a day with an unusually high defect rate. [Pg.320]

While it is possible to see seasonal signals in the mainstem data, e.g. rising versus falling water trends, a clearer seasonal picture is available from time series data (Fig. 15.11) obtained at the Marchanteria station (Fig. 15.1) 50 km downstream of... [Pg.288]

Tupas, L. (1993). Hawaii ocean time series. Data rep. No. 4. Univ. of Hawaii, SOEST. Tech. Rep. 93-14. [Pg.566]

Steinberg et al. (2001) Overview of BATS time-series data, including nitrogen nutrients and vertical fluxes... [Pg.605]


See other pages where Time series data is mentioned: [Pg.482]    [Pg.36]    [Pg.129]    [Pg.73]    [Pg.27]    [Pg.624]    [Pg.794]    [Pg.411]    [Pg.412]    [Pg.255]    [Pg.255]    [Pg.35]    [Pg.38]    [Pg.40]    [Pg.50]    [Pg.63]    [Pg.230]    [Pg.233]    [Pg.120]    [Pg.12]    [Pg.162]    [Pg.291]    [Pg.662]   
See also in sourсe #XX -- [ Pg.108 ]




SEARCH



Data series

Time series

© 2024 chempedia.info