Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Fundamental errors

Intensive, critical studies of a controversial topic also help to eliminate the possibility of errors. One of my favorite quotations is by George von Bekesy, a fellow Hungarian-born physicist who studied fundamental questions of the inner ear and hearing (Nobel Prize in medicine, 1961) ... [Pg.146]

There is a fundamental difference between such scientific controversies and what simply can be called scientific fraud, i.e., deliberate falsification or fudging of data. Sloppy experimental work or data keeping can also lead to questionable or incorrect conclusions, and, although these violate established scientific standards and must be corrected (as they will), they do not necessarily represent deliberate fraud. In all this, the professor has a strict personal responsibility. As he/she is getting most of the recognition for the accomplishment of the research, it is only natural that he/she must also shoulder the responsibility for any mistakes, errors, or even falsifications. It is not accepta-... [Pg.249]

The system of atomic units was developed to simplify mathematical equations by setting many fundamental constants equal to 1. This is a means for theorists to save on pencil lead and thus possible errors. It also reduces the amount of computer time necessary to perform chemical computations, which can be considerable. The third advantage is that any changes in the measured values of physical constants do not affect the theoretical results. Some theorists work entirely in atomic units, but many researchers convert the theoretical results into more familiar unit systems. Table 2.1 gives some conversion factors for atomic units. [Pg.9]

The normal distribution of measurements (or the normal law of error) is the fundamental starting point for analysis of data. When a large number of measurements are made, the individual measurements are not all identical and equal to the accepted value /x, which is the mean of an infinite population or universe of data, but are scattered about /x, owing to random error. If the magnitude of any single measurement is the abscissa and the relative frequencies (i.e., the probability) of occurrence of different-sized measurements are the ordinate, the smooth curve drawn through the points (Fig. 2.10) is the normal or Gaussian distribution curve (also the error curve or probability curve). The term error curve arises when one considers the distribution of errors (x — /x) about the true value. [Pg.193]

The F statistic, along with the z, t, and statistics, constitute the group that are thought of as fundamental statistics. Collectively they describe all the relationships that can exist between means and standard deviations. To perform an F test, we must first verify the randomness and independence of the errors. If erf = cr, then s ls2 will be distributed properly as the F statistic. If the calculated F is outside the confidence interval chosen for that statistic, then this is evidence that a F 2. [Pg.204]

The principal tool for performance-based quality assessment is the control chart. In a control chart the results from the analysis of quality assessment samples are plotted in the order in which they are collected, providing a continuous record of the statistical state of the analytical system. Quality assessment data collected over time can be summarized by a mean value and a standard deviation. The fundamental assumption behind the use of a control chart is that quality assessment data will show only random variations around the mean value when the analytical system is in statistical control. When an analytical system moves out of statistical control, the quality assessment data is influenced by additional sources of error, increasing the standard deviation or changing the mean value. [Pg.714]

The quantity Gy is an estimation of G, and the fundamental theorem of Monte Carlo guarantees that the expected value of Gy is G, if G exists (Ref. 161). The error in the calculation is given by... [Pg.479]

In concentrated wstems the change in gas aud liquid flow rates within the tower and the heat effects accompanying the absorption of all the components must be considered. A trial-aud-error calculation from one theoretical stage to the next usually is required if accurate results are to be obtained, aud in such cases calculation procedures similar to those described in Sec. 13 normally are employed. A computer procedure for multicomponent adiabatic absorber design has been described by Feiutnch aud Treybal [Jnd. Eng. Chem. Process Des. Dev., 17, 505 (1978)]. Also see Holland, Fundamentals and Modeling of Separation Processes, Prentice Hall, Englewood Cliffs, N.J., 1975. [Pg.1361]

For determination of the aerodynamic diameters of particles, the most commonly apphcable methods for particle-size analysis are those based on inertia aerosol centrifuges, cyclones, and inertial impactors (Lundgren et al.. Aerosol Measurement, University of Florida, Gainesville, 1979 and Liu, Fine Paiiicles—Aerosol Generation, Measurement, Sampling, and Analysis, Academic, New York, 1976). Impactors are the most commonly used. Nevertheless, impactor measurements are subject to numerous errors [Rao and Whitby, Am. Ind. Hyg. A.s.soc.]., 38, 174 (1977) Marple and WiUeke, "Inertial Impactors, in Lundgren et al.. Aerosol Measurement and Fuchs, "Aerosol Impactors, in Shaw, Fundamentals of Aerosol Sci-... [Pg.1582]

Variations in measurable properties existing in the bulk material being sampled are the underlying basis for samphng theory. For samples that correctly lead to valid analysis results (of chemical composition, ash, or moisture as examples), a fundamental theoiy of sampling is applied. The fundamental theoiy as developed by Gy (see references) employs descriptive terms reflecting material properties to calculate a minimum quantity to achieve specified sampling error. Estimates of minimum quantity assumes completely mixed material. Each quantity of equal mass withdrawn provides equivalent representation of the bulk. [Pg.1757]

Model Development PreHminary modeling of the unit should be done during the familiarization stage. Interactions between database uncertainties and parameter estimates and between measurement errors and parameter estimates coiJd lead to erroneous parameter estimates. Attempting to develop parameter estimates when the model is systematically in error will lead to systematic error in the parameter estimates. Systematic errors in models arise from not properly accounting for the fundamentals and for the equipment boundaries. Consequently, the resultant model does not properly represent the unit and is unusable for design, control, and optimization. Cropley (1987) describes the erroneous parameter estimates obtained from a reactor study when the fundamental mechanism was not properly described within the model. [Pg.2564]

The methodical elaboration is included for estimation of random and systematic errors by using of single factor dispersion analysis. For this aim the set of reference samples is used. X-ray analyses of reference samples are performed with followed calculation of mass parts of components and comparison of results with real chemical compositions. Metrological characteristics of x-ray fluorescence silicate analysis are established both for a-correction method and simplified fundamental parameter method. It is established, that systematic error of simplified FPM is less than a-correction method, if the correction of zero approximation for simplified FPM is used by preliminary established correlation between theoretical and experimental set data. [Pg.234]

To facilitate the use of methanol synthesis in examples, the UCKRON and VEKRON test problems (Berty et al 1989, Arva and Szeifert 1989) will be applied. In the development of the test problem, methanol synthesis served as an example. The physical properties, thermodynamic conditions, technology and average rate of reaction were taken from the literature of methanol synthesis. For the kinetics, however, an artificial mechanism was created that had a known and rigorous mathematical solution. It was fundamentally important to create a fixed basis of comparison with various approximate mathematical models for kinetics. These were derived by simulated experiments from the test problems with added random error. See Appendix A and B, Berty et al, 1989. [Pg.281]

The eompensation of the voltage error amplifier should be a single-pole rolloff with a unity gain frequeney of 38 Hz. This is required to rejeet the fundamental line frequeneies of 50 and 60 Hz. The feedbaek eapaeitor around the voltage error amplifier beeomes... [Pg.230]

The choice of lubricant for a particular composition and process can be quite critical but beyond stating that such materials will normally be fluid at processing temperature and should have a solubility parameter at least 3 MPa different from the polymer, little further fundamental guidance can be given and selection is normally made by an empirical trial and error basis. [Pg.133]

Materials evaluation and selection are fundamental considerations in engineering design. If done properly, and in a systematic manner, considerable time and cost can be saved in design work, and design errors can be avoided. [Pg.18]

From the calibration point of view, manometers can be divided into two groups. The first, fluid manometers, are fundamental instruments, where the indication of the measured quantity is based on a simple physical factor the hydrostatic pressure of a fluid column. In principle, such instruments do not require calibration. In practice they do, due to contamination of the manometer itself or the manometer fluid and different modifications from the basic principle, like the tilting of the manometer tube, which cause errors in the measurement result. The stability of high-quality fluid manometers is very good, and they tend to maintain their metrological properties for a long period. [Pg.1151]

This requirement clearly defines the purpose of a quality system, that of ensuring that products conform to specified requirements. One of the principal differences between ISO 9000 and ISO/TS 16949 is the emphasis placed on internal efficiency and effectiveness. Implementing the requirements of ISO/TS 16949 will cause the waste, errors, and internal costs to be minimized. Unlike ISO 9001, ISO/TS 16949 requires the system to enable the organization to implement its quality policy and achieve its quality objectives, which after all is its purpose. This fundamental shift in concept is also behind the changes being made to ISO 9000 in the year 2000 edition. [Pg.159]

Task analysis is a fundamental methodology in the assessment and reduction of human error. A very wide variety of different task analysis methods exist, and it would be impracticable to describe all these techniques in this chapter. Instead, the intention is to describe representative methodologies applicable to different types of task. Techniques that have actually been applied in the CPI will be emphasized. An extended review of task analysis techniques is available in Kirwan and Ainsworth (1993). [Pg.161]

The Complete Basis Set (CBS) methods were developed by George Petersson and several collaborators. The family name reflects the fundamental observation underlying these methods the largest errors in ab initio thermochemical calculations result from basis set truncation. [Pg.154]

The LSDA approximation in general underestimates the exchange energy by 10%, thereby creating errors which are larger tlian the whole correlation energy. Electron correlation is furthermore overestimated, often by a factor close to 2, and bond strengths are as a consequence overestimated. Despite the simplicity of the fundamental assumptions, LSDA methods are often found to provide results with an accuracy similar to that obtained by wave mechanics HE methods. [Pg.184]

Vibrational Spectra Many of the papers quoted below deal with the determination of vibrational spectra. The method of choice is B3-LYP density functional theory. In most cases, MP2 vibrational spectra are less accurate. In order to allow for a comparison between computed frequencies within the harmonic approximation and anharmonic experimental fundamentals, calculated frequencies should be scaled by an empirical factor. This procedure accounts for systematic errors and improves the results considerably. The easiest procedure is to scale all frequencies by the same factor, e.g., 0.963 for B3-LYP/6-31G computed frequencies [95JPC3093]. A more sophisticated but still pragmatic approach is the SQM method [83JA7073], in which the underlying force constants (in internal coordinates) are scaled by different scaling factors. [Pg.6]

While one is free to think of CA as being nothing more than formal idealizations of partial differential equations, their real power lies in the fact that they represent a large class of exactly computable models since everything is fundamentally discrete, one need never worry about truncations or the slow aciminidatiou of round-off error. Therefore, any dynamical properties observed to be true for such models take on the full strength of theorems [toff77a]. [Pg.6]

If several fundamentally different methods of analysis for a given constituent are available, e.g. gravimetric, titrimetric, spectrophotometric, or spectrographic, the agreement between at least two methods of essentially different character can usually be accepted as indicating the absence of an appreciable systematic error in either (a systematic error is one which can be evaluated experimentally or theoretically). [Pg.129]

Part A, dealing with the Fundamentals of Quantitative Chemical Analysis, has been extended to incorporate sections of basic theory which were originally spread around the body of the text. This has enabled a more logical development of theoretical concepts to be possible. Part B, concerned with errors, statistics, and sampling, has been extensively rewritten to cover modern approaches to sampling as well as the attendant difficulties in obtaining representative samples from bulk materials. The statistics has been restructured to provide a logical, stepwise approach to a subject which many people find difficult. [Pg.903]


See other pages where Fundamental errors is mentioned: [Pg.278]    [Pg.483]    [Pg.297]    [Pg.278]    [Pg.483]    [Pg.297]    [Pg.418]    [Pg.183]    [Pg.1757]    [Pg.2270]    [Pg.2319]    [Pg.2549]    [Pg.2564]    [Pg.2578]    [Pg.739]    [Pg.831]    [Pg.907]    [Pg.1124]    [Pg.431]    [Pg.371]    [Pg.85]    [Pg.253]    [Pg.288]    [Pg.107]    [Pg.142]    [Pg.322]    [Pg.915]    [Pg.138]   
See also in sourсe #XX -- [ Pg.9 , Pg.11 , Pg.37 , Pg.38 , Pg.46 , Pg.62 ]




SEARCH



Fundamental attribution error

Human Error Fundamentals

The fundamental attribution error

Tracking error fundamentals

© 2024 chempedia.info