Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Before-the-Fact Measures

Before-the-fact measurements measure the results of supervisor actions before an incident occurs. For example, a periodic inspection is made of the supervisor s work area to measure how well he/she is maintaining physical conditions. This is a measure of whether things are wrong and, if so, how many are wrong. We can also measure how well a supervisor gets through to the people in the department by measuring the employee s work behavior [2]. [Pg.168]

One of the more recent and one of the best measurement tools is the perception survey. [Pg.168]

Traditionally safety programs dealt with the physical environment. Later we looked at management and attempted to build management principles into our safety programs. Today we recognize the need to look at the behavior environment—the climate and culture in which the safety system must live [2]. [Pg.168]


Leading indicators are those measures that can be effective in predicting future safety performance (Dupont Corporation 2000). Leading indicators can be considered before-the-fact measures. These measures assess outcome actions taken before accidents occur and are measures of proactive efforts designed to minimize losses and prevent accidents. Leading indicators can help uncover weaknesses in the organization s operations or employee behaviors before they develop into full-fledged problems. [Pg.14]

Rather than relying solely on these after-the-event measures, an audit provides a before-the-fact measure of control activities. A more valid comparison from year-to-year or site-to-site becomes possible. All elements of the plan, or the health and safety management system, can be critically reviewed so that priorities for future action may be determined. [Pg.225]

It follows that although the thermodynamic functions can be measured for a given distribution system, they can not be predicted before the fact. Nevertheless, the thermodynamic properties of the distribution system can help explain the characteristics of the distribution and to predict, quite accurately, the effect of temperature on the separation. [Pg.49]

Consistent with the experimental situation shown in Table V, with the exception of M02 and Ag2, the 4d dimers have received relatively little attention from non-empirical quantum chemistry. Given the increased possibilities of measuring the properties of these molecules provided by the new beam spectroscopy techniques, I think it would be very interesting to have more before-the-facts predictions from the various state-of-the-art quantum-chemical techniques. [Pg.502]

The use of unsafe behaviors as a metric provides the safety manager with an additional tool to use for measuring safety program effectiveness. While accidents and losses are after-the-fact metrics, safety performance based upon unsafe behaviors can be considered a proactive before-the-fact activity, which when used with other metrics, can provide the safety manager with an arsenal of safety metrics. [Pg.122]

The first type of behavior-related measures that can be used before the fact, or before an accident occurs, is the percentage of safe behaviors observed for an observation time. The greater the percentage of safe behaviors, the smaller the percentage of unsafe behaviors, and, thus, the fewer chances for an accident. Other types of measures related to the performance of safe behaviors involve a number of activity-based measures. Some of these may include monitoring the number of job tasks evaluated for potential hazards and compliance with various safety regulations, the number of job tasks for which safe job procedures have been established, and the amount of safety training provided to workers. [Pg.123]

Indeed AMS is a state-of-art ultrasensitive technique for trace analysis, but the success of AMS depends on many crucial steps before the actual measurement in the AMS system. In fact, sometimes expertise from several areas of science is required to make the AMS experiments reliable. Some of the crucial points are sample collection, pretreatment, sample preparation for ion source, development of separation methods for isobars, etc. Like all other analytical techniques, it is also necessary to measure standard, blank, and background, simultaneously with the sample. Below some of the crucial steps are described in nutshell. [Pg.2473]

Accidents and incidents are investi ted so that measures can be taken to prevent a recurrence of similar events. Investigation represents an after-the-fact response as far as any particular mishap is concerned. However, a thorough investigation may uncover hazards or problems which can be eliminated before-the-fact for the future. After causes have been determined, prompt follow-up action is required to achieve the purpose of the investigation. [Pg.223]

The existence of an injury or property damage loss is no longer a necessary condition for appraising accident performance. It is now possible to identify and examine accident problems before the fact instead of after the fact in terms of their injury-producing or property-damaging consequences. This allows the safety professional to concentrate on measurement of loss potential or near misses and remove the necessity of relying on measurement techniques based on the probabilistic, fortuitous, rare-event, injurious accident (p. 319). [Pg.151]

As shown above, in biology information and thermodynamics meet most closely and the problem of Shannon s entropy becomes especially acute [233]. If we consider a nucleotide sequence, with the sample-description space X=/(A, T, C, G) indicating the available four-fold alphabet A, T, C and G. If the chance of any base appearing at a locus is %, the relation log 4 = 2 measures its before-the-fact uncertainty. This symbol-uncertainty constitutes the basis for assigning the nucleotide sequences entropy. Thus a DNA sequence 100 units long has, assuming symbol equiprobability . Shannon s entropy of 200 bits. If we happen to know what that sequence is, then that entropy becomes information and such reasoning compromises the concept of entropy at any front. [Pg.189]

In this respect there is a strong call [231] then again, to have a completely appropriate alternative for entropy in information theory, which can be complexity so that what the Shannon formula actually measures, is the complexity of structural relationship. Factually in such an algorithmic approach to information theory the entropy of the symbol ensemble is replaced by the program of information required to specify sequence. The entropy of a thermodynamic system is a measure of the extent to which it is not tied down to a particular matter-energy distribution. A complex system, in contrast, requires a structured relationship among elements and such a system must have before-the-fact alternatives, such that one could not predict those structural relationships just from the properties of the elements concerned. Yet once these elements and relationships have been specified, the indeterminacy vanishes. [Pg.191]

The nomenclature of biochemical compounds is in large measure a part of organic nomenclature. However, it has its own special problems, arising partiy from the fact that many biochemical compounds must be given names before their chemical stmctures have been fully determined, and partiy from the interest in grouping them according to biological function as much as to chemical class. [Pg.119]

Analysis for Poly(Ethylene Oxide). Another special analytical method takes advantage of the fact that poly(ethylene oxide) forms a water-insoluble association compound with poly(acryhc acid). This reaction can be used in the analysis of the concentration of poly(ethylene oxide) in a dilute aqueous solution. Ereshly prepared poly(acryhc acid) is added to a solution of unknown poly(ethylene oxide) concentration. A precipitate forms, and its concentration can be measured turbidimetricaHy. Using appropriate caUbration standards, the precipitate concentration can then be converted to concentration of poly(ethylene oxide). The optimum resin concentration in the unknown sample is 0.2—0.4 ppm. Therefore, it is necessary to dilute more concentrated solutions to this range before analysis (97). Low concentrations of poly(ethylene oxide) in water may also be determined by viscometry (98) or by complexation with KI and then titration with Na2S202 (99). [Pg.343]

Even below the condensation pressure the pressure-volume product was not perfectly constant. With measurements of sufficient accuracy and precision, we can see that the PV product of ammonia at 25°C is not really constant after all. It varies systematically from 24.45 at 0.1000 atmospheres to 23.10 at 9.800 atmospheres, just before condensation begins. Similar measurements on 28.0 grams of carbon monoxide at 0°C show that the PV product is 22.410 at 0.2500 atmospheres pressure, but if the pressure is raised to 4.000 atmospheres, the PV prod-uci becomes 22.308. This type of deviation is common. Careful measurements reveal the fact that no gas follows perfectly the generalization PV = a constant at all pressures. On the other hand, every gas follows this rule approximately, and the fit becomes better and better as the pressure is lowered. So we find that every gas approaches the behavior PV = a constant as pressure is lowered. [Pg.60]

Author s comment] Because a general rendition of the Scientific Method cannot be cast in legally watertight wording, all possible outcomes of a series of measurements and pursuant actions must be in writing before the experiments are started. This includes but is not limited to the number of additional samples and measurements, and prescriptions on how to calculate and present final results. Off-the-cuff interpretations and decisions after the fact are viewed with suspicion. [Pg.277]

An important consideration for the direct physical measurement of adhesion via pull-off measurements is the influence of the precise direction of the applied force. In AFM the cantilever does not usually lie parallel to the surface, due to the risk that another part of the cantilever chip or chip holder will make contact with the surface before the tip. Another problem relates to the fact that the spot size in the optical beam deflection method is usually larger than the width of the lever. This can result in an interference effect between the reflection from the sample and the reflection from the cantilever. This is reduced if the cantilever and sample are not parallel. Most commercial AFM systems use an angle in the range of 10°-15° between the sample and the cantilever. Depending on this angle and the extent to which the cantilever is bent away from its equilibrium position, there can be a significant fraction of unintentional lateral forces applied to the contact. [Pg.30]

Vj = 1 <— v" = 1 transition will be at a different energy than the Vj = 0 <— v" = 0. We use this fact to measure the vibrational spectrum of V (OCO) in a depletion experiment (Fig. 12a). A visible laser is set to the Vj = 0 Vj = 0 transition at 15,801 cm producing fragment ions. A tunable IR laser fires before the visible laser. Absorption of IR photons removes population from the ground state, which is observed as a decrease in the fragment ion signal. This technique is a variation of ion-dip spectroscopy, in which ions produced by 1 + 1 REMPI are monitored as an IR laser is tuned. Ion-dip spectroscopy has been used by several groups to study vibrations of neutral clusters and biomolecules [157-162]. [Pg.358]

The current efficiencies for the different reaction products CO2, formaldehyde, and formic acid obtained upon potential-step methanol oxidation are plotted in Fig. 13.7d. The CO2 current efficiency (solid line) is characterized by an initial spike of up to about 70% directly after the potential step, followed by a rapid decay to about 54%, where it remains for the rest of the measurement. The initial spike appearing in the calculated current efficiency for CO2 formation can be at least partly explained by a similar artifact as discussed for formaldehyde oxidation before, caused by the fact that oxidation of the pre-formed COacurrent efficiency. The current efficiency for formic acid oxidation steps to a value of about 10% at the initial period of the measurement, and then decreases gradually to about 5% at the end of the measurement. Finally, the current efficiency for formaldehyde formation, which was not measured directly, but calculated from the difference between total faradaic current and partial reaction currents for CO2 and formic acid formation, shows an apparently slower increase during the initial phase and then remains about constant (final value about 40%). The imitial increase is at least partly caused by the same artifact as discussed above for CO2 formation, only in the opposite sense. [Pg.441]


See other pages where Before-the-Fact Measures is mentioned: [Pg.162]    [Pg.168]    [Pg.162]    [Pg.168]    [Pg.211]    [Pg.295]    [Pg.2115]    [Pg.223]    [Pg.211]    [Pg.120]    [Pg.375]    [Pg.742]    [Pg.260]    [Pg.799]    [Pg.73]    [Pg.463]    [Pg.149]    [Pg.487]    [Pg.5]    [Pg.1006]    [Pg.179]    [Pg.550]    [Pg.313]    [Pg.135]    [Pg.291]    [Pg.600]    [Pg.9]    [Pg.24]    [Pg.140]    [Pg.364]    [Pg.236]    [Pg.59]   


SEARCH



Beforal

Before-the-fact

FACT

© 2024 chempedia.info