Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Detection anomaly

Measurement of exoenzymatic activities is potentially useful in detecting the effects of toxicants on heterotrophic biofilm communities. Sensitivity and direct relationship with organic matter use and, therefore, microbial growth make extracellular enzyme activities a relevant tool to assess the toxicity of specific compounds. Use of novel approaches that combine enzymatic and microscopic tools (e.g. ELF-phosphatase) may be extremely useful to detect anomalies at the sub-cellular scale. [Pg.399]

Figure 5.4. Block scheme of a monitoring system to detect anomalies in the environment. Figure 5.4. Block scheme of a monitoring system to detect anomalies in the environment.
Another approach to detect anomalies on the body is the use of millimeter-wave technology, which is non-ionizing low-power radiation, enabling its use with people for detecting explosives, drugs, plastics, ceramics, wood, paper, metals, and other anomalies concealed under clothing. [Pg.389]

The uterus is opened and the implants, live and dead fetuses, dead embryo-fetal primordia undergoing resorption and the respective placentas, as well as the corpora lutea in the ovaries are counted and examined macroscopically. The fetuses are assessed for signs of life, sex, outward appearance and outwardly detectable anomalies, and their body weight and, optionally, crown-rump-length are measured. Then the rat fetuses are killed by CO2-asphyxia. The rabbit fetuses are immediately placed in an incubator for 24 hours to test their capacity to survive and then also killed by CO2-asphyxia. [Pg.844]

Quality Control of Calibration Graphs. The main objective of any analytical method is to report reliable and accurate results that are informative in supporting the quality of products, be it as a part of problem-solving, detecting anomalies in samples,... [Pg.84]

Metal detectors have been in use since World War II and are still the most effective sensors for use against landmines and other UXO. There are two types of metal detectors. One detects anomalies in the earth s magnetic field caused by ferrous (iron-based) materials. The other creates an electromagnetic field that can detect both ferrous and nonferrous metals. Improvements made to metal detectors have reportedly been in processing sensor information, weight reduction, and improved sensitivity to disturbances in the magnetic field caused by metallic objects. [Pg.190]

For the given purpose, namely to detect anomalies and to trace typical sources of pollution ( hot spots ), the standard scheme, from sampling , sample preparation (with particular emphasis on grain size correction), chemical analysis (use of dry... [Pg.377]

In reconnaissance, measurements were generally made from the vehicle while it was driven at standard highway speeds. This was suflB-cient to detect anomalies greater than 100 ng/meter . Once a mercury anomaly was detected, the direction of the wind was noted, and the plume was traced upwind to its source wherever possible. Traverses were then run at right angles to the wind to obtain profiles across the plume at increasing distances from the source. [Pg.84]

There is also the awareness that sensitivity to environmental insult, and subsequent expression of that insult, does not cease with birth. The mammal at term is not a miniature adult a partial list of systems still undergoing differentiation include the nervous, endocrine, urogenital, digestive and immune systems. Expression of an insult incurred utero may not develop until after birth, in the human up to ten years of age for most detected anomalies, but with a latency of 15-30 years for carcinogenic events. [Pg.116]

Total times of detection Times of detected anomalies Times of successM prediction Times of turnover... [Pg.75]

Control Algorithm Flaws After the ground launch personnel cutbacks, SSLS management did not create a master surveillance plan to define the tasks of the remaining personnel (the formal insight plan was still in draft). In particular, there were no formal processes established to check the validity of the II filter constants or to monitor attitude rates once the flight tape was loaded into the INU at Cape Canaveral Air Station (CCAS) prior to launch. SSLS launch personnel were provided with no documented requirement or procedures to review the data and no references with which to compare the observed data in order to detect anomalies. [Pg.484]

The MEDUSA has been used as a regular part of plant maintenance and operation since 1990, and it is useful to detect anomalies and to diagnose their causes. As an example of vibration monitoring in JOYO, the vibration level of a cooling blower suddenly increased by a factor of ten on December 1992. This blower is placed in a nitrogen gas atmosphere and cannot be repaired while the reactor is operating. [Pg.54]

Variations in isotopic abundances that are caused by nuclear reactions induced by cosmic rays are most commonly utilized in cosmic ray exposure dating, but this employs isotopes that are measured by either accelerator or noble gas mass spectrometry [28, 29]. In fact, there are only a very limited number of elements that are suitable for the study of cosmogenic isotopic variations, which can be readily analyzed by either TIMS or MC-ICP-MS [28]. The most important application of these techniques are studies of the secondary neutron fluxes that are generated by (primary) cosmic rays. Such measurements aim to detect anomalies in Sm, Gd, and Cd isotopic abundances that are produced by (n,y) reactions, for example " Cd(n, y) Cd. Many of these investigations were conducted by TIMS [137-139], but some cosmogenic Cd isotope variations of lunar rocks and soUs were evaluated based on MC-ICP-MS isotope ratio data that were originally acquired as part of a stable isotope study [134]. [Pg.306]

Nowadays, different assets can be identified in our society whose compromisation could have catastrophic consequences e.g. energy production and distribution, telecommunication, water supply and others. For this reason, such assets are classified as critical [8]. Specifically, the term Critical Infrastructure (Cl) is used to describe these assets that are essential for a society and must be available 365 days a year and 24 hours a day. Thus, Cl monitoring is a very important task to avoid disasters. Cl monitoring is often performed through IT solutions so to allow Cl operators to detect anomalies which could cause failures. However, the drawback of such solutions is that they are exposed to cyber attacks. [Pg.339]

With the advent of modern computer-based system for monitoring pipelines, API developed an appropriate standard Std 1130 Computational Pipeline Monitoring (First Edition, October 1995). This standard is particularly important in detecting anomalies, which can be attributed to leaks, rupture of the line, etc. It covers algorithmic monitoring tools to help the person in charge of control and monitoring detect such anomalies. [Pg.561]

Unfortunately lEC standards have not accepted the idea that basic software must be able to operate but also to detect anomalies introduced into the memory by parasites. [Pg.126]

Run-time verification (RV) [7,16] entails adding pieces of code called monitors to a running application. The monitors scrutinise the system behaviour and check if it respects associated specifications. The monitors can detect anomalies during the execution of the application. That information may be logged and back... [Pg.66]

Diversity and fault tolerance. The system has several layers of fault tolerance - at operator, system, and unit levels. Since the operator obtains several variants of data, (s)he can detect anomalies and initiate manual error recovery. At the system level, the system exceeds its fault tolerance limit only if all N modules fail at once. Finally, at the DPU level, even if all DPUs fail to produce fresh data, a DPU keeps displaying data based on the last good value until it remains fresh. At the same time, software diversity significantly contributes to achieving data integrity - it diminishes the possibility of a common processing error. [Pg.67]

This term is used to describe the principle that a detected anomaly must first be confirmed over one or more cycles before being processed, thereby limiting the impact of false alarms and temporary faults. This of course depends on the severity of the anomaly. [Pg.290]

Because of the relatively short displacement time or length scales typically probed by NMR diffusometry, it is particularly well suited to detect anomalies in the segment displacement behavior expected on a time scale shorter than the terminal relaxation time, that is for root mean squared displacements shorter than the random-coil dimension. All models discussed above unanimously predict such anomalies (see Tables 1-3). Therefore, considering exponents of anomalous mean squared displacement laws alone does not provide decisive answers. In order to obtain a consistent and objective picture, it is rather crucial to make sure that (i) the absolute values of the mean squared segment displacement or the time-dependent diffusion coefficient are compatible with the theory, (ii) the dependence on other experimental parameters such as the molecular weight are correctly rendered, and (iii) the values of the limiting time constants are not at variance with those derived from other techniques. [Pg.99]


See other pages where Detection anomaly is mentioned: [Pg.355]    [Pg.230]    [Pg.238]    [Pg.371]    [Pg.199]    [Pg.32]    [Pg.2145]    [Pg.2147]    [Pg.327]    [Pg.341]    [Pg.421]    [Pg.211]    [Pg.57]    [Pg.89]    [Pg.49]    [Pg.321]    [Pg.262]    [Pg.73]    [Pg.300]   
See also in sourсe #XX -- [ Pg.383 ]




SEARCH



Anomaly

Detection of Tissue Anomalies and Cancer

© 2024 chempedia.info