Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Standardization calibration standards

Calibration standard Calibration standards used for inspection, measuring, and test equipment shall be traceable to national or international standards. [Pg.234]

Calibration standards. Calibration standards used for inspection, measuring and test equipment shall be traceable to national or international standards. If national or international standards are not practical or available, the manufacturer shall use an independent reproducible standard. If no applicable standard exists, the manufacturer shall establish and maintain an in-house standard. [Pg.262]

Analytical standard Calibration standard Drift correction standard Primary standard... [Pg.45]

This is the essential characteristic for every lubricant. The kinematic viscosity is most often measured by recording the time needed for the oil to flow down a calibrated capillary tube. The viscosity varies with the pressure but the influence of temperature is much greater it decreases rapidly with an increase in temperature and there is abundant literature concerning the equations and graphs relating these two parameters. One can cite in particular the ASTM D 341 standard. [Pg.282]

The viscosity is determined by measuring the time it takes for a crude to flow through a capillary tube of a given length at a precise temperature. This is called the kinematic viscosity, expressed in mm /s. It is defined by the standards, NF T 60-100 or ASTM D 445. Viscosity can also be determined by measuring the time it takes for the oil to flow through a calibrated orifice standard ASTM D 88. It is expressed in Saybolt seconds (SSU). [Pg.318]

Fig. 2 Conventional flat-bottom hole calibration standard for corrosion detection. Fig. 2 Conventional flat-bottom hole calibration standard for corrosion detection.
Fig.3 Corrosion calibration standard for eddy curent instrument using the gradient method. Fig.3 Corrosion calibration standard for eddy curent instrument using the gradient method.
The operation is quite simple One sets the frequency to the lowest value, adjusts the gain and phase to the desired sensitivity using a special calibration standard discussed below and performs a zero-compensation on a defect free zone of the standard. Now one is ready to test. As one slides the probe across the surface of an aluminum structure, a signal response will be indicative of the presence of corrosion or of the presence of a subsurface edge. [Pg.286]

Calibration procedure bases on rope specimens and corresponds to the Standard Pratice ASTM 1574. It takes a piece of the rope under test having a nominal metallic cross-section area (LMA=0) to set zero point of the instrument. Rope section with the LMA value known is used to set the second point of LMA calibration charactiristics. It is possible to use the air point calibration when there is no rope in a magnetic head (LMA=100%). [Pg.337]

Zero setup segment Sect.l ( 0% LMA ) cal.standard N2 Flaw setup segment Sect.2 ( 12.4% LMA ) cal.standard N2 LF calibration... [Pg.339]

To evaluate the image quality of the processing system, one can determine classical parameters like spatial resolution, contrast resolution, dynamic range, local and global distortion. Guidelines for film digitization procedures have been well described now. Furthermore, a physical standard film for both equipment assessment and digitization calibration and control, will be available in a next future (4). [Pg.501]

BAM produces and distributes calibrating films for the measurement of the standard deviation of the density. [Pg.554]

Documentation of area scanned, top view, side view and all calibration data of ultrasonic instrument and system (comes in a standard 3 page report form)... [Pg.776]

Quality in NDT depends upon a number of factors. Qualification of NDT personnel, technical state and correctness of choice of testing equipment, availability of approved working procedures of examination, calibration of NDT equipment have decisive importance among those factors of an NDT laboratory. Assessment of NDT laboratory competence is provided through accreditation in compliance with the EN 45000 series standards. [Pg.953]

Flow measurements using tracers are performed in all piping systems carrying oil, gas or water including separators, compressors, injector systems, and flares. Calibration of elsewhere difficult accessible flow meters is regularly performed by the tracer methods, which are based on international standards. Tracer flow measurements are also well suited for special purposes... [Pg.1053]

For calculation of the volumetric flow rate only the cross section area of the pipe is to be known. In order to give flow under standard conditions the temperature and pressure must be measured, and for conversion to mass flow the composition or density of the gas must be determined. These process parameters are often monitored by calibrated instrumentation. [Pg.1054]

Measurement of the conductivity can be carried out to high precision with specially designed cells. In practice, tiiese cells are calibrated by first measuring the conductance of an accurately known standard, and then introducing the sample under study. Conductances are usually measured at about 1 kHz AC rather than with DC voltages in order to avoid complications arismg from electrolysis at anode and cathode [8]. [Pg.571]

The ppm scale is always calibrated relative to the appropriate resonance of an agreed standard compound, because it is not possible to detect the NMR of bare nuclei, even though absolute shieldings can be calculated... [Pg.1445]

With most non-isothemial calorimeters, it is necessary to relate the temperature rise to the quantity of energy released in the process by determining the calorimeter constant, which is the amount of energy required to increase the temperature of the calorimeter by one degree. This value can be detemiined by electrical calibration using a resistance heater or by measurements on well-defined reference materials [1], For example, in bomb calorimetry, the calorimeter constant is often detemiined from the temperature rise that occurs when a known mass of a highly pure standard sample of, for example, benzoic acid is burnt in oxygen. [Pg.1902]

Provided that the balance is functioning correctly, the main source of error is in the weights themselves these should be calibrated by one of the standard methods so that their relative values are known, and they should be carefully cleaned with tissue paper and checked from time to time. To make the best use of the balance, weighing should be carried out by the method of swings, but for this purpose it is necessary first to determine the sensitivity of the balance. [Pg.465]

Tetramethylsilane (TIMS) (Section 13 4) The molecule (CH3)4Si used as a standard to calibrate proton and carbon 13 NMR spectra... [Pg.1295]

The scatter of the points around the calibration line or random errors are of importance since the best-fit line will be used to estimate the concentration of test samples by interpolation. The method used to calculate the random errors in the values for the slope and intercept is now considered. We must first calculate the standard deviation Sy/x, which is given by ... [Pg.209]

Table 8.33 Standard Solutions for Calibrating Conductivity Vessels 8.160... Table 8.33 Standard Solutions for Calibrating Conductivity Vessels 8.160...
The most visible part of the analytical approach occurs in the laboratory. As part of the validation process, appropriate chemical or physical standards are used to calibrate any equipment being used and any solutions whose concentrations must be known. The selected samples are then analyzed and the raw data recorded. [Pg.6]

Analytical chemists make a distinction between calibration and standardization. Calibration ensures that the equipment or instrument used to measure the signal is operating correctly by using a standard known to produce an exact signal. Balances, for example, are calibrated using a standard weight whose mass can be traced to the internationally accepted platinum-iridium prototype kilogram. [Pg.47]

Examine a procedure from Standard Methods for the Analysis of Waters and Wastewaters (or another manual of standard analytical methods), and identify the steps taken to compensate for interferences, to calibrate equipment and instruments, to standardize the method, and to acquire a representative sample. [Pg.52]

Suppose that you need to add a reagent to a flask by several successive transfers using a class A 10-mL pipet. By calibrating the pipet (see Table 4.8), you know that it delivers a volume of 9.992 mL with a standard deviation of 0.006 mL. Since the pipet is calibrated, we can use the standard deviation as a measure of uncertainty. This uncertainty tells us that when we use the pipet to repetitively deliver 10 mL of solution, the volumes actually delivered are randomly scattered around the mean of 9.992 mL. [Pg.64]

Signals are measured using equipment or instruments that must be properly calibrated if Sjneas is to be free of determinate errors. Calibration is accomplished against a standard, adjusting S eas until it agrees with the standard s known signal. Several common examples of calibration are discussed here. [Pg.105]


See other pages where Standardization calibration standards is mentioned: [Pg.132]    [Pg.107]    [Pg.397]    [Pg.265]    [Pg.7]    [Pg.52]    [Pg.599]    [Pg.14]    [Pg.163]    [Pg.286]    [Pg.286]    [Pg.508]    [Pg.816]    [Pg.1121]    [Pg.1248]    [Pg.1908]    [Pg.2814]    [Pg.2964]    [Pg.2964]    [Pg.79]    [Pg.229]    [Pg.45]    [Pg.47]    [Pg.47]    [Pg.48]    [Pg.48]    [Pg.50]    [Pg.60]    [Pg.64]    [Pg.104]    [Pg.104]    [Pg.105]   
See also in sourсe #XX -- [ Pg.324 ]




SEARCH



ASTM standards temperature calibration

Adjustment and calibration DKD, PTB national standards

Analytical Standardization and Calibration

Broad-standard linear calibration

Cadmium calibration standards

Calibration Raman shift standards

Calibration Transfer and Instrument Standardization

Calibration analytical standards

Calibration and standardization

Calibration broad-standard

Calibration by standard addition

Calibration curves standardization) routines

Calibration external standard

Calibration external standards used

Calibration graphs standard addition method used

Calibration internal standards used

Calibration luminescence standards

Calibration of Internal Standard

Calibration of Internal Standard to Acyl-ACP

Calibration qualitative standard

Calibration standard additions

Calibration standard solutions

Calibration standard-additions method

Calibration standards

Calibration standards INDEX

Calibration standards commercial assay kits

Calibration standards for

Calibration standards frequently used

Calibration standards immunoassay development

Calibration standards levels

Calibration standards measurement

Calibration standards suppliers

Calibration surrogate internal standard

Calibration volumetric internal standard

Calibration weight distribution standards

Calibration with Standards

Calibration with an External Standard

Calibration with an Internal Standard

Calibration, generally control standards

Channels calibration with standards

Conductivity, electrical calibration standards

Continuing calibration verification standards

Cross-section, absolute calibration standards

Direct standard calibration

Drug development calibration standard matrix

External standard calibration errors

External standard mode of instrument calibration

High-frequency measurements calibration standards

Indirect Calibration by Fluid Standards

Inductively coupled plasma mass internal standard calibration

Inductively coupled plasma mass standard addition calibration

Initial Calibration Verification Standards

Internal standard calibration

Isotope-dilution mass spectrometry calibration standards

Lignins calibration standards

Mass calibration standards

Matrix-matched calibration standards

Mercury standard, calibration

Metal calibration standards

Metal calibration standards indium

Multivariate calibration models transfer standardization methods

Narrow standard calibration

Nonionic surfactants calibration standards

Polystyrene calibration standard

Primary Calibrants and Internal Standards

Primary calibration standard

Quantification of Analytical Data via Calibration Curves in Mass Spectrometry Using Certified Reference Materials or Defined Standard Solutions

Relative standard deviation calibration

Sampling, Standardization, and Calibration

Selenium calibration standard

Spin concentration calibration standards

Standard KC1 Solutions for Calibrating

Standard KC1 Solutions for Calibrating Conductivity Cells

Standard Operating Procedures calibration

Standard Salt Solutions for Humidity Calibration

Standard addition method calibration graphs using

Standard addition mode of instrument calibration

Standard calibration curve methods

Standard deviation calibration plot

Standard deviation calibration-curve detection

Standard error of calibration

Standard of calibration

Standard operating procedure calibration models

Standard-additions method, calibration disadvantages

Standard-additions method, calibration effects

Standardization methods Calibration

Standards and Calibration

Standards for Calibration of Explosive Detectors

Standards wavelength calibration

Standards, in calibration

Standards, light emission intensity calibration

State-of-the-Art Commercial Instruments, Standards, and Calibration

Validation calibration/standard curve

Working calibration standards

Working calibration standards using

© 2024 chempedia.info