Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Inferential Properties

Process Control A Practical Approach Myke King 2011 John Wiley Sons Ltd. ISBN 978-0-470-97587-9 [Pg.197]

Inferentials comprise a mathematical function (/) using a number of independent variables (x) to predict the value of a dependent variable (y). [Pg.198]

They fall into two groups - those derived from regression analysis of historical process data and first-principle types which rely on engineering calculations. First-principle techniques still require some historical data to calibrate the model and to check its accuracy. While the vendors of first-principle techniques might argue that the volume of data required is less, the key to the success of both techniques is the quality of the data. The use of routinely collected data, for example from a plant history database, can often cause inaccuracies in the end result. [Pg.199]

Another potential problem is that of time-stamping. The dependent variable is often a laboratory result which may not be available until several horns after the sample was taken. It is therefore necessary to associate it with the operating conditions at the time of the sample. However, sample times are not necessarily reliable. Most sites will sample according to a schedule. However, the true sample time may be very different. It may have been delayed because there was an operating problem at the time or it may be taken early to fit in with the sampler s workload. Often all the samples on a process are scheduled for the same time but they clearly could not all be taken simultaneously. It is a misconception that, if the process is steady, that recording the exact sample time is not important. [Pg.199]

Relying on routinely collected data will often not provide sufficient scatter. With modem data collection systems it is a trivial exercise to assemble information collected over several years. Even if a laboratory sample is only taken daily, assembling a thousand or more sets of data should present no problem. However, unless the process is required to make multiple grades of the product, each with very different specifications, even without automatic quality control the process operator will have kept the quality very close to target. Any large deviations will usually be due to process upsets and may not provide any reliable steady-state information. [Pg.200]


Properties to be controlled may not be measurable online fast enough to allow for a timely action by the manipulated variable. Such properties may have to be inferred from other measured properties. A column product purity or composition, for example, could be inferred from measured column temperatures on a number of trays. The required property is related to the measurements by inferential property correlations whose parameters must be determined. In the composition-temperature example, the correlation parameters are evaluated from measured temperatures and laboratory composition analysis, and are updated every time laboratory analyses become available. [Pg.561]

Another contributor to the lag between a disturbance and controller action is associated with product analyzers response time. Inferential property models that correlate product properties to readily measurable column variables can cut that response time (Smith, 2002). [Pg.569]

Model development in drug development is usually empirical or exploratory in nature. Models are developed using experimental data and then refined until a reasonable balance is obtained between overfitting and underfitting. This iterative process in model selection results in models that have overly optimistic inferential properties because the uncertainty in the model is not taken into account. No universally accepted solution to this problem has been found. [Pg.56]

The lowest layer of process control applications is described as regulatory control. This includes aU the basic controllers for flow, temperature, pressure and level. But it also includes control of product quality. Regulatory is not synonymous with basic. Regulatory controls are those which maintain the process at a desired condition, or SP, but that does not mean they are simple. They can involve complex instrumentation such as on-stream analysers. They can employ advanced techniques such as signal conditioning, feedforward, dynamic compensation, overrides, inferential properties etc. Such techniques are often described as advanced regulatory control (ARC). Generally they are implemented... [Pg.1]

Laboratory samples are often collected during plant tests. These are usually to support the development of inferential properties (as described in Chapter 9). Indeed steady operation, under conditions away from normal operation, can provide valuable data scatter . Occasionally a series of samples are collected to obtain dynamic behaviour, for example if an onstream analyser is temporarily out of service or its installation delayed. The additional laboratory testing generated may be substantial compared to the normal workload. If the laboratory is not expecting this, then analysis may be delayed for several days with the risk that the samples may degrade. [Pg.12]

Figure 9.9 shows the performance of an inferential developed by the author. With of 0.99 one would question why the developer is not a multi-bilhonaire The reason is that it failed to predict the large falls in the value of the stock. The three occasions circled undermine completely the usefulness of the prediction. The same is true of an inferential property. If there is no change in the property then, no matter how accurate, the inferential has no value. If it then fails to respond to any significant change it may as well be abandoned. [Pg.205]

The Fenske Equation is normally used to estimate the minimum number of theoretical trays (A) for a column, i.e. the number necessary to achieve the required separation when operating with total reflux. While of little value in column design it is a parameter used by others in inferential property calculations. We will cover this later in this chapter. [Pg.274]

There are several ways in which dT/dP might be determined. The first assumes that sufficient good quality historical data exists so that an inferential property can be developed in the form... [Pg.328]

Figure 12.98 uses the data collected during this test to develop an inferential property calculation for bottoms composition. Despite the highly nonlinear behaviour displayed in the previous figures, a strong linear relationship exists between the C3 content of bottoms and the reciprocal reboil ratio. [Pg.336]

There are literally thousands of inferential properties, so called soft sensors , in use today that are ineffective. Indeed many of them are so inaccurate that process profitability would be improved by decommissioning them. Chapter 9 shows how many of the statistical techniques that are used to assess their accuracy are flawed and can lead the engineer into believing that their performance is adequate. It also demonstrates that automatically updating the inferential bias with laboratory results will generally aggravate the problem. [Pg.411]

Chapter 16 discusses aU the various tests that can be conducted on refractories to determine the thermal and mechanical material properties, as well as tests to determine such properties as spalling resistance and other inferential properties. Spalling tests are defined as inferential, because the refractory shape is not subjected to the complete stress-strain operating environment during testing. However, these tests are very valuable in ranking candidate refractory materials. [Pg.514]

Inferential sensors, also known as soft sensors, are models that nse readily measurable variables to determine product properties critical to prediction of prodnct/process qnafity. Ideally the soft sensors are continuously monitored and controlled, or moiutored on a relevant time scale. They need to make predictions quickly enough to be used for feedback control to keep process variability to a minimum. [Pg.536]

The shortcoming of all methods for predetermining cure cycles that regulate secondary variables is that they deal only in expectations and probabilities. No matter how many eventualities are anticipated, there is always one more that is unexpected. Unexpected variations in material properties, process equipment malfunctions, and changes to geometries of tool and part all contribute to the uncertainty of the outcome. As a result, in-process, inferential control is needed for the process environment as well as the boundary conditions and material state. Inferential control is relatively new to the polymer processing industry, especially in complex processes where good models are not yet common. [Pg.458]

There are numerous properties of materials which can be used as measures of composition, e.g. preferential adsorption of components (as in chromatography), absorption of electromagnetic waves (infra-red, ultra-violet, etc.), refractive index, pH, density, etc. In many cases, however, the property will not give a unique result if there are more than two components, e.g. there may be a number of different compositions of a particular ternary liquid mixture which will have the same refractive index or will exhibit the same infra-red radiation absorption characteristics. Other difficulties can make a particular physical property unsuitable as a measure of composition for a particular system, e.g. the dielectric constant cannot be used if water is present as the dielectric constant of water is very much greater than that of most other liquids. Instruments containing optical systems (e.g. refractometers) and/or electromechanical feedback systems (e.g. some infra-red analysers) can be sensitive to mechanical vibration. In cases where it is not practicable to measure composition directly, then indirect or inferential means of obtaining a measurement which itself is a function of composition may be employed (e.g. the use of boiling temperature in a distillation column as a measure of the liquid composition—see Section 7.3.1). [Pg.497]

The inferential method relies upon the availability of accurate concentration data and corresponding deposition velocities. However, knowledge of these properties alone does not permit the desired deposition data to be computed. As an extension of dry deposition research programs, a trial network has been set up to test the inferential method. Here, the scientific basis for the network operation will be discussed, and preliminary data will be presented. [Pg.196]

Without proper antecedent basis a claim is rendered indefinite. "A" or "an" introduces an element for the first time except in a means-plus-function format. "Said" or "the" refers back to previously introduced elements or limitations or refers to inherent properties (not required to be recited for antecedent purpose for example, "the surface of said element" when "surface" is not defined earlier). Inferential claiming where interconnectivity of elements is not certain—does not tell if the element is part of combination or not. [Pg.49]

At just about the same time as Shannon s paper was being written, Richard T. Cox published an equally significant paper, "Probability, Frequency, and Reasonable Expectation" (9). In the same style as Shannon, Cox proceeded to define a problem in inferential logic and then to design a mathematical function suitable for the problem. Both Cox and Shannon defined their functions by requiring certain properties such as additivity, decomposability, consistency, etc., and then found the functional forms which met the requirements. [Pg.278]

With linear models, exact inferential procedures are available for any sample size. The reason is that as a result of the linearity of the model parameters, the parameter estimates are unbiased with minimum variance when the assumption of independent, normally distributed residuals with constant variance holds. The same is not true with nonlinear models because even if the residuals assumption is true, the parameter estimates do not necessarily have minimum variance or are unbiased. Thus, inferences about the model parameter estimates are usually based on large sample sizes because the properties of these estimators are asymptotic, i.e., are true as n —> oo. Thus, when n is large and the residuals assumption is true, only then will nonlinear regression parameter estimates have estimates that are normally distributed and almost unbiased with minimum variance. As n increases, the degree of unbiasedness and estimation variability will increase. [Pg.104]

Experimental evidence to date implies that regulation in SM is complex. Many of the mechanisms are proposed on inferential grounds and attempt to explain specific observations without an analysis of the capacity to predict the overall mechanical output and energetic properties of SM. We begin by consideration of the information required to confirm a role for potential regulatory mechanisms. [Pg.341]

Inferential analysis [20, 21] is not a spectroscopy but could have a bearing upon the use of all process analysis techniques. It is a term being used to describe measurements that are not made but are inferred from other properties of the process under scrutiny. These methods rely upon process models being available for the process concerned. The value of this approach, quite apart from the fact that no expensive equipment is needed, is that it can give an indication of a measurement when it is impossible to extract a sample without it undergoing change or where inserting a probe is impractical. Inferential methods can also be useful to provide values between the frequency of the installed measurement devices or indeed when the measurement devices are off-line for maintenance purposes. The quality of an intermediate or a product, can in some instances be inferred from the values of temperature, pressure and flow rates in the area of the process under consideration. [Pg.873]

The laws of atomic heats and of isomorphism demonstrated that apparently unexciting empirical correlations of physical properties could have dramatic inferential applications to the microworld. Berzelius and others regularly used such tools in the construction of chemical atomism. And in fact, if one is interested in inference to the microworld, the study of physical properties has one important epistemological advantage over the study of chemical reactions. Namely, these properties reflect a stable substance being observed statically, rather than a substance in a dynamic state, undergoing a chemical transformation into another substance. [Pg.266]

On-line measurements of parameters such as ash, moisture, and sulfur use an inferential technique in which some property can be measured that in turn relates to the parameter in the material. [Pg.21]

Figure 9.6 shows the method favoured by suppliers of inferentials. Line charts tend to lead one to believe the correlation between the inferential and the actual property is better than it is. Presenting the same data in Figure 9.7 as a scatter plot gives a more precise measure. For example, if the true property is 50 %, the inferential will be between 30 and 70 % -probably far too inaccurate to be of any value. [Pg.203]

This parameter will have a value of 1 when the inferential is perfect and 0 when its benefit is zero. But importantly it will be negative if the property controller performance is so bad that process performance would be improved by switching off the controller. Figure 9.10 trends this parameter for the stock price example. It confirms what we know, that the prediction will lose us money on several occasions. [Pg.206]

The values of 0 in columns 2 and 3 of the table confirm, unlike R, that the inferential would be so poor that its use would cause control of the property to worsen. In column 4, where only a bias error is introduced, both R and 0 show that the inferential would be perfect - requiring just a once-off correction for it to be useful. [Pg.206]

Model-predictive control (MPC) uses process models derived from past process behavior to predict future process behaviour. The predictions are used to control process imits dynamically at optimum steady-state targets. MPC applications may also include the use of predicted product properties (inferential analyzers) and certain process calculations. Model-predictive controllers almost always include multiple independent variables. [Pg.249]

Although many process variables are easily measured, lack of on-line sensors for key polymer properties renders quality control of polymer plants difficult. Process control schemes based on process variables p, T, flow-rate and feedstock compositions) alone are no longer sufficient, because these cannot reveal all material property variations. Significant efforts are being spent on improvements to process control systems, as exemplified by the numerous attempts to monitor polymer properties during processing, such as composition, density, viscosity and dispersion of a minor phase, etc., all of which are somehow difficult to measure. The development of an on-line inferential system for polymer property is a very active research area of polymerisation reactor control [1]. A schematic of inferential systems is illustrated in Fig. 7.1. For highest quality... [Pg.663]


See other pages where Inferential Properties is mentioned: [Pg.176]    [Pg.49]    [Pg.197]    [Pg.389]    [Pg.176]    [Pg.49]    [Pg.197]    [Pg.389]    [Pg.444]    [Pg.458]    [Pg.461]    [Pg.61]    [Pg.276]    [Pg.319]    [Pg.227]    [Pg.43]    [Pg.12]    [Pg.21]    [Pg.374]    [Pg.280]    [Pg.110]    [Pg.329]    [Pg.208]    [Pg.297]   


SEARCH



© 2024 chempedia.info