Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Mean of a random variable

The confidence interval characterizes the range about the mean of a random variable, in which an observation can be expected with a given probability P or risk a = 1 — />. As a statistical factor, the t value from Student s distribution should be used in the case of a normal distribution (cf. Section 2.2). The confidence interval for the mean, x, is calculated for/ degrees of freedom by... [Pg.23]

The expectation or mean of a random variable is the average value over the whole distribution. For a discrete random variable with density/ and state space S, we have, denoting expectation by the symbol E,... [Pg.408]

To illustrate the general procedure, suppose we wish a 95% confidence interval for the mean of a random variable. Suppose the sample size is n. As described earlier, the sample mean x and the sample variance are unbiased estimators for the mean and variance of the underlying distribution. For a symmetric confidence interval the requirement is... [Pg.2268]

Main Effect n A change in the character of the relationship (effect) between a parameter and the level of one factor averaged across all levels of all other factors in the experiment. More precisely the response to a treatment can be expressed as the relationship between the mean, of a random variable, X, and the levels of factors A and B, which can be expressed as ... [Pg.986]

Tlie expected value of a random variable X is also called "the mean of X" and is often designated by p. Tlie expected value of (X-p) is called die variance of X. The positive square root of the variance is called die standard deviation. Tlie terms and a (sigma squared and sigma) represent variance and standard deviadon, respectively. Variance is a measure of the spread or dispersion of die values of the random variable about its mean value. Tlie standard deviation is also a measure of spread or dispersion. The standard deviation is expressed in die same miits as X, wliile die variance is expressed in the square of these units. [Pg.559]

Tlie mean )i and tlie variance a" of a random variable are constants cliaracterizing die random variable s average value and dispersion about its mean. The mean and variance can be derived from die pdf of the random variable. If die pdf is miknown, however, the mean and die variance can be estimated on die basis of a random sample of observations on die random variable. Let X, Xj,. X, denote a random sample of n observations on X. [Pg.562]

Here, f(x) is tlie probability of x occurrences of an event tliat occurs on the average p times per unit of space or time. Both tlie mean and tlie variance of a random variable X liaving a Poisson distribution are (i. [Pg.581]

The mean and variance of a random variable X having a log-normal distribution are given by... [Pg.589]

Estimates of the parameters a and p in tlie pdf of a random variable X having a log-normal distribution can be obtained from a sample of observations on X by making use of tlie fact diat In X is normally distributed with mean a and standard deviation p. Tlierefore, tlie mean and standard deviation of the natural logaritluns of tlie sample observations on X furnish estimates of a and p. To illustrate tlie procedure, suppose the time to failure T, in thousands of hours, was observed for a sample of 5 electric motors. The observed values of T were 8, 11, 16, 22, and 34. The natural logs of these observations are 2.08, 2.40, 2.77, 3.09, and 3.53. Assuming tliat T has a log-normal distribution, the estimates of the parameters a and p in the pdf are obtained from the mean and standard deviation of the natural logs of tlie observations on T. Applying the Eqs. (19.10.1), and (19.10.2) yields 2.77 as tlie estimate of a and 0.57 as tlie estimate ofp. [Pg.590]

The term Monte Carlo is often used to describe a wide variety of numerical techniques that are applied to solve mathematical problems by means of the simulation of random variables. The intuitive concept of a random variable is a simple one It is a variable that may take a given value of a set, but we do not know in advance which value it will take in a concrete case. The simplest example at hand is that of flipping a coin. We know that we will get head or tail, but we do not know which of these two cases will result in the next toss. Experience shows that if the coin is a fair one and we flip it many times, we obtain an average of approximately half heads and half tails. So we say that the probability p to obtain a given side of the coin is k A random variable is defined in terms of the values it may take and the related probabilities. In the example we consider, we may write... [Pg.668]

Parameter Two distinct definitions for parameter are used. In the first usage (preferred), parameter refers to the constants characterizing the probability density function or cumulative distribution function of a random variable. For example, if the random variable W is known to be normally distributed with mean p and standard deviation o, the constants p and o are called parameters. In the second usage, parameter can be a constant or an independent variable in a mathematical equation or model. For example, in the equation Z = X + 2Y, the independent variables (X, Y) and the constant (2) are all parameters. [Pg.181]

In general statistics there is a difference between the parent population of a random variable, e.g. x (sometimes also characterized by capital letters) and a single realization of the parent population expressed, e.g., as single measurements, xh of the variable x. The parent population means an infinity of values which follow a certain distribution function. In the reality of experimental sciences one always has single realizations, x , of the random variable x. [Pg.25]

When one or more input process variable and some process and non-process parameters are characterized by means of a random distribution (frequently normal distributions), the class of non-deterministic models or of models with random parameters is introduced. Many models with distributed parameters present the state of models with random parameters at the same time. [Pg.24]

The more measurements of a random variable, the better the estimated value based on the sample mean. However, even with a huge number of measurements the sample mean is at best an approximation of the true value and could in fact be way off (e.g,. if there is something wrong with the instruments or procedures used to measure X). [Pg.17]

Consider two sets of measurements of a random variable. X—for example, the percentage conversion in the same batch reactor measured using two different experimental techniques. Scatter plots of X versus run number are shown in Figure 2.5-1. The sample mean of each set is 70%, but the measured values scatter over a much narrower range for the first set (from 68% to 73%) than for the second set (from 52% to 95%). In each case you would estimate the true value of X for the given experimental conditions as the sample mean. 70%. but you wouid clearly have more confidence in the estimate for Set (a) than in that for Set (bp... [Pg.18]

Three quantities—the range, the sample variance, and the sample standard deviation—are used to express the extent to which values of a random variable scatter about their mean value. The range is simply the difference between the highest and lowest values of X in the data set ... [Pg.18]

For a sample size of n observations of a random variable, the sample mean, an estimator of the population mean, is calculated as ... [Pg.119]

For two independent groups 1 and 2, a sample size of observations of a random variable from group 1 and observations of a random variable from group 2, the sample means from each group are ... [Pg.120]

Fig. 2.3. Frequency distribution of a random variable, x N(0, 1). Note that x is the deviation from the mean (zero), measured in standard deviations. Fig. 2.3. Frequency distribution of a random variable, x N(0, 1). Note that x is the deviation from the mean (zero), measured in standard deviations.
The variance of a random variable is its expected squared deviation from its mean, or... [Pg.2147]

In basic statistics we learn that probability density functions can be defined by certain constants called distribution parameters. These parameters in turn can be used to characterize random variables through measures of location, shape, and variability of random phenomena. The most important parameters are the mean p and the variance The parameter /r is a measure of the center of the distribution (an analogy is the center of gravity of a mass) while is a measme of its spread or range (an analogy being the moment of inertia of a mass). Hence, when we speak of the mean and the variance of a random variable, we refer to two statistical parameters (constants) that greatly characterize or influence the probabilistic behavior of the random variable. The mean or expected value of a random variable x is defined as... [Pg.2242]

Industrial engineers frequently use simulation experiments to compare the performance of alternative systems and, ideally, to optimize system performance. When a system is modeled as a stochastic process, the objective is often to optimize expected performance, where expected means the mathematical expectation of a random variable. This section describes methods for optimization via simulation, using the problem of selecting the inventory policy that minimizes long-run expected cost per period as an illustration. [Pg.2487]

A resulting optimal solution x is sometimes called an expected value solution. Of course, this approach requires that the mean of the random variable D be known to the decision maker. In the present example, tip optimal solution of this deterministic optimization problem isx = ix. Note that the mean solution x can be very different from the solution x given in (7). It is well known that the quantiles are much more stable to variations of the cdf F than the corresponding mean valtre. Therefore, the optimal solution x of the stochastic optimization problem is more robust with respect to variations of the probability distributions than an optimal solution x of the corresponding deterministic optimization problem. This should be not surprising, since the deterministic problem (9) can be formulated in the framework of the stochastic optimization problem (4) by considering the trivial distribution of D being identically equal to fx. [Pg.2628]

A time series is a set of values for a sequence of random variables over time. Let Xy Xy Xy. .., X bc random variables denoting demands for periods 1, 2,..., n. The forecasting problem is to estimate the demand for period (n + 1) given the observed values of demands for the last n periods, Dy Dy. .., D . If is the forecast of demand for period (n + 1), then is the predicted mean of the random variable X y In other words. [Pg.31]


See other pages where Mean of a random variable is mentioned: [Pg.408]    [Pg.987]    [Pg.408]    [Pg.987]    [Pg.117]    [Pg.43]    [Pg.915]    [Pg.62]    [Pg.224]    [Pg.592]    [Pg.2147]    [Pg.25]    [Pg.854]    [Pg.68]    [Pg.1651]    [Pg.1654]    [Pg.27]    [Pg.40]   
See also in sourсe #XX -- [ Pg.408 ]




SEARCH



A?-means

Random variables

© 2024 chempedia.info