Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Random variable, distribution function

The physical and conceptual importance of the normal distribution rests on one unique property the sum of n random variables distributed with almost any arbitrary distribution tends to be distributed as a normal variable when n- oo (the Central Limit Theorem). Most processes that result from the addition of numerous elementary processes therefore can be adequately parameterized with normal random variables. On any sort of axis that extends from — oo to + oo, or when density on the negative side is negligible, most physical or chemical random variables can be represented to a good approximation by a normal density function. The normal distribution can be viewed a position distribution. [Pg.184]

Haring and Greenkorn (1970) developed an alternative statistical model for predicting dispersion in a network of randomly intersecting tubes. In this model, both l and r are assumed to be random variables distributed according to the beta probability distribution function, with parameters a, bx and ar,br, respectively. Haring and Greenkom s (1970) expression for K is ... [Pg.114]

In practice, in most cases, a renewal function carmot be expressed analytically. Therefore, it is necessary to determine its value using other methods, e.g. using numerical integration, similar procedures as mentioned in literature, tables as mentioned in specialized literature (Blischke Murthy 1994, 1996, Rigdon Basu 2000), etc. Analytical calculation of the integral is only possible for certain types of distribution of random variables. The majority of random variable distributions require a calculation with the use of numerical methods. [Pg.1936]

Generally, for light sources other than mode-locked or single-mode lasers, the statistical features must be taken into account as well the mean energies of the photon wave packets are regarded as random variables distributed according to a classical probability function of a characteristic width A j e the energy profile of these non-transform-limited or chaotic pulses has then a width of the order... [Pg.351]

Cumulative distribution function, cdf, of a random variable A function F defined for all real numbers such that F(x) = P(X < x). F x) is the cumulative sum of all probabilities assigned to real numbers less than or equal to x. [Pg.350]

Probability distribution function, pdf, of a continuous random variable A function when integrated over an interval gives the probability that the random variable values assume in the interval. If/is the pdf of a continuous random variable X then... [Pg.352]

The exponential distribution is a continuous random variable distribution that is widely used in the industrial sector, particularly in performing reliability studies [11]. The probability density function of fhe distribution is defined by... [Pg.22]

The normal distribution is a widely used continuous random variable distribution, and sometimes it is called the Gaussian distribution after Carl Friedrich Gauss (1777-1855), a German mathematician. The probability density function of the distribution is expressed by... [Pg.24]

Note that X is called a random variable. The function (X) is a discrete probability distribution function. By definition... [Pg.13]

Fig. 10.19 The probability density of the extreme value distribution typical of the MSP scores for random sequena The probability that a random variable with this distribution has a score of at least x is given by 1 - exp[-e -where u is the characteristic value and A is the decay constant. The figure shows the probability density function (which corresponds to the function s first derivative) for u = 0 and A = 1. Fig. 10.19 The probability density of the extreme value distribution typical of the MSP scores for random sequena The probability that a random variable with this distribution has a score of at least x is given by 1 - exp[-e -where u is the characteristic value and A is the decay constant. The figure shows the probability density function (which corresponds to the function s first derivative) for u = 0 and A = 1.
Example 3 illustrated the use of the normal distribution as a model for time-to-failure. The normal distribution has an increasing ha2ard function which means that the product is experiencing wearout. In applying the normal to a specific situation, the fact must be considered that this model allows values of the random variable that are less than 2ero whereas obviously a life less than 2ero is not possible. This problem does not arise from a practical standpoint as long a.s fija > 4.0. [Pg.10]

The distribution function/(x) can be taken as constant for example, I/Hq. We choose variables Xi, X9,. . . , Xs randomly from/(x) and form the arithmetic mean... [Pg.479]

Another consideration when using the approach is the assumption that stress and strength are statistically independent however, in practical applications it is to be expected that this is usually the case (Disney et al., 1968). The random variables in the design are assumed to be independent, linear and near-Normal to be used effectively in the variance equation. A high correlation of the random variables in some way, or the use of non-Normal distributions in the stress governing function are often sources of non-linearity and transformations methods should be considered. [Pg.191]

Monte Carlo simulation is a numerical experimentation technique to obtain the statistics of the output variables of a function, given the statistics of the input variables. In each experiment or trial, the values of the input random variables are sampled based on their distributions, and the output variables are calculated using the computational model. The generation of a set of random numbers is central to the technique, which can then be used to generate a random variable from a given distribution. The simulation can only be performed using computers due to the large number of trials required. [Pg.368]

In the introduction to this section, two differences between "classical" and Bayes statistics were mentioned. One of these was the Bayes treatment of failure rate and demand probttbility as random variables. This subsection provides a simple illustration of a Bayes treatment for calculating the confidence interval for demand probability. The direct approach taken here uses the binomial distribution (equation 2.4-7) for the probability density function (pdf). If p is the probability of failure on demand, then the confidence nr that p is less than p is given by equation 2.6-30. [Pg.55]

Having available the table (or function in a calculator or spreadsheer program) for the Ffj(0, 1) distribution, the probability of any F( i, cr) distributed random variable required to obtain a value between the limits [a, b can be evaluated. Table 12.2 gives values for the N(0, 1) distribution. As an example, the following probabilities can be evaluated from the tabic ... [Pg.1127]

The probability distribution of a randoni variable concerns tlie distribution of probability over tlie range of tlie random variable. The distribution of probability is specified by the pdf (probability distribution function). This section is devoted to general properties of tlie pdf in tlie case of discrete and continuous nmdoiii variables. Special pdfs finding e.xtensive application in liazard and risk analysis are considered in Chapter 20. [Pg.552]

Property 1 indicates tliat tlie pdf of a discrete random variable generates probability by substitution. Properties 2 and 3 restrict the values of f(x) to nonnegative real niunbers whose sum is 1. An example of a discrete probability distribution function (approaching a normal distribution - to be discussed in tlie next chapter) is provided in Figure 19.8.1. [Pg.553]

Anotlier fimction used to describe tlie probability distribution of a random variable X is tlie cumulative distribution function (cdf). If f(x) specifies tlie pdf of a random variable X, tlien F(x) is used to specify the cdf For both discrete and continuous random variables, tlie cdf of X is defined by ... [Pg.555]

The probability distribution of a random variable concerns the distribution of probability over tlie range of the random variable. Tlie distribution of probability is specified by the pdf (probability distribution function). [Pg.567]

The moments describe the characteristics of a sample or distribution function. The mean, which locates the average value on the measurement axis, is the first moment of values measured about the origin. The mean is denoted by p for the population and X for the sample and is given for a continuous random variable by... [Pg.92]

A bounded continuous random variable with uniform distribution has the probability function... [Pg.94]

The chi-square distribution gives the probability for a continuous random variable bounded on the left tail. The probability function has a shape parameter... [Pg.95]

The Poisson distribution can be used to determine probabilities for discrete random variables where the random variable is the number of times that an event occurs in a single trial (unit of lime, space, etc.). The probability function for a Poisson random variable is... [Pg.102]

We begin our discussion of random processes with a study of the simplest kind of distribution function. The first-order distribution function Fx of the time function X(t) is the real-valued function of a real-variable defined by6... [Pg.102]

Random Variables.—An interesting and useful interpretation of the theorem of averages is to regard it as a means for calculating the distribution functions of certain time functions Y(t) that are related to a time function X(t) whose distribution function is known. More precisely, if Y(t) is of the form Y(t) = [X(t)], then the theorem of averages enables us to calculate the distribution function of Y(t)... [Pg.114]

The results derived at the beginning of this section can now be expressed by the statement every random variable has a distribution function F, which is given by the equation... [Pg.118]

Note carefully that the same random variable (function) may have many different distribution functions depending on the distribution function of the underlying function X(t). We will avoid confusion on this point by adopting the convention that, in any one problem, and unless an explicit statement to the contrary is made, all random variables are to be used in conjunction with a time function X(t) whose distribution function is to be the same in all expressions in which it appears. With this convention, the notation F is just as unambiguous as the more cumbersome notation so that we are free to make use of whichever seems more appropriate in a given situation. [Pg.118]

The notion of the distribution function of a random variable is also useful in connection with problems where it is not possible or convenient to subject the underlying function X(t) to direct measurements, but where certain derived time functions of the form Y(t) = [X(t)] are available for observation. The theorem of averages then tdls us what averages of X(t) it is possible to calculate when all that is known is the distribution function of . The answer is quite simple if / denotes (almost) apy real-valuqd function of a real variable, then all X averages of the form... [Pg.118]

There is one further point that is worth mentioning in connection with the random variable concept. We have repeatedly stressed the fact that the theory of random processes is primarily concerned with averages of time functions and not with their detailed structure. The same comment applies to random variables. The distribution function of a random variable (or perhaps some other less complete information about averages) is the quantity of interest not its functional form. The functional form of the random variable is only of interest insofar as it enables us to derive its distribution function from the known distribution function of the underlying time function X(t). It is the relationship between averages of various time functions that is of interest and not the detailed relationship between the time functions themselves. [Pg.119]

The characteristic function of the random variable (or, equivalently, of the distribution function F ) is defined to be the expectation of the random variable eiv4> where v is a real number.16 In symbols... [Pg.126]

This result checks with our earlier calculation of the moments of the gaussian distribution, Eq. (3-66). The characteristic function of a gaussian random variable having an arbitrary mean and variance can be calculated either directly or else by means of the method outlined in the next paragraph. [Pg.128]

We conclude this section by introducing some notation and terminology that are quite useful in discussions involving joint distribution functions. The distribution function F of a random variable associated with time increments fnf m is defined to be the first-order distribution function of the derived time function Z(t) = + fn),... [Pg.143]

Y(t + )] thus, F+(x) = Fz(z). This definition is a straightforward generalization of our earlier definition of the distribution function of a one-dimensional random variable. More generally, given a family of... [Pg.143]

A few minutes thought should convince the reader that all our previous results can be couched in the language of families of random variables and their joint distribution functions. Thus, the second-order distribution function FXtx is the same as the joint distribution function of the random variables and 2 defined by... [Pg.144]

In this connection, we shall often abuse our notation somewhat by referring to FXZx Ts as the joint distribution function of the random variables X(t + rx) and X(t + r2) instead of employing the more precise but cumbersome language used at the beginning of this paragraph. In the same vein, the distribution function FXJn.rym will be referred to loosely as the joint distribution function of the random variables X(t + rj),- -, X(t + r ), Y(t + ri), -,Y t + r m). [Pg.144]

Once again, it should be emphasized that the functional form of a set of random variables is important only insofar as it enables us to calculate their joint distribution function in terms of other known distribution functions. Once the joint distribution function of a group of random variables is known, no further reference to their fractional form is necessary in order to use the theorem of averages for the calculation of any time average of interest in connection with the given random variables. [Pg.144]

The conditional probability distribution function of the random variables fa, , fa given that the random variables fa, , fa+m have assumed the values xn+1, , xn+m respectively, can be defined, in most cases of interest to us, by means of the following procedure. To simplify the discussion, we shall only present the details of the derivation for the case of two random variables fa and fa. We begin by using the definition, Eq. (3-159), to write... [Pg.151]

Our next result concerns the central limit theorem, which places in evidence the remarkable behavior of the distribution function of when n is a large number. We shall now state and sketch the proof of a version of the central limit theorem that is pertinent to sums of identically distributed [p0i(x) = p01(a ), i — 1,2, ], statistically independent random variables. To simplify the statement of the theorem, we shall introduce the normalized sum s defined by... [Pg.157]

The central limit theorem thus states the remarkable fact that the distribution function of the normalized sum of identically distributed, statistically independent random variables approaches the gaussian distribution function as the number of summands approaches infinity—... [Pg.157]


See other pages where Random variable, distribution function is mentioned: [Pg.57]    [Pg.57]    [Pg.1652]    [Pg.302]    [Pg.566]    [Pg.375]    [Pg.135]    [Pg.144]    [Pg.145]   
See also in sourсe #XX -- [ Pg.15 ]




SEARCH



Distribution function of random variable

Random distributions

Random function

Random variables

Randomly distributed

Variables distributed

© 2024 chempedia.info