Math expectation of a function of a random variable. Expected value

Mathematical expectation of a discrete random variable is the sum of the products of all its possible values ​​and their probabilities.

Let a random variable take only probability values ​​whose respectively equal Then expected value random variable is determined by the equality

If a discrete random variable takes a countable set of possible values, then

Moreover, the mathematical expectation exists if the series on the right side of the equality converges absolutely.

Comment. From the definition it follows that the mathematical expectation of a discrete random variable is a non-random (constant) quantity.

Definition of mathematical expectation in the general case

Let us determine the mathematical expectation of a random variable whose distribution is not necessarily discrete. Let's start with the case of non-negative random variables. The idea will be to approximate such random variables using discrete ones for which the mathematical expectation has already been determined, and set the mathematical expectation equal to the limit of the mathematical expectations of the discrete random variables that approximate it. By the way, this is a very useful general idea, which is that some characteristic is first determined for simple objects, and then for more complex objects it is determined by approximating them by simpler ones.

Lemma 1. Let there be an arbitrary non-negative random variable. Then there is a sequence of discrete random variables such that


Proof. Let us divide the semi-axis into equal length segments and determine

Then properties 1 and 2 easily follow from the definition of a random variable, and

Lemma 2. Let be a non-negative random variable and and two sequences of discrete random variables possessing properties 1-3 from Lemma 1. Then

Proof. Note that for non-negative random variables we allow

By virtue of Property 3, it is easy to see that there is a sequence of positive numbers such that

It follows that

Using the properties of mathematical expectations for discrete random variables, we obtain

Passing to the limit at we obtain the statement of Lemma 2.

Definition 1. Let be a non-negative random variable, - a sequence of discrete random variables that have properties 1-3 from Lemma 1. The mathematical expectation of a random variable is the number

Lemma 2 guarantees that it does not depend on the choice of approximating sequence.

Let now be an arbitrary random variable. Let's define

From the definition and it easily follows that

Definition 2. The mathematical expectation of an arbitrary random variable is the number

If at least one of the numbers on the right side of this equality is finite.

Properties of mathematical expectation

Property 1. The mathematical expectation of a constant value is equal to the constant itself:

Proof. We will consider a constant as a discrete random variable that has one possible value and takes it with probability, therefore,

Remark 1. Let us define the product of a constant variable by a discrete random variable as a discrete random whose possible values ​​are equal to the products of the constant by the possible values; the probabilities of possible values ​​are equal to the probabilities of the corresponding possible values. For example, if the probability of a possible value is equal then the probability that the value will take the value is also equal

Property 2. The constant factor can be taken out of the sign of the mathematical expectation:

Proof. Let the random variable be given by the probability distribution law:

Taking into account Remark 1, we write the distribution law of the random variable

Remark 2. Before moving on to the next property, we point out that two random variables are called independent if the distribution law of one of them does not depend on what possible values ​​the other variable took. Otherwise, the random variables are dependent. Several random variables are called mutually independent if the laws of distribution of any number of them do not depend on what possible values ​​the remaining variables took.

Remark 3. Let us define the product of independent random variables and as a random variable whose possible values ​​are equal to the products of each possible value by each possible value, the probabilities of the possible values ​​of the product are equal to the products of the probabilities of the possible values ​​of the factors. For example, if the probability of a possible value is, the probability of a possible value is then the probability of a possible value is

Property 3. The mathematical expectation of the product of two independent random variables is equal to the product of their mathematical expectations:

Proof. Let independent random variables be specified by their own probability distribution laws:

Let's compile all the values ​​that a random variable can take. To do this, let's multiply all possible values ​​by each possible value; As a result, we obtain and, taking into account Remark 3, we write the distribution law, assuming for simplicity that all possible values ​​of the product are different (if this is not the case, then the proof is carried out in a similar way):

The mathematical expectation is equal to the sum of the products of all possible values ​​and their probabilities:

Consequence. The mathematical expectation of the product of several mutually independent random variables is equal to the product of their mathematical expectations.

Property 4. The mathematical expectation of the sum of two random variables is equal to the sum of the mathematical expectations of the terms:

Proof. Let random variables and be specified by the following distribution laws:

Let's compile all possible values ​​of a quantity. To do this, we add each possible value to each possible value; we obtain. Let us assume for simplicity that these possible values ​​are different (if this is not the case, then the proof is carried out in a similar way), and we denote their probabilities, respectively, by and

The mathematical expectation of a value is equal to the sum of the products of possible values ​​and their probabilities:

Let us prove that an Event that will take on the value (the probability of this event is equal) entails an event that will take on the value or (the probability of this event by the addition theorem is equal), and vice versa. Hence it follows that the equalities are proved similarly

Substituting the right-hand sides of these equalities into relation (*), we obtain

or finally

Variance and standard deviation

In practice, it is often necessary to estimate the dispersion of possible values ​​of a random variable around its mean value. For example, in artillery it is important to know how closely the shells will fall near the target that is to be hit.

At first glance, it may seem that the easiest way to estimate dispersion is to calculate all possible deviations of a random variable and then find their average value. However, this path will not give anything, since the average value of the deviation, i.e. for any random variable is equal to zero. This property is explained by the fact that some possible deviations are positive, while others are negative; as a result of their mutual cancellation, the average deviation value is zero. These considerations indicate the advisability of replacing possible deviations with their absolute values ​​or their squares. This is what they do in practice. True, in the case when possible deviations are replaced by absolute values, one has to operate with absolute values, which sometimes leads to serious difficulties. Therefore, most often they take a different path, i.e. calculate the average value of the squared deviation, which is called dispersion.

Expected value

Dispersion continuous random variable X, the possible values ​​of which belong to the entire Ox axis, is determined by the equality:

Purpose of the service. Online calculator designed to solve problems in which either distribution density f(x) or distribution function F(x) (see example). Usually in such tasks you need to find mathematical expectation, standard deviation, plot functions f(x) and F(x).

Instructions. Select the type of source data: distribution density f(x) or distribution function F(x).

Distribution density f(x) given Distribution function F(x) given

The distribution density f(x) is given:

The distribution function F(x) is given:

A continuous random variable is specified by a probability density
(Rayleigh distribution law - used in radio engineering). Find M(x) , D(x) .

The random variable X is called continuous , if its distribution function F(X)=P(X< x) непрерывна и имеет производную.
The distribution function of a continuous random variable is used to calculate the probability of a random variable falling into a given interval:
P(α< X < β)=F(β) - F(α)
Moreover, for a continuous random variable, it does not matter whether its boundaries are included in this interval or not:
P(α< X < β) = P(α ≤ X < β) = P(α ≤ X ≤ β)
Distribution density a continuous random variable is called a function
f(x)=F’(x) , derivative of the distribution function.

Properties of distribution density

1. The distribution density of the random variable is non-negative (f(x) ≥ 0) for all values ​​of x.
2. Normalization condition:

The geometric meaning of the normalization condition: the area under the distribution density curve is equal to unity.
3. The probability of a random variable X falling into the interval from α to β can be calculated using the formula

Geometrically, the probability of a continuous random variable X falling into the interval (α, β) is equal to the area curved trapezoid under the distribution density curve based on this interval.
4. The distribution function is expressed in terms of density as follows:

The value of the distribution density at point x is not equal to the probability of accepting this value; for a continuous random variable we can only talk about the probability of falling into a given interval. Let =M[(X- M(X)) 2 ]+(a- M(X)) 2 .

To prove this, let us first consider a random variable that is constant, i.e. the function maps the space of elementary events to a single point A. Since the constant multiplier can be taken beyond the sign of the sum, then

If each member of a sum is divided into two terms, then the whole sum is divided into two sums, of which the first is made up of the first terms, and the second is made up of the second. Therefore, the mathematical expectation of the sum of two random variables X+Y, defined on the same space of elementary events, is equal to the sum of mathematical expectations M(X) And M(U) these random variables:

M(X+Y) = M(X) + M(Y).

And therefore M(X-M(X)) = M(X) - M(M(X)). As shown above, M(M(X)) = M(X). Hence, M(X-M(X)) = M(X) - M(X) = 0.

Because the (X - a) 2 = ((XM(X)) + (M(X) - a)} 2 = (X - M(X)) 2 + 2(X - M(X))(M(X) - a) + (M(X) – a) 2 , That M[(X - a) 2 ] =M(X - M(X)) 2 + M{2(X - M(X))(M(X) - a)} + M[(M(X) – a) 2 ]. Let's simplify the last equality. As shown at the beginning of the proof of Statement 3, the mathematical expectation of a constant is the constant itself, and therefore M[(M(X) – a) 2 ] = (M(X) – a) 2 . Since the constant multiplier can be taken beyond the sign of the sum, then M{2(X - M(X))(M(X) - a)} = 2(M(X) - a)M(X - M(X)). The right side of the last equality is 0 because, as shown above, M(X-M(X))=0. Hence, M[(X- a) 2 ]= M[(X- M(X)) 2 ]+(a- M(X)) 2 , which was what needed to be proven.

From the above it follows that M[(X- a) 2 ] reaches a minimum A, equal M[(X- M(X)) 2 ], at a = M(X), since the second term in equality 3) is always non-negative and equals 0 only for the specified value A.

Statement 4. Let the random variable X takes values x 1, x 2,…, xm, and f is some function of the numerical argument. Then

To prove this, let’s group on the right side of equality (4), which defines the mathematical expectation, terms with the same values:

Using the fact that the constant factor can be taken out of the sign of the sum, and the definition of the probability of a random event (2), we obtain

Q.E.D.

Statement 5. Let X And U– random variables defined on the same space of elementary events, A And b- some numbers. Then M(aX+ bY)= aM(X)+ bM(Y).

Using the definition of the mathematical expectation and the properties of the summation symbol, we obtain a chain of equalities:

The required has been proven.

The above shows how the mathematical expectation depends on the transition to another reference point and to another unit of measurement (transition Y=aX+b), as well as to functions of random variables. The results obtained are constantly used in technical and economic analysis, in assessing the financial and economic activities of an enterprise, during the transition from one currency to another in foreign economic calculations, in regulatory and technical documentation, etc. The results under consideration allow the use of the same calculation formulas for various parameters scale and shift.

Previous

The mathematical expectation (average value) of a random variable X given on a discrete probability space is the number m =M[X]=∑x i p i if the series converges absolutely.

Purpose of the service. Using the service in online mode mathematical expectation, variance and standard deviation are calculated(see example). In addition, a graph of the distribution function F(X) is plotted.

Properties of the mathematical expectation of a random variable

  1. The mathematical expectation of a constant value is equal to itself: M[C]=C, C – constant;
  2. M=C M[X]
  3. The mathematical expectation of the sum of random variables is equal to the sum of their mathematical expectations: M=M[X]+M[Y]
  4. The mathematical expectation of the product of independent random variables is equal to the product of their mathematical expectations: M=M[X] M[Y] , if X and Y are independent.

Dispersion properties

  1. The variance of a constant value is zero: D(c)=0.
  2. The constant factor can be taken out from under the dispersion sign by squaring it: D(k*X)= k 2 D(X).
  3. If the random variables X and Y are independent, then the variance of the sum is equal to the sum of the variances: D(X+Y)=D(X)+D(Y).
  4. If random variables X and Y are dependent: D(X+Y)=DX+DY+2(X-M[X])(Y-M[Y])
  5. The following computational formula is valid for dispersion:
    D(X)=M(X 2)-(M(X)) 2

Example. The mathematical expectations and variances of two independent random variables X and Y are known: M(x)=8, M(Y)=7, D(X)=9, D(Y)=6. Find the mathematical expectation and variance of the random variable Z=9X-8Y+7.
Solution. Based on the properties of mathematical expectation: M(Z) = M(9X-8Y+7) = 9*M(X) - 8*M(Y) + M(7) = 9*8 - 8*7 + 7 = 23 .
Based on the properties of dispersion: D(Z) = D(9X-8Y+7) = D(9X) - D(8Y) + D(7) = 9^2D(X) - 8^2D(Y) + 0 = 81*9 - 64*6 = 345

Algorithm for calculating mathematical expectation

Properties of discrete random variables: all their values ​​can be renumbered natural numbers; Assign each value a non-zero probability.
  1. We multiply the pairs one by one: x i by p i .
  2. Add the product of each pair x i p i .
    For example, for n = 4: m = ∑x i p i = x 1 p 1 + x 2 p 2 + x 3 p 3 + x 4 p 4
Distribution function of a discrete random variable stepwise, it increases abruptly at those points whose probabilities are positive.

Example No. 1.

x i 1 3 4 7 9
p i 0.1 0.2 0.1 0.3 0.3

We find the mathematical expectation using the formula m = ∑x i p i .
Expectation M[X].
M[x] = 1*0.1 + 3*0.2 + 4*0.1 + 7*0.3 + 9*0.3 = 5.9
We find the variance using the formula d = ∑x 2 i p i - M[x] 2 .
Variance D[X].
D[X] = 1 2 *0.1 + 3 2 *0.2 + 4 2 *0.1 + 7 2 *0.3 + 9 2 *0.3 - 5.9 2 = 7.69
Standard deviation σ(x).
σ = sqrt(D[X]) = sqrt(7.69) = 2.78

Example No. 2. A discrete random variable has the following distribution series:

X -10 -5 0 5 10
R A 0,32 2a 0,41 0,03
Find the value of a, the mathematical expectation and the standard deviation of this random variable.

Solution. The value of a is found from the relation: Σp i = 1
Σp i = a + 0.32 + 2 a + 0.41 + 0.03 = 0.76 + 3 a = 1
0.76 + 3 a = 1 or 0.24=3 a , from where a = 0.08

Example No. 3. Determine the distribution law of a discrete random variable if its variance is known, and x 1 x 1 =6; x 2 =9; x 3 =x; x 4 =15
p 1 =0.3; p 2 =0.3; p 3 =0.1; p 4 =0.3
d(x)=12.96

Solution.
Here you need to create a formula for finding the variance d(x):
d(x) = x 1 2 p 1 +x 2 2 p 2 +x 3 2 p 3 +x 4 2 p 4 -m(x) 2
where the expectation m(x)=x 1 p 1 +x 2 p 2 +x 3 p 3 +x 4 p 4
For our data
m(x)=6*0.3+9*0.3+x 3 *0.1+15*0.3=9+0.1x 3
12.96 = 6 2 0.3+9 2 0.3+x 3 2 0.1+15 2 0.3-(9+0.1x 3) 2
or -9/100 (x 2 -20x+96)=0
Accordingly, we need to find the roots of the equation, and there will be two of them.
x 3 =8, x 3 =12
Choose the one that satisfies the condition x 1 x 3 =12

Distribution law of a discrete random variable
x 1 =6; x 2 =9; x 3 =12; x 4 =15
p 1 =0.3; p 2 =0.3; p 3 =0.1; p 4 =0.3

Random variable A variable is called a variable that, as a result of each test, takes on one previously unknown value, depending on random reasons. Random variables are denoted by capital Latin letters: $X,\ Y,\ Z,\ \dots $ According to their type, random variables can be discrete And continuous.

Discrete random variable- this is a random variable whose values ​​can be no more than countable, that is, either finite or countable. By countability we mean that the values ​​of a random variable can be numbered.

Example 1 . Here are examples of discrete random variables:

a) the number of hits on the target with $n$ shots, here the possible values ​​are $0,\ 1,\ \dots ,\ n$.

b) the number of emblems dropped when tossing a coin, here the possible values ​​are $0,\ 1,\ \dots ,\ n$.

c) the number of ships arriving on board (a countable set of values).

d) the number of calls arriving at the PBX (countable set of values).

1. Law of probability distribution of a discrete random variable.

A discrete random variable $X$ can take values ​​$x_1,\dots ,\ x_n$ with probabilities $p\left(x_1\right),\ \dots ,\ p\left(x_n\right)$. The correspondence between these values ​​and their probabilities is called law of distribution of a discrete random variable. As a rule, this correspondence is specified using a table, the first line of which indicates the values ​​$x_1,\dots ,\ x_n$, and the second line contains the probabilities $p_1,\dots ,\ p_n$ corresponding to these values.

$\begin(array)(|c|c|)
\hline
X_i & x_1 & x_2 & \dots & x_n \\
\hline
p_i & p_1 & p_2 & \dots & p_n \\
\hline
\end(array)$

Example 2 . Let the random variable $X$ be the number of points dropped during the toss dice. Such a random variable $X$ can take the following values: $1,\ 2,\ 3,\ 4,\ 5,\ 6$. The probabilities of all these values ​​are equal to $1/6$. Then the law of probability distribution of the random variable $X$:

$\begin(array)(|c|c|)
\hline
1 & 2 & 3 & 4 & 5 & 6 \\
\hline

\hline
\end(array)$

Comment. Since in the distribution law of a discrete random variable $X$ the events $1,\ 2,\ \dots ,\ 6$ form a complete group of events, then the sum of the probabilities must be equal to one, that is, $\sum(p_i)=1$.

2. Mathematical expectation of a discrete random variable.

Expectation of a random variable sets its “central” meaning. For a discrete random variable, the mathematical expectation is calculated as the sum of the products of the values ​​$x_1,\dots ,\ x_n$ and the probabilities $p_1,\dots ,\ p_n$ corresponding to these values, that is: $M\left(X\right)=\sum ^n_(i=1)(p_ix_i)$. In English-language literature, another notation $E\left(X\right)$ is used.

Properties of mathematical expectation$M\left(X\right)$:

  1. $M\left(X\right)$ lies between the smallest and largest values ​​of the random variable $X$.
  2. The mathematical expectation of a constant is equal to the constant itself, i.e. $M\left(C\right)=C$.
  3. The constant factor can be taken out of the sign of the mathematical expectation: $M\left(CX\right)=CM\left(X\right)$.
  4. The mathematical expectation of the sum of random variables is equal to the sum of their mathematical expectations: $M\left(X+Y\right)=M\left(X\right)+M\left(Y\right)$.
  5. The mathematical expectation of the product of independent random variables is equal to the product of their mathematical expectations: $M\left(XY\right)=M\left(X\right)M\left(Y\right)$.

Example 3 . Let's find the mathematical expectation of the random variable $X$ from example $2$.

$$M\left(X\right)=\sum^n_(i=1)(p_ix_i)=1\cdot ((1)\over (6))+2\cdot ((1)\over (6) )+3\cdot ((1)\over (6))+4\cdot ((1)\over (6))+5\cdot ((1)\over (6))+6\cdot ((1 )\over (6))=3.5.$$

We can notice that $M\left(X\right)$ lies between the smallest ($1$) and largest ($6$) values ​​of the random variable $X$.

Example 4 . It is known that the mathematical expectation of the random variable $X$ is equal to $M\left(X\right)=2$. Find the mathematical expectation of the random variable $3X+5$.

Using the above properties, we get $M\left(3X+5\right)=M\left(3X\right)+M\left(5\right)=3M\left(X\right)+5=3\cdot 2 +5=$11.

Example 5 . It is known that the mathematical expectation of the random variable $X$ is equal to $M\left(X\right)=4$. Find the mathematical expectation of the random variable $2X-9$.

Using the above properties, we get $M\left(2X-9\right)=M\left(2X\right)-M\left(9\right)=2M\left(X\right)-9=2\cdot 4 -9=-1$.

3. Dispersion of a discrete random variable.

Possible values ​​of random variables with equal mathematical expectations can disperse differently around their average values. For example, in two student groups the average score for the exam in probability theory turned out to be 4, but in one group everyone turned out to be good students, and in the other group there were only C students and excellent students. Therefore, there is a need for a numerical characteristic of a random variable that would show the spread of the values ​​of the random variable around its mathematical expectation. This characteristic is dispersion.

Variance of a discrete random variable$X$ is equal to:

$$D\left(X\right)=\sum^n_(i=1)(p_i(\left(x_i-M\left(X\right)\right))^2).\ $$

In English literature the notation $V\left(X\right),\ Var\left(X\right)$ is used. Very often the variance $D\left(X\right)$ is calculated using the formula $D\left(X\right)=\sum^n_(i=1)(p_ix^2_i)-(\left(M\left(X \right)\right))^2$.

Dispersion properties$D\left(X\right)$:

  1. The variance is always greater than or equal to zero, i.e. $D\left(X\right)\ge 0$.
  2. The variance of the constant is zero, i.e. $D\left(C\right)=0$.
  3. The constant factor can be taken out of the sign of the dispersion provided that it is squared, i.e. $D\left(CX\right)=C^2D\left(X\right)$.
  4. The variance of the sum of independent random variables is equal to the sum of their variances, i.e. $D\left(X+Y\right)=D\left(X\right)+D\left(Y\right)$.
  5. The variance of the difference between independent random variables is equal to the sum of their variances, i.e. $D\left(X-Y\right)=D\left(X\right)+D\left(Y\right)$.

Example 6 . Let's calculate the variance of the random variable $X$ from example $2$.

$$D\left(X\right)=\sum^n_(i=1)(p_i(\left(x_i-M\left(X\right)\right))^2)=((1)\over (6))\cdot (\left(1-3.5\right))^2+((1)\over (6))\cdot (\left(2-3.5\right))^2+ \dots +((1)\over (6))\cdot (\left(6-3.5\right))^2=((35)\over (12))\approx 2.92.$$

Example 7 . It is known that the variance of the random variable $X$ is equal to $D\left(X\right)=2$. Find the variance of the random variable $4X+1$.

Using the above properties, we find $D\left(4X+1\right)=D\left(4X\right)+D\left(1\right)=4^2D\left(X\right)+0=16D\ left(X\right)=16\cdot 2=32$.

Example 8 . It is known that the variance of the random variable $X$ is equal to $D\left(X\right)=3$. Find the variance of the random variable $3-2X$.

Using the above properties, we find $D\left(3-2X\right)=D\left(3\right)+D\left(2X\right)=0+2^2D\left(X\right)=4D\ left(X\right)=4\cdot 3=12$.

4. Distribution function of a discrete random variable.

The method of representing a discrete random variable in the form of a distribution series is not the only one, and most importantly, it is not universal, since a continuous random variable cannot be specified using a distribution series. There is another way to represent a random variable - the distribution function.

Distribution function random variable $X$ is called a function $F\left(x\right)$, which determines the probability that the random variable $X$ will take a value less than some fixed value $x$, that is, $F\left(x\right )=P\left(X< x\right)$

Properties of the distribution function:

  1. $0\le F\left(x\right)\le 1$.
  2. The probability that the random variable $X$ will take values ​​from the interval $\left(\alpha ;\ \beta \right)$ is equal to the difference between the values ​​of the distribution function at the ends of this interval: $P\left(\alpha< X < \beta \right)=F\left(\beta \right)-F\left(\alpha \right)$
  3. $F\left(x\right)$ - non-decreasing.
  4. $(\mathop(lim)_(x\to -\infty ) F\left(x\right)=0\ ),\ (\mathop(lim)_(x\to +\infty ) F\left(x \right)=1\ )$.

Example 9 . Let us find the distribution function $F\left(x\right)$ for the distribution law of the discrete random variable $X$ from example $2$.

$\begin(array)(|c|c|)
\hline
1 & 2 & 3 & 4 & 5 & 6 \\
\hline
1/6 & 1/6 & 1/6 & 1/6 & 1/6 & 1/6 \\
\hline
\end(array)$

If $x\le 1$, then, obviously, $F\left(x\right)=0$ (including for $x=1$ $F\left(1\right)=P\left(X< 1\right)=0$).

If $1< x\le 2$, то $F\left(x\right)=P\left(X=1\right)=1/6$.

If $2< x\le 3$, то $F\left(x\right)=P\left(X=1\right)+P\left(X=2\right)=1/6+1/6=1/3$.

If $3< x\le 4$, то $F\left(x\right)=P\left(X=1\right)+P\left(X=2\right)+P\left(X=3\right)=1/6+1/6+1/6=1/2$.

If $4< x\le 5$, то $F\left(X\right)=P\left(X=1\right)+P\left(X=2\right)+P\left(X=3\right)+P\left(X=4\right)=1/6+1/6+1/6+1/6=2/3$.

If $5< x\le 6$, то $F\left(x\right)=P\left(X=1\right)+P\left(X=2\right)+P\left(X=3\right)+P\left(X=4\right)+P\left(X=5\right)=1/6+1/6+1/6+1/6+1/6=5/6$.

If $x > 6$, then $F\left(x\right)=P\left(X=1\right)+P\left(X=2\right)+P\left(X=3\right) +P\left(X=4\right)+P\left(X=5\right)+P\left(X=6\right)=1/6+1/6+1/6+1/6+ 1/6+1/6=1$.

So $F(x)=\left\(\begin(matrix)
0,\ at\ x\le 1,\\
1/6,at\ 1< x\le 2,\\
1/3,\ at\ 2< x\le 3,\\
1/2,at\ 3< x\le 4,\\
2/3,\ at\ 4< x\le 5,\\
5/6,\ at\ 4< x\le 5,\\
1,\ for\ x > 6.
\end(matrix)\right.$