Events are called independent if. Dependent and independent random events. Total Probability Formula

The dependence of events is understood in probabilistic sense, not functionally. This means that the appearance of one of the dependent events cannot unambiguously judge the appearance of the other. Probabilistic dependence means that the occurrence of one of the dependent events only changes the probability of the occurrence of the other. If the probability does not change, then the events are considered independent.

Definition: Let - arbitrary probability space, - some random events. They say that event BUT does not depend on the event AT , if it conditional probability coincides with the unconditional probability:

.

If a , then we say that the event BUT event dependent AT.

The concept of independence is symmetrical, that is, if an event BUT does not depend on the event AT, then the event AT does not depend on the event BUT. Indeed, let . Then . Therefore, they simply say that the events BUT and AT independent.

The following symmetrical definition of the independence of events follows from the rule of multiplication of probabilities.

Definition: Developments BUT and AT, defined on the same probability space are called independent, if

If a , then the events BUT and AT called dependent.

Note that this definition is also valid when or .

Properties of independent events.

1. If events BUT and AT are independent, then the following pairs of events are also independent: .

▲ Let us prove, for example, the independence of the events . Imagine an event BUT as: . Since the events are incompatible, then , and due to the independence of the events BUT and AT we get that . Hence , which means independence . ■

2. If the event BUT does not depend on events IN 1 and IN 2, which are incompatible () , that event BUT does not depend on the amount.

▲ Indeed, using the axiom of additivity of probability and independence of the event BUT from events IN 1 and IN 2, we have:

Relationship between the concepts of independence and incompatibility.

Let BUT and AT- any events that have a non-zero probability: , so . If the events BUT and AT are inconsistent (), and therefore equality can never take place. In this way, incompatible events are dependent.

When more than two events are considered simultaneously, their pairwise independence does not sufficiently characterize the connection between the events of the entire group. In this case, the concept of independence in the aggregate is introduced.

Definition: Events defined on the same probability space are called collectively independent, if for any 2 £m £n and any combination of indices holds the equality:

At m = 2 independence in the aggregate implies pairwise independence of events. The reverse is not true.


Example. (Bernstein S.N.)

A random experiment consists in tossing a regular tetrahedron (tetrahedron). There is a face that has fallen out from top to bottom. The faces of the tetrahedron are colored as follows: 1st face - white, 2nd face - black,
3 face - red, 4 face - contains all colors.

Consider the events:

BUT= (Dropout white color}; B= (Black drop out);

C= (Red dropout).

Then ;

Therefore, the events BUT, AT and FROM are pairwise independent.

However, .

Therefore, events BUT, AT and FROM collectively they are not independent.

In practice, as a rule, the independence of events is not established by checking it by definition, but vice versa: events are considered independent from any external considerations or taking into account the circumstances random experiment, and use independence to find the probabilities of producing events.

Theorem (multiplications of probabilities for independent events).

If events defined on the same probability space are independent in the aggregate, then the probability of their product is equal to the product of the probabilities:

▲ The proof of the theorem follows from the definition of the independence of events in the aggregate or from the general probability multiplication theorem, taking into account the fact that in this case

Example 1 (typical example for finding conditional probabilities, the concept of independence, the probability addition theorem).

The electrical circuit consists of three independently operating elements. The failure probabilities of each of the elements are respectively equal to .

1) Find the probability of circuit failure.

2) The circuit is known to have failed.

What is the probability that it fails:

a) 1st element; b) 3rd element?

Solution. Consider events = (Failed k th element), and the event BUT= (Scheme failed). Then the event BUT is presented in the form:

.

1) Since the events and are not incompatible, then the axiom of additivity of probability Р3) is not applicable and to find the probability one should use the general probability addition theorem, according to which

Let the probability of an event AT does not depend on the occurrence of the event BUT.

Definition. Event AT called independent of the event A if the occurrence of the event BUT does not change the probability of an event AT, i.e. if the conditional probability of the event AT is equal to its unconditional probability:

R A(AT) = R(AT). (2.12)

Substituting (2.12) into relation (2.11), we obtain

R(BUT)R(AT) = R(AT)R B(BUT).

R B(BUT) = R(BUT),

those. conditional probability of an event BUT assuming an event has occurred AT, is equal to its unconditional probability. In other words, the event BUT does not depend on the event B.

Lemma (on mutual independence of events): if event AT does not depend on the event BUT, then the event BUT does not depend on the event AT; it means that property of independence of events mutually.

For independent events, the multiplication theorem R(AB) = R(BUT) R A(AT) has the form

R(AB) = R(BUT) R(AT), (2.13)

those. the probability of the joint occurrence of two independent events is equal to the product of the probabilities of these events.

Equality (2.13) is taken as the definition of independent events. Two events are said to be independent if the occurrence of one of them does not change the probability of the occurrence of the other.

Definition. Two events are called independent, if the probability of their combination is equal to the product of the probabilities of these events; otherwise the events are called dependent.

In practice, the independence of events is concluded according to the meaning of the problem. For example, the probabilities of hitting a target with each of two guns do not depend on whether the other gun hit the target, so the events “first gun hit the target” and “second gun hit the target” are independent.

Example. Find the probability of hitting the target jointly by two guns if the probability of hitting the target by the first gun (event BUT) is equal to 0.8, and the second (the event AT) – 0,7.

Solution. Developments BUT and AT independent, therefore, by the multiplication theorem, the desired probability

R(AB) = R(BUT)R(AT) = 0.7 × 0.8 = 0.56.

Comment 1. If events BUT and AT are independent, then the events are also independent. BUT and , and AT, and . Really,

Consequently,

, or .

, or .

those. developments BUT and AT independent.

Independence of events and AT, and is a consequence of the proved assertion.

The concept of independence can be extended to the case n events.

Definition. Several events are called pairwise independent if every two of them are independent. For example, events BUT, AT, FROM pairwise independent if the events are independent BUT and AT, BUT and FROM, AT and FROM.

In order to generalize the multiplication theorem to several events, we introduce the concept of independence of events in the aggregate.

Definition. Several events are called collectively independent(or simply independent) if every two of them are independent and every event and all possible products of the others are independent. For example, if the events BUT 1 , A 2 , BUT 3 are independent in the aggregate, then the events are independent BUT 1 and A 2 , BUT 1 and BUT 3 , A 2 and BUT 3 ; BUT 1 and A 2 BUT 3 , A 2 and BUT 1 BUT 3 , BUT 3 and BUT 1 A 2. From what has been said, it follows that if the events are independent in the aggregate, then the conditional probability of the occurrence of any event from among them, calculated on the assumption that any other events from among the others have occurred, is equal to its unconditional probability.



We emphasize that if several events are independent in pairs, then their independence in the aggregate does not yet follow from this. In this sense, the requirement for the independence of events in the aggregate is stronger than the requirement for their pairwise independence.

Let us explain what has been said with an example. Suppose there are 4 balls in the urn, colored: one is red ( BUT), one - in blue ( AT), one - black ( FROM) and one - in all these three colors ( ABC). What is the probability that a ball drawn from the urn is red?

Since two of the four balls are red, then R(BUT) = 2/4 = 1/2. Arguing similarly, we find R(AT) = 1/2, R(FROM) = 1/2. Let us now assume that the ball taken is blue, i.e. event AT already happened. Will the probability that the drawn ball is red change, i.e. Will the probability of an event change? BUT? Of the two balls that are blue, one ball is also red, so the probability of the event is BUT is still 1/2. In other words, the conditional probability of an event BUT, calculated under the assumption that an event has occurred AT, is equal to its unconditional probability. Therefore, the events BUT and AT independent. Similarly, we conclude that the events BUT and FROM, AT and FROM independent. So the events BUT, AT and FROM are pairwise independent.

Are these events independent in the aggregate? It turns out not. Indeed, let the extracted ball have two colors, for example, blue and black. What is the probability that this ball is also red? Only one ball is colored in all three colors, so the captured ball is also red. Thus, assuming that the events AT and FROM occurred, we conclude that the event BUT will surely come. Therefore, this event is reliable and its probability is equal to one. In other words, the conditional probability R BC(BUT)= 1 events BUT is not equal to its unconditional probability R(BUT) = 1/2. So, in pairs independent events BUT, AT, FROM are not collectively independent.

We now present a corollary of the multiplication theorem.

Consequence. The probability of the joint occurrence of several events that are independent in the aggregate is equal to the product of the probabilities of these events:

Proof. Consider three events: BUT, AT and FROM. Combination of events BUT, AT and FROM tantamount to a combination of events AB and FROM, that's why

R(ABC) = R(AB×C).

Since the events BUT, AT and FROM are independent in the aggregate, then independent, in particular, are the events AB and FROM, as well as BUT and AT. By the multiplication theorem for two independent events, we have:

R(AB×C) = R(AB)R(FROM) and R(AB) = R(BUT)R(AT).

So, finally we get

R(ABC) = R(BUT)R(AT)R(FROM).

For an arbitrary n the proof is carried out by the method of mathematical induction.

Comment. If events BUT 1 , BUT 2 , ...,A n are independent in the aggregate, then the opposite events are also independent in the aggregate.

Example. Find the probability of the coat of arms appearing together in one toss of two coins.

Solution. The probability of the appearance of the coat of arms of the first coin (event BUT)

R(BUT) = 1/2.

The probability of the appearance of the coat of arms of the second coin (event AT)

R(AT) = 1/2.

Developments BUT and AT independent, so the desired probability by the multiplication theorem is equal to

R(AB) = R(BUT)R(AT) = 1/2 × 1/2 = 1/4.

Example. There are 3 boxes containing 10 parts. The first drawer contains 8, the second drawer 7 and the third drawer 9 standard parts. One item is drawn at random from each box. Find the probability that all three parts taken out are standard.

Solution. The probability that a standard part is taken from the first box (the event BUT),

R(BUT) = 8/10 = 0,8.

The probability that a standard part is taken from the second box (the event AT),

R(AT) = 7/10 = 0,7.

The probability that a standard part is taken from the third box (the event FROM),

R(FROM) = 9/10 = 0,9.

Since the events BUT, AT and FROM independent in the aggregate, then the desired probability (by the multiplication theorem) is equal to

R(ABC) = R(BUT)R(AT)R(FROM) = 0.8×0.7×0.9 = 0.504.

Let us give an example of the joint application of the addition and multiplication theorems.

Example. Probabilities of occurrence of each of three independent events BUT 1 , BUT 2 , BUT 3 respectively equal R 1 , R 2 , R 3 . Find the probability of occurrence of only one of these events.

Solution. Note that, for example, the appearance only first event BUT 1 is equivalent to the appearance of an event (the first one appeared and the second and third events did not appear). Let us introduce the notation:

B 1 - only event appeared BUT 1 , i.e. ;

B 2 – only event appeared BUT 2 , i.e. ;

B 3 – only event appeared BUT 3 , i.e. .

Thus, to find the probability of occurrence of only one of the events BUT 1 , BUT 2 , BUT 3 , we will look for the probability P(B 1 + B 2 + AT 3) the appearance of one, no matter which of the events AT 1 , AT 2 , AT 3 .

Since the events AT 1 , AT 2 , AT 3 are inconsistent, then the addition theorem applies

P(B 1 + B 2 + AT 3) = R(AT 1) + R(AT 2) + R(AT 3). (*)

It remains to find the probabilities of each of the events AT 1 , AT 2 , AT 3 . Developments BUT 1 , BUT 2 , BUT 3 are independent, therefore, the events are independent, so the multiplication theorem applies to them

Likewise,

Substituting these probabilities into (*), we find the desired probability of occurrence of only one of the events BUT 1 , BUT 2 , BUT 3.

Probability definitions

Classic definition

The classical "definition" of probability comes from the notion equal opportunities as an objective property of the phenomena being studied. Equivalence is an indefinable concept and is established from general considerations of the symmetry of the phenomena under study. For example, when tossing a coin, it is assumed that, due to the supposed symmetry of the coin, the homogeneity of the material, and the randomness (non-biasedness) of the toss, there is no reason to prefer “tails” over “eagles” or vice versa, that is, the loss of these sides can be considered equally probable (equiprobable) .

Along with the concept of equiprobability in the general case, the classical definition also requires the concept of an elementary event (outcome) that favors or does not favor the event A under study. We are talking about outcomes, the occurrence of which excludes the possibility of the occurrence of other outcomes. These are incompatible elementary events. For example, when throwing dice Dropping a specific number excludes the dropping of other numbers.

The classical definition of probability can be formulated as follows:

The probability of a random event A called the ratio of the number n incompatible equally probable elementary events that make up the event A , to the number of all possible elementary events N :

For example, suppose two dice are tossed. The total number of equally possible outcomes (elementary events) is obviously 36 (6 possibilities on each die). Estimate the probability of getting 7 points. Getting 7 points is possible in the following ways: 1+6, 2+5, 3+4, 4+3, 5+2, 6+1. That is, there are only 6 equally likely outcomes that favor event A - getting 7 points. Therefore, the probability will be equal to 6/36=1/6. For comparison, the probability of getting 12 points or 2 points is only 1/36 - 6 times less.

Geometric definition

Despite the fact that the classical definition is intuitive and derived from practice, at least it cannot be directly applied if the number of equally possible outcomes is infinite. A vivid example of an infinite number of possible outcomes is a limited geometric region G, for example, on a plane, with an area S. A randomly "thrown" "point" with equal probability can be at any point in this region. The problem is to determine the probability of a point falling into some subdomain g with area s. In this case, generalizing the classical definition, we can come to a geometric definition of the probability of falling into the subdomain :

In view of the equal possibility, this probability does not depend on the shape of the region g, it depends only on its area. This definition can naturally be generalized to a space of any dimension, where the concept of "volume" is used instead of area. Moreover, it is this definition that leads to the modern axiomatic definition of probability. The concept of volume is generalized to the concept of "measure" of some abstract set, to which the requirements are imposed, which the "volume" also has in the geometric interpretation - first of all, these are non-negativity and additivity.

Frequency (statistical) determination

The classical definition, when considering complex problems, encounters difficulties of an insurmountable nature. In particular, in some cases it may not be possible to identify equally likely cases. Even in the case of a coin, as is known, there is a clearly not equally probable possibility of an "edge" falling out, which cannot be estimated from theoretical considerations (one can only say that it is unlikely and this consideration is rather practical). Therefore, at the dawn of the formation of the theory of probability, an alternative "frequency" definition of probability was proposed. Namely, formally, the probability can be defined as the limit of the frequency of observations of the event A, assuming the homogeneity of observations (that is, the sameness of all observation conditions) and their independence from each other:

where is the number of observations, and is the number of occurrences of the event .

Despite the fact that this definition rather indicates a way of estimating an unknown probability - by means of a large number of homogeneous and independent observations - nevertheless, this definition reflects the content of the concept of probability. Namely, if a certain probability is attributed to an event, as an objective measure of its possibility, then this means that under fixed conditions and multiple repetitions, we should get a frequency of its occurrence close to (the closer, the more observations). Actually, this is the original meaning of the concept of probability. It is based on an objectivist view of natural phenomena. Below are the so-called laws big numbers, which provide a theoretical basis (within the framework of the modern axiomatic approach presented below), including for the frequency estimate of probability.

Axiomatic definition

In the modern mathematical approach, the probability is given by Kolmogorov's axiomatics. It is assumed that some space of elementary events. Subsets of this space are interpreted as random events. The union (sum) of some subsets (events) is interpreted as an event consisting in the occurrence at least one from these events. The intersection (product) of subsets (events) is interpreted as an event consisting in the occurrence all these events. Disjoint sets are interpreted as incompatible events (their joint offensive is impossible). Accordingly, the empty set means impossible event.

Probability ( probability measure) is called measure(numeric function) defined on the set of events, having the following properties:

If the space of elementary events X certainly, then the specified additivity condition for arbitrary two incompatible events is sufficient, from which additivity will follow for any final the number of incompatible events. However, in the case of an infinite (countable or uncountable) space of elementary events, this condition is not enough. The so-called countable or sigma additivity, that is, the fulfillment of the additivity property for any no more than countable families of pairwise incompatible events. This is necessary to ensure the "continuity" of the probability measure.

The probability measure may not be defined for all subsets of the set . It is assumed that it is defined on some sigma algebra subsets . These subsets are called measurable according to a given probability measure, and they are random events. The set - that is, the set of elementary events, the sigma-algebra of its subsets and the probability measure - is called probability space.

Continuous random variables. In addition to discrete random variables, the possible values ​​of which form a finite or infinite sequence of numbers that do not completely fill any interval, there are often random variables whose possible values ​​form a certain interval. An example of such a random variable is the deviation from the nominal value of a certain size of a part with a properly established technological process. This kind of random variables cannot be specified using the probability distribution law p(x). However, they can be specified using the probability distribution function F(x). This function is defined in exactly the same way as in the case of a discrete random variable:

Thus, here too the function F(x) defined on the whole number axis, and its value at the point X is equal to the probability that the random variable will take on a value less than X. Formula (19) and properties 1° and 2° are valid for the distribution function of any random variable. The proof is carried out similarly to the case of a discrete quantity. The random variable is called continuous, if for it there exists a non-negative piecewise-continuous function* that satisfies for any values x equality

Based on the geometric meaning of the integral as an area, we can say that the probability of fulfilling the inequalities is equal to the area of ​​a curvilinear trapezoid with base bounded above by a curve (Fig. 6).

Since , and based on formula (22)

Note that for a continuous random variable, the distribution function F(x) continuous at any point X, where the function is continuous. This follows from the fact that F(x) is differentiable at these points. Based on formula (23), assuming x 1 =x, , we have

Due to the continuity of the function F(x) we get that

Consequently

In this way, the probability that a continuous random variable can take on any single value of x is zero. It follows from this that the events consisting in the fulfillment of each of the inequalities

They have the same probability, i.e.

Indeed, for example,

because Comment. As we know, if an event is impossible, then the probability of its occurrence is zero. In the classical definition of probability, when the number of test outcomes is finite, the reverse proposition also takes place: if the probability of an event is zero, then the event is impossible, since in this case none of the test outcomes favors it. In the case of a continuous random variable, the number of its possible values ​​is infinite. The probability that this value will take on any particular value x 1 as we have seen, is equal to zero. However, it does not follow from this that this event is impossible, since as a result of the test, the random variable can, in particular, take on the value x 1 . Therefore, in the case of a continuous random variable, it makes sense to talk about the probability that the random variable falls into the interval, and not about the probability that it will take on a particular value. So, for example, in the manufacture of a roller, we are not interested in the probability that its diameter will be equal to the nominal value. For us, the probability that the diameter of the roller does not go out of tolerance is important. Example. The distribution density of a continuous random variable is given as follows:

The graph of the function is shown in Fig. 7. Determine the probability that a random variable will take a value that satisfies the inequalities. Find the distribution function of a given random variable. ( Solution)

The next two paragraphs are devoted to the distributions of continuous random variables that are often encountered in practice - uniform and normal distributions.

* A function is called piecewise continuous on the entire numerical axis if it is either continuous on any segment or has a finite number of discontinuity points of the first kind. ** The rule for differentiating an integral with a variable upper bound, derived in the case of a finite lower bound, remains valid for integrals with an infinite lower bound. Indeed,

Since the integral

is a constant value.

Dependent and independent events. Conditional Probability

Distinguish between dependent and independent events. Two events are said to be independent if the occurrence of one of them does not change the probability of the occurrence of the other. For example, if two automatic lines operate in a workshop, which are not interconnected according to production conditions, then the stops of these lines are independent events.

Example 3 The coin is flipped twice. The probability of the appearance of the "coat of arms" in the first test (event ) does not depend on the appearance or non-appearance of the "coat of arms" in the second test (event ). In turn, the probability of the appearance of the "coat of arms" in the second test does not depend on the result of the first test. Thus, events and independent.

Several events are called collectively independent , if any of them does not depend on any other event and on any combination of the others.

The events are called dependent , if one of them affects the probability of occurrence of the other. For example, two production plants are connected by a single technological cycle. Then the probability of failure of one of them depends on the state of the other. The probability of one event, calculated assuming the occurrence of another event, is called conditional probability events and is denoted by .

The condition of independence of an event from an event is written in the form , and the condition of its dependence - in the form . Consider an example of calculating the conditional probability of an event.

Example 4 There are 5 incisors in the box: two worn and three new. Two consecutive extractions of incisors are made. Determine the conditional probability of the appearance of a worn cutter during the second extraction, provided that the cutter removed for the first time is not returned to the box.

Solution. Let us denote the extraction of a worn cutter in the first case, and - the extraction of a new one. Then . Since the removed cutter is not returned to the box, the ratio between the numbers of worn and new cutters changes. Therefore, the probability of removing a worn cutter in the second case depends on what event took place before.

Let us designate the event that means the extraction of the worn cutter in the second case. The probabilities for this event are:

Therefore, the probability of an event depends on whether the event occurred or not.

Probability density- one of the ways to set a probability measure on the Euclidean space. In the case when the probability measure is the distribution of a random variable, one speaks of densityrandom variable.

Probability density Let be a probability measure on, that is, a probability space is defined, where denotes the Borel σ-algebra on. Let denote the Lebesgue measure on.

Definition 1. The probability is called absolutely continuous (with respect to the Lebesgue measure) () if any Borel set of zero Lebesgue measure also has probability zero:

If the probability is absolutely continuous, then according to the Radon-Nikodym theorem, there exists a non-negative Borel function such that

,

where the common abbreviation is used , and the integral is understood in the sense of Lebesgue.

Definition 2. More generally, let be an arbitrary measurable space, and let and be two measures on this space. If there is a non-negative , which allows expressing the measure in terms of the measure in the form

then this function is called measure density as , or derivative of Radon-Nikodim measure with respect to measure , and denote

If, at the occurrence of an event, the probability of an event does not change, then the events and called independent.

Theorem:Probability of joint occurrence of two independent events and (works and ) is equal to the product of the probabilities of these events.

Indeed, since developments and independent, then
. In this case, the formula for the probability of a product of events and takes the form.

Developments
called pairwise independent if any two of them are independent.

Developments
called collectively independent (or simply independent), if every two of them are independent and each event and all possible products of the others are independent.

Theorem:Probability of product of a finite number of independent events in the aggregate
is equal to the product of the probabilities of these events.

Let us illustrate the difference in the application of the event probability formulas for dependent and independent events using examples

Example 1. The probability of hitting the target by the first shooter is 0.85, the second is 0.8. The guns fired one shot at a time. What is the probability that at least one projectile hit the target?

Solution: P(A+B) =P(A) +P(B) –P(AB) Since the shots are independent, then

P(A+B) = P(A) +P(B) –P(A)*P(B) = 0.97

Example 2. An urn contains 2 red and 4 black balls. 2 balls are taken out of it in a row. What is the probability that both balls are red.

Solution: 1 case. Event A - the appearance of a red ball at the first removal, event B - at the second. Event C is the appearance of two red balls.

P(C) \u003d P (A) * P (B / A) \u003d (2/6) * (1/5) \u003d 1/15

2nd case. The first ball drawn is returned to the basket.

P(C) \u003d P (A) * P (B) \u003d (2/6) * (2/6) \u003d 1/9

Total Probability Formula.

Let the event can only happen to one of the incompatible events
, forming a complete group. For example, the store receives the same products from three enterprises and in different quantities. The probability of producing low-quality products at these enterprises is different. One of the products is randomly selected. It is required to determine the probability that this product is of poor quality (event ). Events here
- this is the choice of a product from the products of the corresponding enterprise.

In this case, the probability of the event can be considered as the sum of products of events
.

By the addition theorem for the probabilities of incompatible events, we obtain
. Using the probability multiplication theorem, we find

.

The resulting formula is called total probability formula.

Bayes formula

Let the event happens at the same time as one of incompatible events
, whose probabilities
(
) are known before experience ( a priori probabilities). An experiment is performed, as a result of which the occurrence of an event is registered , and it is known that this event had certain conditional probabilities
(
). It is required to find the probabilities of events
if the event is known happened ( a posteriori probabilities).

The problem is that, having new information(event A happened), you need to re-estimate the probabilities of events
.

Based on the theorem on the probability of the product of two events

.

The resulting formula is called Bayes formulas.

Basic concepts of combinatorics.

When solving a number of theoretical and practical problems, it is required to make various combinations from a finite set of elements according to given rules and to count the number of all possible such combinations. Such tasks are called combinatorial.

When solving problems, combinatorics use the rules of sum and product.

General statement of the problem: the probabilities of some events are known, but the probabilities of other events that are associated with these events need to be calculated. In these problems, there is a need for such operations on probabilities as addition and multiplication of probabilities.

For example, two shots were fired while hunting. Event A- hitting a duck from the first shot, event B- hit from the second shot. Then the sum of events A and B- hit from the first or second shot or from two shots.

Tasks of a different type. Several events are given, for example, a coin is tossed three times. It is required to find the probability that either all three times the coat of arms will fall out, or that the coat of arms will fall out at least once. This is a multiplication problem.

Addition of probabilities of incompatible events

Probability addition is used when it is necessary to calculate the probability of a combination or a logical sum of random events.

Sum of events A and B designate A + B or AB. The sum of two events is an event that occurs if and only if at least one of the events occurs. It means that A + B- an event that occurs if and only if an event occurs during the observation A or event B, or at the same time A and B.

If events A and B are mutually inconsistent and their probabilities are given, then the probability that one of these events will occur as a result of one trial is calculated using the addition of probabilities.

The theorem of addition of probabilities. The probability that one of two mutually incompatible events will occur is equal to the sum of the probabilities of these events:

For example, two shots were fired while hunting. Event BUT– hitting a duck from the first shot, event AT– hit from the second shot, event ( BUT+ AT) - hit from the first or second shot or from two shots. So if two events BUT and AT are incompatible events, then BUT+ AT- the occurrence of at least one of these events or two events.

Example 1 A box contains 30 balls of the same size: 10 red, 5 blue and 15 white. Calculate the probability that a colored (not white) ball is taken without looking.

Solution. Let's assume that the event BUT– “the red ball is taken”, and the event AT- "The blue ball is taken." Then the event is “a colored (not white) ball is taken”. Find the probability of an event BUT:

and events AT:

Developments BUT and AT- mutually incompatible, since if one ball is taken, then balls of different colors cannot be taken. Therefore, we use the addition of probabilities:

The theorem of addition of probabilities for several incompatible events. If the events make up the complete set of events, then the sum of their probabilities is equal to 1:

The sum of the probabilities of opposite events is also equal to 1:

Opposite events form a complete set of events, and the probability of a complete set of events is 1.

The probabilities of opposite events are usually denoted in small letters. p and q. In particular,

from which the following formulas for the probability of opposite events follow:

Example 2 The target in the dash is divided into 3 zones. The probability that a certain shooter will shoot at a target in the first zone is 0.15, in the second zone - 0.23, in the third zone - 0.17. Find the probability that the shooter hits the target and the probability that the shooter misses the target.

Solution: Find the probability that the shooter will hit the target:

Find the probability that the shooter misses the target:

More difficult tasks in which you need to apply both addition and multiplication of probabilities - on the page "Various tasks for addition and multiplication of probabilities" .

Addition of probabilities of mutually joint events

Two random events are said to be joint if the occurrence of one event does not preclude the occurrence of a second event in the same observation. For example, when throwing a dice, the event BUT is considered to be the occurrence of the number 4, and the event AT- dropping an even number. Since the number 4 is an even number, the two events are compatible. In practice, there are tasks for calculating the probabilities of the occurrence of one of the mutually joint events.

The theorem of addition of probabilities for joint events. The probability that one of the joint events will occur is equal to the sum of the probabilities of these events, from which the probability of the common occurrence of both events is subtracted, that is, the product of the probabilities. The formula for the probabilities of joint events is as follows:

Because the events BUT and AT compatible, event BUT+ AT occurs if one of three possible events occurs: or AB. According to the theorem of addition of incompatible events, we calculate as follows:

Event BUT occurs if one of two incompatible events occurs: or AB. However, the probability of occurrence of one event from several incompatible events is equal to the sum of the probabilities of all these events:

Similarly:

Substituting expressions (6) and (7) into expression (5), we obtain the probability formula for joint events:

When using formula (8), it should be taken into account that the events BUT and AT can be:

  • mutually independent;
  • mutually dependent.

Probability formula for mutually independent events:

Probability formula for mutually dependent events:

If events BUT and AT are inconsistent, then their coincidence is an impossible case and, thus, P(AB) = 0. The fourth probability formula for incompatible events is as follows:

Example 3 In auto racing, when driving in the first car, the probability of winning, when driving in the second car. Find:

  • the probability that both cars will win;
  • the probability that at least one car will win;

1) The probability that the first car will win does not depend on the result of the second car, so the events BUT(first car wins) and AT(second car wins) - independent events. Find the probability that both cars win:

2) Find the probability that one of the two cars will win:

More difficult tasks in which you need to apply both addition and multiplication of probabilities - on the page "Various tasks for addition and multiplication of probabilities" .

Solve the problem of addition of probabilities yourself, and then look at the solution

Example 4 Two coins are thrown. Event A- loss of coat of arms on the first coin. Event B- loss of coat of arms on the second coin. Find the probability of an event C = A + B .

Probability multiplication

Multiplication of probabilities is used when the probability of a logical product of events is to be calculated.

In this case, random events must be independent. Two events are said to be mutually independent if the occurrence of one event does not affect the probability of the occurrence of the second event.

Probability multiplication theorem for independent events. The probability of the simultaneous occurrence of two independent events BUT and AT is equal to the product of the probabilities of these events and is calculated by the formula:

Example 5 The coin is tossed three times in a row. Find the probability that the coat of arms will fall out all three times.

Solution. The probability that the coat of arms will fall on the first toss of a coin, the second time, and the third time. Find the probability that the coat of arms will fall out all three times:

Solve problems for multiplying probabilities yourself, and then look at the solution

Example 6 There is a box with nine new tennis balls. Three balls are taken for the game, after the game they are put back. When choosing balls, they do not distinguish between played and unplayed balls. What is the probability that after three games will there be no unplayed balls in the box?

Example 7 32 letters of the Russian alphabet are written on cut alphabet cards. Five cards are drawn at random, one after the other, and placed on the table in the order in which they appear. Find the probability that the letters will form the word "end".

Example 8 From a full deck of cards (52 sheets), four cards are taken out at once. Find the probability that all four of these cards are of the same suit.

Example 9 The same problem as in example 8, but each card is returned to the deck after being drawn.

More complex tasks, in which you need to apply both addition and multiplication of probabilities, as well as calculate the product of several events - on the page "Various tasks for addition and multiplication of probabilities" .

The probability that at least one of the mutually independent events will occur can be calculated by subtracting the product of the probabilities of opposite events from 1, that is, by the formula.