The concept of probability theory statistics. Basic concept of probability theory. Laws of probability theory. Classic definition of probability

What is probability?

The first time I encountered this term, I would not have understood what it was. Therefore, I will try to explain clearly.

Probability is the chance that the event we want will happen.

For example, you decided to go to a friend’s house, you remember the entrance and even the floor on which he lives. But I forgot the number and location of the apartment. And now you are standing on the staircase, and in front of you there are doors to choose from.

What is the chance (probability) that if you ring the first doorbell, your friend will answer the door for you? There are only apartments, and a friend lives only behind one of them. With an equal chance we can choose any door.

But what is this chance?

The door, the right door. Probability of guessing by ringing the first doorbell: . That is, one time out of three you will accurately guess.

We want to know, having called once, how often will we guess the door? Let's look at all the options:

  1. You called 1st door
  2. You called 2nd door
  3. You called 3rd door

Now let’s look at all the options where a friend could be:

A. Behind 1st door
b. Behind 2nd door
V. Behind 3rd door

Let's compare all the options in table form. A checkmark indicates options when your choice coincides with a friend's location, a cross - when it does not coincide.

How do you see everything Maybe options your friend's location and your choice of which door to ring.

A favorable outcomes for everything . That is, you will guess once by ringing the doorbell once, i.e. .

This is probability - the ratio of a favorable outcome (when your choice coincides with your friend’s location) to the number of possible events.

The definition is the formula. Probability is usually denoted by p, so:

It is not very convenient to write such a formula, so we will take for - the number of favorable outcomes, and for - the total number of outcomes.

The probability can be written as a percentage; to do this, you need to multiply the resulting result by:

The word “outcomes” probably caught your eye. Since mathematicians call various actions (in our case, such an action is a doorbell) experiments, the result of such experiments is usually called the outcome.

Well, there are favorable and unfavorable outcomes.

Let's go back to our example. Let's say we rang one of the doors, but a stranger opened it for us. We didn't guess right. What is the probability that if we ring one of the remaining doors, our friend will open it for us?

If you thought that, then this is a mistake. Let's figure it out.

We have two doors left. So we have possible steps:

1) Call 1st door
2) Call 2nd door

The friend, despite all this, is definitely behind one of them (after all, he wasn’t behind the one we called):

a) Friend for 1st door
b) Friend for 2nd door

Let's draw the table again:

As you can see, there are only options, of which are favorable. That is, the probability is equal.

Why not?

The situation we considered is example of dependent events. The first event is the first doorbell, the second event is the second doorbell.

And they are called dependent because they influence the following actions. After all, if after the first ring the doorbell was answered by a friend, what would be the probability that he was behind one of the other two? Right, .

But if there are dependent events, then there must also be independent? That's right, they do happen.

A textbook example is tossing a coin.

  1. Toss a coin once. What is the probability of getting heads, for example? That's right - because there are all the options (either heads or tails, we will neglect the probability of the coin landing on its edge), but it only suits us.
  2. But it came up heads. Okay, let's throw it again. What is the probability of getting heads now? Nothing has changed, everything is the same. How many options? Two. How many are we happy with? One.

And let it come up heads at least a thousand times in a row. The probability of getting heads at once will be the same. There are always options, and favorable ones.

It is easy to distinguish dependent events from independent ones:

  1. If the experiment is carried out once (they throw a coin once, ring the doorbell once, etc.), then the events are always independent.
  2. If an experiment is carried out several times (a coin is thrown once, the doorbell is rung several times), then the first event is always independent. And then, if the number of favorable ones or the number of all outcomes changes, then the events are dependent, and if not, they are independent.

Let's practice determining probability a little.

Example 1.

The coin is tossed twice. What is the probability of getting heads twice in a row?

Solution:

Let's consider all possible options:

  1. Eagle-eagle
  2. Heads-tails
  3. Tails-Heads
  4. Tails-tails

As you can see, there are only options. Of these we are only satisfied. That is, the probability:

If the condition asks simply to find the probability, then the answer should be given in the form decimal. If it were specified that the answer should be given as a percentage, then we would multiply by.

Answer:

Example 2.

In a box of chocolates, all the chocolates are packaged in the same wrapper. However, from sweets - with nuts, with cognac, with cherries, with caramel and with nougat.

What is the probability of taking one candy and getting a candy with nuts? Give your answer as a percentage.

Solution:

How many possible outcomes are there? .

That is, if you take one candy, it will be one of those available in the box.

How many favorable outcomes?

Because the box contains only chocolates with nuts.

Answer:

Example 3.

In a box of balloons. of which are white and black.

  1. What is the probability of drawing a white ball?
  2. We added more black balls to the box. What is now the probability of drawing a white ball?

Solution:

a) There are only balls in the box. Of them are white.

The probability is:

b) Now there are more balls in the box. And there are just as many whites left - .

Answer:

Total probability

The probability of all possible events is equal to ().

Let's say there are red and green balls in a box. What is the probability of drawing a red ball? Green ball? Red or green ball?

Probability of drawing a red ball

Green ball:

Red or green ball:

As you can see, the sum of all possible events is equal to (). Understanding this point will help you solve many problems.

Example 4.

There are markers in the box: green, red, blue, yellow, black.

What is the probability of drawing NOT a red marker?

Solution:

Let's count the number favorable outcomes.

NOT a red marker, that means green, blue, yellow or black.

Probability of all events. And the probability of events that we consider unfavorable (when we take out a red marker) is .

Thus, the probability of pulling out a NOT red felt-tip pen is .

Answer:

The probability that an event will not occur is equal to minus the probability that the event will occur.

Rule for multiplying the probabilities of independent events

You already know what independent events are.

What if you need to find the probability that two (or more) independent events will occur in a row?

Let's say we want to know what is the probability that if we flip a coin once, we will see heads twice?

We have already considered - .

What if we toss a coin once? What is the probability of seeing an eagle twice in a row?

Total possible options:

  1. Eagle-eagle-eagle
  2. Heads-heads-tails
  3. Heads-tails-heads
  4. Heads-tails-tails
  5. Tails-heads-heads
  6. Tails-heads-tails
  7. Tails-tails-heads
  8. Tails-tails-tails

I don’t know about you, but I made mistakes several times when compiling this list. Wow! And only option (the first) suits us.

For 5 throws, you can make a list of possible outcomes yourself. But mathematicians are not as hardworking as you.

Therefore, they first noticed and then proved that the probability of a certain sequence of independent events each time decreases by the probability of one event.

In other words,

Let's look at the example of the same ill-fated coin.

Probability of getting heads in a challenge? . Now we flip the coin once.

What is the probability of getting heads in a row?

This rule doesn't only work if we are asked to find the probability that the same event will happen several times in a row.

If we wanted to find the sequence TAILS-HEADS-TAILS for consecutive tosses, we would do the same.

The probability of landing heads is - , heads - .

The probability of getting the sequence TAILS-HEADS-TAILS-TAILS:

You can check it yourself by making a table.

The rule for adding the probabilities of incompatible events.

So stop! New definition.

Let's figure it out. Let's take our worn-out coin and toss it once.
Possible options:

  1. Eagle-eagle-eagle
  2. Heads-heads-tails
  3. Heads-tails-heads
  4. Heads-tails-tails
  5. Tails-heads-heads
  6. Tails-heads-tails
  7. Tails-tails-heads
  8. Tails-tails-tails

So, incompatible events are a certain, given sequence of events. - these are incompatible events.

If we want to determine what the probability of two (or more) incompatible events is, then we add the probabilities of these events.

You need to understand that heads or tails are two independent events.

If we want to determine the probability of a sequence (or any other) occurring, then we use the rule of multiplying probabilities.
What is the probability of getting heads on the first toss, and tails on the second and third tosses?

But if we want to know what is the probability of getting one of several sequences, for example, when heads comes up exactly once, i.e. options and, then we must add up the probabilities of these sequences.

Total options suit us.

We can get the same thing by adding up the probabilities of occurrence of each sequence:

Thus, we add probabilities when we want to determine the probability of certain, inconsistent, sequences of events.

There is a great rule to help you avoid getting confused when to multiply and when to add:

Let's go back to the example where we tossed a coin once and wanted to know the probability of seeing heads once.
What is going to happen?

Should fall out:
(heads AND tails AND tails) OR (tails AND heads AND tails) OR (tails AND tails AND heads).
This is how it turns out:

Let's look at a few examples.

Example 5.

There are pencils in the box. red, green, orange and yellow and black. What is the probability of drawing red or green pencils?

Solution:

What is going to happen? We have to pull (red OR green).

Now it’s clear, let’s add up the probabilities of these events:

Answer:

Example 6.

If a die is thrown twice, what is the probability of getting a total of 8?

Solution.

How can we get points?

(and) or (and) or (and) or (and) or (and).

The probability of getting one (any) face is .

We calculate the probability:

Answer:

Training.

I think now you understand when you need to calculate probabilities, when to add them, and when to multiply them. Is not it? Let's practice a little.

Tasks:

Let's take a card deck containing cards including spades, hearts, 13 clubs and 13 diamonds. From to Ace of each suit.

  1. What is the probability of drawing clubs in a row (we put the first card pulled out back into the deck and shuffle it)?
  2. What is the probability of drawing a black card (spades or clubs)?
  3. What is the probability of drawing a picture (jack, queen, king or ace)?
  4. What is the probability of drawing two pictures in a row (we remove the first card drawn from the deck)?
  5. What is the probability, taking two cards, to collect a combination - (jack, queen or king) and an ace? The sequence in which the cards are drawn does not matter.

Answers:

  1. In a deck of cards of each value, it means:
  2. Events are dependent, since after the first card pulled out, the number of cards in the deck decreased (as did the number of “pictures”). There are total jacks, queens, kings and aces in the deck initially, which means the probability of drawing a “picture” with the first card:

    Since we remove the first card from the deck, it means that there are already cards left in the deck, including pictures. Probability of drawing a picture with the second card:

    Since we are interested in the situation when we take out a “picture” AND a “picture” from the deck, we need to multiply the probabilities:

    Answer:

  3. After the first card pulled out, the number of cards in the deck will decrease. Thus, two options suit us:
    1) The first card is Ace, the second is Jack, Queen or King
    2) We take out a jack, queen or king with the first card, and an ace with the second. (ace and (jack or queen or king)) or ((jack or queen or king) and ace). Don't forget about reducing the number of cards in the deck!

If you were able to solve all the problems yourself, then you are great! Now you will crack probability theory problems in the Unified State Exam like nuts!

PROBABILITY THEORY. AVERAGE LEVEL

Let's look at an example. Let's say we throw a die. What kind of bone is this, do you know? This is what they call a cube with numbers on its faces. How many faces, so many numbers: from to how many? Before.

So we roll the dice and we want it to come up or. And we get it.

In probability theory they say what happened auspicious event(not to be confused with prosperous).

If it happened, the event would also be favorable. In total, only two favorable events can happen.

How many are unfavorable? Since there are total possible events, it means that the unfavorable ones are events (this is if or falls out).

Definition:

Probability is the ratio of the number of favorable events to the number of all possible events. That is, probability shows what proportion of all possible events are favorable.

Indicates probability Latin letter(apparently from English word probability - probability).

It is customary to measure probability as a percentage (see topics and). To do this, the probability value must be multiplied by. In the example with dice probability.

And in percentage: .

Examples (decide for yourself):

  1. What is the probability of getting heads when tossing a coin? What is the probability of landing heads?
  2. When throwing a dice, what is the probability of getting even number? Which one is odd?
  3. In a box of simple, blue and red pencils. We draw one pencil at random. What is the probability of getting a simple one?

Solutions:

  1. How many options are there? Heads and tails - just two. How many of them are favorable? Only one is an eagle. So the probability

    It's the same with tails: .

  2. Total options: (how many sides does the cube have, so many various options). Favorable ones: (these are all even numbers:).
    Probability. Of course, it’s the same with odd numbers.
  3. Total: . Favorable: . Probability: .

Total probability

All pencils in the box are green. What is the probability of drawing a red pencil? There are no chances: probability (after all, favorable events -).

Such an event is called impossible.

What is the probability of drawing a green pencil? There are exactly the same number of favorable events as there are total events (all events are favorable). So the probability is equal to or.

Such an event is called reliable.

If a box contains green and red pencils, what is the probability of drawing green or red? Yet again. Let's note this: the probability of pulling out green is equal, and red is equal.

In sum, these probabilities are exactly equal. That is, the sum of the probabilities of all possible events is equal to or.

Example:

In a box of pencils, among them are blue, red, green, plain, yellow, and the rest are orange. What is the probability of not drawing green?

Solution:

We remember that all probabilities add up. And the probability of getting green is equal. This means that the probability of not drawing green is equal.

Remember this trick: The probability that an event will not occur is equal to minus the probability that the event will occur.

Independent events and the multiplication rule

You flip a coin once and want it to come up heads both times. What is the likelihood of this?

Let's go through all the possible options and determine how many there are:

Heads-Heads, Tails-Heads, Heads-Tails, Tails-Tails. What else?

Total options. Of these, only one suits us: Eagle-Eagle. In total, the probability is equal.

Fine. Now let's flip a coin once. Do the math yourself. Happened? (answer).

You may have noticed that with the addition of each subsequent throw, the probability decreases by half. General rule called multiplication rule:

The probabilities of independent events change.

What are independent events? Everything is logical: these are those that do not depend on each other. For example, when we throw a coin several times, each time a new throw is made, the result of which does not depend on all previous throws. We can just as easily throw two different coins at the same time.

More examples:

  1. The dice are thrown twice. What is the probability of getting it both times?
  2. The coin is tossed once. What is the probability that it will come up heads the first time, and then tails twice?
  3. The player rolls two dice. What is the probability that the sum of the numbers on them will be equal?

Answers:

  1. The events are independent, which means the multiplication rule works: .
  2. The probability of heads is equal. The probability of tails is the same. Multiply:
  3. 12 can only be obtained if two -ki are rolled: .

Incompatible events and the addition rule

Events that complement each other to the point of full probability are called incompatible. As the name suggests, they cannot happen simultaneously. For example, if we flip a coin, it can come up either heads or tails.

Example.

In a box of pencils, among them are blue, red, green, plain, yellow, and the rest are orange. What is the probability of drawing green or red?

Solution .

The probability of drawing a green pencil is equal. Red - .

Favorable events in all: green + red. This means that the probability of drawing green or red is equal.

The same probability can be represented in this form: .

This is the addition rule: the probabilities of incompatible events add up.

Mixed type problems

Example.

The coin is tossed twice. What is the probability that the results of the rolls will be different?

Solution .

This means that if the first result is heads, the second must be tails, and vice versa. It turns out that there are two pairs of independent events, and these pairs are incompatible with each other. How not to get confused about where to multiply and where to add.

There is a simple rule for such situations. Try to describe what is going to happen using the conjunctions “AND” or “OR”. For example, in this case:

It should come up (heads and tails) or (tails and heads).

Where there is a conjunction “and” there will be multiplication, and where there is “or” there will be addition:

Try it yourself:

  1. What is the probability that if a coin is tossed twice, the coin will land on the same side both times?
  2. The dice are thrown twice. What is the probability of getting a total of points?

Solutions:

  1. (Heads fell and tails fell) or (tails fell and tails fell): .
  2. What are the options? And. Then:
    Dropped (and) or (and) or (and): .

Another example:

Toss a coin once. What is the probability that heads will appear at least once?

Solution:

Oh, how I don’t want to go through the options... Heads-tails-tails, Eagle-heads-tails,... But there’s no need! Let's remember about total probability. Do you remember? What is the probability that the eagle will never fall out? It’s simple: heads fly all the time, that’s why.

PROBABILITY THEORY. BRIEFLY ABOUT THE MAIN THINGS

Probability is the ratio of the number of favorable events to the number of all possible events.

Independent events

Two events are independent if the occurrence of one does not change the probability of the other occurring.

Total probability

The probability of all possible events is equal to ().

The probability that an event will not occur is equal to minus the probability that the event will occur.

Rule for multiplying the probabilities of independent events

The probability of a certain sequence of independent events is equal to the product of the probabilities of each event

Incompatible events

Incompatible events are those that cannot possibly occur simultaneously as a result of an experiment. A number of incompatible events form a complete group of events.

The probabilities of incompatible events add up.

Having described what should happen, using the conjunctions “AND” or “OR”, instead of “AND” we put a multiplication sign, and instead of “OR” we put an addition sign.

THE REMAINING 2/3 ARTICLES ARE AVAILABLE ONLY TO YOUCLEVER STUDENTS!

Become a YouClever student,

Prepare for the Unified State Exam or Unified State Exam in mathematics for the price of “a cup of coffee per month”,

And also get unlimited access to the textbook "YouClever", the Preparation Program (workbook) "100gia", unlimited trial Unified State Examination and OGE, 6000 problems with analysis of solutions and other services YouClever and 100gia.

Mathematics for Programmers: Probability Theory

Ivan Kamyshan

Some programmers, after working in the field of developing regular commercial applications, think about mastering machine learning and becoming a data analyst. They often don't understand why certain methods work, and most machine learning methods seem like magic. In fact, machine learning is based on mathematical statistics, which in turn is based on probability theory. Therefore, in this article we will pay attention to the basic concepts of probability theory: we will touch on the definitions of probability, distribution and analyze several simple examples.

You may know that probability theory is conventionally divided into 2 parts. Discrete probability theory studies phenomena that can be described by a distribution with a finite (or countable) number of possible behavior options (throwing dice, coins). Continuous probability theory studies phenomena distributed over some dense set, for example, on a segment or in a circle.

One can consider the subject of probability theory on simple example. Imagine yourself as a shooter developer. An integral part of the development of games in this genre is the shooting mechanics. It is clear that a shooter in which all weapons shoot absolutely accurately will be of little interest to players. Therefore, it is imperative to add spread to your weapon. But simply randomizing weapon impact points will not allow for fine tuning, so adjusting the game balance will be difficult. At the same time, using random variables and their distributions can analyze how a weapon will perform with a given spread and help make the necessary adjustments.

Space of elementary outcomes

Let's say that from some random experiment that we can repeat many times (for example, tossing a coin), we can extract some formalized information (it came up heads or tails). This information is called an elementary outcome, and it is useful to consider the set of all elementary outcomes, often denoted by the letter Ω (Omega).

The structure of this space depends entirely on the nature of the experiment. For example, if we consider shooting at a sufficiently large circular target, the space of elementary outcomes will be a circle, for convenience, placed with the center at zero, and the outcome will be a point in this circle.

In addition, sets of elementary outcomes - events are considered (for example, hitting the top ten is a concentric circle of small radius with a target). In the discrete case, everything is quite simple: we can get any event, including or excluding elementary outcomes in a finite time. In the continuous case, everything is much more complicated: we need some fairly good family of sets to consider, called algebra by analogy with simple real numbers that can be added, subtracted, divided and multiplied. Sets in algebra can be intersected and combined, and the result of the operation will be in the algebra. This is a very important property for the mathematics that lies behind all these concepts. A minimal family consists of only two sets - the empty set and the space of elementary outcomes.

Measure and probability

Probability is a way of making inferences about the behavior of very complex objects without understanding how they work. Thus, probability is defined as a function of an event (from that very good family of sets) that returns a number - some characteristic of how often such an event can occur in reality. To be certain, mathematicians agreed that this number should lie between zero and one. In addition, this function has requirements: the probability of an impossible event is zero, the probability of the entire set of outcomes is unit, and the probability of combining two independent events (disjoint sets) is equal to the sum of the probabilities. Another name for probability is a probability measure. Most often, Lebesgue measure is used, which generalizes the concepts of length, area, volume to any dimensions (n-dimensional volume), and thus it is applicable to a wide class of sets.

Together, the collection of a set of elementary outcomes, a family of sets, and a probability measure is called probability space. Let's consider how we can construct a probability space for the example of shooting at a target.

Consider shooting at a large round target of radius R, which is impossible to miss. By a set of elementary events we set a circle with a center at the origin of coordinates of radius R. Since we are going to use area (the Lebesgue measure for two-dimensional sets) to describe the probability of an event, we will use a family of measurable (for which this measure exists) sets.

Note Actually, this is a technical point and simple tasks the process of determining a measure and a family of sets does not play a special role. But it is necessary to understand that these two objects exist, because in many books on probability theory the theorems begin with the words: “ Let (Ω,Σ,P) be a probability space...».

As mentioned above, the probability of the entire space of elementary outcomes must be equal to one. The area (two-dimensional Lebesgue measure, which we denote λ 2 (A), where A is an event) of a circle, according to a well-known formula from school, is equal to π *R 2. Then we can introduce the probability P(A) = λ 2 (A) / (π *R 2), and this value will already lie between 0 and 1 for any event A.

If we assume that hitting any point on the target is equally probable, the search for the probability of a shooter hitting some area of ​​the target comes down to finding the area of ​​this set (from here we can conclude that the probability of hitting a specific point is zero, because the area of ​​the point is zero).

For example, we want to find out what is the probability that the shooter will hit the top ten (event A - the shooter hits the desired set). In our model, the “ten” is represented by a circle with a center at zero and radius r. Then the probability of getting into this circle is P(A) = λ 2 /(A)π *R 2 = π * r 2 /(π R 2)= (r/R) 2.

This is one of the simplest types of "geometric probability" problems - most of these problems require finding an area.

Random variables

A random variable is a function that converts elementary outcomes into real numbers. For example, in the problem considered, we can introduce a random variable ρ(ω) - the distance from the point of impact to the center of the target. The simplicity of our model allows us to explicitly define the space of elementary outcomes: Ω = (ω = (x,y) such numbers that x 2 +y 2 ≤ R 2 ) . Then the random variable ρ(ω) = ρ(x,y) = x 2 +y 2 .

Means of abstraction from probabilistic space. Distribution function and density

It’s good when the structure of the space is well known, but in reality this is not always the case. Even if the structure of a space is known, it can be complex. To describe random variables if their expression is unknown, there is the concept of a distribution function, which is denoted by F ξ (x) = P(ξ< x) (нижний индекс ξ здесь означает случайную величину). Т.е. это вероятность множества всех таких элементарных исходов, для которых значение random variableξ at this event is less than the given parameter x.

The distribution function has several properties:

  1. Firstly, it is between 0 and 1.
  2. Secondly, it does not decrease when its argument x increases.
  3. Third, when the number -x is very large, the distribution function is close to 0, and when x itself is large, the distribution function is close to 1.

Probably, the meaning of this construction is not very clear upon first reading. One useful property is that the distribution function allows you to look for the probability that a value takes a value from an interval. So, P (the random variable ξ takes values ​​from the interval) = F ξ (b)-F ξ (a). Based on this equality, we can study how this value changes if the boundaries a and b of the interval are close.

Let d = b-a , then b = a+d . And therefore, F ξ (b) - F ξ (a) = F ξ (a+d) - F ξ (a) . For small values ​​of d, the above difference is also small (if the distribution is continuous). It makes sense to consider the ratio p ξ (a,d)= (F ξ (a+d) - F ξ (a))/d. If, for sufficiently small values ​​of d, this ratio differs little from some constant p ξ (a), independent of d, then at this point the random variable has a density equal to p ξ (a).

Note Readers who have previously encountered the concept of derivative may notice that p ξ (a) is the derivative of the function F ξ (x) at point a. In any case, you can study the concept of a derivative in an article on this topic on the Mathprofi website.

Now the meaning of the distribution function can be defined as follows: its derivative (density p ξ, which we defined above) at point a describes how often a random variable will fall into a small interval centered at point a (the neighborhood of point a) compared to the neighborhoods of other points . In other words, the faster the distribution function grows, the more likely it is that such a value will appear in a random experiment.

Let's go back to the example. We can calculate the distribution function for the random variable, ρ(ω) = ρ(x,y) = x 2 +y 2 , which denotes the distance from the center to the random hit point on the target. By definition, F ρ (t) = P(ρ(x,y)< t) . т.е. множество {ρ(x,y) < t)} – состоит из таких точек (x,y) , расстояние от которых до нуля меньше, чем t . Мы уже считали вероятность такого события, когда вычисляли вероятность попадания в «десятку» - она равна t 2 /R 2 . Таким образом, Fρ(t) = P(ρ(x,y) < t) = t 2 /R 2 , для 0

We can find the density p ρ of this random variable. Let us immediately note that outside the interval it is zero, because the distribution function over this interval is unchanged. At the ends of this interval the density is not determined. Inside the interval, it can be found using a table of derivatives (for example, from the Mathprofi website) and elementary rules of differentiation. The derivative of t 2 /R 2 is equal to 2t/R 2. This means that we found the density on the entire axis of real numbers.

Another useful property of density is the probability that a function takes a value from an interval, which is calculated using the integral of density over this interval (you can find out what this is in articles about proper, improper, and indefinite integrals on the Mathprofi website).

On first reading, the integral over an interval of the function f(x) can be thought of as the area of ​​a curved trapezoid. Its sides are a fragment of the Ox axis, a gap (horizontal coordinate axis), vertical segments connecting points (a,f(a)), (b,f(b)) on the curve with points (a,0), (b,0 ) on the Ox axis. The last side is a fragment of the graph of the function f from (a,f(a)) to (b,f(b)) . We can talk about the integral over the interval (-∞; b], when for sufficiently large negative values, a, the value of the integral over the interval will change negligibly compared to the change in the number a. The integral over intervals is defined in a similar way)

Have questions?

Report a typo

Text that will be sent to our editors: