Change in the entropy of an ideal system. School Encyclopedia
In the previous section, we proceeded from the basic assumption that for any system there is a parameter called entropy and denoted by S. For small values of thermal interaction, the corresponding differential change in entropy dS is . We use this definition below to calculate entropy changes in some simple and well-known processes.
Entropy change during ice melting. Suppose that on a hot summer day we bring a thermos filled with a mixture of ice and water to a picnic. Since the insulation of the thermos is not perfect, the ice will gradually melt. However, melting occurs slowly, the temperature in the thermos will remain almost unchanged and equal to 0 ° C. Let us calculate the change in entropy corresponding to the melting of 1 mol (or 18 g) of ice. The tabular value of the heat of fusion of ice is 79.67 cal/g, which gives about 1434 cal/mol. Then one can write
As before, it simply means the summation of infinitesimal quantities - the integration (or summation) of all quantities corresponding to each small amount of heat. The integration is carried out in this case particularly simply because the temperature T does not change during the melting process. Therefore, the factor 1/T can be taken out from under the integral sign, so that it becomes just a factor at the last expression is actually the heat of phase transition (melting) of ice cal/mol. Relation (19) means that the entropy of 1 mol of water at 273 K is 5.27 cal/K higher than the entropy of 1 mol of ice at the same temperature.
Believe when the ice melts. Entropy will increase.
Conversely, if enough heat is removed from water at a temperature of 273 K to form 1 mole of ice at 273 K, the entropy of the system will decrease by .
Note that throughout this section we have used absolute temperature by Kelvin in the denominator of the ratio . It would be possible to use the absolute Rankin scale, if we measure the amount of heat in b.t. e. It is obvious that temperatures in Celsius or Fahrenheit scales cannot be used in the denominator of the expression (as even trained students sometimes try to do). So, for example, using the Celsius scale, in the case under consideration, we would come to an absurd result (the denominator of the expression would turn to zero). Note that the units in which the change in entropy is expressed coincide with the units in which the molar heat capacity is measured. The change in entropy when 1 mol of ice melts at a freezing point of normal conditions is 5.27 cal/(mol K).
Entropy change during boiling water. Another well-known process that takes place at a certain temperature is the transition of liquid water to steam at a pressure of 1 atm. The temperature at which water boils under normal conditions is, by definition, 100°C, or 373 K. The heat of vaporization at this temperature is 539 cal/g, or 9702 cal/mol. Then the change in entropy corresponding to the evaporation of 1 mole of water under normal conditions is
This calculation turned out to be so simple because the temperature did not change during the process.
Note that the entropy change in the process of water evaporation is almost 5 times greater than the entropy change in the process of ice melting. The value is slightly higher than usual for similar situations values and points to unusual properties a substance such as water. For many "normal" (non-polar) liquids, the change in entropy during evaporation is This rule was obtained empirically by the English physicist Frederick Trouton (1863-1922) and is called the "Trouton rule". It gives a way to estimate the heat of vaporization given substance, if the temperature at which it boils under normal conditions is known.
To find an approximate value for the heat of vaporization, it is enough to multiply the boiling point (expressed in Kelvin) by the Growton constant.
Entropy change during isothermal expansion ideal gas. There is one more process at a constant temperature, which we have already encountered more than once before - this is the process of reversible isothermal expansion of an ideal gas. If along with the thermal there is only the usual mechanical interaction (so that the elementary work is expressed by the formula the first law of thermodynamics for 1 mole of an ideal gas can be written as
(here it is taken into account that ). Using the equation pV = RT, for dT = 0 (constant temperature condition) we can write
We had to integrate this expression in Chap. 4, so here is the result immediately:
Since the temperature T remains constant, the expression for the corresponding entropy change is
As is known, the gas constant R has the dimension cal/(mol K), and the multiplier containing the logarithm is a dimensionless number, so that the dimensions in the left and right parts of relation (24) coincide. Thus, an increase in volume (i.e., expansion) at a constant temperature is accompanied by an increase in entropy.
Let us return to the case of boiling water. Let 1 mole of water evaporate; 1 mole of an ideal gas, as we remember, under normal conditions (pressure 1 atm and temperature 273 K) occupies a volume of about 22,400 cm3. At 373 K the corresponding volume will be 22,400 (373/273), or about 30,600 cm3. Before evaporation, 1 mol of liquid occupied a volume of about so that the ratio is According to equation (24), the change in entropy corresponding to the change in volume due to evaporation is R ln 1700. Considering that the value of R is approximately equal to , the desired change in entropy is approximately 14.88 cal/(mol K).
Calculating in the previous section the total change in entropy during the entire process of evaporation of 1 mol of water, we obtained a value of 26.0 cal/(mol K). As we have now seen, slightly more than half of this value is associated with a change in volume when the liquid changes into vapor.
Changes in entropy due to changes in temperature. So far, all of our calculations of entropy change have been carried out for thermal interactions at constant temperature. Let us now consider a more common and somewhat more complicated case, when reversible heating leads to a change in temperature. If heating occurs at a constant volume, then. according to the definition of specific heat at constant volume, we have . Then
Integrating this expression over a finite temperature range, we obtain
It was assumed here that the heat capacity does not depend on temperature and can be taken out of the integral sign. It is significant that, by identifying
we remove the restriction on the reversibility of the heating process, as well as on the uniformity of the temperature during the heating process. We need to know the temperature of the system only at the beginning and at the end of the heating process. In other words, it is only essential that thermal equilibrium exist in the initial and final states: intermediate states do not play a role.
In the more common and practically much easier case of heating at constant pressure, we have . Literally repeating all the above reasoning, we get
2. Heating water at 1 atm from 273 K to 373 K:
3. Water-steam transition at 1 atm and 373 K:
Thus, the resulting change in entropy during the transformation of 1 mol of ice, which has a temperature of 273 K, into steam at 373 K is
The concept of entropy is used in various sciences: physics, chemistry, mathematics, biology, sociology. The word itself comes from the Greek and means "transformation, change." What is it in simple words? We can say that this is a measure of disorder, randomness in any system. The lower the order, the greater its value. If books are on a shelf, they are less disordered than if they are in a pile.
The definition of this term depends on the scope of its application. In general terms, we can say that this is a measure of disorder and the irreversible dissipation of energy. The more ordered a system, the more concentrated the energy. For example, if we place a hot object in cold water, gradually it will cool down, and the water will heat up. In the second case, the entropy is greater.
Important! Entropy characterizes disorder. The larger it is, the less the system is ordered.
Anything can act as a system. In physics or chemistry, this is usually a gas, liquid, solid, a set of a certain number of particles. In computer science it can be a text, in sociology a group of people.
The term entropy
In physics
This term is used in such branches of physics as thermodynamics and statistical physics. Thermodynamics studies the methods of transfer and transformation of energy. It deals with processes in which the concept of temperature can be used. It was in thermodynamics that this concept was first used. It was introduced by the German scientist Rudolf Clausius. Statistical mechanics studies the behavior of systems of a finite number of particles, using the methods of probability theory for this.
In different branches of physics, this term means somewhat different things. In thermodynamics, this is a characteristic of the irreversible dissipation of energy. In statistical physics, this value indicates the probability of some state.
In thermodynamics
Entropy is the only quantity that shows the direction of physical processes. What does it mean?
- In an isolated system, that is, one that does not exchange either matter or energy with surrounding objects, processes always proceed in such a way that disorder increases. Having reached a maximum, it remains constant. This is the essence of the second law of thermodynamics.
- Reversible processes do not change disorder.
- Irreversible processes always proceed in such a way that disorder increases.
In an open system, this value can increase or remain constant, and such processes are possible in which disorder decreases. That is, by outside intervention, we can reduce disorder.
Any system that is in constant external conditions eventually comes to a state of equilibrium and cannot get out of it on its own. In this case, all its parts will have the same temperature. This is the zeroth law of thermodynamics.
In balance, disorder is the most. For example, there is a vessel divided by a partition. On one side is one gas, on the other - the other. If the partition is removed, the gases will gradually mix and will not separate again on their own. Such a state will be more disordered than the state when the gases have been separated.
In physics, this quantity is a function of the state of the system. This means that it depends on the system parameters:
- temperature;
- pressure;
- volume;
- internal energy.
In statistical mechanics
In statistical mechanics, this concept is associated with the probability of obtaining a certain state. For example, for several objects or particles, it depends on the number of ways to arrange them.
There are several definitions for this quantity. The simplest definition of Bolzamann. It is equal to the logarithm of the state probability multiplied by the Boltzmann constant: S=k*ln(W).
Useful video: what is entropy
Absolute value
Entropy is a non-negative value (greater than or equal to zero). The closer the temperature is to absolute zero, the closer it is to zero. This is the third law of thermodynamics. It was originally formulated by Max Planck in 1911.
Also, the third law of thermodynamics is called the principle of the inaccessibility of absolute zero. This means that with any processes associated with a change in disorder, it is impossible to reach absolute zero (0K, or -273.15 C). One can only approach this temperature indefinitely. Scientists agreed that at 0 K, disorder is 0.
Important! The absolute value of disorder can be calculated as the change in energy at a given temperature.
In thermodynamics, the absolute value usually does not matter, only its change is important. However, an absolute value can also be found. It is calculated according to different formulas for the solid, liquid and gaseous state of matter. This value is measured in J / K or J / degree, that is, in the same units as the heat capacity. It is convenient to divide this value by the mass or number of moles of the substance. Therefore, units of J / (mol * K) or J / (mol * degree) are used, depending on whether the temperature is measured in kelvins or degrees.
In chemistry
What is, for example, entropy in chemistry? This concept is used in chemical thermodynamics. It is important to change this value. If it is positive, then the system becomes less ordered. Knowing this is important for determining the direction of chemical reactions and changing chemical equilibrium. This term is associated with the concept of enthalpy - energy that can be converted into heat at a certain constant pressure.
By changing the disorder, one can determine whether the reaction can proceed spontaneously. This cannot be done only by changing the energy, since there are both reactions that take place with the absorption of heat and reactions that take place with its release. According to the second law of thermodynamics, the most disordered state is the most stable state of a closed system. Also, any closed system tends to the least ordered state. Therefore, in spontaneous processes, disorder increases.
In information theory
Information entropy characterizes the unpredictability of any system. For example, it may be the probability of the occurrence of some character from the alphabet in the text. Moreover, this function is equal to the amount of information that falls on one character. Claude Shannon, the scientist who introduced this term in information theory, even at first wanted to call this quantity information.
Shannon suggested that by increasing the amount of information, we reduce uncertainty. By streamlining the system, we also reduce uncertainty.
Important! The more predictable an event is, the less informative it is, and the less mess.
With the help of this uncertainty, events can be predicted, for example, the outcome of some experiment. For this, the events are divided into separate parts and the uncertainty for them is considered.
Information entropy is related to the number of available states. The larger this number, the larger it is. For example, if we play chess according to the rules, for a chessboard this value will be less than if we rearrange the pieces randomly. The uncertainty for a coin that can only fall on one side or the other is less than that of dice with 6 faces, and for a bone with 20 faces, this value is even greater.
There is also the entropy of language. This concept refers to the amount of information per unit of text in this language (one character) and is measured in bits per letter. For different languages she is different.
In the language, some characters appear more often, others less often, and there are also certain frequently occurring combinations of characters. By analyzing the probability of the occurrence of a particular character, it is possible to decode the ciphertext. Information disorder also helps to establish the necessary throughput channels for the transmission of encrypted messages.
For data analysis in various fields, from medicine to sociology, information-entropy analysis is used. In simple words it can be said that by analyzing the increase or decrease in disorder, it is possible to establish connections between phenomena.
The concept of "information entropy" is also used in mathematical statistics and statistical physics. These sciences also deal with the probability of various states and use the methods of probability theory.
In economics
In economics, the concept of "entropy coefficient" is used. It is related to the concentration of sellers in the market. The greater the concentration, the lower this coefficient, or index. It depends on the distribution of shares between firms in the market, and what more difference in the magnitude of these shares, the greater the entropy coefficient.
If you divide this index by the number of firms in the market, you get a relative indicator. It is denoted by the letter E. Its value is between 0 and 1. The value of E=0 corresponds to monopoly, and E=1 to perfect competition.
What does wikipedia say
Wikipedia can be found different definitions this concept. The most general is a measure of the irreversible dissipation of energy, the deviation of a real process from an ideal one. Also on Wikipedia you can find:
- articles about this term in classical thermodynamics;
- in biological ecology;
- the entropy of the universe;
- language;
- differential;
- topological;
- informational.
Useful video: the idea of entropy
Conclusion
The term "entropy" was first used in thermodynamics by Rudolf Clausius. From physics, he came to other sciences. This concept denotes disorder, randomness, unpredictability and is closely related to probability. Entropy analysis helps to study data and find connections between phenomena, determine the direction of physical and chemical processes.
Entropy
The change in the enthalpy of the system cannot serve as the only criterion for the spontaneous realization chemical reaction, since many endothermic processes proceed spontaneously. An illustration of this is the dissolution of some salts (for example, NH 4NO 3) in water, accompanied by a noticeable cooling of the solution. It is necessary to take into account one more factor that determines the ability to spontaneously move from a more ordered to a less ordered (more chaotic) state.
Entropy (S) is the thermodynamic state function, which serves as a measure of the disorder (disorder) of the system. The possibility of the occurrence of endothermic processes is due to a change in entropy, because in isolated systems the entropy of a spontaneously occurring process increases Δ S > 0 (second law of thermodynamics).
L. Boltzmann defined entropy as the thermodynamic probability of a state (disorder) of a system W. Since the number of particles in the system is large (Avogadro's number N A = 6.02∙10 23), then the entropy is proportional to the natural logarithm of the thermodynamic probability of the state of the system W:
The dimension of the entropy of 1 mole of a substance coincides with the dimension of the gas constant R and is equal to J∙mol –1∙K –1. entropy change *) in irreversible and reversible processes is given by the relations Δ S > Q / T and Δ S = Q / T. For example, the change in the entropy of melting is equal to the heat (enthalpy) of melting Δ S pl = Δ H pl/ T pl For a chemical reaction, the change in entropy is similar to the change in enthalpy
*) term entropy was introduced by Clausius (1865) through the ratio Q/T (reduced heat).
Here Δ S° corresponds to the entropy of the standard state. Standard entropies simple substances are not equal to zero. Unlike other thermodynamic functions, the entropy of a perfectly crystalline body at absolute zero is zero (Planck's postulate), since W = 1.
The entropy of a substance or system of bodies at a certain temperature is an absolute value. In table. 4.1 shows standard entropies S° some substances.
|
||||||||||||||||||||||||||||||||||||||||||||
Table 4.1. Standard entropies of some substances. |
From Table. 4.1 it follows that the entropy depends on:
- Aggregate state of matter. Entropy increases during the transition from solid to liquid and especially to the gaseous state (water, ice, steam).
- Isotopic composition (H 2O and D 2O).
- Molecular weight compounds of the same type (CH 4, C 2H 6, n-C 4H 10).
- Molecule structures (n-C 4H 10, iso-C 4H 10).
- Crystal structure (allotropy) - diamond, graphite.
Finally, fig. 4.3 illustrates the dependence of entropy on temperature.
Consequently, the tendency of the system to disorder manifests itself the more, the higher the temperature. The product of the change in the entropy of the system by the temperature TΔ S quantifies this trend and is called entropy factor.
Tasks and tests on the topic "Chemical thermodynamics. Entropy"
- Chemical elements. Signs of chemical elements - Initial chemical concepts and theoretical ideas grades 8–9
Lessons: 3 Assignments: 9 Tests: 1
2. Standard entropy of substances. The change in entropy with a change in the state of aggregation of substances. Calculation of the standard entropy change in a chemical reaction.
Entropy (S) is a thermodynamic state function that serves as a measure of the disorder (disorder) of a system. The possibility of endothermic processes is due to the change in entropy, because in isolated systems the entropy of a spontaneous process increases ΔS > 0 (the second law of thermodynamics). L. Boltzmann defined entropy as the thermodynamic probability of the state (disorder) of the system W. Entropy is related to the thermodynamic probability by the relation: S = R ln W
The dimension of the entropy of 1 mole of a substance coincides with the dimension of the gas constant R and is equal to J∙mol–1∙K–1. The change in entropy *) in irreversible and reversible processes is conveyed by the relations ΔS > Q / T and ΔS = Q / T. For example, the change in the entropy of melting is equal to the heat (enthalpy) of melting ΔSmelt = ΔHmelt/Tmelt. For a chemical reaction, the change in entropy is analogous to the change in enthalpy
*) the term entropy was introduced by Clausius (1865) through the ratio Q / T (reduced heat).
Here ΔS° corresponds to the entropy of the standard state. The standard entropies of simple substances are not equal to zero. Unlike other thermodynamic functions, the entropy of a perfectly crystalline body at absolute zero is zero (Planck's postulate), since W = 1.
The entropy of a substance or system of bodies at a certain temperature is an absolute value.
Entropy depends on:
-aggregate state of matter. Entropy increases during the transition from solid to liquid and especially to the gaseous state (water, ice, steam).
-isotopic composition (H2O and D2O).
-molecular weight of similar compounds (CH4, C2H6, n-C4H10).
-structure of the molecule (n-C4H10, iso-C4H10).
-crystal structure (allotropy) - diamond, graphite.
The change in entropy during this (solid-liquid) phase transition can be found simply if the process is considered to be in equilibrium.
This is a perfectly acceptable approximation, if we assume that the temperature difference between the system and the object that supplies the system with heat is not too large, much less than the melting point. Then we can use the thermodynamic meaning of entropy: from the point of view of thermodynamics, entropy is such a function of the state of the system, the change in which dS in an elementary equilibrium process is equal to the ratio of the portion of heat δQ that the system receives in this process to the temperature of the system T:
Since the temperature of the system in this phase transition does not change and is equal to the melting temperature, the integrand is a quantity that does not change during the process, therefore it does not depend on the mass m of the substance. Then
It follows from this formula that the entropy increases during melting and decreases during crystallization. physical meaning of this result is quite clear: the phase region of a molecule in a solid is much smaller than in a liquid, since in a solid only a small region of space between neighboring sites is accessible to each molecule crystal lattice, while in a liquid the molecules occupy the entire region of space. Therefore, at the same temperature, the entropy of a solid is less than the entropy of a liquid. This means that a solid is a more ordered and less chaotic system than a liquid.
The application of entropy in this (liquid-gas) process can be found simply by assuming the process is in equilibrium. And again, this is a perfectly acceptable approximation, provided that the temperature difference between the system and the "supplier" of heat is small, i.e. much lower than the boiling point. Then
It follows from the formula that entropy increases during evaporation, and decreases during condensation.
The physical meaning of this result is the difference between the phase region of a molecule in a liquid and a gas. Although the entire region of space occupied by the system is available to each molecule in a liquid and gas, this region itself is much smaller for a liquid than for a gas. In a liquid, attractive forces between molecules keep them at a certain distance from each other. Therefore, although each molecule has the ability to freely migrate over the region of space occupied by a liquid, it does not have the ability to “break away from the collective” of other molecules: as soon as it breaks away from one molecule, another is immediately attracted. Therefore, the volume of the liquid depends on its quantity and is in no way related to the volume of the vessel.
Gas molecules behave differently. They have much more freedom, the average distance between them is such that the forces of attraction are very small, and the molecules "notice each other" only during collisions. As a result, the gas always occupies the entire volume of the vessel.
Therefore, at equal temperatures, the phase region of gas molecules is much larger than the phase region of liquid molecules, and the entropy of the gas is greater than the entropy of the liquid. A gas, compared to a liquid, is a much less ordered, more chaotic system.
The change in standard molar entropy in a chemical reaction is given by the equation:
It should be noted that the change in entropy in the considered example turns out to be negative. This could be expected if we take into account that, according to the equation of the reaction under consideration, the total amount of gaseous reactants is 1.5 mol, and the total amount of gaseous products is only 1 mol. Thus, as a result of the reaction, there is a decrease total gases. At the same time, we know that combustion reactions are exothermic reactions. Consequently, the result of their flow is the dissipation of energy, and this makes us expect an increase in entropy, and not its decrease. Further, it should be taken into account that the combustion of hydrogen gas at 25°C, caused by the initial initiation, then proceeds spontaneously and with great intensity. But shouldn't the change in entropy in this reaction be positive in this case, as required by the second law of thermodynamics? It turns out - no, or at least not necessarily. The second law of thermodynamics requires that as a result of a spontaneous process, the total entropy of the system and its environment increase. The entropy change calculated above characterizes only the considered chemical system, consisting of reagents and products that take part in the combustion of hydrogen gas at 25°C.
9.9. Entropy. The physical meaning of entropy. Entropy and probability.
Considering the efficiency of a heat engine operating according to the Carnot cycle, it can be noted that the ratio of the temperature of the refrigerator to the temperature of the heater is equal to the ratio of the amount of heat given by the working fluid to the refrigerator and the amount of heat received from the heater. This means that for an ideal heat engine operating according to the Carnot cycle, the following relation also holds:
. Attitude Lorentz named reduced heat
. For an elementary process, the reduced heat will be equal to . This means that during the implementation of the Carnot cycle (and it is a reversible cyclic process), the reduced heat remains unchanged and behaves as a function of the state, while, as is known, the amount of heat is a function of the process.
Using the first law of thermodynamics for reversible processes,
and dividing both sides of this equation by the temperature, we get:
(9-41)
We express from the Mendeleev - Clapeyron equation
, substitute into equation (9-41) and get:
(9-42)
We learn that
, a
, we substitute them into equation (9-42) and get:
(9-43)
The right side of this equality is a total differential, therefore, in reversible processes, the reduced heat is also a total differential, which is a sign of the state function.
The state function whose differential is , is called entropy and denoted S . Thus, entropy is a state function. After the introduction of entropy, formula (9-43) will look like:
, (9-44)
where dS is the increase in entropy. Equality (9-44) is valid only for reversible processes and is convenient for calculating the entropy change in finite processes:
(9-45)
If the system performs a circular process (cycle) in a reversible way, then
, and, therefore, S=0, then S = const.
Expressing the amount of heat in terms of the increment of entropy for an elementary process, and substituting it into the equation for the first law of thermodynamics, we obtain a new form of writing this equation, which is commonly called basic thermodynamic identity:
(9-46)
Thus, to calculate the change in entropy in reversible processes, it is convenient to use the reduced heat.
In the case of irreversible nonequilibrium processes
, and for irreversible circular processes, Clausius inequality
:
(9-47)
Consider what happens to entropy in an isolated thermodynamic system.
In an isolated thermodynamic system, with any reversible change in state, its entropy will not change. Mathematically, this can be written as follows: S = const.
Let us consider what happens to the entropy of a thermodynamic system in an irreversible process. Suppose that the transition from state 1 to state 2 along the path L 1 is reversible, and from state 2 to state 1 along the path L 2 is irreversible (Fig. 9.13).
Then the Clausius inequality (9-47) is valid. Let's write an expression for the right side of this inequality, corresponding to our example:
.
The first term in this formula can be replaced by a change in entropy, since this process is reversible. Then the Clausius inequality can be written as:
.
From here
. Because
, then we can finally write:
(9-48)
If the system is isolated, then
, and inequality (9-48) will look like:
, (9-49)
t o is the entropy of an isolated system increases during an irreversible process. The growth of entropy does not continue indefinitely, but up to a certain maximum value characteristic of a given state of the system. This maximum value of entropy corresponds to the state of thermodynamic equilibrium. The growth of entropy during irreversible processes in an isolated system means that the energy possessed by the system becomes less available for conversion into mechanical work. In a state of equilibrium, when the entropy reaches its maximum value, the energy of the system cannot be converted into mechanical work.
If the system is not isolated, then the entropy can both decrease and increase depending on the direction of heat transfer.
Entropy, as a function of the state of the system, can serve as the same state parameter as temperature, pressure, volume. Depicting this or that process on the diagram (T, S), one can give a mathematical interpretation of the amount of heat, as the area of the figure under the curve depicting the process. Figure 9.14 shows a diagram for an isothermal process in entropy - temperature coordinates.
Entropy can be expressed in terms of gas state parameters - temperature, pressure, volume. To do this, from the basic thermodynamic identity (9-46) we express the increment of entropy:
.
We integrate this expression and get:
(9-50)
The change in entropy can also be expressed in terms of another pair of state parameters - pressure and volume. To do this, you need to express the temperatures of the initial and final states from the equation of state of an ideal gas through pressure and volume and substitute in (9-50):
(9-51)
With the isothermal expansion of the gas into the void, T 1 = T 2, which means that the first term in the formula (9-47) will be set to zero and the change in entropy will be determined only by the second term:
(9-52)
Despite the fact that in many cases it is convenient to use reduced heat to calculate the change in entropy, it is clear that reduced heat and entropy are different, not identical concepts.
Let's find out physical meaning of entropy
. To do this, we use the formula (9-52), for an isothermal process in which the internal energy does not change, and all possible changes in characteristics are due only to a change in volume. Consider the relationship of the volume occupied by gas in equilibrium state, with the number of spatial microstates of gas particles. The number of microstates of gas particles, with the help of which a given macrostate of a gas as a thermodynamic system is realized, can be calculated as follows. Let us divide the entire volume into elementary cubic cells with a side of d ~ 10 -10 m (of the order of magnitude of the effective diameter of the molecule). The volume of such a cell will be equal to d 3 . In the first state, the gas occupies a volume V 1, therefore, the number of elementary cells, that is, the number of places N 1 that molecules can occupy in this state will be equal to
. Similarly, for the second state with volume V 2 we get
. It should be noted that the change in the positions of the molecules corresponds to a new microstate. Not every change in the microstate will lead to a change in the macrostate. Suppose molecules can occupy N 1 places, then the exchange of places of any molecules in these N 1 cells will not lead to a new macrostate. However, the transition of molecules to other cells will lead to a change in the macrostate of the system. The number of microstates of a gas corresponding to a given macrostate can be calculated by determining the number of ways in which particles of this gas can be arranged in unit cells. To simplify the calculations, consider 1 mole of an ideal gas. For 1 mole of an ideal gas, formula (9-52) will look like:
(9-53)
The number of microstates of the system occupying the volume V 1 will be denoted by Г 1 and determined by counting the number of placements N A (Avogadro number) of molecules that are contained in 1 mole of gas, in N 1 cells (places):
. Similarly, we calculate the number of microstates Г 2 of the system occupying the volume V 2:
.
The number of microstates Г i , with the help of which the i-th macrostate can be realized, is called thermodynamic probability this macro state. Thermodynamic probability Г ≥ 1.
Let's find the ratio G 2 / G 1:
.
For ideal gases, the number free places much more than the number of molecules, that is, N 1 >>N A and N 2 >>N A. . Then, taking into account the expression of the numbers N 1 and N 2 through the corresponding volumes, we get:
From here, we can express the ratio of volumes through the ratio of the thermodynamic probabilities of the corresponding states:
(9-54)
Substitute (9-54) into (9-53) and get:
. Given that the ratio of the molar gas constant and the Avogadro number is the Boltzmann constant k,
and also the fact that the logarithm of the ratio of two quantities is equal to the difference between the logarithms of these quantities, we get:. From this we can conclude that the entropy of the ith state S i is determined by the logarithm of the number of microstates through which this macrostate is realized:
(9-55)
Formula (9-55) is called Boltzmann formula who first received it and understood statistical meaning of entropy , how clutter functions . The Boltzmann formula has a more general meaning than the formula (9-53), that is, it can be used not only for ideal gases, and allows you to reveal the physical meaning of entropy. The more organized the system, the less number microstates, through which a given macrostate is realized, the less is the entropy of the system. The growth of entropy in an isolated system, where irreversible processes occur, means the movement of the system in the direction of the most probable state, which is the state of equilibrium. It can be said that entropy is measure of disorder systems; the more disorder in it, the higher the entropy. This is physical meaning of entropy .