What is Entropy?

In physics, entropy means the amount of disorder or the degree of randomness associated with a thermodynamic system. The system is not self-organized when the entropy of the system is high. For instance, when a system is at an elevated temperature, its molecules are in random motion, vibrating and colliding with each other. During this scenario, the probability of finding a molecule at a particular place twice, is low. Under this circumstance, the system can be said to be in a high entropy.

Entropy is a point function and is an exact differential. It is a thermodynamic property and denotes the amount of unavailable energy of a system. In engineering thermodynamics, entropy is a measure of the amount of unavailable thermal energy for mechanical work. Entropy is represented by S and its SI unit is JK.

Entropy of a reversible process

The definition of a reversible process is associated with a closed system such that, after the process termination, the system can be brought back to its original conditions.

Consider the image below. A system is undergoing a cyclic process. The system is initially at state 1 and by undergoing a certain process it moves to state 2, following path 1-a-2. The system later returns to its original state, state 1, following path 2-b-1.

PV diagram showing a cyclic reversible process

For a thermodynamic system undergoing a cyclic process, the expression for Clausius inequality is given as,

δQT0, where, Q is the heat interaction of the system with the surrounding and T is the temperature of the surrounding.

The Clausius inequality for a system undergoing a cyclic reversible process is given as,

RevδQT=0, where, Rev denotes the integral over a cyclic reversible path.

And, the Clausius inequality for a system undergoing a cyclic irreversible process is given as,

IrrevδQT<0, where, Irrev denotes the integral over a cyclic irreversible path. 

Now, apply the Clausius inequality for the reversible cyclic process, 1-a-2-b-1,

δQT1-a-2+δQT2-b-1=0          …(1)

Assume that, instead of taking path 2-b-1, the system returns to its original path following path 2-c-1.

Hence, apply the Clausius inequality for the reversible cyclic process, 1-a-2-c-1,

δQT1-a-2+δQT2-c-1=0          …(2)

Subtract (1) and (2),

δQT2-b-1=δQT2-c-1

It is obvious from the above expression that no matter what path the system chooses, only the initial and the final state matters. Hence, the quantity must represent a change in the property. This is known as the entropy change and the property is known as entropy. Therefore, the final expression can be written as:

δQT=dS, where dS is the entropy change of the system.

Entropy of an irreversible process

The process is said to be irreversible when the system's initial conditions cannot be restored after the execution of the process or the system is in non-equilibrium condition after the process termination.

Consider the same image above, but this time the system follows an irreversible path 2-c-1 (denoted by the dotted line) while returning to its initial state, state 1. Refer to the image below.

PV diagram showing a cyclic process consisting of a irreversible path

It is known that for a closed system undergoing a cyclic irreversible process, the expression for Clausius inequality is given by,

IrrevδQT<0

Where IrrevδQT represents the cyclic integral over a closed irreversible cyclic path.

Apply the Clausius inequality condition for the reversible cyclic process, 1-a-2-b-1,

δQT1-a-2+δQT2-b-1=0          …(3)

Apply Clausius inequality for the irreversible cyclic process, 1-a-2-c-1,

δQT1-a-2+δQT2-c-1<0          …(4)   

From equation (3),      

δQT1-a-2=-δQT2-b-1           …(5)

Substitute the value of equation (5) in equation (4),

-δQT2-b-1+δQT2-c-1<0

δQT2-c-1<δQT2-b-1

δQTIrrev<δQTRev

δQTIrrev<dS                              …(6)

Hence, for an irreversible process, the quantity δQT is always less than the entropy change dS.

Therefore, equation (6) can be rewritten as,

δQTIrrev=dS+Prod

Where Prod is always a positive quantity known as the entropy production or entropy generation.

Boltzmann entropy

Coined by Ludwig Boltzmann, this form of entropy finds its application in statistical mechanics. In statistical mechanics, the number of microstates is infinite, since in statistical mechanics the matter is assumed to be continuous. Under such assumptions, the microstates of a system are characterized by the positions and momentum of all the atoms present in it. Boltzmann gave a microscopic definition of entropy, unlike Clausius, whose entropy's definition was in the sense of macroscopic point of view. According to Boltzmann, entropy is the logarithm of the number of microscopic states that share the physical quantities of the macroscopic states. The concept of probability of the microscopic states in the entropy was later coined by Gibbs. 

The Gibbs entropy is the function of the probability distribution which is spread over phase space. The definition of entropy by Gibbs remains meaningful even if the system is isolated and is far away from equilibrium. The concept can be extended for multiple phases, as given by von Neumann entropy. If the ensemble of classical systems evolves according to the Liouville von Neumann equation, the entropy is of constant motion. 

The Boltzmann entropy is generally represented by the following equation, 

S=kBlogW

Where k is the Boltzmann constant and W is the thermodynamic probability of the microscopic states that are compatible with the microstates of the system.

Determination of the probability of the microstates

The probability of a microstate can be determined by a simple expression.

Consider an isolated system that contains N number of molecules, belonging to m energy states, Eii=1,2,.....m. Let ψi be the occupation number, where, i=1,2,3,....m. Define  ζm=m1, m2,....mm as the set of occupation numbers.

The probability distribution of the set of occupation numbers, ζm is given as,

Pζm=Wζmi=1mPi0mi

Where Pi0 is known as the prior probability to find a molecule in ith energy state.

Shannon entropy

Coined by Claude Shannon in 1948, Shannon entropy relates to the amount of uncertainty or the surprise in a random variable in information theory. Information theory is a branch of mathematics that deals with the transmission of data along a noisy channel.

This theory was originated in the area of the theory of communications. According to the theory of communications, a data communication system consists of three parts, namely, a source of data, communication channel, and receiver. The intuition behind Shannon was to know what data was generated by the source based on the signal received from the channel.

The concept of entropy in information is approximately analogous to entropy in statistical thermodynamics where the values of random variables designate the energies of the microstates. The entropy provides a medium to measure the average amount of information needed for the representation of an event from the probability distribution of a random variable.

Speaking in relevance to mechanical engineering, machinery having rolling element components, such as rolling element bearings, generates more random signals. When there is a fault, the signal becomes more precise and concentrated at a certain interval. The Shannon entropy is used in this case to detect the uncertainty of the random weak signals.

In the areas of medical research, entropy-based approaches are recently being adopted. Researchers have adopted the Shannon entropy formula to combine unique signaling networks and transcription data. While combining the transcription data, the researchers have found that the entropy level of the cancer samples was higher than the normal samples. Also, in the case of growing tumors, it has been found that more advanced stage tumors had higher entropies than lower stages.

This concept is also been used for the recent ongoing pandemic of COVID-19. A higher entropy value signifies more rapid spread of the virus than a lower value indicating less spread of the virus.

Shannon defined the entropy as H, after Boltzmann's H theorem. The expression for H is given by,

HX=-i=0N-1PXilogbPXi where X is the discrete random variable, PXi is the probability of a single event.

The entropy in information theory will have different units depending on the base of the logarithm. For example, if one is dealing with computers, the value of the base of the logarithm is b=2, and the unit would be bits.

 

Context and Applications

The topic finds its application in the following areas:

  • Masters in Science (Mathematics)
  • Bachelors in science (Mathematics)
  • Bachelors in Technology (Mechanical)
  • Bachelors in Commerce (Economics)
  • Masters in Commerce (Economics)

Practice Problems

1. Which of the following characteristics of a system represents entropy?

  1. A random event
  2. A disordered behavior
  3. Unorganized behavior
  4. All of the above

Correct option- d

Explanation: Entropy means the amount of randomness or disorder present within a system. The self-organization of the system or the molecules is less.

2. Shannon entropy relates to the uncertainty of which of the following quantity?

  1. Random variable
  2. Boltzmann H-theorem
  3. Clausius inequality
  4. Microstate

Correct option- a

Explanation: Shannon entropy relates to the uncertainty or surprise of a random variable in information theory.

3. Which of the following gives the equation for Clausius inequality for an irreversible process?

  1. δQT<0
  2. δQT=0
  3. δQT>0
  4. δQT0 

Correct option- a

Explanation: The expression for Clausius inequality for an irreversible process is given by δQT<0

4. What will happen to the entropy of a reversible system if the heat supplied to the system increases?

  1. Decrease
  2. Remains unchanged
  3. Increase
  4. May either increase or decrease

Correct option - c

Explanation: For a reversible system the expression for the entropy is given as δQT=dS, if Q increases, the quantity dS will also increase.

5. Which of the following is the nature of entropy production or generation in the expression of change in entropy for an irreversible process?

  1. Always negative
  2. Always positive
  3. Either negative or positive depending on system conditions
  4. Cannot be predicted as entropy is random behavior

Correct option - b

Explanation: For the following expression, δQT=dS+Prod holds true where Prod has to be a positive quantity.

Want more help with your mechanical engineering homework?

We've got you covered with step-by-step solutions to millions of textbook problems, subject matter experts on standby 24/7 when you're stumped, and more.
Check out a sample mechanical engineering Q&A solution here!

*Response times may vary by subject and question complexity. Median response time is 34 minutes for paid subscribers and may be longer for promotional offers.

Search. Solve. Succeed!

Study smarter access to millions of step-by step textbook solutions, our Q&A library, and AI powered Math Solver. Plus, you get 30 questions to ask an expert each month.

Tagged in
EngineeringMechanical Engineering

Thermodynamics

Basic Thermodynamics

Entropy

Search. Solve. Succeed!

Study smarter access to millions of step-by step textbook solutions, our Q&A library, and AI powered Math Solver. Plus, you get 30 questions to ask an expert each month.

Tagged in
EngineeringMechanical Engineering

Thermodynamics

Basic Thermodynamics

Entropy