First law of thermodynamics tell us about energy conservation,
But energy conservation alone does not tell us which process are possible. For that, we need the Second law. The Second Law introduces us to an important idea: irreversibility.
An irreversible heat engine is always less efficient than the reversible one operating between the same temperature limits. Similarly, irreversible refrigerators/heat pumps always have a lower COP than the reversible one operating between the same temperature limits.
To capture this mathematically, Rudolf Clausius introduced an inequality, called the Clausius inequality, which forms the foundation for defining entropy.
Figure 1. The system considered for Clausius inequality
Thus, the cyclic integral of volume is zero. Conversely, any quantity whose cyclic integral is zero is a property of the state, meaning it depends only on the state and not on the process path.
Figure 2. The net change in volume during a cycle is alwayss zero.
Since entropy is a property, it has a definite value at each state, just like all other thermodynamic properties. Therefore, the entropy change ΔS between two specified states is the same regardless of the path—whether reversible or irreversible—taken during the process (see Figure 3).
In practice, we are usually concerned with the change in entropy rather than its absolute value. The entropy of a substance can be assigned a value of zero at an arbitrarily chosen reference state, and the entropy at other states can then be determined using the preceding relation.
Figure 3. The entropy change between two specific states.
It is well established that not all quantities of heat possess the same potential for conversion into work. The concept of entropy provides a measure of this potential. Entropy is a fundamental thermodynamic property that characterizes the extent to which a given amount of heat can be transformed into useful work.
A useful way to understand entropy is through an analogy with money. If energy is thought of as money, then the ability to perform useful work corresponds to the amount of money that can actually be spent. However, just as financial transactions often involve fees or taxes, the conversion of energy into useful work is limited by entropy.
Useful work is like the portion of money that can be spent directly.
Entropy represents the “transaction fee” or “tax” that reduces the amount of spendable money.
The greater the entropy, the smaller the fraction of energy available for useful work.
Thus:
Low-entropy energy (such as high-temperature heat, electricity, or mechanical work) is comparable to receiving pure cash—fully usable and versatile.
High-entropy energy (such as low-temperature heat or waste heat) is more like receiving a gift card with strict conditions—energy is present, but its usability is restricted.
When heat is supplied at a higher temperature, the resulting increase in entropy is relatively small. In contrast, when the same amount of heat is added at a lower temperature, the increase in entropy is comparatively larger. Therefore:
A higher-entropy state signifies reduced availability of energy for work conversion.
A lower-entropy state indicates greater availability of energy for work conversion.
This demonstrates why high-temperature energy sources are more valuable for producing work than low-temperature energy sources.
An everyday example helps to illustrate these principles. Consider a hot cup of coffee left in a cooler environment:
Heat flows naturally from the coffee (a source of high-temperature, low-entropy energy) to the surrounding air (a sink at lower temperature, higher entropy).
In this process, the total energy is conserved, but the quality of the energy decreases because entropy increases.
Once the coffee cools, it is impossible for the process to reverse spontaneously; the coffee will not draw energy back from the room to reheat itself.
This irreversibility highlights the central role of entropy in determining the direction of natural processes and in limiting the potential to recover useful work from energy transfers.
"Clausius inequality is the mathematical way of saying: nature never wastes energy, but it always wastes usefulness of energy — that’s entropy increase."