Entropy is the measure of the disorder prevalent in a dynamic system. Entropy is defined by the second law of thermodynamics as the amount of thermal energy flowing in and out of a system. It is a measure of the randomness or chaos prevalent in a system.
The German physicist Rudolf Clausius used the term entropy for the first time in 1850. Entropy and the law of conservation of energy are interlinked and explain why the universe prefers disorder to order. Entropy is often referred to as the arrow of time. Like time, entropy flows in one direction only.
According to the third law of thermodynamics, the total amount of entropy in the universe always remains constant or increases.