Entropy is the most cumbersome term in science – leading to some confusing interpretations at times – but the law associated with this term is very important, and has attracted scientists and enthusiasts across disciplines – from basic and applied sciences to the aspects of social science. The term – coined by German physicist Rudolf Clausius (1822 – 1888) is enshrined in the 2nd Law of Thermodynamics. As with all accepted theories or laws of Nature, one law must agree with others to satisfy their compatibilities (compatibility theory) – which mean that the 2nd law must agree with the other laws of thermodynamics – the 0th, 1st, and the 3rd. Therefore this piece will briefly highlight the other three laws prior to embarking on explaining the 2nd law – to show its applications and implications in simple and easily understandable terms – at the same time weaving some traces of metaphysics into it. But before that, meanings of some terms must be delved into by revisiting Physics 101 – because these terms are crucial for understanding the laws.
SYSTEM. A system is a collection of objects (or could even be one object if one considers the molecular activities within that object) and the interactive processes within. It is selected, for the convenience of description and analysis – as an entity defined by its boundaries through which it interacts with the surroundings. The surroundings can be other systems, or generally the environment. The system and its surroundings are collectively referred to as the universe (we hear about this term often in astronomical contexts). Generally, a system – an open system interacts with its surroundings by freely or spontaneously exchanging (receiving and giving) matter and energy. A closed system exchanges only the energy with its surroundings. An isolated system, mostly manufactured – exchanges neither matter nor energy with its surroundings. One can define semi-enclosed, semi-isolated or other systems depending on the limit, restriction or filtration one applies on the exchange of energy, matter, or both. Also important is the state – it refers to the condition of a system in time – defined and measured in thermodynamics by temperature, pressure and density (mass and volume) – but generally refers to the condition of energy.
ENERGY, WORK AND HEAT. Energy is the ability of a system to do work. Work is done by transferring energy when a system is enacted by a force. Force refers to the application of external energy to cause accelerating motion on a system. The terms energy and work are equivalent, and both represent scalar quantities (magnitude only), while force is a vector term (with both magnitude and direction). There are different forms of energy: potential (stored energy in a system that has the potential to do work when released); kinetic (energy of a working system due to its motion); internal (total energy contained in a system; and any change of it, is the difference between heat or energy transferred into the system and work done by the system).
Heat refers to the internal energy in a system that is transferable to its surroundings when a temperature gradient exists (e.g. one way spontaneously, from hot to cold). Enthalpy is defined as the system’s internal energy plus the energy imparted by works done on its volume by external pressures. Temperature is the measure of average internal energy in a system. In SI system, the unit of force is Newton (kg.m/s^2), and the unit of energy, heat and enthalpy is Joule (J = N.m = kg.m.m/s^2). The rate of doing work or transferring energy is power, given in Watt (W = J/s). Temperature is measured in the thermodynamic Kelvin (in honor of British physicist WT Kelvin, 1824 – 1907) scale with absolute zero defined at K = -273.16oC (C = Celsius, in honor of Swedish astronomer A Celsius, 1701 - 1744). At absolute zero, all activities stop from micro to the macro level, therefore at this stage they are all immeasurable.
A few words on the processes involving the transfer of matter and energy. One of them is spontaneous – referring to the process that occurs on its own, without requiring an external energy input or work – this process is driven by downslope gradient. In the domain of physical processes, spontaneity is synonymous with the works of ubiquitous gravitational force – in the capacity of Natural downslope restoration and balancing acts. A work process takes place when an external force is applied to displace a system against an upslope gradient (e.g. against gravity). Thermal processes of heat exchange are four types – adiabatic (no transfer of heat across boundaries); isothermal (constant temperature); isobaric (constant pressure); and isochoric (constant volume). Equilibrium refers to the perfect balance of all processes in a system – at this state no spontaneous exchange of energy and matter takes place with the surrounding. Further, as we shall see – most of the processes in the systems of Nature (also in Social Systems) are Reversible. This process says that if some works are done to such a system (without reaching the breaking point of failure) – it will work out by itself (or when external energy is infused) – to revert back to its initial stage (albeit, may never recover fully in the contexts of space and time). In Engineering literature such systems are termed as an Elastic System. However repeated work attempts on such systems, affect it by pushing toward the stage of irreversibility – by a process known as Fatigue. There are some other systems of Nature that only works One-way – they are the ones recognized as Irreversible or Plastic processes. The 1st and the 2nd Laws of Thermodynamics explain how these two contrasting processes define things.
LAWS OF THERMODYNAMICS. Thermodynamics is the study of the effects of work, heat and energy on a system in the domain of Thermodynamic Force Field (TDFF).
ENTROPY AND THE 2ND LAW. Having clarified all the terms, it is time now to understand entropy and the 2nd law. Entropy is the change in the state of energy (ΔS) of a system during transformation. In thermodynamics, this change in the state of energy (from initial to the final, or from one level to the next) divided by the temperature (in Kelvin scale) at which change occurs, is the measure of entropy (J/K). Before going further, let us attempt to see some major states of energy transformation – some in the arena of coastal processes. What are these? Some familiar examples are:
The 2nd law states that entropy of the universe does not change in a reversible process (ΔS = 0), but it always increases in an irreversible process (ΔS > 0). This means that in a reversible process there are continuous and spontaneous exchanges among the systems of the universe that aim to achieve equilibrium – therefore the net change in entropy is zero. It indicates that if the entropy of a system decreases (ΔS < 0), the entropy of the surroundings must increase (ΔS > 0) by the same amount – so that the total entropy of the universe remains unaffected. This is indeed the restatement of the 1st law.
But the second part of the law – dealing with the irreversibility of processes is intriguing – in a sense that it adds a twisted but essential qualification to the 1st law. The twist is added in the form of turning a scalar entropy (like energy) into a vector quantity by assigning the sense of irreversible directionality. To understand the second part let us consider two simple examples:
The arrow of time in one direction and the inescapable growing disorderliness or higher entropy (another easily understandable term for this one-way Natural process is aging or burning out of all entities in time) in irreversible processes needs a close interpretive look. Because if one tries to explain things of a Natural system, let us say, in short time-scales – these two realities defined by the 2nd of law of thermodynamics may not have straightforward fit – rather one will be inclined to note that Natural irreversibility is not what it seems. Let us attempt to understand this assertion:
Let me very brief on the rest of this piece. Depending on what one focuses, the definition of entropy can be stretched and distinguished in terms of major processes responsible for energy flux and transformation. The key to such definitions is to identify a system and its major processes including characterization of them in terms of reversibility, irreversibility or quasi-reversibility (same as quasi-cyclic). And in literature, many different types of entropy were defined and proposed by investigators. For example, S Hawking (1942 – 2018) defined Black Hole Entropy pertaining to the event horizons of the super-gravity mass. Following this argument, the entropy in thermodynamics processes of heat flow can be termed as thermodynamic entropy. One can also define mind-and-matter entropy, biotic entropy, abiotic entropy, etc. In this piece, I like to introduce and define hydrodynamic entropy and socioeconomic entropy. Let me first start with defining hydrodynamic entropy – because that is where my interest lies. In fact, during the developing stage, French physicist NLS Carnot (1796 – 1832; considered the father of thermodynamics) used the downslope water flow analogy (the steeper the downslope energy gradient, the more the power) to understand and define the 2nd law.
COASTAL SYSTEM AND HYDRODYNAMIC ENTROPY. What is hydrodynamic entropy exactly? It is the change in the state of hydrodynamic energy of a system during transformation. Hydrodynamic energies are: potential (proportional to the product of water mass, gravitation acceleration and height above a datum), flow-kinetic (proportional to the product of water mass, and velocity squared), and wave-kinetic (proportional to the product of water mass, gravitational acceleration and wave height squared). Let us assume, for the sake of convenience that hydrodynamics also include sediment transport dynamics of erodible beds – although this aspect can also be defined as a separate system of interest.
SOCIOECONOMIC ENTROPY. How do the socioeconomic entropy and 2nd law work? Let me briefly touch on this question. In several pieces on the pages of NATURE and SOCIAL INTERACTIONS, I have highlighted positive and negative social energies. Social energy indicates the accumulation or depletion of societal wealth – wealth in this case refers to the multitude of factors such as economy, social interactions, and favorable degrees of: cohesion/divisiveness, trust/mistrust, integrity/corruption, equality/inequality, peace/disturbance, etc. The nature of these factors is determined by good or bad governance.
The change in the state of societal wealth is socioeconomic entropy. A positive energy is an indication of accumulating wealth, therefore represents good socioeconomic entropy. A bad socioeconomic entropy, on the other hand is a representation of depleting societal wealth. The underlying assumption in such characterizations of the society is that socioeconomy is a reversible process – but in a different interpretation of the term. The reversibility is ensured by the stream of works we put in to sustain socioeconomic progress and growth – and in turning negative into positive. But when a society becomes callous, incompetent and ill-motivated, the positive tends to veer toward the negative. The existence of positives and negatives suggest the reality, that there exist kinks in the character of societal wealth – the kinks are long and short, and high and low. The nature of kinks by itself is an indication of the soundness or weakness of societal wealth.
It ended up being another long piece. I like to finish it with a quote from German Philosopher, Arthur Schopenhauer (1788 – 1860): journalists are like dogs, whenever anything moves they begin to bark. It is like what the Buddha (563 – 483 BCE) - The Tathagata said: it is difficult not to express an opinion about others. And journalists have a loud mouthpiece through the media outlet they serve – therefore it is all the more tempting for them to bark. But as a true and responsible partner of democratic institutions, perhaps it makes sense that journalists and media – make accurate, measured and thoughtful barking for the sake of good socioeconomic entropy.
. . . . .
- by Dr. Dilip K. Barua, 18 May 2019