Thermodynamics and the Destiny of the Universe

Anil Mitra

Anil Mitra, PhD

Document created mAY 8, 2012

Home

Thermodynamics
and the Destiny of the Universe

Contents

Introduction

The Second Law of Thermodynamics and Entropy

Entropy, order, and the approach to heat death

Heat Death of the Universe is loss of Useful Energy but not of Energy Per Se

The Second Law for Non-Isolated Systems

Entropy and the Origin of Life

Conservative and Dissipative Forces

Dissipative Forces Responsible for Entropy Increase

Why are there no Anti-dissipative Effects in our Cosmos

Statistical Character of the Second Law

Universality of the First Law?

Approaches to Non-necessity of the Second Law

Anti-dissipative Regimes

Considerations from Modern Cosmology

# Introduction.

The origin and evolution of life and organization in the Universe require what may be called energy potentials.

On one interpretation, the Second Law of Thermodynamics implies that the Universe is approaching a final state in which the energy potentials are spent. This idea was apparently first introduced by William Thomson who called referred to this ultimate fate as the ‘heat death of the Universe’.

If the laws of classical thermodynamics are universal, i.e. without exception, heat death is inevitable.

In the following we look at the reasoning that leads to the conclusion of heat death, question the necessity of the conclusion, and suggest alternative scenarios.

# The Second Law of Thermodynamics and Entropy.

There are a number of equivalent formulations of the Second Law of Thermodynamics. It is convenient to begin with a form that asserts that the entropy of an isolated system increases in time (an isolated system is one that is not in interaction with any other system).

Does entropy increase or does it merely not decrease; and is the increase with time (inexorably) or with the kinds of process that are called dissipative? These questions will be addressed in what follows.

Other forms of the Second Law are more basic—but perhaps less revealing—in that they do not mention entropy but allow its existence and properties to be derived from those forms.

# Entropy, order, and the approach to heat death.

What is entropy? It can be defined quantitatively but here a qualitative explanation will be effective. Entropy may be seen as a measure of disorder of a system or equivalently of the negative of the information in it. Thus more structure and more information are factors that make for lower entropy (and it is therefore that information has been associated with ‘negentropy’).

The entropy of a system in a more organized state is less than the entropy of the system in a less organized state. Think of a system of a compressible gas. It is made of two compartments that are divided by a wall and the pressure on one side is greater than the pressure on the other side: the system is a state referred to as disequilibrium; this is because a constraint, the wall, is required to maintain the state. The pressure difference could be used to extract useful work or energy: the wall could be movable, it could be a piston, and the piston could drive some mechanical device. If the dividing wall is removed the two parcels of gas mix and attain a uniform pressure: the system will then be in mechanical equilibrium. It is the approach to equilibrium that is responsible for loss of useful energy. The system has become less structured, its entropy has risen and now some of what would otherwise have been useful work can no longer be gotten from the system because there is no pressure difference and, equivalently, no energy differential.

An increase in entropy is associated with decrease in order or structure, an approach to equilibrium, and a loss in useful energy—energy potential, and a loss in information (the combination of two compartments of the gas, each at a different pressure, represents more information than the combined parcels at the same pressure).

The argument has been plausible but it may be made rigorous. Additionally it generalizes to any compound isolated system.

# Heat Death of the Universe is loss of Useful Energy but not of Energy Per Se.

The First Law of Thermodynamics is a form of energy conservation law. It asserts that the energy of an isolated system cannot change.

In other words energy is not lost when entropy (of an isolated system) increases. Instead the organization of the system changes in such a way (organized to disorganized, disequilibrium to equilibrium) that the energy loses some of its usability.

The Universe (all that there is) is isolated and so according to the second law its entropy must increase in time.

In other words the Universe tends to an equilibrium state in which there are no differentials or disequilibrium and its entropy is a maximum and information content a minimum.

This situation, gloomy but hypothetical, has been called the heat death of the Universe (this kind of gloom appeals to certain kind of thinker: yes it is gloomy but I have the authority to say so and the courage to face the gloom: I stand alone in my existential glory). The heat death of the Universe is its hypothetical final state in which entropy is a maximum, information content and therefore structure is a minimum: there is no organization, no life, and no process except random processes.

This is said to be a consequence of the Second Law. Since the Second Law is an empirical law we may say, with rough equivalence, that the heat death is an expression or example of the tendencies entailed by the Second Law—i.e., a final case of particular cases.

# The Second Law for Non-Isolated Systems.

Entropy is additive in the sense that the entropy of two systems is the sum of the entropies of the individual systems.

If two systems are in interaction with each other but not with any other systems then the entropy of the system made up of the two must increase (since the compound system is isolated). Therefore the entropy of one may decrease but the entropy of the other must increase and net increase must be greater than zero; i.e. the decrease of the decreasing entropy system must be less than the increase of the increasing entropy system .

# Entropy and the Origin of Life.

This explains the possibility, even in view of the Second Law, of origin of life on Earth: Earth is not an isolated system. High information content, low entropy radiation is received from the sun; low information, high entropy radiation is radiated out to space; some fraction of the information and structure potential of the radiation is absorbed by atoms, chemicals, and organisms on Earth. Earth sheds entropy (disorganization) gains information in the form of complex chemicals and then life.

# Conservative and Dissipative Forces.

Why does the entropy of the Universe increase in time? At the macroscopic level (one at which the particle structure of matter is not evident) there are two kinds of forces: conservative and dissipative. Elastic forces typify the conservative: energy must be supplied to compress an elastic structure; the energy is stored in the structure and may later be extracted. Friction forces are dissipative. Common friction, viscous shear forces, internal friction in solid matter, a current flowing through a resistor are examples of dissipative forces at play. The characteristic that makes ‘dissipative’ forces dissipative is that their action does not result in a reversible store of energy: it results heating (increase in the energy of random molecular motion); energy is not lost but it loses its organization and usability. In contrast, the actions of conservative forces do not result in loss of information, increase in entropy, or loss of ‘available energy’, i.e. the ability to do or extract useful work.

# Dissipative Forces Responsible for Entropy Increase.

An improved version of the Second Law is that it is dissipative forces that result in increase of entropy; i.e. the increase has to do with dissipation and not intrinsically with time. Although our phase of the Universe appears to have and is marked by active dissipative forces it is conceivable if unlikely that it could be in a phase in which dissipative forces are lesser than intrinsically structure producing or anti-dissipative forces.

It is important that there are no dissipative forces at the fundamental level. I.e. the fundamental forces are all conservative (this is of course an empirical result but is thought to be universal; below we provide an explanation of why it is or appears to be universal). There is a macroscopic level—the level at which the dynamical contributions of individual particles are averaged out so that behavior is describable as if matter were not atomic in nature. It is at this level that forces that are conservative on the microscopic level manifest as dissipative. How does this happen? Friction is a mechanical force but in friction the rubbing at a surface results in the conversion of organized energy into random motion energy of atoms etc in which form there is loss in useful energy (the increase in molecular energy of motion manifests as warming which is how ‘heat’ is produced in friction). Because friction always opposes the relative motion of two sliding surfaces it is always dissipative and never ‘anti-dissipative’.

# Why are there no Anti-dissipative Effects in our Cosmos.

Why do we find, in our cosmos, no anti-dissipative forces—forces whose action results in entropy decrease of the Universe, of isolated systems? A partial answer—there are such anti-dissipative interactions. In gasses such as the atmosphere there are local fluctuations that represent a decrease in total entropy. However these are small and temporary fluctuations and the large scale (macroscopic at the level of a relatively small though numerically large number of molecules) is one of net entropy increase; and the sizes of the regions of fluctuation are too small and too transient to extract useful energy, i.e. there is no net increase in organization over sensible periods of time. Another example might be that of self-organizing systems far from equilibrium (as seen, for example, in the work of Ilya Prigogine). The self organizing systems are not truly entropy decreasing because the decrease of entropy for the system is at the expense of a greater increase of entropy associated with some kind of flow through the system (e.g. the energy from the sun flowing through Earth’s biosphere). Although the two examples of this paragraph are not true examples of entropy decrease in isolated systems of any significant scale they are significant in what they suggest.

# Statistical Character of the Second Law.

We now know, as a result of the discipline of statistical mechanics developed initially by Ludwig Boltzmann and James Clerk Maxwell and Josiah Willard Gibbs in the nineteenth century, that the Second Law is statistical. The likelihood that shuffling will return a pack of cards to its original arrangement is very small. On the scale of the Universe the likelihood of increase in net order or information, i.e. a decrease in information, is immensely small though not zero. Even though it is not zero the likelihood of a restructured Universe as an outcome of random molecular process over the age of the Universe may be regarded as effectively zero.

Here is what Einstein thought (said) regarding the necessity of the Second Law.

“A law is more impressive the greater the simplicity of its premises, the more different are the kinds of things it relates, and the more extended its range of applicability. (.) It is the only physical theory of universal content, which I am convinced, that within the framework of applicability of its basic concepts will never be overthrown.” (What’s the source? And notice the hedge ‘within the framework of applicability…’ that renders the observation less than necessarily universal).

In its statistical character the Second Law is different from the First Law. The Second Law is statistical; it is an expression of probabilities for aggregates of molecules (as we have seen the probabilities are such that they are effective though not necessary certainties). The First Law is a necessary consequence of behavior at the elementary level.

# Universality of the First Law.

It is thought that elementary (particle) processes are energy conserving—i.e., that at the most fundamental of levels the energy of isolated systems is constant; or, given two or more systems in interaction, the energy gains of all the systems in any process sum to zero. Consequently energy conservation at the macro level is a deductive or necessary consequence of fundamental behavior. On the other hand the Second Law which pertains especially to dissipative interactions is probabilistic. There is no dissipation at the particle level; probable dissipation and the increase in entropy are properties of the behavior of matter in aggregate form (again, the probabilities are so high as to constitute practical certainties).

If we imagine spontaneous creation of cosmological systems from the Void, there will be a kind of selection of conservative systems: conservative forces are necessary for stability and perhaps for life.

I shall however not take this tack except to suggest that fundamentally non conservative behavior might enter into the recycling of cosmological systems.

While I do not take up the idea of non-conservation at a fundamental level the idea is suggestive as an approach to the question of the fate of the Universe.

# Approaches to Non-necessity of the Second Law.

Can we be sure about the probability assertion? If the probability of something is greater than zero in some interval of time then given enough time the probability will become large. For the Universe the required time is enormous compared even to its age. Perhaps when heat death occurs there will be no process at all and so no possibility of restructure to an equal or greater form. We do not know this; we do not know enough to know this; still, I do not think that this is a particularly productive line of thought. The authority that speaks here is the combined authority of probability and of established thought and I am not entirely convinced by this authority. A first doubt is that regarding probability. The tendency to equilibrium is not characteristic of all systems; however it will be a good thing to take a more traditional approach—one that will not involve the colossal improbabilities involved in mere random restructuring and that will not involve hypothetical kinds of process And second, my Universal Metaphysics shows the ultimate emptiness of the entropy arguments. However, I will not take that tack because it too does not address the question of what we might expect on times that are of the order of the age of the Universe (which for this purpose must be taken to be the known cosmos).

It will be useful to wonder why energy conservation bears the mark of necessity at all levels (except perhaps that there are quantum fluctuations that individually and only very transiently violate energy conservation). We can imagine a cosmos with dissipation at the fundamental level. The ‘problem’ with such a cosmos is that it would not have the stability that is necessary for the variety of forms of our cosmos (unless the dissipation were so small as to be ineffective over appreciable time) (it would be an implosive or POOFATIVE cosmos). We can imagine another cosmos with energy generating interactions at the fundamental level. Such a cosmos might be too inflationary (e.g. exponential growth) for the stable humdrum of our cosmos (it would be an explosive or PUFFATIVE cosmos). This is a simple qualitative argument for conservation of energy but it could be made quantitative. It is interesting that it suggests that the first law is not a necessary law; there might be ways in which it too has statistical origins, e.g. over myriads of cosmoses but here we see it as an artifact of the kind of cosmos that permits our forms of Being and life.

It is easy to imagine how an energy generating cosmos might also be entropy destroying. Lacking the relevant ‘physics’ and or statistics of such systems the argument remains qualitative and hypothetical.

# Anti-dissipative Regimes.

To argue against the universality of the Second Law we must go back to the idea of far from equilibrium behavior. We live in a regime of increasing entropy. Some scientists imagine that if the cosmos should stop expanding and contract then when it goes through a big crunch it would come out on the other side but, in view of the Second Law, with greater entropy; i.e., on that view, a persisting Universe would on the long run be bound toward heat death. This bows too much to the authority of the Second Law which after all we have never seen in action (or not) in a great gravitational collapse (crunch: one that is more than just some black hole). It is not unreasonable to think that gravitational collapse would invoke an anti-dissipative Second Law regime via gravity as an organizing interaction.

The thought remains an idea and awaits quantitative or necessary qualitative development as well as some kind of data support or as yet unexplained data to predict.

However, if we knew that the cosmos would go through a collapse and come out reordered, then some kind of ordering Second Law regime (in contrast to the present disordering or at least non-ordering one) would be near necessary: for the alternative would be sheer time for probability to take its course.

# Considerations from Modern Cosmology.

There are interesting but inconclusive considerations from modern physical cosmology.

Inflationary cosmology suggests that in the early Universe gravity was dominant and uniform distribution of gravititional energy is associated with low entropy.

In a closed universe, there is recollapse and heat death is expected with the universe approaching maximum entropy.

In an open universe the value of maximum entropy may increase faster the gain of entropy and the universe woud then move further away from heat death.