anil Mitra PHD, COPYRIGHT © 2001, REFORMATTED June 13, 2003


All matter has energy. The interest in energy it is usefulness; additionally the concept of energy is important in understanding the physical universe. Not all energy is equally usable. The science of energy also deals with the usability or availability of energy. Commonly utilizable energy is in foods and both chemical and nuclear fuels. The energy in rivers, oceans, and the wind is also useful. Much of this energy comes indirectly from the sun but solar energy can also be tapped directly. The sun’s energy comes from nuclear fusion

Energy sources provide warmth [heat] and the power to do physical work – climb a mountain, a heartbeat, to think… and to drive engines of various kinds: automobiles, jets, lawnmowers, and power plants

The science of energy interactions, called thermodynamics, is taught to scientists and professionals – engineers, nutritionists… Thermodynamics is a branch of physics. The detailed structure of substances is not of direct concern; it is the net energy transfers and transformations that are studied. For some simple systems, the measurable properties of concern are familiar ones like temperature [T], pressure [p] and density [ρ – the Greek letter “rho”]. A famous property the entropy [S] is related to usable energy. When entropy is produced, the net available energy is reduced. In addition to the properties, heat [Q is the heat transferred to a system] and work [W is the work done on a system] are important concepts. Heat and work are forms of energy transfer: heat is energy transfer due to a difference in temperature, work is a mechanical effect – work is energy transfer due to a force [F] moving through a distance. If the distance moved in the direction of the force is d, the work done is:

W = F x d

This formula assumes that the force remains constant. When the force is not constant, the mathematics is more complex but the concept is the same

As noted temperature, pressure and density are measurable properties for some systems. For other systems, there are other properties of concern in addition to or instead of pressure and density but temperature is always part of thermodynamic understanding and calculation


THE ENERGY LAW: Energy cannot be created or destroyed. This law is also known as the first law of thermodynamics

The law has both physical and economic consequences. Since energy is not created or destroyed, the increase of energy of a system must be the energy transferred to it. In symbols:

Δ U = Q + W

Δ is the symbol for change. U is the symbol for energy, which, for technical reasons, is called the internal energy of a system. In words, the equation says that the increase [change] in energy is the sum of the heat transfer to and the work done on the system. Since the universe has nothing to interact with the net work and heat for the universe are zero and the energy of he universe is a constant. The energy law is thought to be universal – obeyed in some form by all known physical processes except perhaps quantum mechanical fluctuations that manifest for insignificant periods


THE ENTROPY LAW: Every system has a property called entropy. In an isolated system – one that does not interact with other systems – the entropy cannot decrease. The entropy law is also called the second law of thermodynamics

In real processes the entropy of an isolated system increases, it is only in certain idealized processes that the entropy remains constant for such systems. The entropy of a system may decrease but only at the expense of a greater net increase in other systems. This is how some systems absorb energy, create structure and order while “degrading” the energy

The entropy law can be presented in a number of ways. In the original developments, the entropy law was connected to the performance of heat engines and, later to refrigerators. Thus, the entropy law also has economic consequences. As stated above, the entropy is related to usable or available energy. There are limitations on the amount of energy that can be converted to useful work in heat engines such as those in power plants and automobiles. The entropy law can be used to calculate those limits. The statement of the law above is an adaptation that focused on systems and properties rather than special types of mechanisms – in this formulation its application to physical processes in general is more direct

Since the universe has no other systems with which to interact, its entropy must always increase. That is, the net “usable” energy decreases as the universe proceeds toward its final death at which there is no energy available for use and so nothing can happen. The universe is then “dead.” This last conclusion is often found to be depressing but the entropy law is a statistical law and not necessary – even though essential violation is extremely unlikely in most circumstances. However, it is not known that it will hold as the universes cycle through birth, death and rebirth or what will happen in unending time


Entropy is related to usable energy. The mathematical formulation spells out the relation. In real world processes, various effects result in loss in useful energy. The typical example is friction. Think of a car cruising down a level highway at a fixed speed. The energy –from gasoline – is used to overcome friction. The friction occurs at a number of places: with the air, at the contact of the tires with the road and at the friction points in the engine, the transmission and so on. The friction results in a heating effect and some energy is no longer available to do useful work. The symbol LW refers to “lost work”. The connection to entropy is that when usable energy is lost, the entropy of a system increases. The effect on the entropy of a system of heat transfer to the system is similar to the effect of friction in the system. These arguments are meant to be intuitive and not precise. The objective is to give some meaning to the following mathematical formulation of the entropy law:

Δ S = [Q + LW] / T

In words: form the sum of the heat transfer to and the lost work in the system; divide this sum by the temperature to get the change in entropy of the system. There is one assumption in this formulation – it is that the temperature is constant. However, the equation can be used to formulate a mathematically –but not conceptually– more complex equation for the change in entropy when the temperature is not a constant


The discussion so far has not depended on the molecular constitution of systems. That is both an advantage and disadvantage. The advantage is that it is very general. A disadvantage is that a molecular level understanding gives insight. These are not true advantages or disadvantages because both kinds of understanding can be used together: they complement each other

The use of molecular considerations in energy results in a theory called statistical mechanics and statistical thermodynamics. These are, mathematically, more complex than the thermodynamics of systems discussed here. I do not plan to do any of the molecular theory right here but thought it useful to mention the fact that there is a well-developed body of knowledge of energy science based on atomic and molecular constitution of matter

The theories that are based on the molecular constitution can be called microscopic. Our discussion is macroscopic or large scale. The microscopic theories lead to the following significant insight: the energy law is universally valid while the entropy law is statistically valid. Practically, this means that in normal circumstances we will [almost] never see exceptions to the entropy law. However, we cannot expect that the entropy law will apply universally – over all time and space

Another insight from microscopic thermodynamics is that entropy is a measure of molecular disorder. It follows that a loss in useful energy is associated with an increase in disorder. The microscopic theories have many other uses and applications – these will be developed later