- This article is about
**entropy**in physical chemistry. - We'll start by learning the
**definition of entropy**and its**units**. - We'll then look at
**entropy changes**, and you'll be able to practice calculating enthalpy changes of reaction. - Finally, we'll explore the
**second law of thermodynamics****feasible reactions**. You'll find out how entropy, enthalpy, and temperature determine the feasibility of a reaction through a value known as**G****ibbs free energy**.

## Entropy definition

In the introduction to this article, we gave you one definition of entropy.

**Entropy** **(S)** is a measure of **disorder **in a **thermodynamic system**.

However, we can also describe entropy differently.

**Entropy** **(S)** is the number of possible ways that particles and their energy can be **distributed **in a system.

The two definitions seem very different. However, when you break them down, they start to make a little more sense.

Let's revisit the Rubik’s cube. It starts off ordered - each face contains just one colour. The first time you twist it, you disrupt the order. The second time you twist it, you *might* undo your first move and restore the cube to its original, perfectly solved arrangement. But it is more likely that you will rotate a different side and disrupt the order even more. Each time you randomly twist the cube, you increase the number of possible configurations that your cube could take, decrease the chance of landing upon that perfectly solved arrangement, and get more and more disordered.

Now, imagine a 3x3 Rubik's Cube. This complex cube has many more moving parts than the first, and so has more possible permutations. If you shut your eyes and twist the sides around blindly once more, the odds of chancing upon a solved cube when you open them again are even slimmer - it is extremely unlikely that your cube will have anything but a totally random, disordered configuration.** A larger cube with more individual pieces has a greater tendency to become disordered**, simply because there are so **many more ways that it can be arranged**. For example, a simple 2x2 Rubik's cube has over 3.5 million possible permutations. A standard 3x3 cube has 45 quintillion combinations - that's the number 45 followed by 18 zeros! However, a 4x4 cube trumps them all with a mind-blowing 7.4 quattuordecillion combinations^{1}. Ever heard of a number that large before? It is 74 followed by 44 zeros! But for all of those cubes, there is only one solved arrangement, and so the odds of randomly stumbling across that perfect combination decrease.

Notice something? As time goes on, the cube goes from solved to randomly arranged, **from a state of order to ****disorder**. In addition, as the **number of moving pieces increases**, the **tendency to become more disordered increases** because the cube has a **larger number of possible arrangements**.

Let’s now relate this to entropy. Imagine that each sticker represents a certain particle and amount of energy. The energy starts off neatly **arranged **and **ordered**, but quickly becomes **randomly arranged** and **disordered**. The larger cube has more stickers, and so has more particles and units of energy. As a result, there are more possible configurations of stickers and **more possible arrangements of particles and their energy**. In fact, it is a lot easier for the particles to move away from that perfectly ordered arrangement. With each move away from the starting configuration, the particles and their energy become more and more randomly dispersed, and** more and more disordered**. This fits with our two definitions of entropy:

The larger cube has a

**higher number of possible arrangements of particles and their energy**than the smaller cube, and so has a**greater entropy**.The larger cube tends to be

**more disordered**than the smaller cube, and so has a**greater entropy**.

## Properties of entropy

Now that we have a bit of an understanding of entropy, let’s look at some of its properties:

Systems with a

**higher number of particles**or**more units of energy**have a**greater entropy**because they have more**possible distributions**.**Gases****have a greater entropy than solids**because the particles can move around much more freely and so have more possible ways of being arranged.**Increasing the temperature****of a system**increases its entropy because you supply the particles with more energy.**More complex species**tend to have a**higher entropy**than simple species because they have more energy.**Isolated systems tend towards a greater entropy**. This is given to us by the**second law of thermodynamics**.**Increasing entropy increases the energetic stability of a system**because the energy is more evenly distributed.

## Units of entropy

What do you think the **units of entropy **are? We can work them out by considering what entropy depends on. We know that it is a measure of **energy**, and is affected by **temperature** and the **number of particles**. Therefore, entropy takes the units **J·K**^{-1·}**mol**** ^{-1}**.

Note that unlike** enthalpy**, entropy uses **joules**, not** kilojoules**. This is because a unit of entropy is smaller (in order of magnitude) than a unit of enthalpy. Head over to **Enthalpy Changes **to find out more.

### Standard entropy

To compare entropy values, we often use entropy under **standard conditions**. These conditions are the same as the ones used for **standard enthalpies**:

A temperature of

**298K**.A pressure of

**100kPa**.All species in their

**standard states**.

Standard entropy is represented by the symbol **S°.**

## Entropy changes: definition and formula

Entropy cannot be measured directly. However, we can measure the **change in entropy (ΔS****)**. We typically do this using standard entropy values, which have already been calculated and verified by scientists.

**Entropy change** **(ΔS****) **measures the change in disorder caused by a reaction.

Each reaction firstly causes an **entropy change within the system **- that is, within the reacting particles themselves. For example, a solid might turn into two gases, which increases the total entropy. If the system is **completely isolated**, this is the only entropy change that takes place. However, isolated systems don't exist in nature; they are **purely hypothetical**. Instead, reactions also affect the **entropy of their surroundings**. For example, a reaction might be exothermic and release energy, which increases the entropy of the surroundings.

We’ll start by looking at the formula for the **entropy change within a system** (commonly simply known as the **entropy change of a reaction**, or just **entropy change**), before taking a deep dive into the **entropy change of the surroundings** and the **total entropy change**.

Most exam boards only expect you to be able to calculate the** entropy change of a reaction**, not the surroundings. Check *your* specification to find out what is required of you from your examiners.

### Entropy change of reaction

The **entropy change of a reaction** (which, you’ll remember, is also called the **entropy change of the system**) measures the **difference in entropy between the products and the reactants in a reaction**. For example, imagine your reactant is the perfectly solved Rubik’s cube, and your product is a randomly arranged cube. The product has a **much higher entropy** than the reactant, and so there is a** positive entropy change**.

We work out the standard entropy change of reaction, represented by **ΔS****°**_{system} or just **ΔS**°, using the following equation:

$$\Delta S^\circ = {\Delta S^\circ}_{products}-{\Delta S^\circ}_{reactants}$$

1) Don’t worry - you aren’t expected to remember standard entropy values! You’ll be provided with them in your exam.

2) For examples of entropy changes, including the chance to calculate them yourself, check out **Entropy Changes**.

### Predicting entropy changes of reaction

Let’s now see how we can use what we know about entropy to predict the possible entropy change of a reaction. This is a quick way to estimate entropy changes without doing any calculations. We predict the entropy change of a reaction by looking at its equation:

A

**positive entropy change of reaction**means the entropy of the system**increases**and the products have a**higher**entropy than the reactants. This could be caused by:A

**change of state**from**solid to liquid**or**liquid to gas**.An

**increase in the number of molecules**. In particular, we look at the**number of gaseous molecules**.An

**endothermic reaction**that takes in heat.

A

**negative entropy change of reaction**means that the entropy of the system**decreases**, and the products have a**lower**entropy than the reactants. This could be caused by:A

**change of state**from**gas to liquid**or**liquid to solid**.A

**decrease in the number of molecules**. Once again, we look closely at the**number of gaseous molecules**.An

**exothermic reaction**that releases heat.

### Entropy change of surroundings

In real life, reactions don’t just result in an entropy change within the **system** - they also cause an entropy change in the **surroundings**. This is because the system isn’t isolated, and the heat energy absorbed or released during the reaction affects the surrounding environment’s entropy. For example, if a reaction is **exothermic**, it releases heat energy, which heats up the environment and causes a **positive** entropy change in the surroundings. If a reaction is **endothermic**, it absorbs heat energy, cooling the environment and causing a **negative **entropy change in the surroundings.

We calculate the standard entropy change of surroundings using the following formula:

$${\Delta S^\circ}_{surroundings}=\frac{{-\Delta H^\circ}_{reaction}}{T}$$

Note that here, T is the temperature that the reaction takes place at, in K. For standard entropy changes, this is always 298 K. However, you can also measure *non-standard* entropy changes - just make sure you use the right value for temperature!

### Total entropy change

Lastly, let's consider one final entropy change: **total entropy change**. Overall, it tells us whether a reaction causes an **increase**** in entropy **or a **decrease in entropy**, taking into consideration the entropy changes of both the **system** and the **surroundings**.

Here’s the formula:

$${\Delta S^\circ}_{total}={\Delta S^\circ}_{system}+{\Delta S^\circ}_{surroundings}$$

Using the formula for the entropy change of the surroundings that we found out above:

$${\Delta S^\circ}_{total}={\Delta S^\circ}_{system}-\frac{{\Delta H^\circ}_{reaction}}{T}$$

The total entropy change is very useful because it helps us predict whether a reaction is **feasible** or not. Don’t worry if you haven’t heard of this term before - we’ll visit it next.

## Entropy and feasible reactions

We learned earlier that, according to the **second law of thermodynamics**, isolated systems tend towards a **greater entropy**. We can therefore predict that reactions with a **positive entropy change** happen on their own accord; we call such reactions **feasible**.

**Feasible** (or **spontaneous**) reactions are reactions that take place **by themselves**.

But many feasible day-to-day reactions *don’t* have a positive entropy change. For example, both rusting and photosynthesis have negative entropy changes, and yet they are everyday occurrences! How can we explain this?

Well, like we explained above, it is because natural chemical systems *aren’t* isolated. Instead, they interact with the world around them and so have some sort of effect on the entropy of their surroundings. For example, **exothermic reactions release heat energy**, which **increases** their surrounding environment’s entropy, whilst **endothermic reactions**** absorb heat energy**, which **decreases **their surrounding environment’s entropy. Whilst *total* entropy** **always increases, the entropy of the *system* doesn’t necessarily increase, provided the entropy change of the *surroundings* makes up for it.

So, reactions with a positive total energy change are **feasible**. From looking at how a reaction affects the entropy of its surroundings, we can see that feasibility depends on a few different factors:

The

**entropy change of the reaction**,**ΔS°**(also known as the**entropy change of the system**, or just**entropy change**).The

**enthalpy change of the reaction**,**ΔH°**.The

**temperature**at which the reaction takes place, in K.

The three variables combine to make something called the **change in** **Gibbs free energy**.

**The change in Gibbs free energy (ΔG)** is a value that tells us about the feasibility of a reaction. For a reaction to be feasible (or spontaneous), ΔG must be negative.

Here’s the formula for the change in standard Gibbs free energy:

$$\Delta G^\circ={\Delta H^\circ}-T\Delta S^{\circ}$$

Like enthalpy, it takes the units kJ·mol^{-1}.

You can also calculate Gibbs free energy changes for *non-standard* reactions. Make sure to use the right value for temperature!

The change Gibbs free energy explains why many reactions with negative entropy changes are spontaneous. **An extremely exothermic reaction with a negative entropy change can be feasible**, provided ΔH is large enough and TΔS is small enough. This is why reactions such as rusting and photosynthesis take place.

You can practice calculating ΔG in the article **Free Energy**. There, you’ll also see how temperature affects the feasibility of a reaction, and you’ll be able to have a go at finding the temperature at which a reaction becomes spontaneous.

Feasibility all depends on the **total entropy change**. According to the second law of thermodynamics, **isolated systems tend towards a greater entropy**, and so the total entropy change for feasible reactions is always **positive**. In contrast, the value of Gibbs free energy change for feasible reactions is always negative.

We now know how to find both total entropy change and the change in Gibbs free energy. Can we use one formula to derive the other?

$${\Delta S^\circ}_{total}={\Delta S^\circ}_{system}-\frac{{\Delta H^\circ}_{reaction}}{T}$$

Multiply by T:

$$T{\Delta S^\circ}_{total}=T{\Delta S^\circ}_{system}-{\Delta H^\circ}_{reaction}$$

Divide by -1, then rearrange:

$$-T{\Delta S^\circ}_{total}={\Delta H^\circ}_{reaction}-T{\Delta S^\circ}_{system}$$

The units of entropy are J K^{-1} mol^{-1}, whilst the units of Gibbs free energy are kJ mol^{-1}.

Therefore:

TΔS° _{total} is a version of Gibbs free energy. We've successfully rearranged the equations!

## Entropy - Key takeaways

**Entropy (ΔS)**has two definitions:- Entropy is a measure of disorder in a system.
- It is also the number of possible ways that particles and their energy can be distributed in a system.

- The
**second law of thermodynamic**s tells us that**isolated systems always tend towards a greater entropy**. **Standard entropy values (****ΔS°)**are measured under**standard conditions**of**298K**and**100 kPa**, with all species in**standard states**.- The
**standard entropy change of a reaction**(also known as the**entropy change of the system**, or just**entropy change**) is given by the formula \(\Delta S^\circ = {\Delta S^\circ}_{products}-{\Delta S^\circ}_{reactants}\) **Feasible**(or**spontaneous**) reactions are reactions that take place of their own accord.- The entropy change of a reaction isn’t enough to tell us if a reaction is feasible or not. We need to consider the
**total entropy change**, which takes enthalpy change and temperature into account. This is given to us by the**change in Gibbs free energy****(****ΔG)**.**Standard Gibbs free energy change****(****ΔG°)**has the formula:\(\Delta G^\circ={\Delta H^\circ}-T\Delta S^{\circ}\)

## References

- 'How Many Possible Rubik’s Cube Combinations Are There? - GoCube'. GoCube (29/05/2020)

###### Learn with 15 Entropy flashcards in the free StudySmarter app

We have **14,000 flashcards** about Dynamic Landscapes.

Already have an account? Log in

##### Frequently Asked Questions about Entropy

**What is an example of entropy?**

An example of entropy is a solid dissolving in solution or a gas diffusing around a room.

**Is entropy a force?**

Entropy is not a force, but rather a measure of the disorder of a system. However, the second law of thermodynamics tells us that isolated systems tend towards a greater entropy, which is an observable phenomenon. For example, if you stir sugar into boiling water, you can visibly see the crystals dissolve. Because of this, some people like to say that there is an 'entropic force' causing systems to increase in entropy. However, 'entropic forces' are not underlying forces at an atomic scale!

**What does entropy mean? **

Entropy is a measure of disorder in a system. It is also the number of possible ways that particles and their energy can be distributed in a system.

**Can entropy ever decrease?**

The second law of thermodynamics says that isolated systems always tend towards a greater entropy. However, no natural systems are ever perfectly isolated. Therefore, the entropy of an open system *can* decrease. However, if you look at the total entropy change, which includes the entropy change of the system's surroundings, entropy always increases as a whole.

**How do you calculate entropy?**

You calculate the entropy change of a reaction (also known as the entropy change of the system, ΔS°_{system}**,** or just entropy change, ΔS°) using the formula ΔS° = ΔS°_{products} - ΔS°_{reactants}.

You can also calculate the entropy change of the surroundings with the formula ΔS°_{surroundings} = -ΔH°/T.

Finally, you can work out the total entropy change caused by a reaction using the formula ΔS°_{total} = ΔS°_{system} + ΔS°_{surroundings}

##### About StudySmarter

StudySmarter is a globally recognized educational technology company, offering a holistic learning platform designed for students of all ages and educational levels. Our platform provides learning support for a wide range of subjects, including STEM, Social Sciences, and Languages and also helps students to successfully master various tests and exams worldwide, such as GCSE, A Level, SAT, ACT, Abitur, and more. We offer an extensive library of learning materials, including interactive flashcards, comprehensive textbook solutions, and detailed explanations. The cutting-edge technology and tools we provide help students create their own learning materials. StudySmarter’s content is not only expert-verified but also regularly updated to ensure accuracy and relevance.

Learn more