The spectacular comeback of US president-elect Donald Trump has taken the world by surprise. No doubt people can point to various explanations for his election victory, but in my view, the science of information will pave the way towards deeper insights. Unless the Democrats – and their counterparts around the world – can develop a better understanding of how people receive and reject information, they will never fully understand what happened or successfully fight elections in the future.
There is a fundamental law of nature, known in physical science as the second law. This says that, over time, noise will overwhelm information and uncertainties will dominate. Order will be swamped by confusion and chaos. From a single particle to the whole universe, every system known to science obeys this law. That includes political systems, or societies.
Whenever there is progress in communication technology, people circulate more and more inessential or inaccurate information. In a political system, this is what leads to the noise domination described by the second law.
In science, the quantity that measures the degree of uncertainty is known as entropy. The second law therefore says that entropy can only increase, at least on average.
While entropy does not reduce spontaneously, it is possible to reduce it by spending energy – that is, at a cost. This is exactly what life is about – we create internal order, thus reducing entropy, by consuming energy in the form of food.
For a biological system to survive, it has to reduce uncertainties about the state of its environment. So there are two opposing trends: we don’t like uncertainties and try to reduce them. But we live in a world dominated by growing uncertainties. Understanding the balance of these two forces holds the key to appreciating some of the most perplexing social phenomena – such as why people would vote for a man who has been convicted of multiple crimes and strongly signalled his autocratic tendencies.
The world is filled with uncertainties and information technology is enhancing the level of that uncertainty at an incredible pace. The development of AI is only propelling the increase of uncertainty and will continue to do so at an unimaginable scale.
In the unregulated wild west of the internet, tech giants have created a monster that feeds us with noise and uncertainty. The result is rapidly-growing entropy – there is a sense of disorder at every turn.
Each of us, as a biological system, has the desire to reduce this entropy. That is why, for example, we instinctively avoid information sources that are not aligned with our views. They will create uncertainties. If you are a liberal or leftwing voter and have found yourself avoiding the news after Trump’s re-election, it’s probably linked to your desire to minimise entropy.
The Need for Certainty
People are often puzzled about why societies are becoming more polarised and information is becoming more segmented. The answer is simple – the internet, social media, AI and smartphones are pumping out entropy at a rate unseen in the history of Earth. No biological system has ever encountered such a challenge – even if it is a self-imposed one. Drastic actions are required to regain certainties, even if they are false certainties.
Trump has grasped the fact that people need certainty. He repeatedly offered words of reassurances – “I will fix it”. Whether he will is a more complex question but thinking about that will only generate uncertainties – so it’s better avoided. The Democrats, in contrast, merely offered the assurance of a status quo of prolonged uncertainties.
Whereas Trump declared he would end the war in Gaza, Kamala Harris remarked that she would do everything in her power to bring an end to the war. But the Biden-Harris administration has been doing exactly that for some time with little progress being made.
Whereas Trump declared he would end the war in Ukraine, Harris remarked that she would stand up against Putin. But the Biden-Harris administration has been merely sending weapons to Ukraine to prolong the war. If that is what “standing up against Putin” means, then most Americans would prefer to see a fall in their grocery prices from an end to the war.
Harris argued that Trump is a fascist. This may prove to be true, but what that means exactly is unclear to most Americans.
While Harris’s campaign message of hope was a good initiative, the Democrats failed in delivering certainty and assurance. By the same token they failed to control the information space. Above all, they failed the Americans because, while Trump may well bring an end to the war in Ukraine and Gaza in some form, his climate policy will be detrimental to all Americans, with lasting impacts.
Without understanding the science of information, the blame game currently underway will not bring Democrats anywhere. And there are lessons to be learned for other centre-left governments, like the UK Labour government.
It is not entirely inconceivable that the former prime minister Boris Johnson, encouraged by the events in the US, hopes for a dramatic return to the throne at the next general election. If so, prime minister Keir Starmer must find a way to avoid following the footsteps of Biden and Harris. He must provide people with certainty and assurance.
____
1 These entires from a Reddit thread seemed helpful:
You can call it “disorder” to a layman, but it’s not very precise.
Entropy is a measure of how many “possibilities” (microstates, i.e. atomic configurations) correspond to something you can “see” (macrostate, i.e. temperature, pressure, etc).
I’ll use a world made of Legos as an example.
A single 2×4 yellow Lego piece has no entropy. It is exactly what you see.
A one layer sheet of Legos has no entropy either, because you can see them all. The exception is if you cannot see the seams between the pieces. There is entropy in that case because you don’t know if it was a 2×4 or two 2x2s.
A large block of Legos has entropy because all you can see is the outside.
Mathematically, entropy is S = k*ln(W), where ln is the natural log, W is the number of possible microstates that fit with the given macrostate, and k is a constant conversion factor, depending on what type of entropy you are talking about.
And:
Entropy is a convenient way to describe the state function of a system, which measures the number of ways you can rearrange a system and have it look “the same” (for some value of “the same”). The problem in thermodynamics is that you have a large-scale description of a system (like, say, a steam engine or a heat engine), and physics (particle collision theory) that describes systems like that in exquisite, impossible-to-measure detail. You want to extract the large scale physics from the system – how will it evolve on large, observable scales? (For example, will the steam condense, or will some mercury in contact with the system expand or contract?).
The state function is very useful in cases like that, because it tells you something about how well you understand the condition of the system. The state function is a measure of the number of different ways you could rearrange the inobservably small parts of your system (the water molecules in the steam boiler, for example) and still have it match your macroscopic observations (or hypothetical predictions). That is useful because you can use the state function to calculate, in a broad way, how the system is most likely to evolve, without actually cataloguing each of the myriad states it might be in and assigning a probability to each.
Entropy is just the logarithm of the state function. It’s more useful because then, instead of dealing with a number of order 101000, you’re dealing with a number of order 1000. Incidentally, the reason entropy tends to increase is that there are simply more ways to be in a high entropy state. Many, many more ways, since entropy is a logarithm of a huge number to begin with. so if there’s roughly equal probability of a system evolving in each of many different ways, it’s vastly more likely to end up in a state you would call “high entropy” than one you would call “low entropy”.
Thermodynamically, the reason it takes energy to reduce entropy of a system is that you have to overcome the random excitation of each portion of the system to force it into a known state. Since you don’t know what state the system started in (otherwise its entropy would already be low, since you would have enough knowledge to reduce the value of the state function), you have to waste some energy that wouldn’t technically be needed if you knew more about the system, pushing certain particles (you don’t know in advance which ones) that are already going in the correct direction for your entropy reducing operation.
Maxwell’s Daemon is a hypothetical omniscient gnome who can reduce entropy without wasting any energy, by sorting particles on-the-fly. But with the advent of quantum mechanics we know that knowledge always has an energy cost, and a hypothetical Maxwell’s Daemon couldn’t measure which particles to sort where, without spending some energy to get that knowledge. So Maxwell’s Daemon turns out to require just as much energy to reduce entropy as would any normal physicist.
Anyway, entropy is closely related both to physics and to information theory, since it measures the amount of knowledge (or, more accurately, amount of ignorance) you have about a system. Since you can catalog Sn different states with a string of n symbols out of an alphabet of size S (for example, 2n different numbers with a string of n bits), the length of a symbol string (or piece of memory) in information theory is analogous to entropy in a physical system. Physical entropy measures, in a sense, the number of bits you would need to fully describe the system given the macroscopic knowledge you already have.
Incidentally, in the 19th century, entropy was only determined up to an additive constant because nobody knew where the “small limit” was in the state function, and therefore where the 0 was on the entropy scale. After the advent of quantum mechanics, we learned where the 0 is — pure quantum states have 0 entropy, because a system in a pure quantum state has only one physical state available to it, and the logarithm (base anything) of 1 is 0.