“You should never be surprised by or feel the need to explain why any physical system is in a high entropy state.”

― Brain Greene, *The Fabric of the Cosmos*

** Adhyayam** – 2

In the previous part, i gave a brief introduction about Artificial Intelligence, and Where do we stand in it currently.

Heuristics are techniques defined for a particular problem for solving it. They are a major part of algorithms that instruct machines to solve problems and make decisions.

Entropy is the name people came up with to describe why nothing is 100% perfect or 100% efficient. Technically, Entropy is the number of ways stuff can be *arranged*. And orderliness is not an universal definition, is it? What seems to be ordered to one can be disordered for another. so in my own formulation, I would define ordered system is one which has clear relationship among its various parameters or components. I can throw 10 red balls inside a box, and jiggle them. Every ball in that lot does not have the same relationship with other balls. There can’t be a perfect generic rule, whereas they can be arranged in a particular fashion so that there is some pattern followed if not a completely generic one. This we can all agree that is in a more ordered state than the previous. Now I pose a question – how many ways are there to achieve this pattern with the given system and how many messy ways?

It is reasonable to assume for a simplistic argument that each time we jiggle the box there is a new disordered arrangement of balls in the box and I can jiggle ‘n’ number of times every time coming up with a new arrangement, whereas we can arrange the balls in patterns only a certain number of times which is much less than ‘n’ . Thus more the number of ways of arrangement, higher is the entropy.

The common misconception is that Entropy is the measure of disorder but it is not. It is true that for most of the systems the number of disordered states is much higher that the number of ordered states, implying that higher the entropy higher the disordered state; leading to the above misconception. If so, we can argue this way If we make a system in which the number of ordered states are much higher that the number of disordered states, then the entropy will still be high implying a highly ordered state of the system. Yes, this is perfectly true and recently scientists have come up with such systems. Those are beyond the scope of this post but I shall try to put up a post about them a little later.

After the brief description of entropy, we shall come back to AI. In most of the two-player games, the most favourable moves are the ones which has the maximum number of moves in the subsequent turn for that player compared to his opponent. If a move yields 10 different moves for your opponent and 15 moves for you in the next turn then it is a good move. Logically speaking it implies that there are more ways in which you can move than your opponent thus giving you the control of the game and a series of a controlled moves guarantees you a win. This is a general strategy followed by most of the two player algorithms for games. The most efficient one to me is the Alpha-Beta pruning algorithm which follows a similar logic. The general statement is ‘to maximise the diversity of future possible states’.

In the above example, the number of moves is one heuristic for evaluating the *worthiness* of a particular move. This is used in games like Othello, chess. There can be many such heuristics depending on the need and accuracy. Recently, people have used ‘Entropy’ as such a heuristic for making a system intelligent.

Having linked all the terms in the title, we shall proceed to the explanation. Alex Wissnergross proposed that entropy can be used as a heuristic. There is a law in Thermodynamics which I believe **the** most fundamental law in the whole of universe, which states that ‘Entropy of a spontaneous process always increases in forward time’. Here, we need not worry about *thermodynamic spontaneity *but just get the idea. In our previous example, when we jiggle the box of balls, each event has more number of states to be arranged than the previous one. This is the implication of the Entropy Law.

Now we see the connection between Entropy and AI algorithms being, ‘Increase in the number of possible states’. That is what AI needs and that is exactly what Entropy provides. Simply Elegant!! Alex argues that Entropy is the “Underlying Mechanism of Intelligence”. He describes a force which drives systems to increase their entropy (called as entropic force) is the required heuristic of AI.

The above equation states that the force F seeks to maximize the possible diversity S with a strength T till a given time t (tau). Alex and his colleagues built a software engine called ‘Entropica’ which visualises and simulates the above equation in the given system. The main characteristic of Entropica is that only the system and the equation are fed in and the program by itself does intelligent actions without any goal or instruction, which is amazing and definitely the highest level of intelligence. In the video below, this is clearly demonstrated:

It is amazing how seemingly unrelated topics converge together and produce an astonishing result. If entropy can produce intelligent behaviour in systems on its own, is it also the reason behind human intelligence? This is an amazing question to investigate and might also provide answers why other animals are not as intelligent as humans.

Like, comment and share!!

Links & References:

Alex’s Page : http://www.alexwg.org/

Entropica : http://www.entropica.com/

Alex’s TED talk : http://www.ted.com/talks/alex_wissner_gross_a_new_equation_for_intelligence

http://phys.org/news/2013-04-emergence-complex-behaviors-causal-entropic.html

http://botscene.net/2013/05/11/entropica-claims-powerful-new-kind-of-ai/

http://www.insidescience.org/content/physicist-proposes-new-way-think-about-intelligence/987