Information, Entropy, Complexity

Original question

If information is defined as ’the amount of newness introduced’ or ’the amount of surprise involved’ then chaotic behaviour implies maximum information and ‘newness’. Systems showing periodic or oscillating behaviour are said to ‘freeze’ and nothing new emerges from them. New structure or patterns emerge from systems showing behaviour just shy of chaos (the edge of chaos) and not from systems showing either chaotic or oscillating behaviour. What is the, for lack of a better word, role of information in this emergent behaviour of complex adaptive systems (cas).

Characterizing cas

One aspect characterizing cas is generally associated with complex behaviour. This behaviour in turn is associated with emergent behavior or forming of patterns new to the system, that are not programmed in its constituent parts and that are observable. The mechanics of a cas are also associated with large systems of a complicated make-up and consisting of a large number of hierarchically organised components of which the interconnections are non-linear. These ‘architectural’ conditions are not a sine-qua-non for systems to demonstrate complex behaviour. They may very well not show behaviour as per the above, and they may for that reason not be categorised as cas. They might become one, if their parameter space is adapted via an event at some point in time. Lastly systems behaviour is associated with energy usage (or cost) and with entropy production and information. However, confusion exists as to how to perform the measuring and interpret the outcomes of measurements. No conclusive definition exists about the meaning of any of the above. In other words: to date to my knowledge none of these properties when derived from a cas give a decisive answer to the question whether the system at hand is in fact complex.

The above statements are obviously self-referencing, unclear and undecisive. It would be useful to have an objective principle by which it is possible to know whether a given system shows complex behaviour and is therefore to be classified as a cas. The same goes for clear definitions for the meaning of the terms energy, entropy (production) and information in this context. It is useful to have a clear definition of the relationships of some such properties between themselves and between them and the presumed systems characteristics. This enables an observer to identify characteristics such as newness, surprise, reduction of uncertainty, meaning, information content and their change.

Entropy and information

It appears to me (no more than that) that entropy and information are two sides of the same coin, or in my words: though not separated within the system (or aspects of the same system at the same time), they are so to speak back-to-back, simultaneously influencing the mechanics (the interrelations of the constituent parts) and the dynamics (the interactions of the parts leading up to overall behavioral change of the system in time) of the system. What is the ‘role’ of information when a cas changes and how does it relate to the proportions mentioned.

The relation between information and entropy might then be: structures/patterns/algorithms distributed in a cas enable it in the long run to increase its relative fitness by reducing the cost of energy used in its ‘daily activities’. The cost of energy is part of the fitness function of the agent and stored information allows it to act ‘fit’. Structures and information in cas are distributed: the patterns are proportions of the system and not of individual parts. Measurements therefore must lead to some system characteristic (ie overall and not stop at individual agents) to get a picture of the learning/informational capacity of the entire CAS as a ‘hive’. This requires correlation between the interactions of the parts to allow the system to ‘organize itself’.

CAS as a TM

I suspect (no more than that) that it is in general possible to treat cas as a Turing Machine (TM), ‘disguised’ in any shape or, conversely, to treat complex adaptive systems as an instance of a TM. That approach makes the logic corresponding to TM available to the observer. An example of a system for which this classification is proven is 2-dimensional Cellular Automata of Wolfram class 4. This limited proof decreases the general applicability, because complex adaptive systems, unlike TM in all aspects, are parallel, open and asynchronous.

Illustration

Perhaps illustrative for a possible outcome, is, misusing the Logistic map because no complexity lives there, to ‘walk the process’ by changing parameter mu. Start at the right: in the chaotic region, newness (or reduction of uncertainty / surprise / information) is large, bits are very many, meaning (as in emerging patterns): small. Travel left to any oscillating region: newness is small, bits are very few, meaning is small. Now in between where there is complex behaviour: newness is high, bits fewer than the chaotic region, meaning is very high.

The logical underpinning of ‘newness’ or ‘surprise’ is: if no bit in a sequence can be predicted from a subset of that sequence, it is random. Each bit in the sequence is a ‘surprise’ or ‘new’ and the amount of information is highest. If 1 bit can be predicted, there is a pattern, an algorithm can be designed and, given it is shorter than this bit (this is theoretical) the surprise is less, as is the amount of information. The more pattern, the less surprise it holds and the more information appears to be stored ‘for later use’ such as processing of a new external signal that the system has to deal with. What we observe in a cas is patterns and so a limitation of this ‘surprise’.

A research project

I suggest the objective of such project is to design and test meaningful measurements for entropy production, energy cost and information processing of a complex adaptive system so as to relate them to each other and to the system properties of a cas in order to better recognize and understand them.

The suggested approach is to use a 2-dimensional CA structure parameterized to show complex behavior as per Wolfram class 4 as described in ‘A New Kind of Science’ of Stephen Wolfram.

The actual experiment is then to use this system to solve well-defined problems. As the theoretical characteristics of (the processing of and the storage by) a TM are known, this approach allows for a reference for the information processing and information storage requirements that can be compared to the actual processing and storing capacities of the system at hand.

Promising measurements are:

Measurement Description Using
Entropy Standard: this state related to possible states Gibbs or similar
Energy cost Theoretical energy cost required to solve a particular problem versus the energy the complex adaptive system at hand uses See slide inserted below, presentation e-mailed earlier: https://www.youtube.com/watch?v=9_TMMKeNxO0#t=649

Schermafdruk van 2015-06-09 12:56:03

Information Earlier discussion: Using this approach, we could experimentally compute the bits of information that agents have learned resulting from the introduction of new information into the system. I suggest to add: ..compute the bits of information that agents have learned relating to the system…. That subset of information distributed stored in the system representing the collective aspect of the system, i.e. distributed collective information. Amount of information contained in the co-evolving interfaces of the agents or parts of the system equivalent to labels as suggested by Holland.

Gepubliceerd door

DP

Complexity Scientist