To account for increasingly large and complex systems, we must take an ensemble perspective.
1/n
Instead of thinking about what the system is going to do, start thinking about what can the system possibly do what's the state space? what could it possibly do? what configuration could it possibly have? The complement of that is what possibility won't it manifest/actualise?
2/n
Emphasise the space of possibility
Observe and examine multiple contexts, the relation between the possible and the actual - whatever that means - there are many ways to do it.
3/n
There's some large space of possible things and somehow reality is only manifesting some subset/subspace of that.
4/n
he action of variety is dead simple - counting different/distinct states/elements that you observe some system showing.
5/n
Semaphore is still used, as a redundancy to communicate information.
Nice to have a backup in case things fail.
6/n
In semaphore the arms are equivalent
You do need to be careful counting variety - resolution or resolving power of the observer comes into play and is pretty important.
7/n
Variety seems at first blush almost an objective property but there is some observer dependence for what is the value of some variety I get for a system.
8/n
There are instances where the properties of the thing and whole (set/space of possibilities) can be different from one another.
What's true for the whole can be false for the part or what's nonsense for one might not be for the other.
9/n
"England is a little weird"
10/n
Examples of the set vs the individual:
-Families have an average of 2.3 kids (in 1956)
-Tire going 80 KPH as a whole but at each piece of the wheel there is some trajectory and none are going at exactly 80 KPH
11/n
Something that we see over and over in complex systems is that depending on the scope and resolution in some sense you get a different answer for what you're looking at.
12/n
Variety is crucial for communicating and carrying information.
The interplay and relationship between the instance (the message) and the set of possible messages is crucial.
13/n
The basic idea/natural instinct/reaction so information can't be carried is to shrink the possibility space.
By contrast: as you have more possibilities you can attach those possibilities to different meanings & distinguish among them so you can then transfer information.
14/n
Variety is the reason something can carry information.
So variety and communication are tightly linked.
15/n
There need not be any inherent meaning in something but it can provide a structure on which to attach meaning and transmit it.
It's about having enough variety so you can transmit some message, so that it has some inherent nuance or complexity to it.
16/n
At the most basic level if you can only say one thing a message or no message are the only possibilities.
17/n
But if you pull from a larger set of possibilities then that carries more information.
How large the possibility space is has some effect on what message actually means or what information can be transmitted through it.
18/n
The space of possibilities is interacting with the actuals and that gives the actuals a sort of extra quality that they don't have on their own.
19/n
When we can see that there's a difference between what's possible in these terms of possibility space and what is actual we say that we observe a constraint on the system.
20/n
There can be internal and external impositions of constraints.
21/n
Another way to think about independence:
If the total variety of the system is the sum of the variety of the components.
By implication vice versa with dependence.
22/n
Often in complex systems the constraint is something that the system itself is producing - some interaction among the components is reducing the number of possible states the system might take on.
23/n
For communication you can't get away from relational properties, those that involve an observer and a receiver.
24/n
For any given system that there's nontrivial interactions you'll be able to the observe the behaviour of the parts of the system and you can quickly discern that if you allow those parts to vary independently you would destroy some essential character of the system.
25/n
In some sense the constraints are what make it a system to begin with.
26/n
How we observe the system matters a lot, by varying the resolution you get different answers and if you that systematically you can look at the character of a system in different ways.
27/n
'Object' as constraint:
What we intuitively call an object is some kind of severe constraint on some systems.
(degrees of freedom = variety in continuous space.)
28/n
Much more robust way to look at things - is the variety of the whole (whatever it might be) less than the variety of the components?
29/n
You can't find everything through correlation coefficients.
Not everything is a binary dependency.
30/n
Surprisal: making a metric
Intuitively we're surprised by things when they're less probable.
The lower the probability the greater the surprise.
31/n
Entropy:
The average of surprisal across all the possibilities.
-A measure of 'disorder'
-How much you don't know before you look or
-How much you learn when you do look
32/n
Joe experiences a high entropy event.
33/n
"It's like negative numbers on google sheets."
(NameError: Mary missing?)
(We are Guinea pigs)
34/n
Entropy gets you every time.
35/n
If things are 50/50 I only learn one bit
As I become more certain of an outcome I learn less on average.
36/n
When things aren't equiprobable you have to have a weighted average.
37/n
Where the possibilities are evenly spread you get maximum entropy because you know the least before you do it, as you become biased towards one outcome or other there's less entropy as you learn less on average.
38/n
When we are talking about self-organisation we are talking about systems that decrease in entropy.
39/n
A pattern is the same thing as a constraint, it's the same thing as redundancy, compressibility etc.
Same idea wrapped up in different words and slightly different angles.
Interactions self-impose constraints on the state space of the system.
40/n
Bit Parity system
If variety of the whole < sum of the variety of the parts = constraints, not independent.
Or if the entropy of the whole is , sum of the entropy of the parts, again = not independent.
41/n
This is more general than ideas like correlation (which is a pair wise thing, depends on the magnitude)
This is also a more general statement on dependency structures that can be deployed much more generally.
(so a powerful idea)
42/n
If something is not independent:
You can know something about something else by knowing about the other part of the system if you know enough about it.
You know?
43/n
Complexity paradox - whole being < than the sum of things at odds with the whole is > than the sum of its parts.
When we talk in terms of entropy/information/variety it switches to the whole < than the sum of its parts for complex systems (or systems with dependencies).
44/n
Another 'seeming' paradox:
Computational complexity - how long is the shortest program that I can write that will reproduce this system
How long does the description need to be?
(For perfect randomness you can't.)
45/n
For perfect randomness you can't - the truly random has no pattern it's not compressible or ordered.
In a random scenario the program would have to be as long as the string.
In terms of computational complexity - perfect independent randomness is maximally complex.
46/n
This is basically the opposite of what we've been saying that when things are independent they are the very antithesis of complexity.
47/n
Complexity is about things in interaction - things being non-independent - being interdependent.
Where as now that which seems most complex at least in terms of computational complexity is that which is perfectly random where there's lots of independent random components.
48/n
Is this a complex system or a simple system?
Solution:
Depends on my resolution.
49/n
Multiscale variety: (Assuming fixed scope)
Instead of looking at variety and entropy at one resolution instead look at many.
50/n
Simplicity how does it look across scales?
Two kinds of simplicity:
-everything independent and random
-everything is doing the same thing.
-One maintains over scales less
-Other maintains over scales more.
51/n
Scale it the objective size of resolution where the thing can act at that size.
52/n
This is a place in complex systems science where scientists are constantly talking past one another due to terminology issues, people assume a resolution and it gets treated as self-evident.
53/n
Total independence - spec ops, immune system
Total coherence - tank battalion, musculoskeletal system
Multiscale variety - truly complex - military, your body.
54/n
The amount variety a system has at a given scale is really crucial for it to effectively behave in a given environment
There are many important implications for systems functioning.
55/n
If you have a given system of a given scope there is a trade off in the curve if you want to scale the behaviour of the system up - maybe you lose fine grain with a constraint in order to gain scale.
56/n
There are really important trade offs but if you need to gain in one and not lose in another then you need to add mass - add scope.
57/thread
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Determinism can only take us so far - whether it's a fundamental "randomness" or epistemological limitation, doesn't really matter, probabilistic processes become necessary to start dealing with systems & thinking about their future paths/trajectories/possible trajectories.
1/n
Epistemological limitations like chaotic dynamics or computation irreducibly introduce uncertainty future states of system.
2/n
At a certain point in time it has to have a paradigm shift or it doesn't work anymore:
Different materials, different, tools, etc. - just different
45/n
Something has got to give something has got to be different.
A whale can grow so large compared to elephant because it is in (and has to be in) the ocean.
46/n
Instead maybe try a smallish system then, run a new smallish system in a similar way e.g. cells
Sometimes duplication allows things to evolve differently.
47/n
There are tools that people bring to bear that become selective mechanisms over the possible objects of study and so we get this hugely biased sample of what we consider normal/regular/typical.
2/n