Determinism can only take us so far - whether it's a fundamental "randomness" or epistemological limitation, doesn't really matter, probabilistic processes become necessary to start dealing with systems & thinking about their future paths/trajectories/possible trajectories.
1/n
Epistemological limitations like chaotic dynamics or computation irreducibly introduce uncertainty future states of system.
2/n
In general - we just have to make room for uncertainty in our models
3/n
There are really different levels of uncertainty.
Uncertainty is an ambiguous word - 'tame' uncertainty is where we can say something definitive about what we are going to experience.
4/n
In the real world obviously there are different degrees of uncertainty where some things are deeply, deeply uncertain and where some other things are known within some bounds. They're not certain but can carve out some space & say here are the space of possible futures.
5/n
One of the really important things in moving from deterministic processes to stochastic processes is rather than thinking about trajectory, when uncertainty is introduced, you have to start thinking about the ensembles.
6/n
Think not just one trajectory but the space of all possible trajectories for a system or for all agents in a system.
7/n
Axioms of probability
8/n
The canonical stochastic process is the Bernoulli process.
A key part is that the process involves independent events.
(again very hard to be intentionally random)
9/n
Markov Property:
The state transition depends ONLY on the current state of the system and no prior history.
10/n
Again, a deep assumption that's baked into most/all of physics/physical theory is if you have a state of a system, & you have a dynamical law of the system, then you wouldn't need any other information - everything happening next is a consequence of what's happening now.
11/n
Drunkard's Walk
There is a state space & some possible next states - which one you go to is somehow random.
The simplest way is with a Bernoulli process.
12/n
Aggregation of independent steps:
There are more ways to find yourself to the middle than there are to the edges.
The centre is more densely populated than the periphery.
13/n
Space (ensemble) averages (ergodicity) vs time (trajectory) averages (non-ergodicity)
Time average itself is not converging, it behaves like a random variable.
14/n
You know what you're going to get as you space average.
You don't know what you're going to get as you time average.
15/n
Statistical self-similarity is that wherever you happen to start, choose some subsection of the system, it behaves the same for however long you're looking at the system evolve.
Such as same bell curve pattern spreading out overtime.
Bell curves inside of bell curves.
16/n
More intuitively a trajectory will have a look to it, rough, not too dissimilar from a fractal, intuitive sort of feel to it when looking at any subsection over the trajectory.
17/n
If you didn't have the axis to tell you the frame of reference you wouldn't know which of to graphs were bigger - they both have the same kind of feel to them, they have a statistical self-similarity.
18/n
There is very clear boundedness of a simple random walk due to its discreteness at the smallest scales, so it requires enough steps, and simulates a binomial distribution.
(Brownian motion roughly equivalent in continuous space).
19/n
That's the idea of statistical self-similarity.
You don't know - is this a mountain or a molehill?
20/n
There are random walks in more dimensions - e.g. Lévy flight
You can have random walks in arbitrary numbers of dimensions.
21/n
If you have a random walk with fat-tails you start to get different kinds of characteristics to your random walk.
22/n
The Levy flight starts give some intuitive sense about how you have random walks with these larger tails (larger than +1, -1 obviously).
23/n
Some minimal context Joe quotes:
"...obviously there's no tail there's no nothing."
"These are all fat tails by the way but fatter and fatter as we go down."
(Joe likes big tails?)
24/n
It can be a single or small number of large extreme events that dominate the evolution of the system.
25/n
You might spend centuries going around and then the asteroid hits.
26/n
In fat-tailed distributions those extreme events really dominate where you end up.
In the thin-tailed distribution there is no large events that can dominate where you're at.
It's all an aggregation of these very small events that put you where you are.
27/n
The birth of J. W. Norman meteorology LLC
29/n
This evolves to a long run distribution that's a stable equilibrium & completely disconnected from here & now.
I don't need to know now to know that in a long enough time in the future these are the probabilities of a given state on an arbitrary day in the future.
30/n
But then factor in an asteroid.
The asteroid sucks up all the probability over a long enough time scale.
31/n
There's not much of a chance of getting an asteroid, but overtime, it's sucks up all the probability of the other states.
In a long time horizon there's a probability 1 that you end up in an absorbing state.
32/n
When it doesn't matter which state in you're in - this is a special kind of chain:
an ergodic chain
33/n
A more general definition of ergodicity
Aperiodic & positive recurrent
(Some stochastic element & for every state in the system, the system will return to in a finite amount of time.)
34/n
That which is Non-ergodic is not positive recurrent.
(You won't return to every state)
35/n
A random walk is recurrent but not positive because it can take an infinite time to return.
36/n
*Redacted*
Secret knowledge (sorry lindy table)
"This is esoteric stuff"
*/n
Attractor states can be cycles - ergodic,
or nonergodic
38/n
The essence of ergodicity is that you have your set of states and at some point you revisit all of them (and you never stop visiting them).
39/n
Ergodicity can be a positive quality if you think about it in terms of say economic standing/class you are in, maybe it should be that people visit all the classes & getting in to one class doesn't mean you never get out of it.
40/n
If you go broke you don't get stuck there. If you become a billionaire, you don't necessarily get stuck there.
You have risk you still have to face.
41/n
Both of ergodicity & nonergodicity are kinds of attractors.
But the difference is in an ergodic attractor you are still visiting all the other states.
When there's this absorbing state you just get eventually pulled into that totally & can't visit other states at all.
42/n
This also pulls together the idea of these ergodic chains is that they evolves to some stationary distribution but that doesn't mean the dynamics that underlie the system - the microdynamics - have stopped.
43/n
The agents can all still be moving even if the greater system finds an equilibrium.
44/n
Scale dependent observation:
Two levels we can look at are microdynamics (in this case stochastic) where things that are moving around
&
macrostate where things can look static
45/n
Related is how you build a reliable system with unreliable components, say a transistor.
The electric microdynamics are stochastic, sometimes the connection is made sometimes it's not, but get enough to independently go & reliably some proportion allows the current to flow.
46/n
Further points:
Self-limiting systems can come back to a state that is not extreme.
47/n
An absorbing state leads to more concentrated number of states.
48/n
Extremes need some thing to be (obviously) bigger than some other thing.
49/n
Nonergodic chains don't always have to be an absorbing state.
But if you have an absorbing state you definitely have a nonergodic chain.
50/n
Uses the school to job/trade/profession/uni example.
"Just making stuff up here - maybe a plumber sometimes moonlights as a carpenter"
"you go to college, you become a professional nerd, and sometimes an economist"
51/n
Where you end up depends on where you started.
52/n
Most things we are interested in are actually non-ergodic.
53/n
Again that's this general/special inversion where things have been treated by default as ergodic and this is exactly what @ole_b_peters entire project is about.
54/n
Things are being treated in a default way as if they're ergodic but it turns out in the real world it's a very special situation to be ergodic and very general to be non ergodic.
55/n
Many things don't revisit all the possible states of the system.
Look at the distribution of careers - do I spend 10% of my time being a plumber, 10% being a programmer, 10% being a school teacher, or do I tend to not travel through all those different careers?
56/n
Ergodicity is a very special condition that generally isn't met and it's met in systems that are often highly contrived.
57/n
Or this all comes out of looking at things like ideal gases where each of these individual particles is independently doing its thing - everything's kind of the same, there's no difference between things.
58/n
Once you have a difference between things - this is a epithelial cell, this is a neuron - very nonergodic.
It's not like a cell spends half its time being a skin cell and half its time being a neuron - its got a pathway, its got a history.
Nonergodicity is the norm.
59/n
The space average being different than the time average is actually a special case of this more general idea where ergodicity is this visiting of all states and seeing them again and again vs not.
60/n
Preferential attachment is a stochastic process where things are not independent events, but interdependent events.
61/n
Per Bak sandpile model:
Power law behaviour
Avalanches
Self-organised criticality
(and in real life avalanches you also observe these power laws.)
62/n
If any agent in the system doesn't visit all the states it's not ergodic.
A system with a lot of mobility is closer to ergodic
63/n
Differences between Nonergodicity/hysteresis/path dependence:
All related thematically but they're just different with respect to the formal details that are manifesting these properties.
64/n
It's not just an average of all things that helps you understand the single thing.
There's some more local considerations.
65/n
With ergodicity we're talking about sort of stochastic chains/branches of behaviours, and whether you are visiting one or another (or not).
66/n
With path dependence we're often talking about the order over which things happened (order of transformations).
Does it matter if I iron my shirt and then put it on through the wash, or if I wash my shirt and then iron it?
67/n
Hysteresis is a deterministic phenomenon like a weak memory kind of property where it kind of depends on where you were in the state-space.
You can't necessarily just reverse the parameters back and get back to the same attractor.
68/n
You might consider an ergodic equilibrium a kind of robust phenomenon since if you perturb the system somehow you'll still evolve to that same equilibrium, so could almost think of that as robust - there's one basin of attraction & you always end up back in the same place.
69/n
Something antifragile is very much nonergodic it's more like a path dependent issue where you start two different systems out, two trajectories that go to different places & hence have different kinds of exposures/experiences/etc. & end up very different than one another.
70/n
Think of evolution, far as we understand, we all share a common ancestor, even w/ cats, but you don't spend 1/2 your time a cat & 1/2 a person. It's hugely nonergodic: species developed unique traits tied w/ antifragility as response to certain kinds of disturbance overtime.
71/n
As far as where you might think about structuring a system. One should never want to make out in isolation that any of these properties are either desirable or undesirable it all depends on what you're trying to do.
72/n
If you have even one absorbing state then you are by default nonergodic.
You might have more than one absorbing state.
73/thread
• • •
Missing some Tweet in this thread? You can try to
force a refresh
At a certain point in time it has to have a paradigm shift or it doesn't work anymore:
Different materials, different, tools, etc. - just different
45/n
Something has got to give something has got to be different.
A whale can grow so large compared to elephant because it is in (and has to be in) the ocean.
46/n
Instead maybe try a smallish system then, run a new smallish system in a similar way e.g. cells
Sometimes duplication allows things to evolve differently.
47/n
There are tools that people bring to bear that become selective mechanisms over the possible objects of study and so we get this hugely biased sample of what we consider normal/regular/typical.
2/n