Proximal constrains are due to properties of the ‘algorithm’: i.e. strength of various other evolutionary forces (drift, entropic drive, etc), population structure, standing genetic variation, etc. [2/n]
Ultimate constraints are due exclusively to the ‘problem’: i.e. the fitness landscape itself (a combination of natural selection and the notion of ‘proximity’). [3/n]
A local peak stopping evolution from finding a global peak is a classical example of an ultimate constraint. The historicity constraint.
But it is only partial: it prevents finding some optima (like the global one) by trapping us at other local optima. [4/n]
Paper's goal is to show full ultimate constraint: one that prevents finding any local optimum -- even the lowest fitness one. Thus, fitter mutants are 'always' available, even though landscape is finite. This would give us a foundational explanation for open-ended evolution [5/n]
To achieve this, I introduce the distinction between:
Easy fitness landscapes where evolution can be proved to find a local peak in polynomial time, and
Hard fitness landscapes where evolution cannot find _any_ local fitness peak in polynomial time. [6/n]
There are three main kinds (and their rotations by relabeling) of epistasis on two loci: no epistais (or magnitude epistasis), sign epistasis, & reciprocal sign epistasis (RSE). [7/n]
Reciprocal sign epistasis is req'd for ruggedness/multi-peak. (1st proved by Poelwijk et al.: sciencedirect.com/science/articl… & diff. proof by me).
So I def. landscapes as ‘semi-smooth’ if they have no RSE (but sign epistasis is possible). [8/n]
To connect to #cstheory, I show that semi-smooth fitness landscapes are equivalent tothe acyclic unique-sink orientations of polytopes (AUSOs) that are used in combinatorial optimization, especially in the analysis of simplex algorithms. [9/n]
This allowed me to prove that in semi-smooth fitness landscapes, a minimal-length adaptive path always exists to the unique fitness peak.
But this short adaptive path can be hard to spot among the long adaptive paths. [10/n]
For example, if our model of evolution is the fittest-mutant strong-selection weak-mutation (SSWM) dynamic then I can (recursively) construct an explicit semi-smooth landscapes where the dynamics will take an exponential number of steps (i.e. be trapped on a long path). [11/n]
If you want to construct your own hard semi-smooth landscapes, just apply the below recursion starting with any smooth fitness landscape. [12/n]
We can also show hardness for random fitter-mutant SSWM dynamics. Construction is more complicated but correspondence to AUSOs lets me use exiting results about simplex algorithms (from Matousek & Szabo: sciencedirect.com/science/articl…) to get an exponential lower bound on length [13/n].
But real fitness landscapes are believed to be rugged/multi-peaked (e.g. see landscape from pnas.org/content/106/29…): more complicated than semi-smooth!
For rugged landscapes, we can prove even more surprising results.
So I'll continue 2nd half of the paper after sleep [14/n].
With new day, let's continue thread with 2nd half of paper & be more concrete.
For rugged landscapes, I consider a popular model of fitness landscapes where the amount of epistasis can be tuned: the NK-model introduced in 1987 by Kauffman & Levin [15/n]: sciencedirect.com/science/articl…
I show that the NK-model is PLS-complete for K = 2 or higher: it is as hard to find a local fitness peak in these landscapes as for any polynomial local search problem.
This is analogous to NP-completeness; i.e. NP-c to global peaks is as PLS-c to local peaks. [16/n]
This means (because the reduction has a certain form) that there are fitness landscapes and initial genotypes where every single adaptive path to any local peak is of exponential length: i.e. the landscape is hard for all imaginable adaptive dynamics. [17/n]
But this goes beyond adaptive dynamics!
Given standard #cstheory conjectures (PLS != FP): there exists no algorithm that can find a local fitness peak in polynomial time. Even if evolution goes through valleys, jumps around, etc, etc. – it still won’t find any local peak. [18/n]
On these hardest fitness landscapes, evolution will be in perpetual maladaptive disequilibrium.
The population will always have nearby adaptive mutants available. [19/n]
Hankshaw can maintain cooperation for-effectively-ever on hard landscapes without using just-in-time env. change. [20/n]
Similarly w/ Baldwin effect & costly learning. By being away from local fitness peak, learning can remain adaptive even if it carries small start-up fitness cost.
So if we're looking for hard landscapes among animals, ones with cooperation & learning are good candidates [21/n].
But why should we care about local peaks? Maybe we just want approximate local peaks. After all, biology is probably too messy for exact peaks anyways.
#cstheory can help us here, too. Through the study of approximation algorithms. [22/n]
Evolution can find approximate peaks for moderate approx. factors (time polynomial in 1/s) but not for small factors (impossible for time poly. in ln(1/s)).
This lets us reason about the decay in the selection coefficient over evolutionary trajectories & fitness traces. [23/n]
I show: on hard landscapes, selection coefficients decay as power law, but can’t decay as fast as exponential.
Is there a single simple conclusion from all of this? Probably not.
But I would like to leave with one final message.
We should not imagine hard fitness landscapes as mountain ranges. Instead, we should imagine them as winding mazes: genetics.org/content/early/… [25/25]
If you prefer audio/visual recaps: my presentation of the preliminary version of this work (focused on the CS) at the @SimonsInstitute in 2014 is available online under the old title of "Complexity of Evolutionary Equilibria in Static Fitness Landscapes":
I tried to find some 'historic' roots in recent debates in economic literature. Or as @PabloRedux has eloquently summarized the Q: an inefficient-market hypothesis?
My post on local maxima and the fallacy of jumping to fixed-points has been pretty well received on /r/philosophy. This is surprising: reddit.com/r/philosophy/c…
There is some interesting discussion in the thread. Some adds important points & some identifies common misreadings. Fun
After a day at the top of /r/philosophy, the post & associated attention has now fallen off. Some enjoyable discussion was had along the way.
It seems many view cstheory results about some field X of natural science as part of the 'complexity science' challenge to X. I don't.
Maylin & I had a competition for who'd make a better cover image. Neither of ours was selected, but hers made it to top of the carousel!
If you want to learn more about computational constraints on evolution or algorithmic biology more generally & are in Oxford this evening then I am giving an hour-long introductory talk at @CompSciOxford for @oxcompsoc at 19:00 (in LTA, Wolfson Building): facebook.com/events/2599467…
Unfortunately, my lecture introducing algorithmic biology wasn't recorded.
If you're in Philadelphia this week & want to hear about computational complexity as an ultimate constraint on evolution (or measuring evolutionary games in cancer) then I'm giving an overview talk at 12:00 Monday & technical talk at 14:00 Thursday in Lynch Lab 318.
Do stop by!
Common Q after people read computational complexity as an ultimate constraint: are hard fitness landscapes typical?
Elimination of mandatory retirement at 65 is a factor leading to bad academic job market: sciencemag.org/careers/2018/1…
Solution: semi-retirement.
"[Having waited to semi-retire until] 74, I in essence removed 9 years from someone else’s career. I should have stepped aside sooner."
I'd be interested to see a graph of average age of retirement for tenured academics on the same plot as the average age of starting a tenured job over the last decades.
I should probably find Larson's paper and see if that sort of data is there.
Unfortunately, not very big on data. So it doesn't have the figure I want, but the closest is their simulation of number of years as faculty in the lower figure.
Late last week, Dave Cohen, Pete Jeavons & I updated our paper on what we can learn about easy vs hard fitness landscapes by studying structure of gene-interaction networks (VCSP instances) that represent them: arxiv.org/abs/1907.01218
Now with many more pictures; prettier, too!
I've already made a thread about the previous version of this work that was published in CP2019:
But if there is interest then I could provide some tweets on the new results and changes in this version.
The final version of our paper looking at the kinds of structure of gene-interaction networks that guarantee easy fitness landscapes is finally out: arxiv.org/abs/1907.01218…
It should appear in JAIR soon, and I think I'll do an updated thread on some of the new results then.