1/ FASCINATING new theory on DREAMING from observing a deep neural net
but 1st––some of the most cutting-edge + interesting research on how 🧠🧠 work –– are inspired by observation of modern 💻💻 + algorithms
every era there's a theory of brains running parallel to era's tech
2/ Go back to Descartes who thought the 🧠 worked like hydraulic pumps ⛽️––the available new tech or his era
3/ Freud looked to the tech of his time to describe the mechanics of the brain––the steam engine
4/ More recent analogies have been to the brain as a computer––which notably inspired lots of AI research, specifically the early work on neural nets which lost and regained favor over the decades
5/ Then we have had the analogy of the brain as an internet––with islands of functional groups interconnected
6/ All models are wrong––some of them are useful.
Insights from trying to understand our internal human SENSES, PERCEPTION, SPEECH, VISION, HEARING, MEMORY have all led to embodied technologies
9/ Like "memory––prediction" framework and the computational layer between them
that ingests reality, makes models + predictions of patterns it later expects to see, then updates models based on 'reality' (just as robots/machine vision do)
what if the REASON we DREAM––was similar to
the REASON programmers add noise to deep neural nets
to prevent narrow training from experience
+ generalize, allowing for anticipation of weird new stuff––and be evolutionarily adaptive...
11/ Hoel calls it the Overfitting Brain Hypothesis
the problem of OVERFITTING in machine learning is best visualized by this
12/ The way researchers solve the "overfitting" problem for Deep Neural Nets––is by introducing "noise injections" in the form of corrupt inputs
Why? So the algorithms don't treat everything so narrowly SPECIFIC and precise––but instead can better GENERALIZE
13/ Now IF our brain processes + stores information from stimulus it receives all day long––and learns from experiences in a narrow way––THEN it too can "overfit" a model
& be less fit to encounter wider variations from it
(like the real world)...
So the PROVOCATIVE theory...
14/...Is that the evolutionary PURPOSE of DREAMING is to purposely corrupt data (memory or predictions) by inject noise into the system
And prevent learning from just rote routine memorization
China’s playbook isn’t war in the open—it’s war in the seams.
They target the gaps between state & federal, public & private, markets & security.
America must 'armor up' 🧵...
2/ The CCP’s strategy?
Hollow out U.S. strength from within.
Critical infrastructure? Infiltrated.
Universities? Compromised.
State pension funds? Financing China’s military tech.
3/ While America focused on the War on Terror, China waged a war on us—a silent siege of intellectual property theft, capital capture, and digital espionage.
If you were fly on wall in Qatar's strategy meetings you’d hear their not so secret strategy
i) keep up economic relations w/Iran (esp w shared gas reserves)
ii) support Islamist groups (esp Muslim Brotherhood) + use the masses as pawns for political leverage..
2/
iii) assert distinct identity in MidEast geopolitics as a ‘mediator’
iv) religious mafia style––foment fervent believers (funding Al Jazeera propoganda + mosques/ Imams to rule over other country populations) grow geopolitical power depsite tiny size (and no real military)
3/ The ties to Iran + Turkey in particular who leverage terrorism, religious extremism are the greatest threat to Mideast regional peace
especially with moderate + modernizing + peace + prosperity seeking UAE + Saudi (on cusp on Israel normalzied relations) and Sunni…
2/ this technical paper from Apple published year ago not widely discussed was the breadcrumb clue and the inspiration I have shared with many friends since....
3/ The paper introduces 2 methods called “windowing” + “row-column bundling” which allow devices to run models up to 2x DRAM size by storing model parameters in flash + loading them into DRAM as needed.