, 14 tweets, 6 min read
My Authors
Read all threads
Are you skeptical about successor representations? Want to know how our new model can learn cognitive maps, context-specific representations, do transitive inference, and flexible hierarchical planning? #tweeprint...(1) @vicariousai @swaroopgj @rvrikhye biorxiv.org/content/10.110…
As @yael_niv pointed out in her recent article, learning context specific representations from aliased observations is a challenge. Our agent can learn the layout of a room from severely aliased random walk sequences, only 4 unique observations in the room!
And it works even when the room is empty, with no unique observations in the center of the room. The observations are now severely aliased and correlated, but it still recovers the map of the room.
A hard problem of transitive inference: two aliased rooms with an overlapping portion, and the agent walks in only in one room at a time. No problem...our model learns a coherent global map from these disjoint episodes!
Observations in rooms are aliased, but there is more. Did you notice that a portion in the first room looks exactly like the overlapping patch, making this a harder problem? Recovering all the relative positions required a lot of transitive stitching, which the model did!
Note that the model doesn't make any assumptions about 2D/3D space or Euclidean geometry or anything like that. It is purely relational and these are learned purely from sequential random walk observations.
Remember the 'splitter cells', where place cells encode the paths taken rather than locations? These emerge when rats follow paths rather than random walks. Happens that way in our model too. Also, lap cells emerge when rats run laps along the same loop.
By transferring learned structural knowledge, the agent can take shortcuts in a new room, including navigating around obstacles, without having seen the whole room.
How does it work? By learning variable order sequences! The core idea is very simple: split aliased states and adapt them to different contexts. This representation has many good properties compared to suffix trees or RNNs.
The model can be formulated as a highly structured over-complete HMM, and trained with EM. Training results in a directed graph that is very sparse and approximates the latent generative process behind the observed sequences.
When the model is exposed to multiple mazes, it splits them apart properly and the responses remap when the agent switches from one maze to next. Rate remapping can be explained by uncertainty in the observations.
When the world has latent hierarchy, our model can recover that. This is then used for efficient planning.
There is more to be done. We think replay during learning and inference can be explained. The model has strong connections to our earlier work on schema nets vicarious.com/2017/08/07/gen…
I want to point out a few review papers which got us started and had great influence. Viewpoints interview is excellent. What is a cognitive map -- we learned it from @behrenstimb, @neuro_kim et al. and @NealWMorton. Buzsaki&Tingley: importance of sequence learning.
Missing some Tweet in this thread? You can try to force a refresh.

Enjoying this thread?

Keep Current with Dileep George

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!