Matthias Nau Profile picture
Cognitive Neuroscientist | Assistant Prof at Vrije Universiteit Amsterdam | Vision, Gaze Behavior, Memory | Neuroimaging, ML | @matthiasnau.bsky.social

Dec 2, 2020, 15 tweets

Want to do #eyetracking in #fMRI without eye tracker? Check out our #preprint!

@CYHSM & I developed #DeepMReye, a #DeepLearning framework to decode gaze position from the MR-signal of the eyeballs. No camera needed! w/@doellerlab
biorxiv.org/content/10.110…

Thread below! 👇

How does MR-based #eyetracking work?

The orientation of the eyeballs naturally reflects gaze direction. This & the strong MR-signal variations induced by eye movements serve as the basis for gaze decoding.

What is #DeepMReye & what does it do?

It's a #CNN that uses the multi-voxel-pattern of the eyeballs to decode gaze position on the screen. It can perform #eyetracking even in existing #fMRI data.

We tested it on 268 participants scanned with 14 scan protocols on 5 MRI scanners!

Does it work for any type of viewing behavior?

#DeepMReye accurately reconstructs guided fixations & smooth pursuit eye movements & also free viewing & visual search! Here are some n=1 examples! Cherry-picked? Check out the group-results!

Does it work in all participants?

Almost! It worked robustly in the large majority of participants & for all viewing behaviors tested (5 key datasets, group-median correlation between real & predicted gaze path: r=0.89, R2=0.78, Eucl.Err=1.14°).

These are held-out participants!

How does it detect outliers?

In addition to gaze position, #DeepMReye computes a Predicted Error for each sample & participant, which allows detecting & removing outliers in an #unsupervised fashion! Threshold is yours to chose, here: 20% of participants (in orange).

How many participants do you need for model training?

We fixed the test set & sub-sampled the training set to find out. For free viewing, n=6-8 can work, but it depends on how simple the behavior is & on the type & amount of individual training data. Tips in the #preprint!

Do you need to consider hemodynamic lags?

No! We shifted the gaze labels rel. to the imaging data, finding that eyeball gaze decoding is instantaneous.

However, our pipeline can also be used to decode from brain activity by changing the ROI (see e.g. V1. Et voilà: HRF)!

What's the temporal resolution?

Different slices are acquired at different times & #saccades induce signal blurr. By default, #DeepMReye decodes multiple gaze labels per TR. A lot to say here, but more labels explain more variance & larger within-TR movements can be inferred.

Do #fMRI scan parameters matter?

Yes, but as long as they are similar between training & test set, #DeepMReye should work. We tested a wide range of voxel sizes (1.5-2.5mm) & TR's (0.8-2.5s) across 14 sequences (9 in the same participants, see figure). It worked in all cases.

Can #DeepMReye go beyond camera-based #eyetracking?

Yes! As proof-of-principle we show successful gaze decoding while the eyes were closed. It could also enable #eyetracking e.g during REM-sleep, resting-state or in visually impaired patient groups. Tons of ideas for the future!

Can the decoded gaze labels be used in #fMRI analyses?

Yes! Decoded eye movements explain activity in many regions (incl. "non-visual" ones), showing that #DeepMReye compares well to camera eye trackers & that #eyetracking in general is important for interpreting #fMRI results!

Much more in the #preprint! There are things to consider before using it, but we believe that #DeepMReye can be a powerful community tool for MR-based #eyetracking in the future. The code is currently subject to code review & we will share it #opensource with the final paper.

This work is a team effort with the fantastic @CYHSM (Co-first author & deep learning expert!)🤝Please reach out to us if you have questions!

In addition, I'm excited that this is also my first senior authorship shared with @doellerlab, who made this work even possible!

Thanks to @KISNeuro @MPI_CBS @ERC_Research for supporting this work and a special thank you to those who contributed data @nachopolti, Josh Julian & @epstein_lab, @BartelsAndreas & helpful advice @Chris_I_Baker! 🙏

Share this Scrolly Tale with your friends.

A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.

Keep scrolling