Every lens leaves a blur signature—a hidden fingerprint in every photo.
In our new #TPAMI paper, we show how to learn it fast (5 mins of capture!) with Lens Blur Fields ✨
With it, we can tell apart ‘identical’ phones by their optics, deblur images, and render realistic blurs.
Huge thanks to my amazing co-authors:
@exfilmstudent , @rebeccayelin , Daniel Miau, Florian Kainz, Jiawen Chen, @ceciliazhang77, @DaveLindell & @kyroskutulakos
📄 Paper ➡️
💻 Code: coming soon!
@ComputerSociety #ComputationalPhotography #IEEECS #ComputerVision #Opticsblur-fields.github.io
Optical blur (point spread function PSF) is an umbrella term for a laundry list of degrading effects such as defocus, diffraction, and aberrations.
It’s hard to calibrate because it varies with sensor position, focus, target distance, and where you look on the image plane.
We introduce Lens Blur Fields— tiny MLPs that can model this high-dimensional PSF.
Our capture setup only needs a monitor + a simple phone/camera stand. The pipeline is light ✨
1️⃣ Capture a focal stack of monitor patterns (in minutes)
2️⃣ Train an MLP via non-blind deconvolution
3️⃣ Get a continuous, device-specific PSF model
Two smartphones of the same make can have subtly different PSFs—your phone has its own blur signature 📱🔍
We show this with the lens blur fields of two iPhone 12 Pros:
Lens Blur Fields let you render device-specific depth-of-field, blur a resolution chart, or a 3D scene:
And with more realistic renders, we can also do better device-specific image restoration.
We’ll be releasing the first dataset of 5D & 6D lens blur fields for smartphone & SLR lenses—stay tuned!
Share this Scrolly Tale with your friends.
A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.