Autofocals: Evaluating Gaze-Contingent Eyeglasses for Presbyopes | Science Advances 2019

Nitish Padmanaban, Robert Konrad, Gordon Wetzstein

A new presbyopia correction technology that uses eye tracking, depth sensing, and focus-tunable lenses to automatically refocus the real world.

Autofocals Overview

Nitish's TED Talk

ABSTRACT

As humans age, they gradually lose the ability to accommodate, or refocus, to near distances because of the stiffening of the crystalline lens. This condition, known as presbyopia, affects nearly 20% of people worldwide. We design and build a new presbyopia correction, autofocals, to externally mimic the natural accommodation response, combining eye tracker and depth sensor data to automatically drive focus-tunable lenses. We evaluated 19 users on visual acuity, contrast sensitivity, and a refocusing task. Autofocals exhibit better visual acuity when compared to monovision and progressive lenses while maintaining similar contrast sensitivity. On the refocusing task, autofocals are faster and, compared to progressives, also significantly more accurate. In a separate study, a majority of 23 of 37 users ranked autofocals as the best correction in terms of ease of refocusing. Our work demonstrates the superiority of autofocals over current forms of presbyopia correction and could affect the lives of millions.

Link to Science Advances site

CITATION

N. Padmanaban, R. Konrad, G. Wetzstein, “Autofocals: Evaluating gaze-contingent eyeglasses for presbyopes”, Sci. Adv. 5, eaav6187 (2019).

BibTeX

@article{Padmanaban:2019:Autofocals,
title={Autofocals: Evaluating Gaze-Contingent Eyeglasses for Presbyopes},
author={Padmanaban, Nitish and Konrad, Robert and Wetzstein, Gordon},
journal={Science Advances},
year={2019},
volume={5},
number={6},
pages={eaav6187}
}

Acknowledgements

We would like to thank E. Wu, J. Griffin, and E. Peng for help with CAD, depth error characterization, and coma corrector fabrication; D. Lindell and J. Chang for helpful comments on an earlier draft of the manuscript; and A. Norcia for insightful discussions. N.P. was supported by the National Science Foundation (NSF) Graduate Research Fellowships Program. R.K. was supported by the NVIDIA Graduate Fellowship. G.W. was supported by an Okawa Research Grant, and a Sloan Fellowship. Other funding for the project was provided by NSF (award numbers 1553333 and 1839974) and Intel.


 

Presbyopia

Presbyopia Explainer: Presbyopia is a visual impairment that affects all people, typically starting in their mid- to late-40s. Near- and far-sightedness result from a error in the overall focusing power of the eye’s optics, resulting in a shift of the range of focus closer or farther than “normal” or emmetropic vision. These can be easily corrected by shifting the range back into place with a single fixed-focus optic. On the other hand, presbyopia occurs due to a stiffening of the eye’s crystalline lens and results in a shrinking of the range towards the presbyope’s farthest distance – this means it is possible to be presbyopic and nearsighted. Since the range of refocus is smaller, a fixed-focus, single-vision lens is no longer sufficient; no amount of shifting of the reduced range can make it cover the original range. Instead, either multiple glasses or some form of refocusing lens is required.

Presbyopia Corrections: Here we simulate what a presbyope might see without correction and then with various types of correction (blur exaggerated for effect). Progressive lenses reduce the effective field of vision such that far focus is provided in the top region, and near in the bottom region, with some blending in between; this provides “refocus” via head movements. Monovision instead keeps the entire field of vision at a single focal plane, but instead assigns different eyes to different distances of clear focus. Reading or computer glasses (not pictured) provide only near focus, trading off the convenience of only needing one pair of glasses for full-field vision. Finally, autofocals track the gaze position and automatically focus to the right distance effectively achieving the full-field vision of reading glasses, but with the convenience of only needing a single pair of glasses. (Foreground image: Nitish Padmanaban, Stanford; background image: https://pxhere.com/en/photo/1383278.)

System Overview

CAD Render: An exploded rendering of the autofocal prototype. Pictured are the Pupil Labs eye tracker, Intel RealSense R200 depth camera, Optotune EL-30-45 focus-tunable lenses, and custom coma-correction element. These are mounted onto a custom-built frame which also has slots to mount optional offset lenses for prescription correction.

Autofocal Prototype: The fully assembled prototype placed on a mannequin. The parts are labeled as in the CAD render above. (Photo credit: Nitish Padmanaban, Stanford.)

Refocusing Example: A view through the autofocal prototype as it refocuses between two objects placed at different distances. Inset in the bottom corners are the left and right eyes of a user whose gaze drives the refocusing. (Click video if it doesn’t autoplay.) (Video credit: Robert Konrad, Stanford.)

Wearing the Prototype: A clip of the autofocal prototype refocusing as a user looks to different distances, followed by them looking around. (Click video if it doesn’t autoplay.) (Video credit: Nitish Padmanaban, Stanford.)

User Study Results

Acuity Results: Comparing the acuity for users who were wearing their own daily correction (progressives or monovision) against them wearing autofocals, we see that autofocals are on average better across nearly all measured distances. Furthermore, autofocals maintain well over 20/20 acuity across the range of distances, unlike the traditional corrections which show a clear reduction in acuity for nearer focus distances.

Task Performance Results: Here we compare the speed and accuracy of users as they perform a letter matching task while wearing their typical daily correction or autofocals. Autofocals allow users to perform the task faster on average, and also significantly more accurately than when wearing progressives. This improvement in performance over progressives, reflects the fact that autofocals allow for faster refocus than head movements required to refocus with progressives.

Preference Results: Users spent a few minutes trying autofocals while viewing a natural scene. While they understandably felt that the VR-headset form factor of our autofocal prototype was less comfortable, they reported that autofocals were easier to refocus to multiple distances than other types of correction. Autofocals were also preferred for perceived convenience, though only slightly. We also had them try autofocals controlled only by the depth camera, without eye tracker input. The negative reactions seen above indicate that eye tracking is an essential part of any proposed autofocal correction.

Related Projects

You may also be interested in related projects from our group on next-generation near-eye displays and wearable technology:

  • Y. Peng et al. “Neural Holography with Camera-in-the-loop Training”, ACM SIGGRAPH 2020 (link)
  • R. Konrad et al. “Gaze-contingent Ocular Parallax Rendering for Virtual Reality”, ACM Transactions on Graphics 2020 (link)
  • B. Krajancich et al. “Optimizing Depth Perception in Virtual and Augmented Reality through Gaze-contingent Stereo Rendering”, ACM SIGGRAPH Asia 2020 (link)
  • B. Krajancich et al. “Factored Occlusion: Single Spatial Light Modulator Occlusion-capable Optical See-through Augmented Reality Display”, IEEE TVCG, 2020 (link)
  • N. Padmanaban et al. “Autofocals: Evaluating Gaze-Contingent Eyeglasses for Presbyopes”, Science Advances 2019 (link)
  • K. Rathinavel et al. “Varifocal Occlusion-Capable Optical See-through Augmented Reality Display based on Focus-tunable Optics”, IEEE TVCG 2019 (link)
  • N. Padmanaban et al. “Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays”, PNAS 2017 (link)
  • R. Konrad et al. “Accommodation-invariant Computational Near-eye Displays”, ACM SIGGRAPH 2017 (link)
  • R. Konrad et al. “Novel Optical Configurations for Virtual Reality: Evaluating User Preference and Performance with Focus-tunable and Monovision Near-eye Displays”, ACM SIGCHI 2016 (link)
  • F.C. Huang et al. “The Light Field Stereoscope: Immersive Computer Graphics via Factored Near-Eye Light Field Display with Focus Cues”, ACM SIGGRAPH 2015 (link)