LIDAR sensors and tunable lenses could create autofocus glasses


Tunable lenses
Visual demonstrator The 3D model and final prototype of a custom frame used to assess the electronically tunable lenses. (Courtesy: Biomed. Opt. Express 10.1364/BOE.471190 © The Optical Society)

It’s long been a dream of people whose ageing eyes cannot focus clearly on close-up objects to have a pair of glasses that automatically adjust to provide optical clarity for whatever they are looking at. Presbyopia, the gradual deterioration of the lens’s ability to focus light correctly on the retina, is a natural condition of ageing. But because of this deterioration, many people aged 60 and above can no longer clearly see anything within a radius of a metre. An estimated two billion people live with this condition.

Viable commercialization of a “tunable” lens with automated focus is years away, but researchers at the University of Tuebingen have now demonstrated the capabilities of a liquid-membrane-based tunable lens controlled with a solid-state LIDAR camera feedback system, reporting their findings in Biomedical Optics Express.

Liquid-membrane-based lenses have a membrane that changes shape as a liquid is pumped in and out of a chamber; the curvature of the membrane defines the lens. The researchers evaluated this type of tunable lens for its visual acuity and contrast sensitivity for their use in presbyopia correction, as well assessing its aberration properties. First author Rajat Agarwala and colleagues also demonstrated the feasibility of using a feedback mechanism to operate the lenses, based on a portable sold-state LIDAR camera with a processing time of 40 ms.

READ MORE:  Best Smartphone Launches Of May 2022

The study included 15 eye-healthy adults, who participated after having ophthalmology examinations. Prior to testing, a cycloplegic agent was administered to block their eyes’ ability to change optical power. The participants wore a custom spectacle-like frame prototype housing the tunable lenses, and had a LIDAR camera strapped on their foreheads to determine distances in their visual fields.

The researchers evaluated visual acuity when participants viewed an external display at three distances and provided feedback using a keyboard. They also assessed contrast sensitivity and refocusing tasks using the lenses, with gaze tracking and depth sensor data driving the lens. The LIDAR camera with tunable lenses proved to be technically feasible as a 3D distance estimator for the development of tunable spectacles for presbyopia. The team observed low wavefront errors and fast switching of powers, along with a wide field-of-view, demonstrating significant potential of the lenses for ophthalmic applications.

Eye-tracking expertise

Elsewhere, Nitish Padmanaban is also working on the problem. Padmanaban is co-founder and head of research at Zinn Labs, established in 2020 to develop autofocals as a viable product for consumers. In a TEDx talk attracting more than 2.2 million views, he explained the complexity of building intelligence around the autofocus lenses.

READ MORE:  DevSecOps glossary: 24 terms security professionals need to know

His prototypes borrow technology from virtual and augmented reality systems to estimate focusing distances, with the earliest prototypes incorporating a distance sensor and focus-tunable lenses into a goggles-like device. An eye tracker conveys the direction in which the eyes are focused, and a camera serves as a distance sensor. Newer prototypes remove the need for a camera, relying solely on eye tracking and reducing some of the bulk of the device.

Nitish Padmanaban

“Our core expertise is with eye tracking-related imaging and algorithms, and we have expanded our work to develop a broader set of uses for eye tracking,” Padmanaban tells Physics World. “In addition to autofocals, these uses include eye-based health monitoring and virtual- and augmented-reality headsets. Each of these application areas benefit from lower power, lower-latency eye tracking, though the timelines and specific challenges for bringing them to market differ.”

“Our work at Zinn Labs specifically aims to make the eye tracking good enough such that the vergence estimation alone is sufficient to set lens power, without the need for a depth sensor,” he adds. “More sensors require more energy and more computation. When you want to create a device that people will wear, size and weight need to be minimized. Eye tracking alone can in theory provide sufficient information to estimate a wearer’s gaze distance, and so we believe the best approach is to perfect it.”

READ MORE:  The Positive Impact of Having a Personal Vision Board

As for when autofocus lenses could become a commercial reality, Padmanaban notes that much research is under way, involving several, often orthogonal solutions. He predicts that the most viable option will be some form of low-power eye tracker coupled with a liquid crystal lens.

“It’s hard to predict when the next big leap in focus-tunable lenses will be, but it wouldn’t surprise me if it takes another five to ten years before there is a market-ready product that appeals to consumers,” he concludes.

This post was originally published on this site