Apple has filed for a patent (number US 20240004190 A1) for an “Eye Imaging System” for the upcoming Vision Pro.

The US$3,499 (and higher) Spatial Computer is due, according to Apple, in “early 2024.” Predictions range from later this month to sometime in April. And it will apparently only be available in limited quantities at first. 

About the patent filing

The patent filing involves methods and apparatus for providing eye tracking in head-mounted devices (HMDs) including but not limited to HMDs used in extended reality (XR) applications. Apple says an HMD such as the Vision Pro may include lenses positioned in front of the eyes through which the wearer can view the environment. 

In XR systems, virtual content may be displayed on or projected onto these lenses to make the virtual content visible to the wearer while still being able to view the real environment through the lenses. IN some some systems, an HMD may include gaze tracking technology. In an example gaze tracking system, one or more infrared (IR) light sources emit IR light towards a user’s eye. 

A portion of the IR light is reflected off the eye and captured by an eye tracking camera. Images captured by the eye tracking camera may be input to a glint and pupil detection process, for example implemented by one or more processors of a controller of the HMD. Results of the process are passed to a gaze estimation process, for example implemented by one or more processors of the controller, to estimate the user’s point of gaze. This method of gaze tracking may be referred to as PCCR (Pupil Center Corneal Reflection) tracking.

Conventionally, an eye tracking camera is mounted somewhere on the frame of the HMD and pointed towards the eye to image the eye, or on the outside of the HMD and thus imaging the eye through the HMD’s display lens. In both cases, the camera can’t obtain a direct view of the eye, as the form factor prevents the camera from being positioned directly in front of the eye, and thus the camera images the eye at an angle.

One solution is to use a waveguide located in the HMD’s display component to relay an image of the eye to a sensor located somewhere on the wearable device’s frame. However, Apple says that conventional display waveguides encode image points in angles, and therefore can image objects at infinity. 

While this could be used to perform retinal tracking, due to the form factor of at least some wearable devices the display is too close to the eye’s cornea for this method to be used to capture images for use in a PCCR gaze tracking process. Apple wants to overcome this lamination with the Vision Pro as described in the abstract of the patent filing.

Summary of the patent filing

Here’s Apple’s abstract of the patent filing: “A waveguide with an input coupler and an output coupler that redirects reflected light to a camera. The waveguide may be integrated in a lens of a wearable device such as a pair of glasses. Light sources emit light beams towards the eye. A portion of the light beams are reflected by the surface of the eye towards the input coupler located in front of the eye.

“The input coupler may be implemented according to diffractive or reflective technologies, and may be a straight or curved line of narrow width to focus at close distances but of a length long enough to sufficiently image the eye. The input coupler changes the angles of the light beams so that the light beams are relayed using total internal reflection and focused towards an output coupler of the waveguide. The light beams are redirected by the output coupler to the camera.”




Article provided with permission from AppleWorld.Today