G05B2219/35503

Eye-tracking with MEMS scanning and reflected light

An eye-tracking system is provided. The system includes an at least partially transparent visible light waveguide having a visible light display region configured to emit visible light to impinge upon an eye of a user. A light source is configured to emit at least infrared (IR) light that travels along an IR light path to impinge on the eye. A microelectromechanical system (MEMS) scanning mirror positioned in the IR light path is configured to direct the IR light along the IR light path. A relay positioned in the IR light path downstream of the MEMS scanning mirror includes at least one mirror configured to reflect the IR light along the IR light path. At least one sensor is configured to receive the IR light after being reflected by the eye.

Microsurgery system for displaying in real time magnified digital image sequences of an operated area
10895742 · 2021-01-19 · ·

A system captures and displays video of surgeries. The system may include at least one digital image sensor optically coupled to one or more lenses and configured to capture a video sequence of a scene in a surgery; at least one interface configured to receive at least one region on interest (ROI) of the captured video sequence; an electronic display, selected so that at least one of the digital image sensors has a pixel resolution which is substantially greater than the pixel resolution of the electronic display; and a computer processor configured to: receive the at least one captured video sequence and the at least one received ROI and display over the at least one electronic display a portion of the captured video sequence based on the at least one selected ROI.

IR illumination module for MEMS-based eye tracking

An improved eye tracking illumination system is disclosed. The system includes (i) an RGB laser device that is associated with a first collimating optic and (ii) an IR illumination device that is associated with a second collimating optic. The system also includes a DMA that has a MEMS mirror system. The DMA optically combines IR light and RGB light to generate combined light. The combined light is then directed towards a user's eye via a transport medium (e.g., a waveguide). One or more photodetector(s) are positioned to capture reflected light that is reflected off of the user's eye. The photodetectors include an IR detector configured to detect reflected IR light off of the user's eye in order to perform eye tracking.

CONTROL SYSTEM
20200310384 · 2020-10-01 ·

A control system controls an industrial machine, and each of controllers includes a screen generation unit which generates a controller screen that is displayed on a controller display unit and which generates a glasses screen that is displayed on s glasses-type display device based on a variation in an internal state of the controller screen and the glasses-type display device includes: a transmissive glasses display unit which is arranged so as to correspond to the positions of the eyes of a wearer and which can display the generated glasses screen; a glasses side transmission/reception unit which acquires specific information for specifying the controller that is connected; and a display control unit which displays the glasses screen and the specific information on the glasses display unit.

IR ILLUMINATION MODULE FOR MEMS-BASED EYE TRACKING

An improved eye tracking illumination system is disclosed. The system includes (i) an RGB laser device that is associated with a first collimating optic and (ii) an IR illumination device that is associated with a second collimating optic. The system also includes a DMA that has a MEMS mirror system. The DMA optically combines IR light and RGB light to generate combined light. The combined light is then directed towards a user's eye via a transport medium (e.g., a waveguide). One or more photodetector(s) are positioned to capture reflected light that is reflected off of the user's eye. The photodetectors include an IR detector configured to detect reflected IR light off of the user's eye in order to perform eye tracking.

STEERABLE RETICLE FOR VISOR PROJECTED HELMET MOUNTED DISPLAYS

A helmet mounted display system is described. A visor has an inner reflective surface and is mountable to head gear. A light source is arranged to emit light. Directing optics are arranged to image light from the light source onto the inner reflective surface of the visor to provide a reticle image on the inner reflective surface of the visor. An eye tracker is configured to determine the orientation of an eye of a wearer of the head gear. A controller is configured to receive an indication of the determined orientation of the eye, and to control the at least one actuator to change the orientation and shape of the directing optics to change the position of reticle image based on the indication of the determined orientation of the eye such that the eye views the reticle image.

Holographic Pattern Generation for Head-Mounted Display (HMD) Eye Tracking Using an Array of Parabolic Mirrors

A system for making a holographic medium for use in generating light patterns for eye tracking includes a light source configured to provide light and a beam splitter configured to separate the light into a first portion of the light and a second portion of the light that is spatially separated from the first portion of the light. The system also includes a first set of optical elements configured to transmit the first portion of the light for providing a first wide-field beam onto an optically recordable medium, a second set of optical elements configured to transmit the second portion of the light for providing a second wide-field beam, and a plurality of parabolic reflectors optically coupled with the second set of optical elements and configured to receive the second wide-field beam and project a plurality of separate light patterns onto the optically recordable medium for forming the holographic medium.

Enhanced field of view via common region and peripheral related regions
10674127 · 2020-06-02 · ·

In certain embodiments, a user's field of view may be increased and/or modified. In some embodiments, views of a scene may be obtained via a wearable device. A first region of the scene may be determined, where the first region is common to the views and represented in the views at proximate a center of each of the views. For each view of the views, a second region of the scene may be determined, where the second region is represented in the view and related to peripheral vision of the user. A modified view may be generated based on (i) a portion of at least one of the views that corresponds to the common first region and (ii) portions of the views that correspond to the second regions such that the modified view comprises a representation of the common first region and representations of the second regions.

COLLABORATIVE OPERATION SUPPORT DEVICE
20200039066 · 2020-02-06 ·

The collaborative operation support device includes a display device including a display area; and a processor configured to detect, based on an image in which the operator or the robot is represented, a position of a section of the robot in the display area when the operator looks at the robot through the display area, the section associated with an operation mode of the robot specified by means of an input device; select, in accordance with the specified operation mode of the robot, display data corresponding to the specified mode among display data stored in a memory; and display the selected display data in the display area of the display device in such a way that the selected display data is displayed at a position that satisfies a certain positional relationship with the position of the section of the robot in the display area.

Vision defect determination via a dynamic eye-characteristic-based fixation point
10531795 · 2020-01-14 · ·

In certain embodiments, vision defect information may be generated via a dynamic eye-characteristic-based fixation point. In some embodiments, a first stimulus may be displayed at a first location on a user interface based on a fixation point for a visual test presentation. The fixation point for the visual test presentation may be adjusted during the visual test presentation based on eye characteristic information related to a user. As an example, the eye characteristic information may indicate a characteristic of an eye of the user that occurred during the visual test presentation. A second stimulus may be displayed during the visual test presentation at a second interface location on the user interface based on the adjusted fixation point for the visual test presentation. Vision defect information associated with the user may be generated based on feedback information indicating feedback related to the first stimulus and feedback related to the second stimulus.