G05B2219/35503

METHOD AND DEVICE FOR EYE METRIC ACQUISITION
20190361231 · 2019-11-28 · ·

The present disclosure relates to a method and a device for acquisition of a metric of an eye (1) located in an acquisition space (29). The device comprises at least one light source (11) configured to emit light towards the acquisition space, a camera (15) configured to receive light from the acquisition space to (29) generate image data, and an analyzing unit (14) configured to extract at least one metric from the image data. The camera (15) is configured to receive light from the acquisition space via at least two light paths (17, 19) which are differently angled with respect to the optical axis of the camera, the light of at least one path being received via a first mirror (21). The camera receives light from an overlapping portion of the acquisition space via the first and second paths, as to allow the camera to receive at least two representations of a single eye. This metric may be used for e.g. eye tracking or autorefraction/accomodation.

MICROSURGERY SYSTEM FOR DISPLAYING IN REAL TIME MAGNIFIED DIGITAL IMAGE SEQUENCES OF AN OPERATED AREA
20190293935 · 2019-09-26 ·

A system captures and displays video of surgeries. The system may include at least one digital image sensor optically coupled to one or more lenses and configured to capture a video sequence of a scene in a surgery; at least one interface configured to receive at least one region on interest (ROI) of the captured video sequence; an electronic display, selected so that at least one of the digital image sensors has a pixel resolution which is substantially greater than the pixel resolution of the electronic display; and a computer processor configured to: receive the at least one captured video sequence and the at least one received ROI and display over the at least one electronic display a portion of the captured video sequence based on the at least one selected ROI.

Eye-tracking with MEMS scanning and optical relay

An eye-tracking system is provided that includes a light source configured to emit at least infrared (IR) light and a microelectromechanical system (MEMS) scanning mirror configured to direct the IR light. The system further includes a relay including at least one prism, and the relay is configured to receive the IR light directed by the MEMS scanning mirror and redirect the IR light. The system further includes a waveguide through which the IR light redirected by the relay passes to reach an eye, and at least one sensor configured to receive the IR light after being reflected by the eye.

Microsurgery system for displaying in real time magnified digital image sequences of an operated area
10345582 · 2019-07-09 · ·

A system captures and displays video of surgeries. The system may include at least one digital image sensor optically coupled to one or more lenses and configured to capture a video sequence of a scene in a surgery; at least one interface configured to receive at least one region on interest (ROI) of the captured video sequence; an electronic display, selected so that at least one of the digital image sensors has a pixel resolution which is substantially greater than the pixel resolution of the electronic display; and a computer processor configured to: receive the at least one captured video sequence and the at least one received ROI and display over the at least one electronic display a portion of the captured video sequence based on the at least one selected ROI.

Dynamic Multi-Sensor and Multi-Robot Interface System
20190126484 · 2019-05-02 · ·

An adaptive learning interface system for end-users for controlling one or more machines or robots to perform a given task, combining identification of gaze patterns, EEG channel's signal patterns, voice commands and/or touch commands. The output streams of these sensors are analysed by the processing unit in order to detect one or more patterns that are translated into one or more commands to the robot, to the processing unit or to other devices. A pattern learning mechanism is implemented by keeping immediate history of outputs collected from those sensors, analysing their individual behaviour and analysing time correlation between patterns recognized from each of the sensors. Prediction of patterns or combination of patterns is enabled by analysing partial history of sensors' outputs. A method for defining a common coordinate system between robots and sensors in a given environment, and therefore dynamically calibrating these sensors and devices, is used to share characteristics and positions of each object detected on the scene.

NON-INVASIVE EYE-TRACKING CONTROL OF NEUROMUSCULAR STIMULATION SYSTEM
20190091472 · 2019-03-28 ·

A non-invasive control system for neuromuscular stimulation includes an eye-tracking device, an electrical stimulation device, and software that interprets the eye movements of the user to determine an intended movement and sends electrical signal(s) to the stimulation device to achieve the intended movement. For example, the stimulation device may be a sleeve with electrodes worn on a paralyzed limb, with the intended movement being the movement of the limb.

EYE-TRACKING WITH MEMS SCANNING AND OPTICAL RELAY

An eye-tracking system is provided that includes a light source configured to emit at least infrared (IR) light and a microelectromechanical system (MEMS) scanning mirror configured to direct the IR light. The system further includes a relay including at least one prism, and the relay is configured to receive the IR light directed by the MEMS scanning mirror and redirect the IR light. The system further includes a waveguide through which the IR light redirected by the relay passes to reach an eye, and at least one sensor configured to receive the IR light after being reflected by the eye.

EYE-TRACKING WITH MEMS SCANNING AND REFLECTED LIGHT

An eye-tracking system is provided. The system includes an at least partially transparent visible light waveguide having a visible light display region configured to emit visible light to impinge upon an eye of a user. A light source is configured to emit at least infrared (IR) light that travels along an IR light path to impinge on the eye. A microelectromechanical system (MEMS) scanning mirror positioned in the IR light path is configured to direct the IR light along the IR light path. A relay positioned in the IR light path downstream of the MEMS scanning mirror includes at least one mirror configured to reflect the IR light along the IR light path. At least one sensor is configured to receive the IR light after being reflected by the eye.

Dynamic multi-sensor and multi-robot interface system
10179407 · 2019-01-15 · ·

An adaptive learning interface system for end-users for controlling one or more machines or robots to perform a given task, combining identification of gaze patterns, EEG channel's signal patterns, voice commands and/or touch commands. The output streams of these sensors are analyzed by the processing unit in order to detect one or more patterns that are translated into one or more commands to the robot, to the processing unit or to other devices. A pattern learning mechanism is implemented by keeping immediate history of outputs collected from those sensors, analyzing their individual behavior and analyzing time correlation between patterns recognized from each of the sensors. Prediction of patterns or combination of patterns is enabled by analyzing partial history of sensors' outputs. A method for defining a common coordinate system between robots and sensors in a given environment, and therefore dynamically calibrating these sensors and devices, is used to share characteristics and positions of each object detected on the scene.

Systems and methods for implementing a pointer-guided tracking system and a pointer-guided mechanical movable device control system
10168688 · 2019-01-01 ·

A system and method are provided for facilitating hands free and precise movement, translation and repositioning of a movable mechanical apparatus, including an operating room lighting system, mounted to a mechanically-movable base component including, for example, an articulable or articulated robotic-type arm, according to user input pointing commands, including laser or other like pointing commands initiated by a user. The user provides hands free designation of a point of focus with a pointing device. A sensor associated with the movable mechanical apparatus automatically detects the designated point of focus and a processor determines and executes a scheme of movement for moving the movable mechanical apparatus from a current position to a position proximate to the designated point of focus. A collision avoidance scheme is also provided for safety and to alert the user as to the presence of any impediment in the determined scheme of movement.