G05B2219/35503

Control device and master slave system

Provided is a control device including a control unit that calculates a first positional relationship between an eye of an observer observing an object displayed on a display unit and a first point in a master-side three-dimensional coordinate system, and controls an imaging unit that images the object so that a second positional relationship between the imaging unit and a second point corresponding to the first point in a slave-side three-dimensional coordinate system corresponds to the first positional relationship.

Control system
11480940 · 2022-10-25 · ·

A control system controls an industrial machine, and each of controllers includes a screen generation unit which generates a controller screen that is displayed on a controller display unit and which generates a glasses screen that is displayed on s glasses-type display device based on a variation in an internal state of the controller screen and the glasses-type display device includes: a transmissive glasses display unit which is arranged so as to correspond to the positions of the eyes of a wearer and which can display the generated glasses screen; a glasses side transmission/reception unit which acquires specific information for specifying the controller that is connected; and a display control unit which displays the glasses screen and the specific information on the glasses display unit.

Method and device for eye metric acquisition
11624907 · 2023-04-11 · ·

The present disclosure relates to a method and a device for acquisition of a metric of an eye (1) located in an acquisition space (29). The device comprises at least one light source (11) configured to emit light towards the acquisition space, a camera (15) configured to receive light from the acquisition space to (29) generate image data, and an analyzing unit (14) configured to extract at least one metric from the image data. The camera (15) is configured to receive light from the acquisition space via at least two light paths (17, 19) which are differently angled with respect to the optical axis of the camera, the light of at least one path being received via a first mirror (21). The camera receives light from an overlapping portion of the acquisition space via the first and second paths, as to allow the camera to receive at least two representations of a single eye. This metric may be used for e.g. eye tracking or autorefraction/accomodation.

SYSTEMS AND METHODS FOR IMPLEMENTING A POINTER-GUIDED TRACKING SYSTEM AND A POINTER-GUIDED MECHANICAL MOVABLE DEVICE CONTROL SYSTEM
20170315536 · 2017-11-02 ·

A system and method are provided for facilitating hands free and precise movement, translation and repositioning of a movable mechanical apparatus, including an operating room lighting system, mounted to a mechanically-movable base component including, for example, an articulable or articulated robotic-type arm, according to user input pointing commands, including laser or other like pointing commands initiated by a user. The user provides hands free designation of a point of focus with a pointing device. A sensor associated with the movable mechanical apparatus automatically detects the designated point of focus and a processor determines and executes a scheme of movement for moving the movable mechanical apparatus from a current position to a position proximate to the designated point of focus. A collision avoidance scheme is also provided for safety and to alert the user as to the presence of any impediment in the determined scheme of movement.

CONTROL DEVICE AND MASTER SLAVE SYSTEM

Provided is a control device including a control unit that calculates a first positional relationship between an eye of an observer observing an object displayed on a display unit and a first point in a master-side three-dimensional coordinate system, and controls an imaging unit that images the object so that a second positional relationship between the imaging unit and a second point corresponding to the first point in a slave-side three-dimensional coordinate system corresponds to the first positional relationship.

Holographic pattern generation for head-mounted display (HMD) eye tracking using an array of parabolic mirrors

A system for making a holographic medium for use in generating light patterns for eye tracking includes a light source configured to provide light and a beam splitter configured to separate the light into a first portion of the light and a second portion of the light that is spatially separated from the first portion of the light. The system also includes a first set of optical elements configured to transmit the first portion of the light for providing a first wide-field beam onto an optically recordable medium, a second set of optical elements configured to transmit the second portion of the light for providing a second wide-field beam, and a plurality of parabolic reflectors optically coupled with the second set of optical elements and configured to receive the second wide-field beam and project a plurality of separate light patterns onto the optically recordable medium for forming the holographic medium.

Virtual environment generating system

A system and related methods for visually augmenting an appearance of a physical environment as seen by a user through a head-mounted display device are provided. In one embodiment, a virtual environment generating program receives eye-tracking information, lighting information, and depth information from the head-mounted display. The program generates a virtual environment that models the physical environment and is based on the lighting information and the distance of a real-world object from the head-mounted display. The program visually augments a virtual object representation in the virtual environment based on the eye-tracking information, and renders the virtual object representation on a transparent display of the head-mounted display device.

Steerable reticle for visor projected helmet mounted displays

A helmet mounted display system is described. A visor has an inner reflective surface and is mountable to head gear. A light source is arranged to emit light. Directing optics are arranged to image light from the light source onto the inner reflective surface of the visor to provide a reticle image on the inner reflective surface of the visor. An eye tracker is configured to determine the orientation of an eye of a wearer of the head gear. A controller is configured to receive an indication of the determined orientation of the eye, and to control the at least one actuator to change the orientation and shape of the directing optics to change the position of reticle image based on the indication of the determined orientation of the eye such that the eye views the reticle image.

Collaborative operation support device
11458618 · 2022-10-04 · ·

The collaborative operation support device includes a display device including a display area; and a processor configured to detect, based on an image in which the operator or the robot is represented, a position of a section of the robot in the display area when the operator looks at the robot through the display area, the section associated with an operation mode of the robot specified by means of an input device; select, in accordance with the specified operation mode of the robot, display data corresponding to the specified mode among display data stored in a memory; and display the selected display data in the display area of the display device in such a way that the selected display data is displayed at a position that satisfies a certain positional relationship with the position of the section of the robot in the display area.

MICROSURGERY SYSTEM FOR DISPLAYING IN REAL-TIME MAGNIFIED DIGITAL IMAGE SEQUENCES OF AN OPERATED AREA
20210109349 · 2021-04-15 · ·

A microsurgery system comprising a robotic arm, configured for movement; a head mounted display (HMD) configured to display to a user in real-time image sequences of an operated area; at least one camera coupled to said robotic arm, said at least one camera configured to acquire operated-area image sequences of said operated area, said at least one camera being suspended above said operated area and being mechanically and optically disconnected from said HMD, said robotic arm enables said at least one camera to capture said operated-area image sequences from a range of perspectives; a processing device configured to be coupled with said HMD and said at least one camera, said processing device configured to transmit said image sequences of said operated-area to said HMD; and a tracker configured to track at least one of a head of said user, a hand of said user, and a tool held by said user; wherein said robotic arm is enabled to be guided according to movements of tracked at least one of said head, said hand, and said tool.