Patent classifications
G02B27/00
Dual-polarization LiDAR systems and methods
A LiDAR system has a field of view and includes a polarization-based waveguide splitter. The splitter includes a first splitter port, a second splitter port and a common splitter port. A laser is optically coupled to the first splitter port via a single-polarization waveguide. An objective lens optically couples each optical emitter of an array of optical emitters to a respective unique portion of the field of view. An optical switching network is coupled via respective dual-polarization waveguides between the common splitter port and the array of optical emitters. An optical receiver is optically coupled to the second splitter port via a dual-polarization waveguide and is configured to receive light reflected from the field of view. A controller, coupled to the optical switching network, is configured to cause the optical switching network to route light from the laser to a sequence of the optical emitters according to a temporal pattern.
Measurement method and system
Methods and systems for determining an individual gaze value are disclosed herein. An exemplary method involves: (a) receiving gaze data for a first wearable computing device, wherein the gaze data is indicative of a wearer-view associated with the first wearable computing device, and wherein the first wearable computing device is associated with a first user-account; (b) analyzing the gaze data from the first wearable computing device to detect one or more occurrences of one or more advertisement spaces in the gaze data; (c) based at least in part on the one or more detected advertisement-space occurrences, determining an individual gaze value for the first user-account; and (d) sending a gaze-value indication, wherein the gaze-value indication indicates the individual gaze value for the first user-account.
INFORMATION PROCESSING DEVICE THAT DISPLAYS A VIRTUAL OBJECT RELATIVE TO REAL SPACE
An information processing device including a display unit, a detector, and a first control unit and a method of using same. The display unit may be a head-mounted display. The display unit is capable of providing the user with a field of view of a real space and a virtual object. The detector detects an azimuth of the display unit around at least one axis and display of the virtual object is controlled based in the detected azimuth.
INTELLIGENT SYSTEM FOR CONTROLLING FUNCTIONS IN A COMBAT VEHICLE TURRET
A system for controlling turret functions of a land-based combat vehicle includes: a plurality of image detection sensors for recording sequences of images having an at least partial view of a 360° environment of the land-based combat vehicle; at least one virtual, augmented or mixed reality headset for wear by an operator, the headset presenting the at least partial view of the environment of the land-based combat vehicle on a display, the headset including a direction sensor for tracking an orientation of the headset imparted during a movement of a head of the operator and eye tracking means for tracking eye movements of the operator; a control unit including at least one computing unit for receiving as input and processing: images supplied by the plurality of image detection sensors; headset position and orientation data supplied by the direction sensor; eye position data supplied by the eye tracking means.
EYE TRACKING SYSTEMS AND METHODS
Methods and systems for tracking an individual's eye, by tracking one or more ocular axes, are presented. The technique comprises the following: (i) illuminating the eye, over an area of the cornea extending over the pupil, with first and second incident light beams having a transverse cross sectional area smaller than a predetermined value with respect to an area of the pupil and propagating coaxially along a first optical path defined by central axes of the first and second incident light beams, wherein said first incident light beam is configured to be reflected from the cornea and said second incident light beam is configured to pass through the cornea and the pupil and to be reflected from a retina region of the eye; (ii) detecting respective first and second reflected light beams; (iii) adjusting the first optical path such that said first reflected light beam propagates along said first optical path and said second reflected light beam propagates along a second optical path having a predetermined spatial relationship with said first optical path whereby said predetermined spatial relationship is indicative of said ocular axis being along at least said first optical path; and (iv) tracking said ocular axis of the eye under changes in gaze direction of said eye by repeating (i) to (iii).
HEAD-UP DISPLAY DEVICE AND MOBILE OBJECT
A display region has a curved surface shape having upper and lower end portions disposed at positions closer to a visual field than a reference plane, and a central portion disposed at a position farther from the visual field than the reference plane. A first convergence angle difference between a convergence angle from the eye position to the upper end portion and a convergence angle from the eye position to a first point on the reference plane through the upper end portion, a second convergence angle difference between a convergence angle to the central portion and a convergence angle to a second point on the reference plane through the central portion, and a third convergence angle difference between a convergence angle to the lower end portion and a convergence angle to a third point on the reference plane through the lower end portion respectively fall within four milliradians.
LENS ASSEMBLY, CAMERA MODULE HAVING A LENS ASSEMBLY FOR MOTOR VEHICLES, AND A METHOD FOR MAKING LENS ASSEMBLY
The camera module has a lens assembly comprising a body and a heating element with an optically transparent coating applied to the body for heating it as electric current flows for removing water-based obstructions. The module includes a power supply for supplying electric current to the optically transparent coating through conductors, and a lens barrel (for receiving the body comprising a passageway for the conductors extending within the lens barrel towards the lens body. The method comprises applying to the lens body, high- and low-refractive index layers and an aluminium-doped zinc oxide layer.
ULTRASOUND DEVICES FOR MAKING EYE MEASUREMENTS
The disclosed ultrasound devices may include at least one ultrasound transmitter positioned and configured to transmit ultrasound signals toward a user's face to reflect off a facial feature of the user's face and at least one ultrasound receiver positioned and configured to receive and detect the ultrasound signals reflected off the facial feature. At least one processor may be configured to receive data from the at least one ultrasound receiver and to determine, based on the received data from the at least one ultrasound receiver, at least one of the following eye measurements: an interpupillary distance of the user; an eye relief; or a position of a head-mounted display relative to the facial feature of the user. Various other devices, systems, and methods are also disclosed.
LASER BEAM COMBINING APPARATUS, AND COMBINED STEPPED REFLECTOR AND FILLING RATE CALCULATION METHOD THEREOF
A laser beam combining apparatus, and a combined stepped reflector and a filling rate calculation method thereof are disclosed. The laser beam combining apparatus includes a two-dimensional light-emitting array and the combined stepped reflector used to reflect a plurality of laser beams emitted by the two-dimensional light-emitting array. The combined stepped reflector is composed of a plurality of reflective mirrors that have the same length but sequentially increasing widths and that are stacked in succession, where the distance between centers of the laser beams reflected by the combined stepped reflector is smaller than the distance between centers of the laser beams prior to the incidence, thus increasing the filling rate of the laser beams emitted by the two-dimensional light-emitting array. A method for calculating the filling rate of the laser beam combining apparatus is also provided.
METHOD AND DEVICE FOR RECOGNIZING A VIEWING DIRECTION AND/OR A STATE OF AN EYE USING A LASER DEVICE AND LASER DEVICE
A method for recognizing a state of an eye for a laser device. The method includes reading in an eye parameter using the laser device, which represents a movement of the eye. The eye parameter is compared with a first and/or with a second reference parameter in order to obtain a comparison result. A type of movement of the eye is determined using the comparison result, which represents a saccadic eye movement when the eye parameter has a greater value than the first reference parameter. The type of movement represents a smooth eye movement when the eye parameter has a value which corresponds to the second reference parameter and at most to the first reference parameter. A viewing direction is ascertained as a function of the determined type of movement.