G06F3/0304

DYNAMIC USER INTERACTIONS FOR DISPLAY CONTROL
20180011546 · 2018-01-11 · ·

The technology disclosed relates to using gestures to supplant or augment use of a standard input device coupled to a system. It also relates to controlling a display using gestures. It further relates to controlling a system using more than one input device. In particular, it relates to detecting a standard input device that causes on-screen actions on a display in response to control manipulations performed using the standard input device. Further, a library of analogous gestures is identified, which includes gestures that are analogous to the control manipulations and also cause the on-screen actions responsive to the control manipulations. Thus, when a gesture from the library of analogous gestures is detected, a signal is generated that mimics a standard signal from the standard input device and causes at least one on-screen action.

GAMING DEVICE WITH ROTATABLY PLACED CAMERAS
20180011545 · 2018-01-11 ·

A method to identify positions of fingers of a hand is described. The method includes capturing images of a first hand using a plurality of cameras that are part of a wearable device. The wearable device is attached to a wrist of a second hand and the plurality of cameras of the wearable device is disposed around the wearable device. The method includes repeating capturing of additional images of the first hand, the images and the additional images captured to produce a stream of captured image data during a session of presenting the virtual environment in a head mounted display (HMD). The method includes sending the stream of captured image data to a computing device that is interfaced with the HMD. The computing device is configured to process the captured image data to identify changes in positions of the fingers of the first hand.

DISPLAY APPARATUS AND DISPLAY METHOD
20180012568 · 2018-01-11 ·

A display apparatus includes: a display unit configured to be able to be curved in an almost cylindrical shape with a surface being oriented outwards and display information in a region of one of an entire periphery and a part of the surface; a viewing intention detector configured to detect viewing intention of a user for the information displayed on the display unit, the user wearing the display unit on a part of a body; a display region determiner configured to determine, according to the viewing intention of the user, a region of the display unit, which is viewable by the user, as a display region for displaying the information; and a display control unit configured to control display of the information in the display region.

Addressable crossed line projector for depth camera assembly

A projector for illuminating a target area is presented. The projector includes an array of emitters positioned on a substrate according to a distribution. Each emitter in the array of emitters has a non-circular emission area. Operation of at least a portion of the array of emitters is controlled based in part on emission instructions to emit light. The light from the projector is configured to illuminate the target area. The projector can be part of a depth camera assembly for depth sensing of a local area, or part of an eye tracker for determining a gaze direction for an eye.

OPTICAL DEVICE FOR EXPOSURE OF A SENSOR DEVICE FOR A VEHICLE
20180012922 · 2018-01-11 ·

The invention relates to an optical device (100) for exposure of a sensor device (10) for a vehicle (1) with an optical structure (101) which comprises an arrangement of optical micro elements (101.1) in order to bundle incident light (2) by the optical micro elements (101.1) and direct the light to sensor elements (10.1) of the sensor device (10) respectively, wherein the optical structure (101) is configured such that light (3) which is directed to the sensor element (10.1) can be concentrated for light active areas (10.2) of the sensor elements (10.1).

IRRADIATION OPTICAL SYSTEM AND PROJECTOR
20180011606 · 2018-01-11 · ·

An irradiation optical system includes a uniformizing section and an irradiation lens section. The uniformizing section brings in-plane distribution of light emitted from a light source, close to uniform in-plane distribution. The irradiation lens section diffuses the light in a predetermined direction. The in-plane distribution of the light is brought close to the uniform in-plane distribution by the uniformizing section. The irradiation lens section includes, in order from the light source, a first cylindrical lens and a second cylindrical lens each having negative refractive power in the predetermined direction.

ELECTRONIC DISPLAY ILLUMINATION

According to an example, a system for electronic display illumination comprises a display, a sensor communicatively coupled to the display to detect a user and a user eye gaze, and a processing resource communicatively coupled to the sensor. In some examples, the processing resource may determine an active screen area and an inactive screen area of the display based on the user eye gaze; instruct a display controller to adjust a display value of the inactive screen area; and transmit active screen area data to a secondary display.

SYSTEMS AND METHODS FOR BIOMECHANICALLY-BASED EYE SIGNALS FOR INTERACTING WITH REAL AND VIRTUAL OBJECTS

Systems and methods are provided for discerning the intent of a device wearer primarily based on movements of the eyes. The system may be included within unobtrusive headwear that performs eye tracking and controls screen display. The system may also utilize remote eye tracking camera(s), remote displays and/or other ancillary inputs. Screen layout is optimized to facilitate the formation and reliable detection of rapid eye signals. The detection of eye signals is based on tracking physiological movements of the eye that are under voluntary control by the device wearer. The detection of eye signals results in actions that are compatible with wearable computing and a wide range of display devices.

INFORMATION PROCESSING APPARATUS, POSITION INFORMATION GENERATION METHOD, AND INFORMATION PROCESSING SYSTEM
20180011543 · 2018-01-11 · ·

An information processing apparatus includes a memory storing a program and at least one processor that executes the program to implement processes of detecting a speed of motion of a user based on motion information relating to a motion of the user that is detected by a detection device, and generating position information of a position indication display information item, which is displayed on a display device and indicates a position designated by the user, based on the motion information relating to the motion of the user. The position information of the position indication display information item is generated by restricting a moving direction of the position indication display information item to a predetermined direction when the detected speed of motion of the user does not meet a predetermined speed condition.

GESTURE-BASED USER INTERFACE
20180011544 · 2018-01-11 ·

A computer-implemented method for enabling gesture-based interactions between a computer program and a user is disclosed. According to certain embodiments, the method may include initiating the computer program. The method may also include detecting that a condition has occurred. The method may also include activating a gesture-based operation mode of the computer program. The method may also include receiving gesture data generated by a sensor, the gesture data representing a gesture performed by the user. The method may further include performing a task based on the gesture data.