G06F3/0346

APPARATUS FOR CONTROLLING CONTENTS OF A COMPUTER-GENERATED IMAGE USING THREE DIMENSIONAL MEASUREMENTS
20180011553 · 2018-01-11 ·

A method for controlling movement of a computer display cursor based on a point-of-aim of a pointing device within an interaction region includes projecting an image of a computer display to create the interaction region. At least one calibration point having a predetermined relation to said interaction region is established. A pointing line is directed to substantially pass through the calibration point while measuring a position of and an orientation of the pointing device. The pointing line has a predetermined relationship to said pointing device. Movement of the cursor is controlled within the interaction region using measurements of the position of and the orientation of the pointing device.

INFORMATION PROCESSING APPARATUS, PROGRAM, AND CONTROL METHOD
20180011554 · 2018-01-11 · ·

An information processing apparatus includes a display, a sensor, and a controller. The display has a screen. The sensor is configured to detect an inclination. The controller is configured to display a first object on the screen and display a second object associated with the first object on the screen in accordance with the inclination detected by the sensor.

INFORMATION PROCESSING APPARATUS, PROGRAM, AND CONTROL METHOD
20180011554 · 2018-01-11 · ·

An information processing apparatus includes a display, a sensor, and a controller. The display has a screen. The sensor is configured to detect an inclination. The controller is configured to display a first object on the screen and display a second object associated with the first object on the screen in accordance with the inclination detected by the sensor.

Systems, Methods, and Computer-Readable Media for Generating Computer-Mediated Reality Display Data
20180011552 · 2018-01-11 ·

Systems, methods, and computer-readable media are provided for generating computer-mediated reality display data based on user instantaneous motion data. A system includes at least one sensor, a mediated reality data source, and a mediated reality display generator that generates displayable mediated reality scene data based on (a) current reality data of the system from the at least one sensor; (b) mediated reality data from the mediated reality data source; and (c) instantaneous motion data of the system from the at least one sensor. In one example the mediated reality display generator generates the displayable mediated reality scene data by generating displayable mediated reality frame data based on the current reality data and the mediated reality data. The operations further include selecting a portion of the displayable mediated reality frame data as the displayable mediated reality scene data based on the instantaneous motion data. The portion that is selected is offset from a center of the displayable mediated reality frame data and is less than a frame size of the displayable mediated reality frame data and the offset is selected based on said instantaneous motion.

Systems, Methods, and Computer-Readable Media for Generating Computer-Mediated Reality Display Data
20180011552 · 2018-01-11 ·

Systems, methods, and computer-readable media are provided for generating computer-mediated reality display data based on user instantaneous motion data. A system includes at least one sensor, a mediated reality data source, and a mediated reality display generator that generates displayable mediated reality scene data based on (a) current reality data of the system from the at least one sensor; (b) mediated reality data from the mediated reality data source; and (c) instantaneous motion data of the system from the at least one sensor. In one example the mediated reality display generator generates the displayable mediated reality scene data by generating displayable mediated reality frame data based on the current reality data and the mediated reality data. The operations further include selecting a portion of the displayable mediated reality frame data as the displayable mediated reality scene data based on the instantaneous motion data. The portion that is selected is offset from a center of the displayable mediated reality frame data and is less than a frame size of the displayable mediated reality frame data and the offset is selected based on said instantaneous motion.

Addressable crossed line projector for depth camera assembly

A projector for illuminating a target area is presented. The projector includes an array of emitters positioned on a substrate according to a distribution. Each emitter in the array of emitters has a non-circular emission area. Operation of at least a portion of the array of emitters is controlled based in part on emission instructions to emit light. The light from the projector is configured to illuminate the target area. The projector can be part of a depth camera assembly for depth sensing of a local area, or part of an eye tracker for determining a gaze direction for an eye.

METHOD AND APPARATUS FOR SMART HOME CONTROL BASED ON SMART WATCHES
20180011456 · 2018-01-11 · ·

A method for controlling a smart home using a smart watch is disclosed. The method includes: detecting whether the smart watch has entered a sensing range of the smart home; detecting, after the smart watch has entered the sensing range of the smart home, whether the smart watch has established a wireless connection with the smart home; turning on, after the smart watch has established the wireless connection with the smart home, a smart-home-control function of the smart watch; and while controlling the smart home using the smart-home-control function, recognizing hand gestures of the user using the smart watch and controlling the smart home through the wireless connection to switch current working state of the smart home based on the recognized hand gestures of the user.

Method for Using a Physical Object to Manipulate a Corresponding Virtual Object in a Virtual Environment, and Associated Apparatus and Computer Program Product
20180008355 · 2018-01-11 · ·

Systems and methods are provided for planning a procedure. A display device is configured to display a first virtual element. A controller device having a processor is configured to be in communication with the display device, and the controller device is further configured to direct the display device to display the first virtual element. A physical control element is in communication with the controller device, and is configured to correspond to the first virtual element such that an actual manipulation of the control element is displayed, via the processor of the controller device and on the display device, as a corresponding response of the first virtual element to the actual manipulation of the control element. Associated systems, methods, and computer program products are also provided.

WEARABLE DEVICE, AND METHOD OF INPUTTING INFORMATION USING THE SAME

Disclosed is a wearable device including a sensor array having a plurality of sensors each configured to detect a physical change in epidermis of a corresponding body area; and a body motion determination unit configured to determine movement of a body part based on sensing signals from the plurality of sensors, and determine whether the determined movement corresponds to one of at least one next motion which is able to be derived from a current motion state.

WEARABLE DEVICE, AND METHOD OF INPUTTING INFORMATION USING THE SAME

Disclosed is a wearable device including a sensor array having a plurality of sensors each configured to detect a physical change in epidermis of a corresponding body area; and a body motion determination unit configured to determine movement of a body part based on sensing signals from the plurality of sensors, and determine whether the determined movement corresponds to one of at least one next motion which is able to be derived from a current motion state.