Patent classifications
G06F3/038
ELECTRONIC DEVICE AND SCREEN DISPLAY METHOD THEREOF
An electronic device includes a communicator configured to perform communication with an external device, a display configured to display a UI (User Interface) element in a screen, and a processor. The processor receives through the communicator, touch panel information of the external device and first data according to a first input of a user detected on the touch panel of the external device, and changes a location of the UI element displayed on the screen based on the touch panel information and the first data.
ELECTRONIC DEVICE, WEARABLE DEVICE, AND METHOD FOR CONTROLLING SCREEN OF ELECTRONIC DEVICE
A method for controlling a screen in an electronic device is provided. The method includes receiving, from a first external device, rotation input information entered by rotation of a rotation input device included in the first external device, and controlling an object displayed on a display based on the received rotation input information.
ELECTRONIC DEVICE, WEARABLE DEVICE, AND METHOD FOR CONTROLLING SCREEN OF ELECTRONIC DEVICE
A method for controlling a screen in an electronic device is provided. The method includes receiving, from a first external device, rotation input information entered by rotation of a rotation input device included in the first external device, and controlling an object displayed on a display based on the received rotation input information.
Pose estimation using electromagnetic tracking
Head-mounted augmented reality (AR) devices can track pose of a wearer's head to provide a three-dimensional virtual representation of objects in the wearer's environment. An electromagnetic (EM) tracking system can track head or body pose. A handheld user input device can include an EM emitter that generates an EM field, and the head-mounted AR device can include an EM sensor that senses the EM field. EM information from the sensor can be analyzed to determine location and/or orientation of the sensor and thereby the wearer's pose. An improved or optimized pose can be provided by reverse-estimating a reverse EM measurement matrix and optimizing the pose based on a comparison between the reverse EM measurement matrix and an EM measurement matrix measured by the EM sensor.
Synchronizing Augmented or Virtual Reality (AR/VR) Applications with Companion Device Interfaces
An augmented reality or virtual reality (AR/VR) device pairs with a companion device to augment input interfaces associated with an AR/VR application at the AR/VR device. In implementations, an AR/VR device determines a portion of a markup file that corresponds to an AR/VR scene of a plurality of AR/VR scenes in an AR/VR environment, and communicates the portion of the markup file to the companion device to cause the companion device to configure a companion user interface associated with initiating an action as part of the AR/VR scene. In response to receiving user input via the companion user interface, the companion device communicates the action to the AR/VR device to initiate the action. The AR/VR device receives input data from the companion device, and initiates the action for the AR/VR scene.
Synchronizing Augmented or Virtual Reality (AR/VR) Applications with Companion Device Interfaces
An augmented reality or virtual reality (AR/VR) device pairs with a companion device to augment input interfaces associated with an AR/VR application at the AR/VR device. In implementations, an AR/VR device determines a portion of a markup file that corresponds to an AR/VR scene of a plurality of AR/VR scenes in an AR/VR environment, and communicates the portion of the markup file to the companion device to cause the companion device to configure a companion user interface associated with initiating an action as part of the AR/VR scene. In response to receiving user input via the companion user interface, the companion device communicates the action to the AR/VR device to initiate the action. The AR/VR device receives input data from the companion device, and initiates the action for the AR/VR scene.
REDUNDANT TRACKING SYSTEM
A redundant tracking system comprising multiple redundant tracking sub-systems, enabling seamless transitions between such tracking sub-systems, provides a solution to this problem by merging multiple tracking approaches into a single tracking system. This system is able to combine tracking objects with six degrees of freedom (6DoF) and 3DoF through combining and transitioning between multiple tracking systems based on the availability of tracking indicia tracked by the tracking systems. Thus, as the indicia tracked by any one tracking system becomes unavailable, the redundant tracking system seamlessly switches between tracking in 6DoF and 3DoF thereby providing the user with an uninterrupted experience.
REDUNDANT TRACKING SYSTEM
A redundant tracking system comprising multiple redundant tracking sub-systems, enabling seamless transitions between such tracking sub-systems, provides a solution to this problem by merging multiple tracking approaches into a single tracking system. This system is able to combine tracking objects with six degrees of freedom (6DoF) and 3DoF through combining and transitioning between multiple tracking systems based on the availability of tracking indicia tracked by the tracking systems. Thus, as the indicia tracked by any one tracking system becomes unavailable, the redundant tracking system seamlessly switches between tracking in 6DoF and 3DoF thereby providing the user with an uninterrupted experience.
METHOD AND SYSTEM FOR DETERMINING A CORRECT REPRODUCTION OF A MOVEMENT
Method for determining a correct reproduction of a movement of a target based on a plurality of orientations thereof at different time instants at least including first and second time instants, the second time instant being posterior to the first time instant, the movement being defined by at least a first predetermined constraint, the first predetermined constraint being defined for first and second orientations of the plurality of orientations and defined by a start angle, an end angle and a first plane definition, comprising: providing a first plane and a second plane, each defined by the first plane definition, corresponding to the first and second time instants, respectively; providing a first pair of vectors by projecting the first orientation and the second orientation, corresponding to the first time instant, onto the first plane; providing a second pair of vectors by projecting the first orientation and the second orientation, corresponding to the second time instant, onto the second plane; computing first and second angles between the pair of vectors of the first and second pairs of vectors, respectively; and determining the correct reproduction of the movement if: the first angle is equal to or less than the start angle, and the second angle is equal to or greater than the end angle.
Sensors for Electronic Finger Devices
A system may include one or more finger-mounted devices such as finger devices with U-shaped housings configured to be mounted on a user's fingers while gathering sensor input and supplying haptic output. The sensors may include strain gauge circuitry mounted on elongated arms of the housing. When the arms move due to finger forces, the strain gauge circuitry can measure the arm movement. The sensors may also include ultrasonic sensors. An ultrasonic sensor may have an ultrasonic signal emitter and a corresponding ultrasonic signal detector configured to detect the ultrasonic signals after passing through a user's finger. A two-dimensional ultrasonic sensor may capture ultrasonic images of a user's finger pad. Ultrasonic proximity sensors may be used to measure distances between finger devices and external surfaces. Optical sensors and other sensors may also be used in the finger devices.