Patent classifications
G06F3/0308
HANDHELD ELECTRONIC DEVICE
A handheld electronic device including a holding assembly and a tracking assembly is provided. The tracking assembly includes a main body and a plurality of trackers. The main body is connected to the holding assembly and has an inner surface and an outer surface. The trackers are disposed on the inner surface and the outer surface of the main body. The trackers arranged on the outer surface and a part of the trackers arranged on the inner surface are exposed to the outside in a top view direction and are arranged in interleaving.
LIGHT CAPTURE DEVICE
In some implementations, an apparatus may include a housing enclosing a circuitry may include a processor and a memory, the housing forming a handgrip. In addition, the apparatus may include a plurality of light sensors arranged in a particular configuration, each of the plurality of light sensors coupled to an exterior the housing via a sensor arm. Also, the apparatus may include one or more controls mounted on the exterior of the housing and electrically coupled to the circuitry. The apparatus can include one or more antenna mounted on an exterior of the housing; and a transmitter connected to the circuitry and electrically connected to the one or more antenna to send data from the apparatus via a wireless protocol. The apparatus can include an electronic device for mounting an electronic device to the housing, the electronic device configured to execute an application for an immersive content generation system.
Controller movement tracking with light emitters
A head-worn computer includes a camera system positioned to capture a surrounding environment in front of a user, a processor that identifies a position of a plurality of light emitters mounted on a hand-held controller from images captured by the camera system and tracks the position of the plurality of light emitters as the hand-held controller moves in the surrounding environment and interprets the tracked position as positional changes of the hand-held controller. The processor uses the position of the plurality of light emitters as markers in three dimensional space, the markers used as an anchor for virtual content presented in a see-through display of the head-worn computer.
SENSORY STIMULI DEVICES FOR THE VISUALLY IMPAIRED
A method for improving spatial awareness of a user during drawing includes obtaining, using one or more sensors, position information of a drawing instrument relative to a drawing canvas. The method includes determining, using one or more processors, a first vibration intensity when the drawing instrument is within a first region of the drawing canvas based on the obtained position information. The method includes determining, using the one or more processors, a second vibration intensity when the drawing instrument is within a second region of the drawing canvas based on the obtained position information. The second vibration intensity can be larger than the first vibration intensity and the second region can be distinct from the first region. The method includes generating, by a sensory device, a vibration based on one or both of the first vibration intensity or the second vibration intensity.
Reference position setting method and operation detection device for displaying an operation surface as a virtual image
A reference position setting method includes: a process of displaying at least three markers on an operation surface; a process of acquiring coordinate values of a sensor coordinate system; a process of transforming the acquired coordinate value into coordinate values of a temporary coordinate system; and a process of transforming the transformed coordinate values into coordinate values of a screen coordinate system. At least one of parallel movement and rotation is performed with respect to the sensor coordinate system to transform the sensor coordinate system into the temporary coordinate system. Movement of the temporary coordinate system in a direction parallel to a plane including a second X-axis and a second Y-axis, and enlargement or reduction of the temporary coordinate system are performed to transform the temporary coordinate system into the screen coordinate system.
Systems and methods for determining projected target location of a handheld object
A projected target location of a handheld object is determined based on applying translation factors, scaling factors, and offsets to a location of a reference element of the handheld object detected by a camera on a two-dimensional plane. The translation factors are determined based on a difference between a calibration location on the plane and an initial location of the reference element corresponding to the calibration location, and serve to shift the location of the reference element to generate the projected target location. The scaling factors are determined based on an estimated length of a user's arm holding the handheld object, and serve to scale the location of the reference element to generate the projected target location. The offsets are determined based on polynomial equations, and serve to extend the distance between the projected target location and the calibration location.
Light emitting apparatus recognition system and light emitting apparatus
A light emitting apparatus recognition system includes a light emitting apparatus, an image capturing apparatus, a recognition apparatus, and an electromagnetic wave emitting element. The light emitting apparatus includes a light source that blinks on the basis of identification information unique to the light emitting apparatus and an electromagnetic wave receiving element. The image capturing apparatus captures an image of light emitted by the light source of the light emitting apparatus. The recognition apparatus recognizes the light emitting apparatus on the basis of the light appearing in the image captured by the image capturing apparatus. The electromagnetic wave emitting element emits an electromagnetic wave from the image capturing apparatus or a vicinity of the image capturing apparatus. The electromagnetic wave receiving element receives the electromagnetic wave emitted by the electromagnetic wave emitting element.
Optimizing head mounted displays for augmented reality
While many augmented reality systems provide “see-through” transparent or translucent displays upon which to project virtual objects, many virtual reality systems instead employ opaque, enclosed screens. Indeed, eliminating the user's perception of the real-world may be integral to some successful virtual reality experiences. Thus, head mounted displays designed exclusively for virtual reality experiences may not be easily repurposed to capture significant portions of the augmented reality market. Various of the disclosed embodiments facilitate the repurposing of a virtual reality device for augmented reality use. Particularly, by anticipating user head motion, embodiments may facilitate scene renderings better aligned with user expectations than naïve renderings generated within the enclosed field of view. In some embodiments, the system may use procedural mapping methods to generate a virtual model of the environment. The system may then use this model to supplement the anticipatory rendering.
BRIGHTNESS ADJUSTMENT METHOD AND HMD DEVICE
An electronic device is provided. The electronic device includes at least one processor, and a head-mounted display (HMD) device wirelessly connecting to the at least one processor. The at least one processor includes a first communication module, a light-emitting device, and a first processor for controlling the brightness of the light-emitting device based on a brightness adjustment signal received from the HMD device. The HMD device includes a second communication module, a sensor for measuring external illuminance, and a second processor for acquiring the external illuminance measured by the sensor, and, based on the obtained external illuminance, transmitting the brightness adjustment signal, for adjusting the brightness of the light-emitting device of the at least one processor, to the at least one processor by using the second communication module.
Systems and methods for augmented reality
Disclosed is a method of localizing a user operating a plurality of sensing components, preferably in an augmented or mixed reality environment, the method comprising transmitting pose data from a fixed control and processing module and receiving the pose data at a first sensing component, the pose data is then transformed into a first component relative pose in a coordinate frame based on the control and processing module. A display unit in communication with the first sensing component is updated with the transformed first component relative pose to render virtual content with improved environmental awareness.