Patent classifications
G06F3/0325
METHOD AND APPARATUS FOR VIRTUALIZING A COMPUTER ACCESSORY
A system that incorporates the subject disclosure may include, for example, device may perform operations for receiving a signal from an operational space associated with an accessory device, determining hand feature information according to the signal, accessing device location information associated with the computing accessory device, generating hand-device interaction information according to the hand feature information and device location information associated with the accessory device, and transmitting the hand-device interaction information to a virtual reality system, wherein the virtual reality system generates, according to the hand-device interaction information, a virtual hand and a virtual accessory device in a virtual reality image. Additional embodiments are disclosed.
Display apparatus
Methods and devices for a display apparatus. In one aspect, a display apparatus includes a display device including a transparent layer, a display integrated circuit layer including one or more display control circuits, and a shielding layer between the transparent layer and the display integrated circuit layer, a near-infrared (NIR) light source and a visible light source, and a detector device including a detector integrated circuit layer including one or more detector control circuits, where a surface of the detector device contacts a surface of the display device, and a photodetector electrically coupled to at least one detector control circuit and including a detection region positioned to receive NIR light propagating from a front side of the display device to a back side of the display device along a path, where the shielding layer includes a filter region positioned in the path.
STEREOSCOPIC DISPLAY
A direct interaction stereoscopic display system that produces an augmented or virtual reality environment. The system comprises one or more displays, a beam combiner, and a mirrored surface to virtually project high-resolution flicker-free stereoscopic 3D imagery into a graphics volume in an open region. Viewpoint tracking is provided enabling motion parallax cues. A user interaction volume co-inhabits the graphics volume and a precise low-latency sensor allows users to directly interact with 3D virtual objects or interfaces without occluding the graphics. An adjustable support frame permits the 3D imagery to be readily positioned in situ with real environments for augmented reality applications. Individual display components may be adjusted to precisely align the 3D imagery with components of real environments for high-precision applications and also to match accommodation-vergence distances to prevent eye strain. The system's modular design and adjustability allows display panel pairs of various sizes and models to be installed.
Non-mechanical beam steering for depth sensing
A depth camera assembly (DCA) for depth sensing of a local area. The DCA includes a transmitter, a receiver, and a controller. The transmitter illuminates a local area with outgoing light in accordance with emission instructions. The transmitter includes a fine steering element and a coarse steering element. The fine steering element deflects one or more optical beams at a first deflection angle to generate one or more first order deflected scanning beams. The coarse steering element deflects the one or more first order deflected scanning beams at a second deflection angle to generate the outgoing light projected into the local area. The receiver captures one or more images of the local area including portions of the outgoing light reflected from the local area. The controller determines depth information for one or more objects in the local area based in part on the captured one or more images.
Electronic system with gesture processing mechanism and method of operation thereof
An electronic system includes a control unit, configured to identify a first sensor reading for capturing a gesture directed at a display interface using a first range profile; identify a second sensor reading for capturing the gesture directed at the display interface using a second range profile; calculate a blended position indicator based on the first sensor reading, the second sensor reading, or a combination thereof; and a communication interface, coupled to the control unit, configured to communicate the blended position indicator by generating a cursor at the blended position indicator.
METHOD FOR SIMULATING AND CONTROLLING VIRTUAL SPHERE IN A MOBILE DEVICE
The present invention discloses a virtual sphere simulation and control method for a mobile device, comprising: acquiring images using an image acquisition component of the mobile device, so as to acquire a sequence of continuous images; analyzing contents of the images; in the sequence of the acquired images, according to a certain rule, carrying out the interaction between a virtual sphere and the contents of the images as well as a user; and displaying a result of interaction on a screen of the mobile device.
VIRTUAL OBJECT DISPLAY DEVICE, METHOD, PROGRAM, AND SYSTEM
A camera 14 acquires a background video image B0. A virtual object acquisition unit 22 acquires a virtual object S0. A display information acquisition unit 23 acquires, from the background video image B0, display information representing a position at which the virtual object S0 is to be displayed. A display control unit 24 displays the virtual object S0 on a display 15 on the basis of the display information. A change information acquisition unit 25 acquires, from the background video image B0, change information used to change a display state of the virtual object S0. A display state changing unit 26 changes the display state of the virtual object in accordance with the change information. A set amount display control unit 27 displays on the display 15 information representing a set amount of the display state of the virtual object S0.
SYSTEM FOR DETECTING SIX DEGREES OF FREEDOM OF MOVEMENT BY TRACKING OPTICAL FLOW OF BACKSCATTERED LASER SPECKLE PATTERNS
Augmented reality headgear includes transparent displays that allow a user to simultaneously view the real world and virtual content positioned in the real world and further includes at least one source of coherent light and at least one sensor array for sensing, at a series of times, speckle patterns produced when the coherent light impinges environment surfaces. Circuitry is provided for sensing shifts in the speckle pattern and determining motion which caused the shift of the speckle pattern and adjusting the display of virtual objects displayed by the augmented reality headgear to compensate for the motion.
INFORMATION PROCESSING APPARATUS, CONTROL METHOD FOR THE INFORMATION PROCESSING APPARATUS, AND STORAGE MEDIUM
An information processing apparatus includes an image obtaining unit configured to obtain an image obtained while imaging is performed by pointing an imaging unit towards a target surface, a distance obtaining unit configured to obtain, with regard to a plurality of areas constituting the image, information equivalent to distances from a position corresponding to a reference to surfaces to be imaged in the respective areas, and a recognition unit configured to use the information obtained by the distance obtaining unit with regard to a first area where a predetermined region of an object is imaged in one of the images obtained by the image obtaining unit and the information obtained by the distance obtaining unit with regard to a second area that is a part of the image and in contact with a surrounding of the first area to recognize an input state to the target surface by the object.
System and method for measuring tracker system accuracy
The present invention relates to a simple and effective system and method for measuring camera based tracker system accuracy, especially for a helmet-mounted tracker system, utilizing Coordinate Measuring Machine (CMM). The method comprises the steps of; computing spatial relation between tracked object and calibration pattern using CMM; computing relation between reference camera and tracker camera; computing relation between reference camera and calibration pattern; computing ground truth relation between tracker camera and tracked object; obtaining actual tracker system results; comparing these results with the ground truth relations and finding accuracy of the tracker system; recording accuracy results; testing if the accuracy results is a new calculation required. The system comprises; a reference camera; a calibration pattern visible by reference camera; a camera spatial relation computation unit; a relative spatial relation computation unit a memory unit; a spatial relation comparison unit.