Patent classifications
G06F3/033
Systems and methods for controlling virtual scene perspective via physical touch input
Systems, methods, and non-transitory computer readable media for controlling perspective in an extended reality environment are disclosed. In one embodiment, a non-transitory computer readable medium contains instructions to cause a processor to perform the steps of: outputting for presentation via a wearable extended reality appliance (WER-appliance), first display signals reflective of a first perspective of a scene; receiving first input signals caused by a first multi-finger interaction with the touch sensor; in response, outputting for presentation via the WER-appliance second display signals to modify the first perspective of the scene, causing a second perspective of the scene to be presented via the WER-appliance; receiving second input signals caused by a second multi-finger interaction with the touch sensor; and in response, outputting for presentation via the WER-appliance third display signals to modify the second perspective of the scene, causing a third perspective of the scene to be presented via the WER-appliance.
Systems and methods for controlling virtual scene perspective via physical touch input
Systems, methods, and non-transitory computer readable media for controlling perspective in an extended reality environment are disclosed. In one embodiment, a non-transitory computer readable medium contains instructions to cause a processor to perform the steps of: outputting for presentation via a wearable extended reality appliance (WER-appliance), first display signals reflective of a first perspective of a scene; receiving first input signals caused by a first multi-finger interaction with the touch sensor; in response, outputting for presentation via the WER-appliance second display signals to modify the first perspective of the scene, causing a second perspective of the scene to be presented via the WER-appliance; receiving second input signals caused by a second multi-finger interaction with the touch sensor; and in response, outputting for presentation via the WER-appliance third display signals to modify the second perspective of the scene, causing a third perspective of the scene to be presented via the WER-appliance.
Thermopile array fusion tracking
A simultaneous location and mapping (SLAM)-enabled video game system, a user device of the video game system, and a computer-readable storage medium of the user device are disclosed. Generally, the video game system includes a video game console, a plurality of thermal beacons, and a user device communicatively coupled with the video game console. The user device includes a thermopile array, a processor, and a memory. The user device may receive thermal data from the thermopile array, the thermal data corresponding to a thermal signal emitted from a thermal beacon of the plurality of thermal beacons and detected by the thermopile array. The user device may determine, based on the thermal data, its location in 3D space, and then transmit that location to the video game system.
Image display apparatus
The present disclosure relates to an image display apparatus. The image display apparatus includes: a display including an organic light emitting diode panel (OLED panel); and a controller configured to control the display, wherein the controller calculates an Average Picture Level (APL) of an input image, and in response to the calculated APL being greater than or equal to a first reference value in a high-dynamic range (HDR) mode, the controller decreases the APL and perform luminance conversion based on the decreased APL, and in response to the calculated APL being greater than or equal to the first reference value in a normal mode rather than the HDR mode, the controller performs luminance conversion based on the calculated APL. Accordingly, luminance representation may be improved during displaying image.
Image display apparatus
The present disclosure relates to an image display apparatus. The image display apparatus includes: a display including an organic light emitting diode panel (OLED panel); and a controller configured to control the display, wherein the controller calculates an Average Picture Level (APL) of an input image, and in response to the calculated APL being greater than or equal to a first reference value in a high-dynamic range (HDR) mode, the controller decreases the APL and perform luminance conversion based on the decreased APL, and in response to the calculated APL being greater than or equal to the first reference value in a normal mode rather than the HDR mode, the controller performs luminance conversion based on the calculated APL. Accordingly, luminance representation may be improved during displaying image.
Automated physical training system
Systems, methods and computer readable media comprising a virtual exercise board, which is represented by images on the screen of a pad device; wearable devices configured to attach to each shoe of a user and to collect and transmit touch data to the pad device; cameras for tracking movement and calibrating with the data collected by the wearable devices; and computer programs for collecting user data, processing user data, and generating outputs. In embodiments, features include augmented reality; ratings of performance; automated workouts/protocols; real-time progress bar; multi-location database capabilities; and reports.
APPARATUS FOR CONTROLLING CONTENTS OF A COMPUTER-GENERATED IMAGE USING THREE DIMENSIONAL MEASUREMENTS
A method for controlling movement of a computer display cursor based on a point-of-aim of a pointing device within an interaction region includes projecting an image of a computer display to create the interaction region. At least one calibration point having a predetermined relation to said interaction region is established. A pointing line is directed to substantially pass through the calibration point while measuring a position of and an orientation of the pointing device. The pointing line has a predetermined relationship to said pointing device. Movement of the cursor is controlled within the interaction region using measurements of the position of and the orientation of the pointing device.
Finger devices with adjustable housing structures
A finger device may be worn on a user's finger and may serve as a controller for a head-mounted device or other electronic device. The finger device may have a housing having an upper housing portion that extends across a top of the finger and first and second side housing portions that extend down respective first and second sides of the finger. Sensors in the side housing portions may measure movements of the sides of the finger as the finger contacts an external surface. To ensure that the sensors are appropriately positioned relative to the sides of the finger, the housing may include one or more adjustable structures such as an elastomeric band, a drawstring, a ratchet mechanism, a scissor mechanism, and/or other adjustable structures for adjusting the position of the first and second side housing portions and associated sensors relative to the upper housing portion.
Finger devices with adjustable housing structures
A finger device may be worn on a user's finger and may serve as a controller for a head-mounted device or other electronic device. The finger device may have a housing having an upper housing portion that extends across a top of the finger and first and second side housing portions that extend down respective first and second sides of the finger. Sensors in the side housing portions may measure movements of the sides of the finger as the finger contacts an external surface. To ensure that the sensors are appropriately positioned relative to the sides of the finger, the housing may include one or more adjustable structures such as an elastomeric band, a drawstring, a ratchet mechanism, a scissor mechanism, and/or other adjustable structures for adjusting the position of the first and second side housing portions and associated sensors relative to the upper housing portion.
REMOTE CONTROL
A remote control having at least one multifunction button, to which force is applied in one direction by magnets and counter-magnets or mechanical springs, and to which journals with thickened heads are attached, wherein the thickened heads through interaction with bearing shells of a supporting plate serve as centering, guide, and movement stop of the multifunction button.