G06F3/0426

DEVICE, COMPUTER PROGRAM AND METHOD

A device for authenticating a user is described that comprises a sensor configured to measure the movement of a user in response to the interaction of the user with a displayed image and controller circuitry configured to authenticate the user in response to a positive comparison between the movement of the user and a stored movement associated with the user.

Lighting apparatus

A lighting apparatus with an image-projecting function that is convenient for a user is provided. It includes: an illuminating unit that emits illumination light; a projection-type image display unit that emits image-projecting emission light for projecting an image; and a sensor that emits operation-detecting light for operation detection and that is capable of detecting an operation by an operation object in a range including an image projection area of the projection-type image display unit, and is configured so that the image-projecting light, and the operation-detecting emission light have respective different wavelength distribution characteristics, and regarding a light amount in the wavelength range of light used by the sensor for the operation detection, a light amount of the operation-detecting light is the largest among those of the illumination light, the image-projecting emission light, and the operation-detecting light.

Hand Presence Over Keyboard Inclusiveness
20230131667 · 2023-04-27 ·

A method includes accessing an image of a physical environment of a user, the image depicting a physical input device and a physical hand of the user, determining that a contrast between the physical input device and the physical hand depicted in the image is lower than a predetermined threshold, modifying the image to increase the contrast, determining a pose of the physical input device, generating a three-dimensional model representing the physical hand of the user, generating an image mask by projecting the three-dimensional model onto an image plane, generating, a cropped image depicting at least the physical hand of the user in the image, rendering, based on the perspective of the user and the pose of the physical input device, a virtual input device to represent the physical input device, and displaying the cropped image depicting at least the physical hand over the rendered virtual input device.

ELECTRONIC DEVICE FOR OBTAINING USER INPUT THROUGH VIRTUAL KEYBOARD AND METHOD OF OPERATING THE SAME
20230070539 · 2023-03-09 ·

An electronic device is provided and includes a camera, a display, and a processor. The processor may be configured to control the display to display a second image determined based on a posture of the electronic device from a first image obtained using the camera, detect a finger object from the second image, identify a first virtual key corresponding to the finger object from a virtual keyboard based on a position and a degree of bending of the finger object, and identify an input of the first virtual key based on detecting a typing movement of the finger object. Other embodiments may also be possible.

Projecting content onto water to enhance computer simulation

Content is projected onto the surface of water such that it can be viewed from either above or below the surface. Using distortion mapping, depth sensing, and de-noising, the image can remain unaffected by ripples and allow the user to interact within the body of liquid and/or use it as input. Liquids of different density can be layered on the surface to create different refraction planes. Water currents and jets can be used to actively reduce ripples from user interaction, as well as actively adding or removing liquid from the container to counteract displacement or create effect.

SYSTEMS AND METHODS OF FREE-SPACE GESTURAL INTERACTION

During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its contact with a “virtual touch plane or surface” (i.e., a plane, portion of a plane, and/or surface computationally defined in space, or corresponding to any physical surface).

KEYBOARD
20230111769 · 2023-04-13 ·

The present disclosure provides a keyboard, including: a keyboard body; at least one physical key disposed on the keyboard body; at least one virtual key projection device connected to the keyboard body and configured to present, by means of projection, an image of at least one virtual key in a projection area outside the keyboard body; and a click detection device configured to detect whether a finger clicks on the projection area and determine on which virtual key the finger clicks in response to that the finger clicks on the projection area.

Methods, devices, and systems for displaying a user interface on a user and detecting touch gestures

An example method of identifying a touch gesture on a user is provided. The method includes receiving, by one or more transducers of a wearable device, a set of signals that establish a signal pathway to the wearable device. The method also includes, while receiving the set of signals, determining baseline characteristics for the signal pathway, and sensing a change in the baseline characteristics caused by user interaction with an affordance of a user interface projected or perceived on the user's appendage. The method further includes, in accordance with a determination that the sensed change in the baseline characteristics satisfies a contact criterion, reporting a candidate touch event on the user's appendage to a separate electronic device that creates the user interface or is in communication with another electronic device that creates the user interface.

Image Projection Device
20230071534 · 2023-03-09 ·

An image projection device which can correctly discern content of touch operation when a user performs various kinds of touch operation on an image projected on a projection screen is provided. An imaging unit is configured to image the image projected on the projection screen and acquire image data. A reference data generating unit is configured to generate reference data specifying a position and a size of the image projected on the projection. An image data extracting unit is configured to extract image data in which a finger or a pointer with which a user performs operation on the image projected on the projection screen exists is focused in the image data obtained by the imaging unit. A position data generating unit is configured to generate position data for specifying a position of the finger or the pointer in the imaging range of the imaging unit.

Obfuscated control interfaces for extended reality

Systems, methods, and non-transitory media are provided for generating obfuscated control interfaces for extended reality (XR) experiences. An example method can include determining a pose of an XR device within a mapped scene of a physical environment associated with the XR device; rendering a virtual control interface within the mapped scene according to a configuration including a first size, a first position relative to the pose of the XR device, a first ordering of input elements, and/or a first number of input elements; and adjusting the configuration of the virtual control interface based on a privacy characteristic of data associated with the virtual control interface and/or characteristics of the physical environment associated with the XR device, the adjusted configuration including a second size, a second ordering of input elements, a second number of input elements, and/or a second position relative to the pose of the XR device and/or first position.