G06F3/014

Force-feedback gloves for a surgical robotic system
11712315 · 2023-08-01 · ·

A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.

Remote control of a device via a virtual interface
11567571 · 2023-01-31 · ·

A technique for controlling a target device via a virtual interface, the method including receiving a control input from a virtual interface, determining a control signal based on the control input for an operation to be performed by a first device that is configured to be controlled wirelessly, and transmitting the control signal to the first device to cause the first device to perform the operation.

Self-tracked controller

The disclosed system may include a housing dimensioned to secure various components including at least one physical processor and various sensors. The system may also include a camera mounted to the housing, as well as physical memory with computer-executable instructions that, when executed by the physical processor, cause the physical processor to: acquire images of a surrounding environment using the camera mounted to the housing, identify features of the surrounding environment from the acquired images, generate a map using the features identified from the acquired images, access sensor data generated by the sensors, and determine a current pose of the system in the surrounding environment based on the features in the generated map and the accessed sensor data. Various other methods, apparatuses, and computer-readable media are also disclosed.

Augmented reality object manipulation

A processing system having at least one processor may detect a first object in a first video of a first user and detect a second object in a second video of a second user, where the first video and the second video are part of a visual communication session between the first user and the second user. The processing system may further detect a first action in the first video relative to the first object, detect a second action in the second video relative to the second object, detect a difference between the first action and the second action, and provide a notification indicative of the difference.

Information processing apparatus, method for processing information, and program

Provided are an information processing apparatus, a method for processing information, and a program capable of adjusting a perceptual position of a tactile stimulus even with one tactile stimulus unit. The information processing apparatus (20) includes an output control unit (120) that performs output control of a vibration on at least one tactile stimulus unit (130), in which the output control unit changes a frequency of the vibration to be output from the tactile stimulus unit depending on a position of the tactile stimulus unit and predetermined positional information.

ERGONOMICS IMPROVEMENT SYSTEMS HAVING WEARABLE SENSORS AND RELATED METHODS
20230021704 · 2023-01-26 ·

Ergonomics improvement systems having wearable sensors and related methods. An example ergonomics improvement system includes an encoder system to couple to a limb of a body. The encoder sensor system to generate first outputs in response to movement of the limb relative to the body to determine a position of the limb relative to the body. The system includes a load sensor to generate a second output representative of a load carried by the body and a position sensor to generate a third output representative of a position of a right foot of the body relative to a position of a left foot of the body.

AUGMENTED REALITY ARTIFICIAL INTELLIGENCE ENHANCE WAYS USER PERCEIVE THEMSELVES
20230025585 · 2023-01-26 ·

Methods and systems are provided for generating augmented reality (AR) scenes where the AR scenes can be adjusted to modify at least part of an image of the physical features of a user to produce a virtual mesh of the physical features. The method includes generating an augmented reality (AR) scene for rendering on a display for a user wearing AR glasses, the AR scene includes a real-world space and virtual objects overlaid in the real-world space. The method includes analyzing a field of view into the AR scene from the AR glasses; the analyzing is configured to detect images of physical features of the user when the field of view is directed toward at least part of said physical features of the user. The method includes adjusting the AR scene, in substantial real-time, to modify at least part of the images of the physical features of the user when the physical features of the user are detected to be in the AR scene as viewed from the field of view of the AR glasses, wherein said modifying includes detecting depth data and original texture data from said physical features to produce a virtual mesh of said physical features; the virtual mesh is changed in size and shape and rendered using modified texture data that blends with said original texture data. In one embodiment, the modified physical features of the user appear to the user when viewed via the AR glasses as existing in the real-world space. In this way, when the physical features of a user are detected to be in the AR scene, the physical features are augment in the AR scene which can result in the self-perception of the user improving which in turn can provide the user with confidence to overcome challenging tasks or obstacles during the gameplay of the user.

USER INTERFACE FOR MANIPULATING USER INTERFACE OBJECTS

User interface navigation on a personal electronics device based on movements of a crown is disclosed. The device can select an appropriate level of information arranged along a z-axis for display based on crown movement. The navigation can be based on an angular velocity of the crown.

Smart Gloves
20230028639 · 2023-01-26 ·

The invention discloses a smart glove, which relates to a smart glove that can be used indoors and outdoors. It includes a glove body. The glove body includes a wrist, a palm, and a finger. On the back of the wrist, there are a switch button, a charging interface, and a control. The control box contains a rechargeable lithium battery and a controller, and the charging interface is electrically connected to the lithium battery in the control box; an LED display is installed on the back of the palm, and the inside of the LED display is electrically connected to the driver of the LED display; The back finger part can wrap all fingers or part of the fingers to protect the skin; on the one hand, the present invention can serve as a warning when the user is walking or riding, improving travel safety, and on the other hand. Play a role in display and publicity.

SYSTEMS AND METHODS FOR DEVELOPING BRAIN COMPUTER INTERFACE

Systems, methods, and protocols for developing invasive brain computer interface (iBCI) decoders non-invasively by using emulated brain data are provided. A human operator can interact in real-time with control algorithms designed for iBCI. An operator can provide input to one or more computer models (e.g., via body gestures), and this process can generate emulated brain signals that would otherwise require invasive brain electrodes to obtain.