Patent classifications
G01P13/00
EAR CANAL DEFORMATION BASED CONTINUOUS USER IDENTIFICATION SYSTEM USING EAR WEARABLES
Disclosed herein is a system and methods for ear canal deformation based user authentication using in-ear wearables. This system provides continuous and passive user authentication and is transparent to users. It leverages ear canal deformation that combines the unique static geometry and dynamic motions of the ear canal when the user is speaking for authentication. It utilizes an acoustic sensing approach to capture the ear canal deformation with the built-in microphone and speaker of the in-ear wearable. Specifically, it first emits well-designed inaudible beep signals and records the reflected signals from the ear canal. It then analyzes the reflected signals and extracts fine-grained acoustic features that correspond to the ear canal deformation for user authentication.
EAR CANAL DEFORMATION BASED CONTINUOUS USER IDENTIFICATION SYSTEM USING EAR WEARABLES
Disclosed herein is a system and methods for ear canal deformation based user authentication using in-ear wearables. This system provides continuous and passive user authentication and is transparent to users. It leverages ear canal deformation that combines the unique static geometry and dynamic motions of the ear canal when the user is speaking for authentication. It utilizes an acoustic sensing approach to capture the ear canal deformation with the built-in microphone and speaker of the in-ear wearable. Specifically, it first emits well-designed inaudible beep signals and records the reflected signals from the ear canal. It then analyzes the reflected signals and extracts fine-grained acoustic features that correspond to the ear canal deformation for user authentication.
Projection Scanning System
Imaging systems projecting augmented information on a physical object that at a minimum include a processor, a memory device operably connected to the processor, a projector operably coupled to the processor, and a distance-measuring device operably connected to the processor. The memory device stores augmented image information, and the processor is configured to project augmented image information onto the physical object. The distance-measuring device is configured to measure the distance to the physical object. The processor uses distance measurement information from the distance measuring device to adjust scaling of the augmented image information. The processor provides the scale adjusted augmented image information to the projector. System can also be used for fluorescence imaging during open surgery, for endoscopic fluorescence imaging and for registration of surgical instruments.
Projection Scanning System
Imaging systems projecting augmented information on a physical object that at a minimum include a processor, a memory device operably connected to the processor, a projector operably coupled to the processor, and a distance-measuring device operably connected to the processor. The memory device stores augmented image information, and the processor is configured to project augmented image information onto the physical object. The distance-measuring device is configured to measure the distance to the physical object. The processor uses distance measurement information from the distance measuring device to adjust scaling of the augmented image information. The processor provides the scale adjusted augmented image information to the projector. System can also be used for fluorescence imaging during open surgery, for endoscopic fluorescence imaging and for registration of surgical instruments.
SYSTEM FOR GENERATING A THREE-DIMENSIONAL SCENE OF A PHYSICAL ENVIRONMENT
A system configured to assist a user in scanning a physical environment in order to generate a three-dimensional scan or model. In some cases, the system may include an interface to assist the user in capturing data usable to determine a scale or depth of the physical environment and to perform a scan in a manner that minimizes gaps.
SYSTEM FOR GENERATING A THREE-DIMENSIONAL SCENE OF A PHYSICAL ENVIRONMENT
A system configured to assist a user in scanning a physical environment in order to generate a three-dimensional scan or model. In some cases, the system may include an interface to assist the user in capturing data usable to determine a scale or depth of the physical environment and to perform a scan in a manner that minimizes gaps.
Image data processing method, device and security inspection system based on VR or AR
A method, a device and a security system for image data processing based on VR or AR are disclosed. In one aspect, an example image data processing method includes reconstructing, based on three-dimensional (3D) scanned data of a containing apparatus in which objects are contained, a 3D image of the containing apparatus and the objects contained in the containing apparatus. The reconstructed 3D image is stereoscopically displayed. Manipulation information is determined for at least one of the objects in the containing apparatus based on positioning information and action information of a user. At least a 3D image of the at least one object is reconstructed based on the determined manipulation information. The at least one reconstructed object is presented on the displayed 3D image.
Image data processing method, device and security inspection system based on VR or AR
A method, a device and a security system for image data processing based on VR or AR are disclosed. In one aspect, an example image data processing method includes reconstructing, based on three-dimensional (3D) scanned data of a containing apparatus in which objects are contained, a 3D image of the containing apparatus and the objects contained in the containing apparatus. The reconstructed 3D image is stereoscopically displayed. Manipulation information is determined for at least one of the objects in the containing apparatus based on positioning information and action information of a user. At least a 3D image of the at least one object is reconstructed based on the determined manipulation information. The at least one reconstructed object is presented on the displayed 3D image.
SYSTEM AND METHOD FOR ACTIVITY CLASSIFICATION
One or more computing devices, systems, and/or methods are provided. In an example, a method comprises receiving, by a device, incoming motion data from a motion sensor, generating, by the device, an incoming embedding vector based on the incoming motion data, generating, by the device, a predicted embedding vector based on the incoming embedding vector, assigning, by the device, an activity classification based on the predicted embedding vector, and modifying an operating parameter of the device based on the activity classification.
SYSTEM AND METHOD FOR ACTIVITY CLASSIFICATION
One or more computing devices, systems, and/or methods are provided. In an example, a method comprises receiving, by a device, incoming motion data from a motion sensor, generating, by the device, an incoming embedding vector based on the incoming motion data, generating, by the device, a predicted embedding vector based on the incoming embedding vector, assigning, by the device, an activity classification based on the predicted embedding vector, and modifying an operating parameter of the device based on the activity classification.