Patent classifications
H04N2013/0096
Systems and methods for enhancing machine vision object recognition through accumulated classifications
The disclosed technology includes systems and methods for enhancing machine vision object recognition based on a plurality of captured images and an accumulation of corresponding classification analysis scores. A method is provided for capturing, with a camera of a mobile computing device, a plurality of images, each image of the plurality of images comprising a first object. The method includes processing, with a classification module comprising a trained neural network processing engine, at least a portion of the plurality of images. The method includes generating, with the classification module and based on the processing, one or more object classification scores associated with the first object. The method includes accumulating, with an accumulating module, the one or more object classification scores. And responsive to a timeout or an accumulated score exceeding a predetermined threshold, the method includes outputting classification information of the first object.
METHOD AND APPARATUS FOR CALIBRATING PARAMETER OF THREE-DIMENSIONAL (3D) DISPLAY APPARATUS
Disclosed is a method and apparatus for calibrating parameters of a three-dimensional (3D) display apparatus, the method including acquiring a first captured image of a 3D display apparatus displaying a first pattern image, adjusting a first parameter set of the 3D display apparatus based on the first captured image, acquiring a second captured image of the 3D display apparatus displaying a second pattern image based on the adjusted first parameter set, and adjusting a second parameter set of the 3D display apparatus based on the second captured image.
USING DYNAMIC VISION SENSORS FOR MOTION DETECTION IN HEAD MOUNTED DISPLAYS
Systems and methods may provide for using one or more stereoscopic devices for gesture control and tracking in a head mounted display (HMD). Each of the one or more stereoscopic devices may include a pair of dynamic vision sensors (DVS) arranged to have complementary fields of view (FOV) to detect one or more of a leading and a trailing edge of an object moving in a range or area to be detected. The DVS sensors determine how the object is moving based on a change in light intensity of pixels without having to detect, transfer, or process all of the pixels. The system and method may thereby provide motion tracking and gesture control functions at very low latency and processing bandwidth.
Microscopy method and apparatus for optical tracking of emitter objects
Microscopy method and apparatus for determining the positions of emitter objects in a three-dimensional space that comprises focusing scattered light or fluorescent light emitted by an emitter object, separating the focused beam in a first and a second optical beam, directing the first and the second optical beam through a varifocal lens having an optical axis such that the first optical beam impinges on the lens along the optical axis and the second beam impinges decentralized with respect to the optical axis of the varifocal lens, simultaneously capturing a first image created by the first optical beam and a second image created by the second optical beam, and determining the relative displacement of the position of the object in the first and in the second image, wherein the relative displacement contains the information of the axial position of the object along a perpendicular direction to the image plane.
Information processing apparatus, information processing system, information processing method, and storage medium for embedding time stamped information in an image
An information processing apparatus includes an acquisition unit configured to acquire image data, a generation unit configured to generate information about time as generated additional data, and a replacement unit configured to replace data at a plurality of pixel positions of the acquired image data with the generated additional data.
Faster state transitioning for continuous adjustable 3Deeps filter spectacles using multi-layered variable tint materials
An electrically controlled spectacle includes a spectacle frame and optoelectronic lenses housed in the frame. The lenses include a left lens and a right lens, each of the optoelectrical lenses having a plurality of states, wherein the state of the left lens is independent of the state of the right lens. The electrically controlled spectacle also includes a control unit housed in the frame, the control unit being adapted to control the state of each of the lenses independently.
Method and apparatus for calibrating parameter of three-dimensional (3D) display apparatus
Disclosed is a method and apparatus for calibrating parameters of a three-dimensional (3D) display apparatus, the method including acquiring a first captured image of a 3D display apparatus displaying a first pattern image, adjusting a first parameter set of the 3D display apparatus based on the first captured image, acquiring a second captured image of the 3D display apparatus displaying a second pattern image based on the adjusted first parameter set, and adjusting a second parameter set of the 3D display apparatus based on the second captured image.
SENSOR FUSION AUGMENTED REALITY EYEWEAR DEVICE
An augmented reality eyewear device to operate augmented reality applications and provides a wide-angle field view, is disclosed. The eyewear device comprises a frame which is associated with a processor, a sensor assembly, a camera assembly, and a user interface control assembly coupled to the processor. The sensor assembly coupled to the processor comprises at least two inertial measurement unit (IMU) sensor to transmit raw IMU data of at least one IMU sensor and an android connected IMU data of at least one IMU sensor. The camera assembly coupled to the processor comprises at least two wide angle cameras synchronized with one another is configured to transmit camera feed data from the camera assembly to the processor. The processor is configured to dually synchronize raw IMU Data and android connected IMU data with the camera feed data providing a seamless display of 3D content of the augmented reality applications.
MICROSCOPY METHOD AND APPARATUS FOR OPTICAL TRACKING OF EMITTER OBJECTS
Microscopy method and apparatus for determining the positions of emitter objects in a three-dimensional space that comprises focusing scattered light or fluorescent light emitted by an emitter object, separating the focused beam in a first and a second optical beam, directing the first and the second optical beam through a varifocal lens having an optical axis such that the first optical beam impinges on the lens along the optical axis and the second beam impinges decentralized with respect to the optical axis of the varifocal lens, simultaneously capturing a first image created by the first optical beam and a second image created by the second optical beam, and determining the relative displacement of the position of the object in the first and in the second image, wherein the relative displacement contains the information of the axial position of the object along a perpendicular direction to the image plane.
Using dynamic vision sensors for motion detection in head mounted displays
Systems and methods may provide for using one or more stereoscopic devices for gesture control and tracking in a head mounted display (HMD). Each of the one or more stereoscopic devices may include a pair of dynamic vision sensors (DVS) arranged to have complementary fields of view (FOV) to detect one or more of a leading and a trailing edge of an object moving in a range or area to be detected. The DVS sensors determine how the object is moving based on a change in light intensity of pixels without having to detect, transfer, or process all of the pixels. The system and method may thereby provide motion tracking and gesture control functions at very low latency and processing bandwidth.