Patent classifications
H04N23/56
APPARATUS FOR ACQUIRING DEPTH IMAGE, METHOD FOR FUSING DEPTH IMAGES, AND TERMINAL DEVICE
Provided are an apparatus for acquiring a depth image, a method for fusing depth images, and a terminal device. The apparatus for acquiring a depth image includes an emitting module, a receiving module, and a processing unit. The emitting module is configured to emit a speckle array to an object, where the speckle array includes p mutually spaced apart speckles. The receiving module includes an image sensor. The processing unit is configured to receive the pixel signal and generate a sparse depth image based on the pixel signal, align an RGB image at a resolution of a*b with the sparse depth image, and fuse the aligned sparse depth image with the RGB image using a pre-trained image fusion model to obtain a dense depth image at a resolution of a*b.
System and method to simultaneously track multiple organisms at high resolution
A microscopy includes multiple cameras working together to capture image data of a sample having a group of organisms distributed over a wide area, under the influence of an excitation instrument. A first processor is coupled to each camera to process the image data captured by the camera. Outputs from the multiple first processors are aggregated and streamed serially to a second processor for tracking the organisms. The presence of the multiple cameras capturing images from the sample, configured with 50% or more overlap, can allow 3D tracking of the organisms through photogrammetry.
LED And/Or Laser Projection Light Device
The LED and/or laser projection light device has three major project parts including (a) light source (b) image-forming-unit (c) project/refractive lens to make desired enlarge projected image, patterns or light beams. The project light has at least one of inner optics-lens or optics-elements rotating to create the splendid lighted image or patterns or light-beam to emit to outer-cover. Further, The project light preferred have at least one of additional-functions built-in project light device select from (i) 2.sup.nd light source for preferred illumination function(s), (ii) glow, back light, (iii) 2.sup.nd or more project assemblies in one light device, (iv) other light functions, (v) candle light illumination, (vi) bulb illumination, (vii) desk top or floor light illumination, (viii) having battery or rechargeable battery or built-in/outside AC-to-DC circuit to get power source, (ix) apply the USB port or adaptor or connector or AC-plug wire to get power source, (x) steady, rotating, replaceable, detachable, movable 3 major project parts.
PRODUCT TARGET QUALITY CONTROL SYSTEM
A process includes receiving a target quality value, receiving a measured quality value, receiving a source quality value, and sending a source control instruction. The source control instruction is based at least in part on the target quality value, the measured quality value, and the source quality value. The target quality value, the measured quality value, the source quality value, and the source control instruction are communicated via the communication port. The measured quality value is generated by an inspection device configured to inspect a sample. The source quality value is associated with a quality level of a first group of samples. The target quality value indicates a desired quality value of an output group of samples. The source control instruction causes a source selecting device to select one of a plurality of groups of samples, each group having identified quality characteristics.
SMART AND COMPACT IMAGE CAPTURE DEVICES FOR IN VIVO IMAGING
A novel in-vivo image capture device for capsule endoscope and its method of operation are described. The device includes a wafer level camera module design, high sensitivity backside illumination pixel with high definition image output and LED's to provide illumination, which is synchronized with an image sensor strobe signal. A frame rate of the device can be adjusted based on an angular motion detection from a gyroscope sensor, in which a high frame rate mode is maintained during fast motion while a low frame rate is maintained during slow or no motion. The image capture device also includes machine learning based SOC for image processing, enhancement, and compression. The SOC can process and store zone average of images. The image capture device also includes a high density flash storage to store images in the device, thus no RF transmitter is needed, which make the system more convenient to use.
Single-camera particle tracking system and method
A method for tracking moving particles in a fluid. The method includes illuminating the moving particles with an illumination sequence of patterns generated by a light projector; measuring with a single camera light intensities reflected by the moving particles; calculating, based on the measured light intensity, digital coordinates (x′, y′, z′) of the moving particles; determining a mapping function f that maps the digital coordinates (x′, y′, z′) of the moving particles to physical coordinates (x, y, z) of the moving particles; and calculating the physical coordinates (x, y, z) of the moving particles based on the mapping function f. The illumination sequence of patterns is generated with a single wavelength, and light emitted by the projector is perpendicular to light received by the single camera.
Devices, systems and methods for predicting gaze-related parameters using a neural network
A method for creating and updating a database is disclosed. In one example, the method includes presenting a first stimulus to a first user wearing a head-wearable device, using a first camera of the head-wearable device to generate. When the first user is expected to respond to the first stimulus or expected to have responded to the first stimulus, using a second camera of the head-wearable device to generate a first right image of at least a portion of the right eye of the first user. A data connection is established between the head-wearable device and the database. A first dataset is generated comprising the first left image, the first right image and a first representation of a gaze-related parameter, the first representation being correlated with the first stimulus, and adding the first dataset to a device database.
Optical encoder capable of identifying absolute positions
The present disclosure is related to an optical encoder which is configured to provide precise coding reference data by feature recognition technology. To apply the present disclosure, it is not necessary to provide particular dense patterns on a working surface. The precise coding reference data can be generated by detecting surface features of the working surface.
Portable photogrammetry studio
A portable photogrammetry studio for digitisation of human body surfaces.
IMAGE GENERATING APPARATUS AND IMAGE GENERATING METHOD
Irradiation light in a visible light region is irradiated to a sample while switching irradiation of infrared light IR having a wavelength that corresponds to the infrared absorption spectrum of an observation target material included in the sample between a first state and a second state. A first image and a second image are generated based on the phase distribution, the intensity distribution, and the polarization direction distribution of the light including the irradiation light that has passed through the sample in synchronization with the switching of the infrared light IR irradiation between the first state and the second state. Subsequently, an output image is generated so as to represent one from among the position, size, and shape based on the difference and/or ratio with respect to the pixel values for each pixel between the first image and the second image.