Patent classifications
G06T7/269
Detection target positioning device, detection target positioning method, and sight tracking device
Disclosed is a detection target positioning method and device. The method comprises: acquiring an original image and pre-processing the original image to obtain a gradation of each pixel in a target gradation image corresponding to a target region including a detection target; calculating first gradation sets corresponding to rows of pixels of the target gradation image and second gradation sets corresponding to columns of pixels of the target gradation image; and determining rows of two ends of the detection target in a column direction according to the first gradation sets, determining columns of two ends of the detection target in a row direction according to the second gradation sets, and determining a center of the detection target according to the row of two ends of the detection target in the column direction and the columns of two ends of the detection target in the row direction.
Detection target positioning device, detection target positioning method, and sight tracking device
Disclosed is a detection target positioning method and device. The method comprises: acquiring an original image and pre-processing the original image to obtain a gradation of each pixel in a target gradation image corresponding to a target region including a detection target; calculating first gradation sets corresponding to rows of pixels of the target gradation image and second gradation sets corresponding to columns of pixels of the target gradation image; and determining rows of two ends of the detection target in a column direction according to the first gradation sets, determining columns of two ends of the detection target in a row direction according to the second gradation sets, and determining a center of the detection target according to the row of two ends of the detection target in the column direction and the columns of two ends of the detection target in the row direction.
DENSE OPTICAL FLOW CALCULATION SYSTEM AND METHOD BASED ON FPGA
Disclosed are a dense optical flow calculation system and method based on an FPGA (Field Programmable Gate Array). The system comprises a software system deployed on a host and a dense optical flow calculation module deployed on the FPGA. Pixel information of two continuous frames of pictures is obtained from a host end in the system, and optical flow is obtained by calculation by means of the steps such as smoothing processing, polynomial expansion, intermediate variable calculation, optical flow calculation. An image pyramid and iterative optical flow calculation can be achieved by repeatedly calling a calculation core module in the FPGA; a final calculation result is returned to the host end. According to the dense optical flow calculation system in the present invention, methods of data flow, assembly line, separated convolution, block RAM array storage and the like are applied, the dense optical flow can be efficiently calculated, the dense optical flow result is high in reliability, the requirements of real-time processing and low power consumption can be met, and the practicability of the dense optical flow calculation system is guaranteed.
DENSE OPTICAL FLOW CALCULATION SYSTEM AND METHOD BASED ON FPGA
Disclosed are a dense optical flow calculation system and method based on an FPGA (Field Programmable Gate Array). The system comprises a software system deployed on a host and a dense optical flow calculation module deployed on the FPGA. Pixel information of two continuous frames of pictures is obtained from a host end in the system, and optical flow is obtained by calculation by means of the steps such as smoothing processing, polynomial expansion, intermediate variable calculation, optical flow calculation. An image pyramid and iterative optical flow calculation can be achieved by repeatedly calling a calculation core module in the FPGA; a final calculation result is returned to the host end. According to the dense optical flow calculation system in the present invention, methods of data flow, assembly line, separated convolution, block RAM array storage and the like are applied, the dense optical flow can be efficiently calculated, the dense optical flow result is high in reliability, the requirements of real-time processing and low power consumption can be met, and the practicability of the dense optical flow calculation system is guaranteed.
IMAGE SIGNAL PROCESSOR AND IMAGE PROCESSING DEVICE
An image processing device is provided. The image processing device includes an image signal processor for processing a raw image received from a camera and a memory for storing a previous frame of the raw image and an intermediate image generated by the processing, wherein the image signal processor estimates an initial global motion vector between the previous frame and a current frame, receives focus region information in the raw image, divides the raw image into a foreground (FG) region and a background (BG) region based on the focus region information, generates a final global motion vector by divisionally updating the initial global motion vector based on the FG region and the BG region, performs motion compensation by applying the final global motion vector to the previous frame, and outputs a final image by blending the motion-compensated previous frame with the current frame.
SYSTEM AND METHOD FOR LEARNING TEMPORALLY CONSISTENT VIDEO SYNTHESIS USING FAKE OPTICAL FLOW
A system and method for learning temporally consistent video synthesis using fake optical flow that include receiving data associated with a source video and a target video. The system and method also include processing image-to-image translation across domains of the source video and the target video and processing a synthesized temporally consistent video based on the image-to-image translation. The system and method further include training a neural network with data that is based on synthesizing of the source video and the target video.
DIGITAL VIDEO COMPUTING SYSTEM FOR VEHICLE
A digital video computing system receives two or more frames depicting an environment from a camera system of a vehicle. For a salient image feature identified in the two or more frames, a global motion vector is calculated that is indicative of movement of the feature at least partially attributable to movement of the vehicle. A local motion vector is calculated that is indicative of movement of the feature independent from the movement of the vehicle. Based on the local motion vector, the salient image feature is determined to have an apparent motion relative to the environment that is independent from the movement of the vehicle. A candidate image patch is identified including the salient image feature. The candidate image patch is analyzed to output a likelihood that the candidate image patch depicts a second vehicle.
DIGITAL VIDEO COMPUTING SYSTEM FOR VEHICLE
A digital video computing system receives two or more frames depicting an environment from a camera system of a vehicle. For a salient image feature identified in the two or more frames, a global motion vector is calculated that is indicative of movement of the feature at least partially attributable to movement of the vehicle. A local motion vector is calculated that is indicative of movement of the feature independent from the movement of the vehicle. Based on the local motion vector, the salient image feature is determined to have an apparent motion relative to the environment that is independent from the movement of the vehicle. A candidate image patch is identified including the salient image feature. The candidate image patch is analyzed to output a likelihood that the candidate image patch depicts a second vehicle.
Object localization for mapping applications using geometric computer vision techniques
Systems and methods to localize objects for mapping applications may comprise a vehicle having an imaging device, a location sensor, and an edge processor. Using imaging data from the imaging device, location data from the location sensor, and bounding box data associated with objects, three-dimensional models of environments may be reconstructed using structure from motion algorithms and/or direct triangulation algorithms. After aligning the reconstructions to real-world environments based on the location data, objects may be accurately localized relative to real-world environments.
Object localization for mapping applications using geometric computer vision techniques
Systems and methods to localize objects for mapping applications may comprise a vehicle having an imaging device, a location sensor, and an edge processor. Using imaging data from the imaging device, location data from the location sensor, and bounding box data associated with objects, three-dimensional models of environments may be reconstructed using structure from motion algorithms and/or direct triangulation algorithms. After aligning the reconstructions to real-world environments based on the location data, objects may be accurately localized relative to real-world environments.