Patent classifications
G06T7/80
HOLISTIC CAMERA CALIBRATION SYSTEM FROM SPARSE OPTICAL FLOW
Holistic systems and methods are used for calibrating image capture devices. An image capture device includes a lens, an image sensor, an inertial measurement unit (IMU), and an image signal processor (ISP). The image sensor detects images as frames and the IMU captures motion data. The ISP detects one or more key points on the frames and matches the one or more key points between the frames. The ISP computes one or more calibration parameters. The one or more calibration parameters are based on the matched key points and a model. The model includes an optical component, an IMU component, and a sensor component. The ISP performs a calibration using the calibration parameters.
HOLISTIC CAMERA CALIBRATION SYSTEM FROM SPARSE OPTICAL FLOW
Holistic systems and methods are used for calibrating image capture devices. An image capture device includes a lens, an image sensor, an inertial measurement unit (IMU), and an image signal processor (ISP). The image sensor detects images as frames and the IMU captures motion data. The ISP detects one or more key points on the frames and matches the one or more key points between the frames. The ISP computes one or more calibration parameters. The one or more calibration parameters are based on the matched key points and a model. The model includes an optical component, an IMU component, and a sensor component. The ISP performs a calibration using the calibration parameters.
SYSTEM AND METHOD FOR CALIBRATING A TIME DIFFERENCE BETWEEN AN IMAGE PROCESSOR AND AN INTERTIAL MEASUREMENT UNIT BASED ON INTER-FRAME POINT CORRESPONDENCE
Systems and methods are used for calibrating a time difference between an image signal processor (ISP) and an inertial measurement unit (IMU) of an image capture device. An image capture device includes a lens, an image sensor, an IMU, and an ISP. The image sensor detects images as frames and the IMU captures motion data. The ISP detects one or more key points on the frames and matches the one or more key points between the frames. The ISP computes one or more calibration parameters. The one or more calibration parameters are based on the matched key points and a time difference between the ISP and the IMU. The ISP performs a calibration using the calibration parameters.
SYSTEM AND METHOD FOR CALIBRATING A TIME DIFFERENCE BETWEEN AN IMAGE PROCESSOR AND AN INTERTIAL MEASUREMENT UNIT BASED ON INTER-FRAME POINT CORRESPONDENCE
Systems and methods are used for calibrating a time difference between an image signal processor (ISP) and an inertial measurement unit (IMU) of an image capture device. An image capture device includes a lens, an image sensor, an IMU, and an ISP. The image sensor detects images as frames and the IMU captures motion data. The ISP detects one or more key points on the frames and matches the one or more key points between the frames. The ISP computes one or more calibration parameters. The one or more calibration parameters are based on the matched key points and a time difference between the ISP and the IMU. The ISP performs a calibration using the calibration parameters.
CALCULATING A DISTANCE BETWEEN A VEHICLE AND OBJECTS
A method for calculating a distance between a vehicle camera and an object, the method may include: (a) obtaining an image that was acquired by the vehicle camera of a vehicle; the image captures the horizon, the object, and road lane boundaries; (b) determining an initial row-location horizon estimate and a row-location contact point estimate, the contact point is between the object and a road on which the vehicle is positioned; (c) determining a vehicle camera roll angle correction that once applied will cause the lanes boundaries to be parallel to each other in the real world; (d) calculating a new row-location horizon estimate, wherein the calculating comprises updating the row-location horizon estimate based on the vehicle camera roll angle correction; and (e) calculating the distance between the vehicle camera based on a difference between the new row-location horizon estimate and the row-location contact point estimate.
CALCULATING A DISTANCE BETWEEN A VEHICLE AND OBJECTS
A method for calculating a distance between a vehicle camera and an object, the method may include: (a) obtaining an image that was acquired by the vehicle camera of a vehicle; the image captures the horizon, the object, and road lane boundaries; (b) determining an initial row-location horizon estimate and a row-location contact point estimate, the contact point is between the object and a road on which the vehicle is positioned; (c) determining a vehicle camera roll angle correction that once applied will cause the lanes boundaries to be parallel to each other in the real world; (d) calculating a new row-location horizon estimate, wherein the calculating comprises updating the row-location horizon estimate based on the vehicle camera roll angle correction; and (e) calculating the distance between the vehicle camera based on a difference between the new row-location horizon estimate and the row-location contact point estimate.
NON-UNIFORMITY CORRECTION CALIBRATIONS IN INFRARED IMAGING SYSTEMS AND METHODS
Techniques for facilitating non-uniformity correction calibrations are provided. In one example, an infrared imaging system includes an infrared imager and a logic device. The infrared imager is configured to capture a first set of infrared images of a reference object using a first integration time. The infrared imager is further configured to capture a second set of infrared images of the reference object using a second integration time different from the first integration time. The logic device is configured to determine a dark current correction map based on the second set of infrared images. The logic device is further configured to generate a non-uniformity correction map based on the dark current correction map. Related devices and methods are also provided.
NON-UNIFORMITY CORRECTION CALIBRATIONS IN INFRARED IMAGING SYSTEMS AND METHODS
Techniques for facilitating non-uniformity correction calibrations are provided. In one example, an infrared imaging system includes an infrared imager and a logic device. The infrared imager is configured to capture a first set of infrared images of a reference object using a first integration time. The infrared imager is further configured to capture a second set of infrared images of the reference object using a second integration time different from the first integration time. The logic device is configured to determine a dark current correction map based on the second set of infrared images. The logic device is further configured to generate a non-uniformity correction map based on the dark current correction map. Related devices and methods are also provided.
VEHICULAR ACCESS CONTROL BASED ON VIRTUAL INDUCTIVE LOOP
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for monitoring events using a Virtual Inductive Loop system. In some implementations, image data is obtained from cameras. A region depicted in the obtained image data is identified, the region comprising lines spaced by a distance that satisfies a distance threshold. For each line included in the region: an object depicted crossing the line is determined whether to satisfy a height criteria indicating that the line is activated. In response to determining that an object depicted crossing the line satisfies the height criteria, an event is determined to have likely occurred using data indicating (i) which lines of the lines were activated and (ii) an order in which each of the lines were activated. In response to determining that an event likely occurred, actions are performed using at least some of the data.
VEHICULAR ACCESS CONTROL BASED ON VIRTUAL INDUCTIVE LOOP
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for monitoring events using a Virtual Inductive Loop system. In some implementations, image data is obtained from cameras. A region depicted in the obtained image data is identified, the region comprising lines spaced by a distance that satisfies a distance threshold. For each line included in the region: an object depicted crossing the line is determined whether to satisfy a height criteria indicating that the line is activated. In response to determining that an object depicted crossing the line satisfies the height criteria, an event is determined to have likely occurred using data indicating (i) which lines of the lines were activated and (ii) an order in which each of the lines were activated. In response to determining that an event likely occurred, actions are performed using at least some of the data.