H04N17/002

Image generating device

An optical device may include an optical fiber having a fixed end and a free end a first actuator positioned at a actuator position between the fixed end and the free end and configured to apply a first force on the actuator position of the optical fiber such that a movement of the free end of the optical fiber in a first direction is caused, wherein the first direction is orthogonal to a longitudinal axis of the optical fiber; and a deformable rod disposed adjacent to the optical fiber, and having a first end and a second end, wherein the first end is connected to a first rod position of the optical fiber and the second end is connected to a second rod position of the optical fiber.

Sensor degradation monitor

Techniques for determining a degraded state associated with a sensor are discussed herein. For example, a sensor associated with vehicle may captured data of an environment. A portion of the data may represent a portion of the vehicle. Data associated with a region of interest can be determined based on a calibration associated with the sensor. For example, in the context of image data, image coordinates may be used to determine a region of interest, while in the context of lidar data, a beam and/or azimuth can be used to determine a region of interest. A data metric can be determined for data in the region of interest, and an action can be determined based on the data metric. For example, the action can include cleaning a sensor, scheduling maintenance, reducing a confidence associated with the data, or slowing or stopping the vehicle.

IMAGE PICKUP APPARATUS FOR DETECTING LINE-OF-SIGHT POSITION, CONTROL METHOD THEREFOR, AND STORAGE MEDIUM
20220353426 · 2022-11-03 ·

The image pickup apparatus, a control method therefor, and a storage medium capable of maintaining detection accuracy of a line-of-sight position at the time of calibration even in a case where a focus controllable range changes due to a difference in a detachably mounted accessory. At a camera housing 1B, correction information about an individual difference of an eyeball is acquired by calibration based on a display position of an index in a finder thereof and a position of line-of-sight, of a user, detected by a line-of-sight detection circuit 201, the position of line-of-sight being poured into the index, and correction of the detected position of line-of-sight is performed using the correction information. Then, a position of line-of-sight, of the user, detected by the line-of-sight detection circuit 201, the line-of-sight position being poured into a through image is set to a focal frame. When one of the accessories is mounted thereon, a method for the calibration is changed according to a focus detection region acquired from the mounted accessory.

METHOD AND APPARATUS OF CAMERA IMAGE CORRECTION USING STORED TANGENTIAL AND SAGITTAL BLUR DATA AND COMPUTER-READABLE RECORDING MEDIUM
20220353414 · 2022-11-03 ·

Disclosed is an image correction method, implemented in an image correction apparatus that corrects an image taken by a camera, the method including: obtaining a first image by taking an image of a chart including a plurality of circles, each of which includes one or more regions of interest (ROIs) in a tangential direction and a sagittal direction, through the camera; selecting and storing tangential and sagittal image blur correction data of the camera, based on image blur data in the tangential and sagittal directions of the camera measured using the obtained first image; and loading the stored tangential and sagittal image blur correction data and applying the loaded tangential and sagittal image blur correction data to correction for a second image taken by the camera. Thus, image distortion due to, in particular, tangential and sagittal image blurs is effectively corrected by taking individual characteristics of the camera into account.

TWO-WAY CAMERA OPERATION
20220353463 · 2022-11-03 ·

A computing system includes a first camera, a second camera, a microphone, and a display. The computing system is configured to independently activate the first camera and/or the second camera. The computing system first receives a user input initiating a two-way camera operation using both the first camera and the second camera. In response to the user input, the computing system activates the first camera and generates a first visualization, displaying a first image generated by the first camera, and activates the second camera and generates a second visualization, displaying a second image generated by the second camera. Both the first visualization and the second visualization are simultaneously displayed on the display of the computing system, while functions of each of the first camera and second camera can be started, stopped paused unpaused or modified individually.

BOLOMETER UNIT CELL PIXEL INTEGRITY CHECK SYSTEMS AND METHODS

Techniques to test infrared detectors are disclosed. In one example, a focal plane array for an imaging system includes a plurality of infrared detectors arranged in a plurality of rows and columns where each of the infrared detectors is configured to provide an output signal in response to externally received thermal radiation associated with a scene. A plurality of offset circuits of the imaging system may be electrically coupled to the focal plane array and configured to selectively superimpose fixed-pattern noise on the output signals to provide modified output signals. A readout integrated circuit of the imaging system may be configured to provide the modified output signals for processing to test an integrity of the infrared detectors. Modified output signals that are outside an expected output range based on the thermal radiation and known offset may be determined defective. Related methods, devices, and systems are also provided.

Crossing point detector, camera calibration system, crossing point detection method, camera calibration method, and recording medium

A crossing point detector includes memory and a crossing point detection unit that reads out a square image from a captured image in the memory, and detects a crossing point of two boundary lines in a checker pattern depicted in the square image. The crossing point detection unit decides multiple parameters of a function model treating two-dimensional image coordinates as variables, the parameters optimizing an evaluation value based on a difference between corresponding pixel values represented by the function model and the square image, respectively, and computes the position of a crossing point of two straight lines expressed by the decided multiple parameters to thereby detect the crossing point with subpixel precision. The function model uses a curved surface that is at least first-order differentiable to express pixel values at respective positions in a two-dimensional coordinate system at the boundary between black and white regions.

Automotive sensor integration module

An automotive sensor integration module including a plurality of sensors which differ in at least one of a sensing period or an output data format, and a signal processor, which simultaneously outputs, as sensing data, pieces of detection data respectively output from the plurality of sensors on the basis of the sensing period of any one of the plurality of sensors, calculates a reliability value of each of the pieces of detection data on the basis of the pieces of detection data and external environment data, and outputs the reliability value as reliability data.

Sensor self-calibration in the wild by coupling object detection and analysis-by-synthesis
11494939 · 2022-11-08 · ·

A system for self-calibrating sensors includes an electronic control unit, a first image sensor and a second image sensor communicatively coupled to the electronic control unit. The electronic control unit is configured to obtain a first image and a second image, where the first image and the second image contain an overlapping portion, determine an identity of an object present within the overlapping portion, obtain parameters of the identified object, determine a miscalibration of the first image sensor or the second image sensor based on a comparison of the identified object in the overlapping portions and the parameters of the identified object, in response to determining a miscalibration of the first image sensor or the second image sensor, calibrate the first image sensor or the second image sensor based on the parameters of the identified object and the second image or the first image, respectively.

Calibration system for calibrating visual coordinate system and depth coordinate system, calibration method and calibration device

The disclosure provides a calibration system, a calibration method, and a calibration device. The calibration method for obtaining a transformation of coordinate systems between a vision sensor and a depth sensor includes the following steps. (a) A first coordinate group of four endpoints of a calibration board in a world coordinate system is created. (b) An image of the calibration board is obtained by the vision sensor, and a second coordinate group of the four endpoints of the calibration board in a two-dimensional coordinate system is created. (c) A third coordinate group of the four endpoints of the calibration board in a three-dimensional coordinate system is created according to the first and second coordinate groups. (d) The third coordinate group is transformed to a fourth coordinate group corresponding to the depth sensor to obtain the transformation of the coordinate systems according to at least three target scanning spots.