G06T7/85

Imaging processing apparatus and method extracting a second RGB ToF feature points having a correlation between the first RGB and TOF feature points

An image processing apparatus and method of extracting a second RGB feature point and a second ToF feature point such that a correlation between the first RGB feature point and the first ToF feature point is equal to or greater than a predetermined value; calculating an error value between the second RGB feature point and the second ToF feature point; updating pre-stored calibration data when the error value is greater than a threshold value, and calibrating the RGB image and the ToF image by using the updated calibration data; and synthesizing the calibrated RGB and ToF images.

SYSTEM AND METHOD FOR DETECTING A CONDITION PROMPTING AN UPDATE TO AN AUTONOMOUS VEHICLE DRIVING MODEL
20230084316 · 2023-03-16 ·

Systems and methods for implementing one or more autonomous features for autonomous and semi-autonomous control of one or more vehicles are provided. More specifically, image data may be obtained from an image acquisition device and processed utilizing one or more machine learning models to identify, track, and extract one or more features of the image utilized in decision making processes for providing steering angle and/or acceleration/deceleration input to one or more vehicle controllers. In some instances, techniques may be employed such that the autonomous and semi-autonomous control of a vehicle may change between vehicle follow and lane follow modes. In some instances, at least a portion of the machine learning model may be updated based on one or more conditions.

Method, system and computer program product for emulating depth data of a three-dimensional camera device

A method, system and computer program product for emulating depth data of a three-dimensional camera device is disclosed. The method includes concurrently operating the radar device and the 3D camera device to generate training radar data and training depth data respectively. Each of the radar device and the 3D camera device has a respective field of view. The field of view of the radar device overlaps the field of view of the 3D camera device. The method also includes inputting the training radar and depth data to the neural network. The method also includes employing the training radar and depth data to train the neural network. Once trained, the neural network is configured to receive real radar data as input and to output substitute depth data.

INTELLIGENT MANUFACTURING INDUSTRIAL INTERNET OF THINGS WITH FRONT SPLIT SERVICE PLATFORM, CONTROL METHOD AND MEDIUM THEREOF

An intelligent manufacturing industrial Internet of Things with a front split service platform is provided, which includes: an obtaining module configured to obtain an image taken by a first camera on a production line as a first image data and an image taken by a second camera as a second image data through a sensor network platform; a three-dimensional module configured to process into three-dimensional image data; a recognition module configured to obtain a plurality of point positions of a distortion part as judgment point positions; a mapping module configured to map the judgment point positions to the second image data to form second calibration point positions; and a calibration module configured to calibrate the first camera and the second camera according to first calibration point positions and the second calibration point positions.

Method of calibration of a stereoscopic camera system for use with a radio therapy treatment apparatus

The disclosed calibration method includes a calibration phantom positioned on an adjustable table on the surface of a mechanical couch, with the phantom's centre at an estimated location for the iso-centre of a radio therapy treatment apparatus. The calibration phantom is then irradiated using the apparatus, and the relative location of the center of the calibration phantom and the iso-centre of the apparatus is determined by analyzing images of the irradiation of the calibration phantom. The calibration phantom is then repositioned by the mechanical couch applying an offset corresponding to the determined relative location of the centre of the calibration phantom and the iso-centre of the apparatus to the calibration phantom. Images of the relocated calibration phantom are obtained, to which the offset has been applied, and the obtained images are processed to set the co-ordinate system of a stereoscopic camera system relative to the iso-centre of the apparatus.

Systems and methods for measurement of 3D attributes using computer vision

A system including a computing device and camera is disclosed; the system configured for measuring three-dimensional attributes and associated performance measurements of a mechanical device. Some embodiments comprise a camera configured to capture images of the mechanical device and a computing device in communication with the camera. In some embodiments, the computing device is configured to access a first set of pixels associated with a first plurality of fiducials to calibrate a spatial resolution of the camera. A second image from the camera can be converted into a second set of pixels associated with each of the plurality of fiducials, which are attached to the mechanical device. The computing device can be further configured to compare the first and second set of pixels to determine the location of the plurality of fiducials on the mechanical device.

Image capturing apparatus, image processing apparatus, image processing method, image capturing apparatus calibration method, robot apparatus, method for manufacturing article using robot apparatus, and recording medium
11637948 · 2023-04-25 · ·

An image capturing apparatus including a lens and a processing unit, wherein the lens includes a first region through which a first light ray passes and a second region through which a second light ray passes, wherein the first region and the second region are arranged in a predetermined direction, and wherein the processing unit sets a component of the predetermined direction as a degree of freedom in a first relative positional relationship between a predetermined position in the first region and a predetermined position in the second region is employed.

System and method for collaborative sensor calibration
11635762 · 2023-04-25 · ·

The present teaching relates to method, system, medium, and implementations for sensor calibration. An ego vehicle determines whether a sensor deployed on the ego vehicle to facilitate autonomous driving of the ego vehicle needs to be calibrated and sends, if it is determined that the sensor needs to be calibrated, a request for assistance in collaborative calibration of the sensor, with a first position of the ego vehicle or a first configuration of the sensor with respect to the ego vehicle. When a response of the request is received, an assisting vehicle is indicated to travel to be near the ego vehicle to facilitate the calibration of the sensor by collaborating with the moving ego vehicle and the ego vehicle coordinates with the assisting vehicle to enable the sensor to acquire information of a target present on the assisting vehicle for the collaborative calibration of the sensor.

System and method for simultaneously multiple sensor calibration and transformation matrix computation
11635313 · 2023-04-25 · ·

The present teaching relates to apparatus, method, medium, and implementations for simultaneously calibrating multiple sensors of different types. Multiple sensors of different types are first activated to initiate simultaneous calibration thereof based on a 3D construct including a plurality of fiducial marks. Sensors of different types including visual and depth based sensors operate in their respective coordinate systems. Each of the sensors is calibrated by acquiring sensor information of the 3D construct, detecting a feature point on each of the plurality of fiducial markers based on the sensor information, estimating a set of 3D coordinates, with respect to its coordinate system, corresponding to the detected feature points, based on which calibration parameters are generated. Sets of 3D coordinates derived in different coordinate systems are then used to compute at least one transformation matrix for corresponding at least one pair of the plurality of sensors.

CAMERA SYSTEM IN SITUATION BUILT-IN-TEST
20230122529 · 2023-04-20 ·

An autonomous or semi-autonomous vehicle camera system including a camera having a field of view, wherein the camera is operable to receive optical information in the field of view. A display located in the camera field of view. A controller in electrical connection with the camera, wherein the controller is operable to conduct a Built-in-Test. Wherein the Built-in-Test is configured to present one or more images in the camera field of view via the display to determine functionality of the camera system.