H04N17/00

DEVICE AND METHOD FOR REDUCING BACKLIGHT EFFECT ON A CAMERA OF A ROBOT

A device and a method reduce a backlight effect on a camera of a robot in consideration of a situation of the robot. The device acquires a surrounding image, communicates with a system of the robot to obtain a current state value of the robot, and calculates a parameter value of the camera based on the current state value of the robot. Thus, the device precisely corrects the parameter of the camera based on an environment where the robot is actually located, thereby reducing the backlight effect on the camera.

IMAGING RANGE ESTIMATION DEVICE, IMAGING RANGE ESTIMATION METHOD, AND PROGRAM
20230049073 · 2023-02-16 ·

An imaging range estimation device includes an image data processor configured to acquire image data imaged by a camera device and generate image data with an object name label added, a reference data generator configured to set, by using geographic information, a region within a predetermined distance that is imageable from an estimated position at which the camera device is installed and generate reference data with an object name label added, and an imaging range estimator configured to calculate a concordance rate by comparing a feature indicated by a region of an object name label of the image data with a feature indicated by a region of an object name label of the reference data, and estimate the imaging range of the camera device to be a region of the reference data that corresponds to the image data.

INTERFACE BOARD FOR TESTING IMAGE SENSOR, TEST SYSTEM HAVING THE SAME, AND OPERATING METHOD THEREOF

A testing system for testing an image sensor, includes a probe card, a pogo block receiving output signals of the probe card, an interface board configured to receive output signals of the pogo block, convert the received output signals of the pogo block, and output the converted signals through a cable, and a testing apparatus connected to the interface board through the cable. The testing apparatus is configured to test the device under test through signals received through the cable. The interface board includes an active interface module configured to amplify the received output signals of the pogo block, convert the amplified signals into signals having a same frequency as the received output signals of the pogo block, and transmit the converted signals to the cable.

Solid-state image sensor, imaging device, and method of controlling solid-state image sensor

To improve image quality of image data in a solid-state image sensor that detects an address event. The solid-state image sensor includes a photodiode, a pixel signal generation unit, and a detection unit. In the solid-state image sensor, the photodiode generates electrons and holes by photoelectric conversion. The pixel signal generation unit generates a pixel signal having a voltage according to an amount of one of the electrons and the holes. The detection unit detects whether or not a change amount in the other of the electrons and the holes has exceeded a predetermined threshold and outputs a detection signal.

Solid-state image sensor, imaging device, and method of controlling solid-state image sensor

To improve image quality of image data in a solid-state image sensor that detects an address event. The solid-state image sensor includes a photodiode, a pixel signal generation unit, and a detection unit. In the solid-state image sensor, the photodiode generates electrons and holes by photoelectric conversion. The pixel signal generation unit generates a pixel signal having a voltage according to an amount of one of the electrons and the holes. The detection unit detects whether or not a change amount in the other of the electrons and the holes has exceeded a predetermined threshold and outputs a detection signal.

Image processing device
11582402 · 2023-02-14 · ·

An image processing device includes a rotation processor and an image processor. The rotation processor receives an input image and generates a temporary image according to the input image. The image processor is coupled to the rotation processor and outputs a processed image according to the temporary image, wherein the image processor has a predetermined image processing width, a width of the input image is larger than the predetermined image processing width, and a width of the temporary image is less than the predetermined image processing width.

Control method, control system, electronic device and readable storage medium for capsule endoscope

The present invention discloses a control method, system, electronic device and readable storage medium for a capsule endoscope. The method includes: providing a working apparatus, comprising a capsule endoscope, and an external data recorder for cooperating with and controlling the capsule endoscope; monitoring the received ambient power by the external data recorder before wireless transmission of the capsule endoscope or during an intermittence between two transmissions, and/or monitoring the output power of the capsule endoscope by the external data recorder as data is transmitted during wireless transmission; adjusting the operating state of the working apparatus according to the ambient power and/or output power. The present invention can monitor the power during the dormant period before image interaction and/or in the process of image interaction, thus adjust the operating state of the capsule endoscope in real time, which can improve the wireless communication performance and operating time of the capsule endoscope.

DEVICE, METHOD, AND USE OF THE DEVICE FOR ADJUSTING, ASSEMBLING AND/OR TESTING AN ELECTRO-OPTICAL SYSTEM
20230037764 · 2023-02-09 ·

A device (1) for producing a photoactive system (10), in particular a deactivated photoactive system (10), characterised by: an imaging device (2) having at least one imaging arrangement (20), wherein the at least one imaging arrangement (20) has a beam passage plane (SE) and an optical axis (O), and the at least one imaging arrangement (20) is designed to generate electromagnetic beams which extend along a beam path and pass through the imaging arrangement (20) on the beam passage plane (SE) and to reflect the electromagnetic beams along the beam path at the photoactive arrangement (11) in order to image, on a first focal plane (B1) of the imaging arrangement (20), an evaluation image of a photoactive arrangement (11) of the photoactive system (10) to be produced, and the electromagnetic beams of the beam path are captured on the first focal plane (B1) in order to capture the evaluation image of the photoactive arrangement (11); and a first holding device (3a) having a first holding plane (Ha), on the first holding plane (Ha), an optical arrangement (12) of the photoactive system (10) to be produced; and a second holding device (3b) having a second holding plane (3b) for holding the photoactive arrangement (11) on the second holding plane (Hb); wherein the first holding device (3a) having the first holding plane (Ha) and/or the second holding device (3a) having the second holding plane (Ha) is/are movably positioned relative to the imaging device (2).

Vehicular vision system that dynamically calibrates a vehicular camera

A vehicular vision system includes a camera disposed at a vehicle and operable to capture multiple frames of image data during a driving maneuver of the vehicle. A control includes an image processor that processes frames of captured image data to determine feature points in an image frame when the vehicle is operated within a first range of steering angles, and to determine motion trajectories of those feature points in subsequent image frames for the respective range of steering angles. The control determines a horizon line based on the determined motion trajectories. Responsive to determination that the determined horizon line is non-parallel to the horizontal axis of the image plane, at least one of pitch, roll or yaw of the camera is adjusted. Image data captured by the camera is processed at the control for object detection.

Vehicular vision system that dynamically calibrates a vehicular camera

A vehicular vision system includes a camera disposed at a vehicle and operable to capture multiple frames of image data during a driving maneuver of the vehicle. A control includes an image processor that processes frames of captured image data to determine feature points in an image frame when the vehicle is operated within a first range of steering angles, and to determine motion trajectories of those feature points in subsequent image frames for the respective range of steering angles. The control determines a horizon line based on the determined motion trajectories. Responsive to determination that the determined horizon line is non-parallel to the horizontal axis of the image plane, at least one of pitch, roll or yaw of the camera is adjusted. Image data captured by the camera is processed at the control for object detection.