H04N23/665

Self-adaptive liDAR-camera synchronization system
11614527 · 2023-03-28 · ·

A method may include determining an alignment time based on a zero-crossing point corresponding to a LiDAR sensor and a horizontal field of view corresponding to an image-capturing sensor. The method may include determining a delay timing for initiating image capturing by the image-capturing sensor in which the delay timing is based on at least one of: the alignment time, a packet capture timing corresponding to the LiDAR sensor, and an average frame exposure duration corresponding to the image-capturing sensor. The method may include initiating data capture by the LiDAR sensor, and after the initiating of data capture by the LiDAR sensor and after the delay timing has elapsed, initiating data capture by the image-capturing sensor.

LOW POWER SENSOR, PROCESSOR, AND DATA PROCESSING SYSTEM INCLUDING THE SENSOR AND THE PROCESSOR
20220353411 · 2022-11-03 ·

A sensor includes a control circuit set to a first operation mode in which an operation is prepared by receiving a clock signal from a processor and receiving an operation command from the processor. The sensor is configured to generate a first signal including a result of an operation corresponding to the operation command and a second signal indicating completion of the operation. An interface circuit is configured to transmit the first signal and the second signal to the processor. The control circuit is set to a second operation mode due to blocking of the clock signal by a control of the processor in response to the transmission of the second signal.

Multi-aperture cameras with at least one two state zoom camera

Multi-cameras and in particular dual-cameras comprising a Wide camera comprising a Wide lens and a Wide image sensor, the Wide lens having a Wide effective focal length EFL.sub.W and a folded Tele camera comprising a Tele lens with a first optical axis, a Tele image sensor and an OPFE, wherein the Tele lens includes, from an object side to an image side, a first lens element group G1, a second lens element group G2 and a third lens element group G3, wherein at least two of the lens element groups are movable relative to the image sensor along the first optical axis to bring the Tele lens to two zoom states, wherein an effective focal length (EFL) of the Tele lens is changed from EFL.sub.T,min in one zoom state to EFL.sub.T,max in the other zoom state, wherein EFL.sub.Tmin>1.5×EFL.sub.W and wherein EFL.sub.Tmax>1.5×EFL.sub.Tmin.

Apparatus, method and computer program for image capturing
11611726 · 2023-03-21 · ·

Examples of the disclosure relate to apparatus, methods and computer programs for enabling sub-pixel information to be determined in captured images. The apparatus can comprise means for activating at least one filter wherein the at least one filter is positioned in front of at least one image sensor. The at least one filter is configured to at least partially filter light such that the at least one filter has a spatial variation of transparency on an analogue scale across an area covered by the at least one filter. The apparatus also comprises means for detecting an image captured by the at least one image sensor; and using information relating to the spatial variation of transparency of the at least one filter to determine sub-pixel information in the captured image.

Fixed pattern noise reduction and high spatial frequency filtering using vari-focus lenses in low contrast scenes
11611692 · 2023-03-21 · ·

A method for identifying and correcting fixed pattern noise includes capturing a focused image and an unfocused image via a variable focus lens. Fixed pattern noise represented in the unfocused image is filtered from the focused image. The unfocused image represents a low-pass filtered component of the focused image; subtracting the unfocused image from the focused image results in a high-pass and fixed pattern noise filtered focused image. Image capture and focus of the variable focus lens are synchronized to remove transitional frames from the image stream.

Transmitter

There is provided is a transmitter including an image processor that sets region information corresponding to a region set for an image for each row in the image and that transmits the set region information and region data corresponding to the region for each row, in which the image processor sets the region by analyzing the image or on a basis of externally acquired region-designating information, and the region information includes information indicating a position of a row and information indicating a position of a column of the region included in the row.

DATA TRANSMISSION CABLE AND RELATED DEVICE
20230076232 · 2023-03-09 ·

A data transmission cable (100) includes: a signal bundle (110), where the signal bundle (110) includes at least three signal cables, the at least three signal cables are disposed at intervals, pairwise signal cables form a differential pair signal cable, and the differential pair signal cable is used to transmit a differential data signal; a ground cable (120), where the ground cable (120) encircles and covers the signal bundle (110), and the ground cable (120) is used to transmit a ground signal and isolate the signal bundle (110) from a signal bundle (110) of another data transmission cable (100); and a filling medium (130), where the filling medium (130) is disposed in space on an inner side of the ground cable (120) except the signal cable ,so that a problem that a MIPI bus has poor transmission quality and a short transmission distance can be resolved.

IMAGE CONVERSION DEVICE
20230130130 · 2023-04-27 ·

An image conversion device includes: a lens module configured to allow passing of image light beams of an object, an optical waveguide element configured to transmit the image light beams to a light processing component, and an image sensor configured to convert the image light beams into digital image signals. By changing image capturing and image forming methods, higher image quality may be achieved and expanding flexibility may be maintained.

IMAGING DEVICE, IMAGING METHOD, AND ELECTRONIC APPARATUS

An imaging device includes a controller, a power supply, a regulator, and a switch. The controller is configured to control an imaging unit, on the basis of a command and data that are received from a host in accordance with an I2C/I3C communication protocol. The power supply is configured to supply a voltage to a digital block of the controller. The digital block is configured to be subjected to dynamic voltage frequency scaling within one-frame operation. The regulator and the switch are provided between the digital block and the power supply, and coupled in parallel with each other.

Camera and method for fusing snapped images

The present application provides cameras and snapped image fusing methods. The camera includes: a lens, a light splitter, a first image sensor, a second image sensor, and a master processing chip. The light splitter is configured to split incident light, which enters the camera through the lens, into visible light and infrared light. The first image sensor is configured to receive the visible light, and obtain a visible light video image by performing video image capture according to a first shutter and a first gain. The second image sensor is configured to receive the infrared light, and obtain an infrared light video image by performing video image capture according to the first shutter and the first gain. The master processing chip is configured to output a fused video image by fusing the visible light video image and the infrared light video image.