H04N5/00

Performing 3D reconstruction via an unmanned aerial vehicle

In some examples, an unmanned aerial vehicle (UAV) employs one or more image sensors to capture images of a scan target and may use distance information from the images for determining respective locations in three-dimensional (3D) space of a plurality of points of a 3D model representative of a surface of the scan target. The UAV may compare a first image with a second image to determine a difference between a current frame of reference position for the UAV and an estimate of an actual frame of reference position for the UAV. Further, based at least on the difference, the UAV may determine, while the UAV is in flight, an update to the 3D model including at least one of an updated location of at least one point in the 3D model, or a location of a new point in the 3D model.

VHF-UHF antenna system with feedback

An antenna system for use with one or more media devices is provided. The antenna system includes a plurality of antenna elements. Each antenna element is associated with an independent feed element. In addition each antenna element is configured to receive a plurality of radio frequency (RF) signals. Each RF signal can be associated with a UHF band or a VHF band. The antenna system can include at least one tuner and at least one switching device. The at least one switching device can be configured to selectively couple the at least one tuner to one antenna element of the plurality of antenna elements based, at least in part, on channel selection data associated with the at least one tuner.

Image capturing device with display control, image communication system, and method for display control, and recording medium

An image capturing device includes an imaging device to capture a first image, and circuitry to receive a second image from another image capturing device, the second image having an angle of view wider than that of the first image, and control a display to sequentially display the first image and the second image.

Camera head
11528395 · 2022-12-13 · ·

A camera head includes: a mount detachably connected to an eyepiece of an endoscope; an imager configured to capture a subject image emitted from the eyepiece; and a housing configured to house the imager, the housing including an interior portion including the imager thereinside, and an exterior portion arranged on an outer side of the interior portion, wherein the interior portion and the exterior portion are separated from each other in at least a part of an outer surface.

Light-transmitting assembly of display device, signal indicator and display device

A light-transmitting assembly of a display device, a signal indicator, and the display device are provided in the embodiments of the present disclosure. The light-transmitting assembly of the display device includes: a light-transmitting adjustment member including a plurality of first patterns, a light transmittance of each first pattern is less than a light transmittance of a region of the light-transmitting adjustment member other than the first pattern, and each light-shielding area ratio of the first patterns is a ratio of a sum of areas of all the first patterns in any one region of the light-transmitting adjustment member to an area of the any one region.

Light-transmitting assembly of display device, signal indicator and display device

A light-transmitting assembly of a display device, a signal indicator, and the display device are provided in the embodiments of the present disclosure. The light-transmitting assembly of the display device includes: a light-transmitting adjustment member including a plurality of first patterns, a light transmittance of each first pattern is less than a light transmittance of a region of the light-transmitting adjustment member other than the first pattern, and each light-shielding area ratio of the first patterns is a ratio of a sum of areas of all the first patterns in any one region of the light-transmitting adjustment member to an area of the any one region.

Tracking objects using sensor rotation
11616914 · 2023-03-28 · ·

An example apparatus for tracking objects includes a controller to receive a depth map, a focus distance, and an image frame of an object to be tracked. The controller is to detect the object to be tracked in the image frame and generate an object position for the object in the image frame. The controller is to calculate a deflection angle for the object based on the depth map, the focus distance, and the object position. The controller is to further rotate an imaging sensor based on the deflection angle.

In-field monitoring of autofocus performance and instability mitigation in camera systems

A camera system may include one or more controllers to perform in-field monitoring of autofocus performance and instability mitigation. The controllers may monitor one or more parameters associated with adjustment of a relative position between a lens group and an image sensor and/or one or more images. The controllers may analyze the parameters and/or images to calculate various metrics. The controllers may evaluate the metrics with respect to corresponding thresholds. The controllers may detect, based at least in part of the evaluation the metrics, one or more instability events associated with controller performance degradation. In response to detecting the instability events, the controllers may perform one or more remedial actions to mitigate the controller performance degradation.

Adaptive glare removal and/or color correction

Some implementations relate to determining whether glare is present in captured image(s) of an object (e.g., a photo) and/or to determining one or more attributes of any present glare. Some of those implementations further relate to adapting one or more parameters for a glare removal process based on whether the glare is determined to be present and/or based on one or more of the determined attributes of any glare determined to be present. Some additional and/or alternative implementations disclosed herein relate to correcting color of a flash image of an object (e.g., a photo). The flash image is based on one or more images captured by a camera of a client device with a flash component of the client device activated. In various implementations, correcting the color of the flash image is based on a determined color space of an ambient image of the object.

Adaptive glare removal and/or color correction

Some implementations relate to determining whether glare is present in captured image(s) of an object (e.g., a photo) and/or to determining one or more attributes of any present glare. Some of those implementations further relate to adapting one or more parameters for a glare removal process based on whether the glare is determined to be present and/or based on one or more of the determined attributes of any glare determined to be present. Some additional and/or alternative implementations disclosed herein relate to correcting color of a flash image of an object (e.g., a photo). The flash image is based on one or more images captured by a camera of a client device with a flash component of the client device activated. In various implementations, correcting the color of the flash image is based on a determined color space of an ambient image of the object.