H04N13/254

IMAGING DEVICE FOR ACQUIRING THREE-DIMENSIONAL INFORMATION OF WORKPIECE SURFACE AND TWO-DIMENSIONAL IMAGE OF WORKPIECE
20230224450 · 2023-07-13 ·

Provided is an imaging device for acquiring three-dimensional information of a workpiece surface and a two-dimensional image. An imaging device comprises a visual sensor for acquiring a two-dimensional image after acquiring three-dimensional information of a workpiece surface. A position detection device is attached to the conveyor drive motor of a conveyor that conveys a workpiece. An image processing part calculates the amount of movement of the workpiece from when the three-dimensional information is acquired to when the two-dimensional image is captured on the basis of an output of the position detection device. The image processing part moves the three-dimensional information relative to the two-dimensional image in such a manner as to correspond to the amount of movement of the workpiece in a predetermined coordinate system.

IMAGING DEVICE FOR ACQUIRING THREE-DIMENSIONAL INFORMATION OF WORKPIECE SURFACE AND TWO-DIMENSIONAL IMAGE OF WORKPIECE
20230224450 · 2023-07-13 ·

Provided is an imaging device for acquiring three-dimensional information of a workpiece surface and a two-dimensional image. An imaging device comprises a visual sensor for acquiring a two-dimensional image after acquiring three-dimensional information of a workpiece surface. A position detection device is attached to the conveyor drive motor of a conveyor that conveys a workpiece. An image processing part calculates the amount of movement of the workpiece from when the three-dimensional information is acquired to when the two-dimensional image is captured on the basis of an output of the position detection device. The image processing part moves the three-dimensional information relative to the two-dimensional image in such a manner as to correspond to the amount of movement of the workpiece in a predetermined coordinate system.

Global-shutter image sensor with time-of-flight sensing capability
11558569 · 2023-01-17 · ·

Apparatus for optical sensing includes first matrix of optical sensing elements, arranged on a semiconductor substrate in rows and columns. A second matrix of storage nodes is arranged on the substrate such that respective first and second storage nodes in the second matrix are disposed in proximity to each of the sensing elements within the first matrix. Switching circuitry couples each of the sensing elements to transfer photocharge to the respective first and second storage nodes. Control circuitry controls the switching circuitry in a depth sensing mode such that over a series of detection cycles, each of the sensing elements and a first neighboring sensing element are connected together to the respective first storage node during the first detection interval, and each of the sensing elements and the second neighboring sensing element are connected together to the respective second storage node during the second detection interval.

Global-shutter image sensor with time-of-flight sensing capability
11558569 · 2023-01-17 · ·

Apparatus for optical sensing includes first matrix of optical sensing elements, arranged on a semiconductor substrate in rows and columns. A second matrix of storage nodes is arranged on the substrate such that respective first and second storage nodes in the second matrix are disposed in proximity to each of the sensing elements within the first matrix. Switching circuitry couples each of the sensing elements to transfer photocharge to the respective first and second storage nodes. Control circuitry controls the switching circuitry in a depth sensing mode such that over a series of detection cycles, each of the sensing elements and a first neighboring sensing element are connected together to the respective first storage node during the first detection interval, and each of the sensing elements and the second neighboring sensing element are connected together to the respective second storage node during the second detection interval.

Detecting positional deviations in an optical module
11555982 · 2023-01-17 · ·

Optoelectronic apparatus includes a projector, which includes an emitter array, including emitters configured to emit respective beams of optical radiation, and projection optics having an entrance face and an exit face and configured to receive the beams of the optical radiation through the entrance face and to project the beams through the exit face. An optical window is positioned adjacent to the exit face and is configured to transmit the optical radiation emitted by the emitter array toward a scene. A detector array includes multiple optical detector elements, which are configured to detect a part of the optical radiation that is reflected back into the apparatus by the optical window. A controller is coupled to monitor a spatial distribution of the optical radiation reflected by the optical window and sensed by the detector array, so as to detect and adjust for a positional deviation of the projector.

Cluster resource management in distributed computing systems

Techniques are provided for managing resources among clusters of computing devices in a computing system. Resource reassignment message are generated for indicating that servers are reassigned and in response to resource compute loads exceed or fall below certain thresholds. Techniques also include establishing communications with the reassigned servers to assign compute loads without physically relocating the servers from one cluster to another cluster.

Cluster resource management in distributed computing systems

Techniques are provided for managing resources among clusters of computing devices in a computing system. Resource reassignment message are generated for indicating that servers are reassigned and in response to resource compute loads exceed or fall below certain thresholds. Techniques also include establishing communications with the reassigned servers to assign compute loads without physically relocating the servers from one cluster to another cluster.

Multi-camera image capture system
11700362 · 2023-07-11 · ·

A dual-camera image capture system may include a first light source, disposed above a target area, a first mobile unit, configured to rotate around the target area, and a second mobile unit, operatively coupled to the first mobile unit, configured to move vertically along the first mobile unit. The dual-camera image capture system may further include a second light source, operatively coupled to the second mobile unit and a dual-camera unit, operatively coupled to the second mobile unit. The dual-camera image capture system may include a first camera configured to capture structural data and a second camera configured to capture color data. The first mobile unit and the second mobile unit may be configured to move the first camera and the second camera to face the target area in a variety of positions around the target area.

Multi-camera image capture system
11700362 · 2023-07-11 · ·

A dual-camera image capture system may include a first light source, disposed above a target area, a first mobile unit, configured to rotate around the target area, and a second mobile unit, operatively coupled to the first mobile unit, configured to move vertically along the first mobile unit. The dual-camera image capture system may further include a second light source, operatively coupled to the second mobile unit and a dual-camera unit, operatively coupled to the second mobile unit. The dual-camera image capture system may include a first camera configured to capture structural data and a second camera configured to capture color data. The first mobile unit and the second mobile unit may be configured to move the first camera and the second camera to face the target area in a variety of positions around the target area.

Automated feature analysis of a structure

An automated structural feature and analysis system is disclosed. A 3D device emits a volume scanning 3D beam that scans a structure to generate 3D data that is associated with a distance between the 3D device and each end point of the 3D beam positioned on the structure. An imaging device captures an image of the structure to generate image data with the structure as depicted by the image of the structure. A controller fuses the 3D data of the structure generated by the 3D device with the image data of the structure generated by the imaging device to determine the distance between the 3D device and each end point of the 3D beam positioned on the structure and to determine a distance between each point on the image. The controller generates a sketch image of the structure that is displayed to the user.