Patent classifications
G06V10/145
Image capturing device captures an object in an illumination environment appropriate for individual identification and object collation
An image capturing device, configured to illuminate an object by an illumination means and capture reflected light from the object as a reflection image by a capturing means, includes an irradiation angle range determination means. The irradiation angle range determination means determines, assuming that a group of pieces of unevenness existing at the same position on the surfaces of a plurality of individuals of an object is an unevenness group, an irradiation angle range for irradiating the object by the illumination means, on the basis of a statistic value of the inclination angles of the unevenness group in which variations in the inclination angles of the unevenness between individuals is larger than other unevenness groups, among the plurality of unevenness groups.
Image capturing device captures an object in an illumination environment appropriate for individual identification and object collation
An image capturing device, configured to illuminate an object by an illumination means and capture reflected light from the object as a reflection image by a capturing means, includes an irradiation angle range determination means. The irradiation angle range determination means determines, assuming that a group of pieces of unevenness existing at the same position on the surfaces of a plurality of individuals of an object is an unevenness group, an irradiation angle range for irradiating the object by the illumination means, on the basis of a statistic value of the inclination angles of the unevenness group in which variations in the inclination angles of the unevenness between individuals is larger than other unevenness groups, among the plurality of unevenness groups.
Three-dimensional measurement apparatus, three-dimensional measurement method and non-transitory computer readable medium
The three-dimensional measurement apparatus includes a light projecting unit projects, onto a target, a pattern in which data is encoded, an image capturing unit captures an image of the target onto which the pattern is projected, and a calculation unit calculates positions of a three-dimensional point group based on positions of the feature points and the decoded data, in which the pattern includes unit patterns that each expresses at least two bits and are used in order to calculate the positions of the three-dimensional point group, the unit patterns each includes a first region and a second region that has an area that is larger than an area of the first region, and an area ratio between the first region and the second region is at least 0.3 and not more than 0.9.
LIGHT EMITTER, LIGHT EMITTING DEVICE, OPTICAL DEVICE, AND INFORMATION PROCESSING APPARATUS
A light emitter includes: a substrate; a driving section provided on the substrate; a light source that is provided on the substrate and is driven by the driving section; a cover section through which light emitted from the light source is transmitted and that is disposed in an optical axial direction of the light source; and a support section that is provided on a part of the substrate excluding a part between the driving section and the light source and supports the cover section.
Timing mechanism to derive non-contaminated video stream using RGB-IR sensor with structured light
An apparatus includes an RGB-IR image sensor, a structured light projector, and a control circuit. The control circuit may be configured to control a shutter exposure time of the RGB-IR image sensor and a turn on time of the structured light projector to obtain a sequence of images captured by the RGB-IR image sensor, wherein the sequence of images comprises at least one image including a structured light pattern and at least one image where the structured light pattern is absent.
Dynamic adjustment of structured light for depth sensing systems based on contrast in a local area
A depth camera assembly (DCA) determines depth information. The DCA projects a dynamic structured light pattern into a local area and captures images including a portion of the dynamic structured light pattern. The DCA determines regions of interest in which it may be beneficial to increase or decrease an amount of texture added to the region of interest using the dynamic structured light pattern. For example, the DCA may identify the regions of interest based on contrast values calculated using a contrast algorithm, or based on the parameters received from a mapping server including a virtual model of the local area. The DCA may selectively increase or decrease an amount of texture added by the dynamic structured light pattern in portions of the local area. By selectively controlling portions of the dynamic structured light pattern, the DCA may decrease power consumption and/or increase the accuracy of depth sensing measurements.
Dynamic adjustment of structured light for depth sensing systems based on contrast in a local area
A depth camera assembly (DCA) determines depth information. The DCA projects a dynamic structured light pattern into a local area and captures images including a portion of the dynamic structured light pattern. The DCA determines regions of interest in which it may be beneficial to increase or decrease an amount of texture added to the region of interest using the dynamic structured light pattern. For example, the DCA may identify the regions of interest based on contrast values calculated using a contrast algorithm, or based on the parameters received from a mapping server including a virtual model of the local area. The DCA may selectively increase or decrease an amount of texture added by the dynamic structured light pattern in portions of the local area. By selectively controlling portions of the dynamic structured light pattern, the DCA may decrease power consumption and/or increase the accuracy of depth sensing measurements.
Object detection apparatus and method for vehicle
An object detection apparatus includes a first camera unit, a second camera unit, and a control unit. The first camera unit includes one or more cameras, and is configured to capture an image around a vehicle. The second camera unit includes one or more cameras, and is configured to capture an image of an area ahead of the vehicle. The control unit is configured to: determine a displacement of a feature point positioned in a common area from the image acquired via the first camera unit; determine a pixel displacement of the feature point in the image acquired via the second camera unit; and determine distance information to an object recognized in the image captured via the second camera unit based on the displacement of the feature point and the pixel displacement of the feature point.
Object detection apparatus and method for vehicle
An object detection apparatus includes a first camera unit, a second camera unit, and a control unit. The first camera unit includes one or more cameras, and is configured to capture an image around a vehicle. The second camera unit includes one or more cameras, and is configured to capture an image of an area ahead of the vehicle. The control unit is configured to: determine a displacement of a feature point positioned in a common area from the image acquired via the first camera unit; determine a pixel displacement of the feature point in the image acquired via the second camera unit; and determine distance information to an object recognized in the image captured via the second camera unit based on the displacement of the feature point and the pixel displacement of the feature point.
Surface texture recognition sensor, surface texture recognition device and surface texture recognition method thereof, display device
A surface texture recognition sensor, a surface texture recognition device and a surface texture recognition method thereof, and a display device are disclosed. The surface texture recognition sensor is configured to recognize a ridge and a valley of a surface, and includes: a first dielectric layer and a second dielectric layer which overlap with each other; a light source which is configured to emit light into the first dielectric layer; and a photosensitive detector which is at a side of the second dielectric layer away from the first dielectric layer. The light emitted from the light source is incident onto the interface with an incident angle; with the recognition unit being in contact with the surface, refractive index of at least one of the first dielectric layer and the second dielectric layer is changed to allow a critical angle of total reflection to be changed.