Patent classifications
G06V10/147
Method for Detecting Lost Image Information, Control Apparatus for Carrying Out a Method of this Kind, Detection Device Having a Control Apparatus of this Kind and Motor Vehicle Having a Detection Device of this Kind
A method for detecting lost image information via a lighting device and an optical sensor. The lighting device and the optical sensor are controlled so as to be chronologically aligned with each other. A visible spacing region in an observation region of the optical sensor is determined from the chronological alignment of the control of the lighting device and the optical sensor. A recording of the observation region with the optical sensor is generated via the aligned control. Image information is identified in the recording in regions outside of the spacing region visible in the image, so as to make the identified image information accessible.
Method for Detecting Lost Image Information, Control Apparatus for Carrying Out a Method of this Kind, Detection Device Having a Control Apparatus of this Kind and Motor Vehicle Having a Detection Device of this Kind
A method for detecting lost image information via a lighting device and an optical sensor. The lighting device and the optical sensor are controlled so as to be chronologically aligned with each other. A visible spacing region in an observation region of the optical sensor is determined from the chronological alignment of the control of the lighting device and the optical sensor. A recording of the observation region with the optical sensor is generated via the aligned control. Image information is identified in the recording in regions outside of the spacing region visible in the image, so as to make the identified image information accessible.
INFRARED IMAGING DEVICE AND INFRARED IMAGING SYSTEM
A light emitting unit emits infrared rays. An imaging element converts incident infrared rays into an electric signal and outputs the electric signal. A control unit estimates a light emission timing at which infrared rays are emitted from another infrared imaging device based on an infrared picture generated based on the electrical signal output from the imaging element, and performs control to cause the light emitting unit to emit infrared rays in a period in which the infrared rays are not emitted from the another infrared imaging device.
DISPLAY WITH IMAGE LIGHT STEERING
A display device includes a directional illuminator providing a light beam, a display panel downstream of a directional illuminator, for receiving and spatially modulating the light beam, and a beam redirecting module downstream of the display panel, for variably redirecting the spatially modulated light beam. Steering the illuminating light by the beam redirecting module enables one to steer the exit pupil of the display device to match the user's eye location(s).
DUAL-PATTERN OPTICAL 3D DIMENSIONING
An optical dimensioning system includes one or more light emitting assemblies configured to project one or more predetermined patterns on an object; an imaging assembly configured to sense light scattered and/or reflected off the object, and to capture an image of the object while the patterns are projected; and a processing assembly configured to analyze the image of the object to determine one or more dimension parameters of the object. The light emitting assembly may include a single piece optical component configured for producing a first pattern and second pattern. The patterns may be distinguishable based on directional filtering, feature detection, feature shift detection, or the like. A method for optical dimensioning includes illuminating an object with at least two detectable patterns; and calculating dimensions of the object by analyzing pattern separate of the elements comprising the projected patterns. One or more pattern generators may produce the patterns.
VISION-BASED NAVIGATION SYSTEM INCORPORATING HIGH-CONFIDENCE ERROR OVERBOUNDING OF MULTIPLE OPTICAL POSES
A system and method for high-confidence error overbounding of multiple optical pose solutions receives a set of candidate correspondences between 2D image features captured by an aircraft camera and 3D constellation features including at least one ambiguous correspondence. A candidate estimate of the optical pose of the camera is determined for each of a set of candidate correspondence maps (CMAP), each CMAP resolving the ambiguities differently. Each candidate pose estimate is evaluated for viability and any non-viable estimates eliminated. An individual error bound is determined for each viable candidate pose estimate and CMAP, and based on the set of individual error bounds a multiple-pose containment error bound is determined, bounding with high confidence the set of candidate CMAPs and multiple pose estimates where at least one is correct. The containment error bound may be evaluated for accuracy as required for flight operations performed by aircraft-based instruments and systems.
VISION-BASED NAVIGATION SYSTEM INCORPORATING HIGH-CONFIDENCE ERROR OVERBOUNDING OF MULTIPLE OPTICAL POSES
A system and method for high-confidence error overbounding of multiple optical pose solutions receives a set of candidate correspondences between 2D image features captured by an aircraft camera and 3D constellation features including at least one ambiguous correspondence. A candidate estimate of the optical pose of the camera is determined for each of a set of candidate correspondence maps (CMAP), each CMAP resolving the ambiguities differently. Each candidate pose estimate is evaluated for viability and any non-viable estimates eliminated. An individual error bound is determined for each viable candidate pose estimate and CMAP, and based on the set of individual error bounds a multiple-pose containment error bound is determined, bounding with high confidence the set of candidate CMAPs and multiple pose estimates where at least one is correct. The containment error bound may be evaluated for accuracy as required for flight operations performed by aircraft-based instruments and systems.
Event-assisted autofocus methods and apparatus implementing the same
A focus method and an image sensing apparatus are disclosed. The method includes capturing, by a plurality of event sensing pixels, event data of a targeted scene, wherein the event data indicates which pixels of the event sensing pixels have changes in light intensity, accumulating the event data for a predetermined time interval to obtain accumulated event data, determining whether a scene change occurs in the targeted scene according to the accumulated event data, obtaining one or more interest regions in the targeted scene according to the accumulated event data in response to the scene change, and providing at least one of the one or more interest regions for a focus operation. The image sensing apparatus comprises a plurality of image sensing pixels, a plurality of event sensing pixels, and a controller configured to perform said method.
Combined biometrics capture system with ambient free infrared
An electronic device is disclosed herein that includes an infrared light source to emit infrared light, a rolling shutter sensor, and at least one processor. The at least one processor is to: cause the rolling shutter sensor to output a first signal corresponding to a first frame of image data during exposure to the infrared light, reset the rows of the rolling shutter sensor at a same time, cause the rolling shutter sensor to output a second signal corresponding to a second frame of image data without exposure to the infrared light from the infrared light source, determine a difference between the first signal and the second signal to generate an ambient infrared free frame, and recognize a face based on the ambient infrared free frame.
Combined biometrics capture system with ambient free infrared
An electronic device is disclosed herein that includes an infrared light source to emit infrared light, a rolling shutter sensor, and at least one processor. The at least one processor is to: cause the rolling shutter sensor to output a first signal corresponding to a first frame of image data during exposure to the infrared light, reset the rows of the rolling shutter sensor at a same time, cause the rolling shutter sensor to output a second signal corresponding to a second frame of image data without exposure to the infrared light from the infrared light source, determine a difference between the first signal and the second signal to generate an ambient infrared free frame, and recognize a face based on the ambient infrared free frame.