Patent classifications
G06V10/143
METHOD FOR RECOGNIZING IRIS AND ELECTRONIC DEVICE THEREFOR
Various embodiments of the present invention relate to a device and method for controlling iris recognition parameters in an electronic device. Here, the electronic device may comprise a processor for: identifying an iris recognition environment using an image sensor module for acquiring an image, a light emitting module for emitting an infrared ray, and input information of the electronic device; modifying at least one iris recognition parameter on the basis of the iris recognition environment; emitting an infrared ray through the light emitting module on the basis of the modified iris recognition parameter; and performing iris recognition using an image acquired by the image sensor module. Various other embodiments are also possible.
PLANT IDENTIFICATION USING HETEROGENOUS MULTI-SPECTRAL STEREO IMAGING
A farming machine identifies and treats a plant as the farming machine travels through a field. The farming machine includes a pair of image sensors for capturing images of a plant. The image sensors are different, and their output images are used to generate a depth map to improve the plant identification process. A control system identifies a plant using the depth map. The control system captures images, identifies a plant, and actuates a treatment mechanism in real time.
OBJECT-AWARE TEMPERATURE ANOMALIES MONITORING AND EARLY WARNING BY COMBINING VISUAL AND THERMAL SENSING SENSING
An apparatus including an interface and a processor. The interface may be configured to receive pixel data generated by a capture device and a temperature measurement generated by a thermal sensor. The processor may be configured to receive the pixel data and the temperature measurement from the interface, generate video frames in response to the pixel data, perform computer vision operations on the video frames to detect objects, perform a classification of the objects detected based on characteristics of the objects, detect a temperature anomaly in response to the temperature measurement and the classification, and generate a control signal in response to the temperature anomaly. The control signal may provide a warning based on the temperature anomaly. The classification may provide a normal temperature range for the objects detected.
Triggering Actions Based on Shared Video Footage from Audio/Video Recording and Communication Devices
Systems and methods for communicating in a network using share signals in accordance with various embodiments of the present disclosure are provided. In one embodiment, a method for communicating in a network may include receiving, from a first client device, a share signal including first image data captured by a camera of a first audio/video (A/V) recording and communication device and a command to share the first image data with a network of users; processing the share signal by comparing the first image data to second image data captured by a camera of a second A/V recording and communication device; and generating and transmitting an alert to a second client device associated with the second A/V recording and communication device when comparison of the first image data with the second image data indicates a person of interest is depicted in both the first image data and the second image data.
PHOTOSENSITIVE THIN FILM DEVICE AND BIOMETRIC INFORMATION SENSING APPARATUS INCLUDING THE PHOTOSENSITIVE THIN FILM DEVICE
A photosensitive thin film device includes a substrate that is transparent and insulative; a first electrode on the substrate; a circular semiconductor layer on the substrate and surrounding a perimeter of the first electrode; a circular second electrode on the substrate and surrounding a perimeter of the semiconductor layer; an interlayer insulating layer on the semiconductor layer and the first and second electrodes and having a first aperture exposing the first electrode; and a conductive layer including an upper surface light barrier arranged on the interlayer insulating layer and covering an upper surface of the semiconductor layer, and a contact plug extending from the upper surface light barrier and connected to the first electrode via the first aperture.
PORTABLE IMAGE DEVICE FOR SIMULATING INTERACTION WITH ELECTRONIC DEVICE
An apparatus and a method for displaying an image on a portable image device are provided. The method includes receiving a first input from an object detection device, the first input being an indication that a marker associated with an object is detected, determining a configuration of the object based on the first input, generating a first image corresponding to the object based on the configuration of the object, and displaying the first image on an image display device of the portable image device.
Image processing device, imaging device, and image processing method
Visibility of a license plate and color reproducibility of a vehicle body are improved in a monitoring camera. A vehicle body area detection unit detects a vehicle body area of a vehicle from an image signal. A license plate area detection unit detects a license plate area of the vehicle from the image signal. A vehicle body area image processing unit performs processing of the image signal corresponding to the detected vehicle body area. A license plate area image processing unit performs processing different from the processing of the image signal corresponding to the vehicle body area on the image signal corresponding to the detected license plate area. A synthesis unit synthesizes the processed image signal corresponding to the vehicle body area and the processed image signal corresponding to the license plate area.
Coordinated illumination and image signal capture for enhanced signal detection
Signal detection and recognition employees coordinated illumination and capture of images under to facilitate extraction of a signal of interest. Pulsed illumination of different colors facilitates extraction of signals from color channels, as well as improved signal to noise ratio by combining signals of different color channels. The successive pulsing of different color illumination appears white to the user, yet facilitates signal detection, even for lower cost monochrome sensors, as in barcode scanning and other automatic identification equipment.
Smart optical input/output (I/O) extension for context-dependent workflows
Systems, methods, and computer program products for smart, automated capture of textual information using optical sensors of a mobile device are disclosed. The capture and provision is context-aware, and determines context of the optical input, and invokes a contextually-appropriate workflow based thereon. The techniques also provide capability to normalize, correct, and/or validate the captured optical input and provide the corrected, normalized, validated, etc. information to the contextually-appropriate workflow. Other information necessary by the workflow and available to the mobile device optical sensors may also be captured and provided, in a single automatic process. As a result, the overall process of capturing information from optical input using a mobile device, invoking an appropriate workflow, and providing captured information to the workflow is significantly simplified and improved in terms of accuracy of data transfer/entry, speed and efficiency of workflows, and user experience.
Monitoring activity with depth and multi-spectral camera
A camera system is configured to automatically monitor an area. Depth image(s) of the area are acquired based on active IR light emitted by the camera system and reflected from the area to a sensor array of the camera system. The depth image(s) are computer analyzed to identify a human subject. For each spectral illuminator of the camera system, spectral light image(s) of the area are acquired based on active spectral light in the spectral light sub-band of the spectral illuminator reflected from the area to the sensor array. The spectral light image(s) for the spectral illuminators are computer analyzed to identify an interaction between the human subject and an object in the area. In response to identifying the interaction between the human subject and the object in the area, an action to be performed for the object in the area is computer issued.