G06V10/143

METHOD FOR DETERMINING ABNORMAL ACOUSTIC SOURCE AND AI ACOUSTIC IMAGE CAMERA
20220381606 · 2022-12-01 ·

Disclosed is an AI acoustic camera including an acoustic source localizing means unit of generating position-specific acoustic level data by determining a position of an acoustic source, an AI acoustic analysis unit of recognizing a type of acoustic source estimated as an abnormal acoustic source by extracting a regeneration time domain acoustic signal for the acoustic source with the determined position and AI-learning and recognizing an acoustic feature image of the extracted time domain acoustic signal, an object recognition unit of recognizing a type of object positioned in the acoustic source through image analysis of an area recognized as that the acoustic source is positioned, and a determination unit of determining the acoustic source as a true acoustic source when the type of acoustic source and the type of object have commonality.

COUNTERFEIT IMAGE DETECTION

A computer, including a processor and a memory, the memory including instructions to be executed by the processor to acquire a first image from a first camera by illuminating a first object with a first light and determine an object status as one of a real object or a counterfeit object by comparing a first measure of pixel values corresponding to the first object to a threshold.

Hyperspectral optical patterns on retroreflective articles

In some examples, a retroreflective article may include a retroreflective substrate, and an optical pattern embodied on the retroreflective substrate. The optical pattern may include a first optical sub-pattern and a second optical sub-pattern, wherein the optical pattern represents a set of information that is interpretable based on a combination of the first optical sub-pattern that is visible in a first light spectrum and the second optical sub-pattern that is visible in a second light spectrum. The first and second light spectra may be different.

Hyperspectral optical patterns on retroreflective articles

In some examples, a retroreflective article may include a retroreflective substrate, and an optical pattern embodied on the retroreflective substrate. The optical pattern may include a first optical sub-pattern and a second optical sub-pattern, wherein the optical pattern represents a set of information that is interpretable based on a combination of the first optical sub-pattern that is visible in a first light spectrum and the second optical sub-pattern that is visible in a second light spectrum. The first and second light spectra may be different.

Method and device for situation awareness
11514668 · 2022-11-29 · ·

A method for situation awareness is provided. The method comprises: preparing a neural network trained by a learning set, wherein the learning set includes a plurality of maritime images and maritime information including object type information which includes a first type index for a vessel, a second type index for a water surface and a third type index for a ground surface, and distance level information which includes a first level index indicating that a distance is undefined, a second level index indicating a first distance range and a third level index indicating a second distance range greater than the first distance range; obtaining a target maritime image generated from a camera; and determining a distance of a target vessel based on the distance level index of the maritime information being outputted from the neural network which receives the target maritime image and having the first type index.

Temporal thermal sensing and related methods

Embodiments described herein generally relate to: sensing and/or authentication using luminescence imaging; diagnostic assays, systems, and related methods; temporal thermal sensing and related methods; and/or to emissive species, such as those excitable by white light, and related systems and methods.

OBJECT DETECTION APPARATUS AND METHOD
20220377281 · 2022-11-24 ·

The present disclosure discloses an object detection method used in an object detection apparatus that includes the steps outlined below. An image signal received from an image sensor is detected to generate an image detection signal when an image variation is detected. An infrared signal received from an infrared sensor is detected to generate an infrared detection signal when an infrared energy variation is detected. A time counting process is initialized when the image detection signal is generated. An object detection signal is generated when the infrared detection signal is generated within a predetermined time period after the time counting process is initialized. A detection distance of the image sensor is larger than a detection distance of the infrared sensor.

METHODS, SYSTEMS, APPARATUSES, AND DEVICES FOR FACILITATING MANAGING CULTIVATION OF CROPS BASED ON MONITORING THE CROPS

Disclosed herein is an apparatus for facilitating managing cultivation of crops based on monitoring the crops. Further, the apparatus comprises an apparatus body, cameras, light sensors, a processing unit, and a communication interface. Further, the cameras generate a measurement of a crop and a field portion. Further, the light sensors generate an environment measurement of an environment of the apparatus. Further, the processing unit analyzes the environment measurement, determines a factor affecting the measurement, and generates a calibrating factor for the cameras. Further, the calibrating factor facilitates compensating the affecting of the factor in the measurement. Further, the cameras calibrate a camera parameter of the cameras based on the calibrating factor to generate the measurement. Further, the processing unit analyzes the measurement and generates a status of the crop. Further, the communication interface transmits the status to a device.

METHODS, SYSTEMS, APPARATUSES, AND DEVICES FOR FACILITATING MANAGING CULTIVATION OF CROPS BASED ON MONITORING THE CROPS

Disclosed herein is an apparatus for facilitating managing cultivation of crops based on monitoring the crops. Further, the apparatus comprises an apparatus body, cameras, light sensors, a processing unit, and a communication interface. Further, the cameras generate a measurement of a crop and a field portion. Further, the light sensors generate an environment measurement of an environment of the apparatus. Further, the processing unit analyzes the environment measurement, determines a factor affecting the measurement, and generates a calibrating factor for the cameras. Further, the calibrating factor facilitates compensating the affecting of the factor in the measurement. Further, the cameras calibrate a camera parameter of the cameras based on the calibrating factor to generate the measurement. Further, the processing unit analyzes the measurement and generates a status of the crop. Further, the communication interface transmits the status to a device.

EXTENDED FIELD-OF-VIEW CAPTURE OF AUGMENTED REALITY EXPERIENCES

Augmented reality experiences of a user wearing an electronic eyewear device are captured by at least one camera on a frame of the electronic eyewear device, the at least one camera having a field of view that is larger than a field of view of a display of the electronic eyewear device. An augmented reality feature or object is applied to the captured scene. A photo or video of the augmented reality scene is captured and a first portion of the captured photo or video is displayed in the display. The display is adjusted to display a second portion of the captured photo or video with the augmented reality features as the user moves the user's head to view the second portion of the captured photo or video. The captured photo or video may be transferred to another device for viewing the larger field of view augmented reality image.