Patent classifications
H04N23/23
EMOTION DETERMINATION DEVICE AND EMOTION DETERMINATION METHOD
An emotion determination device includes a first estimator that estimates an emotion of a user based on a change in a facial expression of the user detected from a face image of the user, a second estimator that estimates the emotion of the user based on a change in a temperature of the user detected contactlessly from the user, and a determiner that determines the emotion of the user based on an estimation result obtained by the first estimator and an estimation result obtained by the second estimator.
System and Method for Locating an Object in a Specified Area
The invention relates to a system for locating an object and optionally determining the dimension thereof in a specified area/space, comprising a plurality of cameras which capture the area from different locations, wherein a common detection region is provided for each of at least two cameras which are arranged at different locations; means for marking an object detected simultaneously by at least two cameras in the images captured by the respective cameras; and means for determining the position of the object marked in the images of the cameras by calculating the respective line of position between each camera and the object detected by each camera and by calculating the coordinates of the object at the intersection of the lines of position in a coordinate system mapping the area.
System and Method for Locating an Object in a Specified Area
The invention relates to a system for locating an object and optionally determining the dimension thereof in a specified area/space, comprising a plurality of cameras which capture the area from different locations, wherein a common detection region is provided for each of at least two cameras which are arranged at different locations; means for marking an object detected simultaneously by at least two cameras in the images captured by the respective cameras; and means for determining the position of the object marked in the images of the cameras by calculating the respective line of position between each camera and the object detected by each camera and by calculating the coordinates of the object at the intersection of the lines of position in a coordinate system mapping the area.
SYSTEM AND METHOD FOR DETECTING RAINFALL FOR AN AUTONOMOUS VEHICLE
A system includes an autonomous vehicle and a control device associated with the autonomous vehicle. The control device obtains a plurality of sensor data captured by sensors of the autonomous vehicle. The control device determines a plurality of rainfall levels based on the sensor data. Each rainfall level is captured by a different sensor. the control device determines an aggregated rainfall level in a particular time period by combining the plurality of rainfall levels determined during the particular time period. The control device selects a particular object detection algorithm for detecting objects by at least one sensor. The particular object detection algorithm is configured to filter at least a portion of interference caused by the aggregated rainfall level in the sensor data. The control device causes the particular object detection algorithm to be implemented for the at least one sensor.
SYSTEM AND METHOD FOR DETECTING RAINFALL FOR AN AUTONOMOUS VEHICLE
A system includes an autonomous vehicle and a control device associated with the autonomous vehicle. The control device obtains a plurality of sensor data captured by sensors of the autonomous vehicle. The control device determines a plurality of rainfall levels based on the sensor data. Each rainfall level is captured by a different sensor. the control device determines an aggregated rainfall level in a particular time period by combining the plurality of rainfall levels determined during the particular time period. The control device selects a particular object detection algorithm for detecting objects by at least one sensor. The particular object detection algorithm is configured to filter at least a portion of interference caused by the aggregated rainfall level in the sensor data. The control device causes the particular object detection algorithm to be implemented for the at least one sensor.
METHOD OF GENERATING A DE-INTERLACING FILTER AND IMAGE PROCESSING APPARATUS
A method of generating a de-interlacing filter comprises: analysing a pixel array comprising an interlacing pattern of pixels. The interlacing pattern of pixels comprises first and second pluralities of pixels configured to be read during a first measurement subframe and a second measurement subframe, respectively. An n-state representation of the interlacing pattern of pixels is generated and distinguishes between the first plurality of pixels and the second plurality of pixels. The n-state representation of the interlacing pattern is translated to a spatial frequency domain, thereby generating a spatial frequency domain representation of the n-state representation of the interlacing pattern. A DC signal component is then removed from the spatial frequency domain representation of the n-state representation of the interlacing pattern, thereby generating a DC-less spatial frequency domain representation. A kernel filter is then selected and configured to blur before convolving the DC-less spatial frequency representation with the selected kernel filter.
FURNACE MONITORING DEVICE
A furnace monitoring device includes an imaging unit to capture an image of combustion ash adhering to a monitoring position in a furnace, an evaluation unit to evaluate a deposition state of combustion ash on the basis of a monitoring image which is output from the imaging unit, and an alert unit to output an alert for the combustion ash on the basis of a result of evaluation from the evaluation unit.
FURNACE MONITORING DEVICE
A furnace monitoring device includes an imaging unit to capture an image of combustion ash adhering to a monitoring position in a furnace, an evaluation unit to evaluate a deposition state of combustion ash on the basis of a monitoring image which is output from the imaging unit, and an alert unit to output an alert for the combustion ash on the basis of a result of evaluation from the evaluation unit.
METHODS AND APPARATUS TO AUTONOMOUSLY DETECT THERMAL ANOMALIES
Methods, apparatus, systems, and articles of manufacture are disclosed to autonomously detect thermal anomalies. Disclosed examples include an example apparatus to detect engine anomalies comprising: at least one memory; instructions in the apparatus; and processor circuitry to execute the instructions to: control a plurality of infrared cameras to capture a baseline image set, the baseline image set including at least two thermal images; generate emissivity data based on the baseline image set; provide the baseline image set and the emissivity data to an artificial intelligence model, the artificial intelligence model to generate a reconstructed image set; determine a difference between the baseline image set and the reconstructed image set; and in response to the difference exceeding a threshold, generate an alert indicating detection of an engine anomaly.
IMAGING APPARATUS AND METHOD
An imaging apparatus with a 3D thermal imager, a signal processor, a laser, an optical scanner, a controller, and a bimodal array which operates a plurality of slow response thermal detectors for generating a 2D thermal image as input to the signal processor. The controller operates the bimodal array, the laser, the optical scanner, and the thermal imager in predetermined sequence to derive range data for communication to the signal processor which interlaces the 2D thermal image with the range data to generate 3D thermal imaging. A method using a slow response thermal detector for deriving a detected analog signal from an emitted laser pulse returned from a target, asynchronously sampling the detected signal for deriving therefrom two series of time events corresponding to the ascending and the decaying portions of the detected signal. The time of flight and range being calculated by using the two series of time events.