Patent classifications
H04N23/20
METHODS AND SYSTEMS FOR THE AUTOMATIC QUALITY INSPECTION OF MATERIALS USING INFRARED RADIATION
Methods and systems for automatic quality inspection of materials using radiation emitted in the infrared spectrum are described. According to the method of the invention, a radiation pattern falls on a material to be analyzed, the reflected image is captured by a capture device and a defect in the material is detected by the distortion it causes in the pattern included in the captured image. Finally, a software locates, identifies and classifies such distortions and, consequently, the defects of the inspected material, by artificial intelligence techniques.
METHODS AND SYSTEMS FOR THE AUTOMATIC QUALITY INSPECTION OF MATERIALS USING INFRARED RADIATION
Methods and systems for automatic quality inspection of materials using radiation emitted in the infrared spectrum are described. According to the method of the invention, a radiation pattern falls on a material to be analyzed, the reflected image is captured by a capture device and a defect in the material is detected by the distortion it causes in the pattern included in the captured image. Finally, a software locates, identifies and classifies such distortions and, consequently, the defects of the inspected material, by artificial intelligence techniques.
DISPLAY ASSEMBLY
This application is directed to a rearview assembly. The rearview assembly may have a support member, a camera, a backlight, and/or a display. The support member may be substantially transparent to infra-red light. The camera may be disposed in a first direction relative the support member and configured to capture one or more infra-red images. The backlight may comprise a light guide and an edge light configured to emit light into an edge of the light guide. The light guide may be configured to direct the light in a second direction. The second direction may be opposite the first direction. The display may be associated with and supported by the backlight and support member. Further, the display may be disposed in the second direction relative the backlight, configured to receive light from the backlight, and configured to present one or more images for viewing by a user.
DISPLAY ASSEMBLY
This application is directed to a rearview assembly. The rearview assembly may have a support member, a camera, a backlight, and/or a display. The support member may be substantially transparent to infra-red light. The camera may be disposed in a first direction relative the support member and configured to capture one or more infra-red images. The backlight may comprise a light guide and an edge light configured to emit light into an edge of the light guide. The light guide may be configured to direct the light in a second direction. The second direction may be opposite the first direction. The display may be associated with and supported by the backlight and support member. Further, the display may be disposed in the second direction relative the backlight, configured to receive light from the backlight, and configured to present one or more images for viewing by a user.
Window obscuration sensors for mobile gas and chemical imaging cameras
An infrared (IR) imaging system for determining a concentration of a target species in an object is disclosed. The imaging system can include an optical system including a focal plane array (FPA) unit behind an optical window. The optical system can have components defining at least two optical channels thereof, said at least two optical channels being spatially and spectrally different from one another. Each of the at least two optical channels can be positioned to transfer IR radiation incident on the optical system towards the optical FPA. The system can include a processing unit containing a processor that can be configured to acquire multispectral optical data representing said target species from the IR radiation received at the optical FPA. One or more of the optical channels may be used in detecting objects on or near the optical window, to avoid false detections of said target species.
Object recognition by far infrared camera
Example implementations described herein are directed to integration of far infrared cameras in a vehicle system to detect objects based on relative temperature of objects. Such implementations can improve accuracy when paired, for example, with classification systems that classify objects based on the shape of the object, as both the shape and relative temperature can be used to ensure that the classification is accurate. Further, example implementations can synchronize far infrared cameras with other sensor systems to determine distance, energy, and absolute temperature of an object, which can also be used to enhance classification. Such classifications can then be provided to an advanced driver assistance systems (ADAS), which can control the vehicle system in accordance with the object classification.
Camera focus adjustment
A method for camera focus adjustment includes, at a computing system, receiving a depth image of a surrounding environment captured by a depth camera of the computing system. The depth image includes a plurality of depth pixels each encoding depth values corresponding to distances between the depth camera and objects in the surrounding environment. One or more depth pixels of the plurality of depth pixels are identified as a region of interest (ROI). Based on the depth values of the one or more depth pixels, a focus of an environmental imaging camera of the computing system is adjusted.
Camera focus adjustment
A method for camera focus adjustment includes, at a computing system, receiving a depth image of a surrounding environment captured by a depth camera of the computing system. The depth image includes a plurality of depth pixels each encoding depth values corresponding to distances between the depth camera and objects in the surrounding environment. One or more depth pixels of the plurality of depth pixels are identified as a region of interest (ROI). Based on the depth values of the one or more depth pixels, a focus of an environmental imaging camera of the computing system is adjusted.
Using images of a monitored scene to identify windows
A camera system includes memory, image sensors, illuminators, and a processor. The processor operates the illuminators and the image sensors in a first mode to capture a two-dimensional image of the scene using light transmitted by the illuminators and reflected from the scene. The processor operates in a second mode to capture a plurality of images of the scene, including capturing a first image of the scene while one or more of the illuminators are activated and capturing a second image of the scene is while none of the illuminators are activated. The images are transmitted to a remote cloud computing system. The remote system constructs a light intensity map for the scene using the first and second images, and identifies a first region in the light intensity map as a glass surface when the light intensity values for the first region are below a threshold value corresponding to glass.
Trigger zones for objects in projected surface model
An integrated processing and projection device suitable for use on a supporting surface includes a processor and a projector designed to provide a display on the supporting surface proximate to the device. Various sensors enable object and gesture detection in a detection area in the display area. Trigger zones are defined in the detection area such that interaction of an object or human limb in the detection zone provides object and zone specific feedback by the integrated processing and projection device. The feedback can be provided in the projection area or may be provided as audible or active feedback to a device having active feedback capabilities.