G06V10/759

Inspection system for wire electrical discharge machine
10475176 · 2019-11-12 · ·

Provided is an inspection system for a wire electrical discharge machine, capable of automatically performing inspection of a constituent element and the like. The inspection system for a wire electrical discharge machine is provided with the wire electrical discharge machine, a robot for inspecting the constituent element of the wire electrical discharge machine, an image pickup device provided on a movable part of the robot and configured to image the constituent element, an image processing unit configured to acquire an image of the constituent element by means of the image pickup device, and a maintenance necessity determination unit configured to determine the necessity of maintenance of the constituent element based on the image acquired by the image processing unit.

DETECTING FINE-GRAINED SIMILARITY IN IMAGES

Detecting fine-grained similarity in image includes determining a core area of a search image by generating an image salient map from a plurality of layers of the search image and determining a connected area based on the image salient map. Feature descriptors are generated from the core area of the search image. A plurality of capsule vectors are generated from different ones of a plurality of keypoints of the feature descriptors. Capsule vectors of the search image are compared with capsule vectors of each image of the dataset to generate a top-K matrix. Similarity scores for the top-K matrix are calculated. One or more image of the dataset having fine-grained similarity with the search image are selected based a bundled similarity score for each image of the dataset. The bundled similarity score is a summation of the similarity scores of the image.

Washing machine and control method thereof

Disclosed herein is a washing machine capable of identifying whether laundry of an inner tub includes waterproof clothing. The washing machine includes a cabinet provided with an opening at an upper portion thereof, an outer tub provided in the cabinet, an inner tub provided in the outer tub, a motor configured to rotate the inner tub, a camera configured to capture an image of an inside of the inner tub, and a controller configured to control the motor to increase a rotational speed of the inner tub to a first rotational speed during spinning. The controller is configured to control the motor to set the rotational speed of the inner tub to a second rotational speed, which is less than the first rotational speed, based on the image of the inside of the inner tub captured by the camera during the spinning.

Method for indoor localization using deep learning
11961256 · 2024-04-16 · ·

The described technology is a technique related to an indoor localization method using deep learning. The indoor localization method comprises: opening a 3D tour comprising a plurality of panoramic images; receiving a first perspective image captured by a camera provided in the user device; calculating global features for the first perspective image and each of the plurality of panoramic images included in the 3D tour; selecting a most similar panoramic image to the first perspective image by using the calculated global features; computing an indoor location corresponding to a location of the camera on the 3D tour by using feature points included in the selected panoramic image and the first perspective image; and providing the computed indoor location to the user device.

System and method for detecting and tracking an object

A method includes receiving a first image that is captured at a first time. The method also includes detecting a location of a first object in the first image. The method also includes determining a region of interest based at least partially upon the location of the first object in the first image. The method also includes receiving a second image that is captured at a second time. The method also includes identifying the region of interest in the second image. The method also includes detecting a location of a second object in a portion of the second image that is outside of the region of interest.

Object detection device and object detection computer program
11961255 · 2024-04-16 · ·

The object detection device extracts a plurality of predetermined features from an image in which a target object is represented, calculates an entire coincidence degree between the plurality of predetermined features set for an entire model pattern of the target object and the plurality of predetermined features extracted from a corresponding region on the image while changing a relative positional relationship between the image and the model pattern, and calculates, for each partial region including a part of the model pattern, a partial coincidence degree between the predetermined features included in the partial region and the predetermined features extracted from a region corresponding to the partial region on the image. Then, the object detection device determines whether or not the target object is represented in the region on the image corresponding to the model pattern based on the entire coincidence degree and the partial coincidence degree.

Apparatus, method and recording medium storing instructions for determining bone age of teeth
11961235 · 2024-04-16 · ·

The present disclosure proposes an apparatus for determining a bone age of teeth. The apparatus according to the present disclosure may acquire a plurality of first teeth images of a plurality of teeth corresponding to a first gender and having a first bone age, generate a plurality of pre-processed images by pre-processing the plurality of first teeth images, generate a determination filter for determining a teeth shape for the first bone age of a human body having the first gender by training the neural network model using the plurality of pre-processed images, acquire a second teeth image of teeth of a human body having a second gender and gender information indicating the second gender, and determine a second bone age of the teeth corresponding to the second teeth image based on the second teeth image and the gender information by using the determination filter.

Image-based object recognition method and system based on learning of environment variable data

Disclosed herein are image-based object recognition method and system by and in which a learning server performs image-based object recognition based on the learning of environment variable data. The image-based object recognition method includes: receiving an image acquired through at least one camera, and segmenting the image on a per-frame basis; primarily learning labeling results for one or more objects in the image segmented on a per-frame basis; performing primary reasoning by performing object detection in the image through a model obtained as a result of the primary learning; performing data labeling based on the results of the primary reasoning, and performing secondary learning with weights allocated to respective boxes obtained by the primary reasoning and estimated as object regions; and finally detecting one or more objects in the image through a model generated as a result of the secondary learning.

METHOD AND APPARATUS FOR TRAINING VISUAL LANGUAGE PRE-TRAINING MODEL, AND DEVICE AND MEDIUM
20240119725 · 2024-04-11 ·

Provided in the present application are a method and apparatus for training a visual language pre-training model, and a device and a medium. The method includes: acquiring pairing groups respectively corresponding to N images, wherein the pairing group of a first image includes: a first pairing group which is composed of the first image and description text of the first image, and a second pairing group which is composed of a local image of the first image and description text of the local image, N is an integer greater than 1, and the first image is any one of the N images; and training a visual language pre-training model according to the pairing groups respectively corresponding to the N images.

Aircraft door camera system for landing gear monitoring

A camera with a field of view toward an external environment of an aircraft is disposed within an aircraft door such that a wheel of a main landing gear of the aircraft is within the field of view of the camera. A display device is disposed within an interior of the aircraft. A processor is operatively coupled to the camera and to the display device. The processor analyzes image data captured by the camera to monitor the landing gear.