G06V20/00

Control method, terminal, and system using environmental feature data and biological feature data to display a current movement picture

A control method includes obtaining feature data using at least one sensor, the feature data being acquired by the terminal using the at least one sensor, generating an action instruction based on the feature data and a decision-making mechanism of the terminal, and executing the action instruction. In this application, various aspects of feature data are acquired using a plurality of sensors, data analysis is performed on the feature data, and a corresponding action instruction is then generated based on a corresponding decision-making mechanism to implement interactive control.

Method for commissioning a network of optical sensors across a floor space
11563901 · 2023-01-24 · ·

A method includes: accessing a floorplan representing the floorspace; and extracting from the floorplan a set of floorplan features representing areas of interest in the floorspace. The method also includes, calculating a set of target locations relative to the floorplan that, when occupied by the set of sensor blocks: locate the areas of interest in the floorspace within fields of view of the set sensor blocks; and yield a minimum overlap in fields of view of adjacent sensor blocks in the set of sensor blocks. The method further includes, for each sensor block in the sensor blocks installed over the floorspace: receiving, from the sensor block, an image of the floorspace; based on overlaps in the image with images from other sensor blocks in sensor blocks, estimating an installed location of the sensor block; and mapping the sensor block to a target location in the set of target locations.

Method for commissioning a network of optical sensors across a floor space
11563901 · 2023-01-24 · ·

A method includes: accessing a floorplan representing the floorspace; and extracting from the floorplan a set of floorplan features representing areas of interest in the floorspace. The method also includes, calculating a set of target locations relative to the floorplan that, when occupied by the set of sensor blocks: locate the areas of interest in the floorspace within fields of view of the set sensor blocks; and yield a minimum overlap in fields of view of adjacent sensor blocks in the set of sensor blocks. The method further includes, for each sensor block in the sensor blocks installed over the floorspace: receiving, from the sensor block, an image of the floorspace; based on overlaps in the image with images from other sensor blocks in sensor blocks, estimating an installed location of the sensor block; and mapping the sensor block to a target location in the set of target locations.

Apparatus for providing laundry treating information based on artificial intelligence
11562558 · 2023-01-24 · ·

A laundry data analysis apparatus based on artificial intelligence according to an embodiment of the present invention includes: a communication unit configured to receive an image including laundry data related to characteristics of laundry from an image acquisition device corresponding to a group including at least one member; and a processor configured to recognize the laundry data from the received image, acquire additional data related to the characteristics of the laundry on the basis of the recognized laundry data, store laundry information including the laundry data and the additional data into a database, and acquire member characteristic information of each of the at least one member from a plurality of laundry information corresponding to the group stored in the database.

Apparatus for providing laundry treating information based on artificial intelligence
11562558 · 2023-01-24 · ·

A laundry data analysis apparatus based on artificial intelligence according to an embodiment of the present invention includes: a communication unit configured to receive an image including laundry data related to characteristics of laundry from an image acquisition device corresponding to a group including at least one member; and a processor configured to recognize the laundry data from the received image, acquire additional data related to the characteristics of the laundry on the basis of the recognized laundry data, store laundry information including the laundry data and the additional data into a database, and acquire member characteristic information of each of the at least one member from a plurality of laundry information corresponding to the group stored in the database.

Use of on-screen content identifiers in automated tool control systems

An inventory control system comprises an object storage device, a display device, and one or more processors. The object storage device includes a plurality of compartments, in which each compartment has a plurality of storage locations for storing objects. The display device is configured to display information about the object storage device. The one or more processors are configured to establish a description database of objects configured for storage in the inventory control system. The one or more processors retrieve object keywords corresponding to objects stored in the plurality of storage locations of one of the plurality of compartments. The one or more processors also generate a text block based on the retrieved object keywords. On the display device, the one or more processors display a representation of the plurality of compartments of the object storage device with the text block applied to the one of the plurality of compartments.

Object verification/recognition with limited input

Systems and methods for object recognition with limited input are disclosed herein. An example method includes updating a neural network trained to perform object recognition on a first rendition of an object, so that the neural network performs object recognition on a second rendition of the object, using a limited set of input images. The method includes receiving a limited set of model images of the second rendition of the object, accessing a corresponding image mapping, and generating a large number of training images from the limited set, where image mappings include geometric, illumination, and/or obscuration transformations. The neural network is then trained, from this initial small set, to classify the second rendition of the object.

INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD
20230230124 · 2023-07-20 ·

An information processing apparatus that analyzes video data obtained from an image capture apparatus to detect an article that has been pre-registered is disclosed. The apparatus, in a case where an article that is determined to be a property of an article holder that has been pre-registered is detected, searches for an advertisement viewer located within a range of a predetermined distance from the article holder from among advertisement viewers that have registered the article as an article they are interested in. The apparatus then informs an advertisement viewer found by the search of location information of the article.

INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD
20230230124 · 2023-07-20 ·

An information processing apparatus that analyzes video data obtained from an image capture apparatus to detect an article that has been pre-registered is disclosed. The apparatus, in a case where an article that is determined to be a property of an article holder that has been pre-registered is detected, searches for an advertisement viewer located within a range of a predetermined distance from the article holder from among advertisement viewers that have registered the article as an article they are interested in. The apparatus then informs an advertisement viewer found by the search of location information of the article.

Pre-emptive generation of autonomous unmanned aerial vehicle inspections according to monitored sensor events

Methods, systems and apparatus, including computer programs encoded on computer storage media for generation of autonomous unmanned aerial vehicle flight plans based on triggered sensor information. One of the methods includes accessing information correlated from sensors monitoring features of weather events, and determining an upcoming weather event, the determination comprising one or more areas expected to be affected by the weather event. A likelihood of damage associated with the weather event is determined to be greater than a threshold in the areas. The weather event is monitored while areas in which the likelihood is greater than the threshold are updated accordingly. Subsequent to the weather event, properties to be inspected by unmanned aerial vehicles are determined based on severity information associated with the weather event. Job information is generated, the job information being associated with inspecting the determined properties, the job information including jobs each assignable to operators for implementation.