B25J13/00

Moving robot
11701782 · 2023-07-18 · ·

Disclosed is a moving robot including: a voice input unit configured to receive a voice input of a user; a first display capable of receiving a touch input; a second display larger than the first display; and a controller configured to perform control such that a screen to be displayed in response to the voice input or the touch input is displayed on at least one of the first display or the second display based on a type and an amount of information included in the screen, and accordingly, it is possible to provide information and services more effectively using the two displays.

Machine learning method and mobile robot
11703872 · 2023-07-18 · ·

A machine learning method includes: a first learning step which is performed in a phase before a neural network is installed in a mobile robot and in which a stationary first obstacle is placed in a set space and the first obstacle is placed at different positions using simulation so that the neural network repeatedly learns a path from a starting point to the destination which avoids the first obstacle; and a second learning step which is performed in a phase after the neural network is installed in the mobile robot and in which, when the mobile robot recognizes a second obstacle that operates around the mobile robot in a space where the mobile robot moves, the neural network repeatedly learns a path to the destination which avoids the second obstacle every time the mobile robot recognizes the second obstacle.

INFORMATION PROCESSING METHOD, INFORMATION PROCESSING SYSTEM, AND PROGRAM
20230222648 · 2023-07-13 ·

The present disclosure provides an information processing method for performing the following steps, and the resulting information includes captured image data including at least the robot arm every predetermined period: a step of causing a captured image data acquisition unit to acquire captured image data of an imaging target at least including a robot arm and a control object; a step of causing a control unit to change a state of the control object every predetermined period based on user setting; an image comparison step of causing an image comparison unit to compare the captured image data with reference image data; and a step of causing a result information acquisition unit to detect a predetermined state change based on a result of the comparison in the image comparison step, acquire result information regarding a work of the robot arm, and store the result information in a result information storage unit.

INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD

An indicator related to work performed through a plurality of hierarchical processes is predicted efficiently with high accuracy. The information processing device stores sample data in association with each parameter representing work, the sample data including an indicator generated by execution of lower-level simulation based on a lower-level model set for a lower-level process, and the information processing device predicts an indicator related to predicted work by performing higher-level simulation using the sample data associated with a parameter similar to a parameter representing the predicted work, the higher-level simulation being based on a higher-level model which is set for a higher-level process. If sample data associated with a parameter similar to the parameter representing the predicted work is not stored, the information processing device complements sample data by performing lower-level simulation and performs higher-level simulation using the complemented sample data.

INTERVIEW ROBOT
20230219232 · 2023-07-13 ·

An interview robot that is used in the field of human resources (HR), having a camera, a microphone, an odor sensor, a speaker and touch sensors enabling communication with the interviewed candidate, a utility function determination memory, which addresses the questions to the candidate so as to determine the parameters of the utility functions for the economic, social and environmental attributes of the candidate being interviewed, and stores the utility function parameters calculated with the answers received, the nonlinear assignment program solution memory with uncertain utility functions that performs the best (optimal) job-personnel matching under different scenarios by simultaneously taking into account the situation in which employee satisfaction from the utility function determination memory varies in a mostly non-linear way in parallel with the economic, social and environmental characteristics of the candidate, and the uncertainties that may occur in employee satisfaction.

AUTONOMOUS MOBILE ROBOTIC SYSTEMS AND METHODS FOR PICKING AND PUT-AWAY

A method and system for autonomous picking or put-away of items, totes, or cases within a logistics facility. The system includes a remote server and at least one manipulation robot. The system may further include at least one transport robot. The remote server is configured to communicate with the various robots to send and receive picking data, and the various robots are configured to autonomously navigate and position themselves within the logistics facility.

UNMANNED ACCESS FLOOR CONSTRUCTION SYSTEM AND ACCESS FLOOR CONSTRUCTION METHOD USING SAME

According to the present invention, provided is an unmanned construction system for an access floor comprising an installation frame (10), a pad (20) attached to the installation frame (10), and a floor (30) coupled to the pad (20), the unmanned access floor construction system comprising a construction robot connected to a control server (1) by wired or wireless communication, wherein the construction robot comprises: a pad installation robot (100) for attaching the pad (20) to the installation frame (10); a floor installation robot (200) for mounting the floor (30) on the pad (20); and a bolting robot (300) for fastening the pad (20) to the floor (30) by using a fastening means (40).

ROBOT AND CONTROL METHOD THEREFOR
20230219233 · 2023-07-13 ·

A robot is provided. The robot includes a camera, a depth sensor, a memory, and a processor configured to perform an interaction with a first user with a highest degree of interest from among a plurality of users present in vicinity of the robot, obtain gazing information of the plurality of users while performing the interaction with the first user, and obtain distance information of the plurality of users, determine an engagement level of the first user for the interaction by using gazing information and distance information of the first user from among the plurality of users, determine a degree of interest of another user by using gazing information and distance information of the first user and the another user from among the plurality of users, end the interaction with first user, and perform an interaction with the another user based on the degree of interest of the another user.

ROBOT AND CONTROL METHOD THEREFOR
20230219233 · 2023-07-13 ·

A robot is provided. The robot includes a camera, a depth sensor, a memory, and a processor configured to perform an interaction with a first user with a highest degree of interest from among a plurality of users present in vicinity of the robot, obtain gazing information of the plurality of users while performing the interaction with the first user, and obtain distance information of the plurality of users, determine an engagement level of the first user for the interaction by using gazing information and distance information of the first user from among the plurality of users, determine a degree of interest of another user by using gazing information and distance information of the first user and the another user from among the plurality of users, end the interaction with first user, and perform an interaction with the another user based on the degree of interest of the another user.

Robot and method for controlling the same
11554499 · 2023-01-17 · ·

A robot according to the present disclosure comprises: a microphone; a camera disposed to face a predetermined direction; and a processor configured to: inactivate driving of the camera and activate driving of the microphone, if a driving mode of the robot is set to a user monitoring mode; acquire a sound signal through the microphone; activate the driving of the camera based on an event estimated from the acquired sound signal; confirm the event from the image acquired through the camera; and control at least one constituent included in the robot to perform an operation based on the confirmed event.