B25J9/0003

Robot and method for recognizing wake-up word thereof
11577379 · 2023-02-14 · ·

Provided is a robot including a microphone configured to acquire a sound signal corresponding to a sound generated near the robot, a camera, an output interface including at least one of a display configured to output a wake-up screen or a speaker configured to output a wake-up sound when the robot wakes up, and a processor configured to recognize whether the acquired sound includes a voice of a person, activate the camera when the sound includes a voice of a person, recognize whether a person is present in an image acquired by the activated camera, set a wake-up word recognition sensitivity based on a recognition result as to whether a person is present, and recognize whether a wake-up word is included voice data of a user acquired through the microphone based on the set wake-up word recognition sensitivity.

Automated restaurant
11579621 · 2023-02-14 ·

The present application discloses an automated restaurant comprising: a kitchen; a customer-tracking area comprising a dining area; and a plurality of vehicles. The kitchen comprises a storage apparatus to store ingredient containers, a transfer apparatus to move ingredient containers, and one or more cooking stations. Each vehicle is configured to move one or more food containers from cooking stations to dining tables. A tracking system comprises cameras, lidars, etc., which are fixedly mounted. The tracking system can dynamically map out the fixtures, humans and vehicles in the restaurant. Information from the tracking system is used to control the motion of the vehicles. The tracking system can dynamically track the positions of customers in the customer-tracking area, so that foods ordered by specific customers may be automatically sent by vehicles to the customers' locations.

Systems and methods for privacy management in an autonomous mobile robot

A method of operating a mobile cleaning robot can include receiving a privacy mode setting from a user interface, where the privacy mode setting is based on a user selection between at least two different privacy mode settings for determining whether to operate the mobile cleaning robot in an image-capture-restricted mode. An image stream of an image capture device of the mobile cleaning robot is permitted in an absence of a user-selection of a more restrictive one of the privacy settings. At least a portion of the image stream is restricted or disabled based at least in part on a user-selection of a more restrictive one of the privacy settings.

Automatic guiding method for self-propelled apparatus

An automatic guiding method for a self-propelled apparatus (10) is provided. The self-propelled apparatus (10) turns and irradiates when a signal light emitted by a charging dock (20) is sensed by a flank sensor (103), and changes its turn direction when another different signal light from the charging dock (20) is sensed by a forward sensor (102). The charging dock (20) switches to emit another signal light different from the signal light currently emitted when each time is triggered by the signal light emitted by the self-propelled apparatus (10). Repeatedly execute the above actions and make the self-propelled apparatus approach the light-emitting unit (202) until the self-propelled apparatus (10) reaches a charging position. It can accurately guide the self-propelled apparatus (10) to the charging position by arranging only two sensors on the self-propelled apparatus.

APPARATUS FOR GUIDING AN AUTONOMOUS VEHICLE TOWARDS A DOCKING STATION

An apparatus for guiding an autonomous vehicle towards a docking station including an autonomous vehicle with a camera-based sensing system, a drive system for driving the autonomous vehicle, and a control system for controlling the drive system. The apparatus includes a docking station including a first fiducial marker and a second fiducial marker, wherein the second fiducial marker is positioned on the docking station to define a predetermined relative spacing with the first fiducial marker, wherein the control system is operable to receive an image provided by the camera-based sensing system, the image including a representation of the first and second fiducial markers, and to control the drive system so as to guide the autonomous vehicle towards the base station based on a difference between the representation of the first and second fiducial markers in the received image and the predetermined relative spacing between the first and second fiducial markers.

Sharing Learned Information Among Robots

Methods, systems, and apparatus, including computer programs encoded on computer storage media, for sharing learned information among robots. In some implementations, a robot obtains sensor data indicating characteristics of an object. The robot determines a classification for the object and generates an embedding for the object using a machine learning model stored by the robot. The robot stores the generated embedding and data indicating the classification for the object. The robot sends the generated embedding and the data indicating the classification to a server system. The robot receives, from the server system, an embedding generated by a second robot and a corresponding classification. The robot stores the received embedding and the corresponding classification in the local cache of the robot. The robot may then use the information in the cache to identify objects.

Smart home robot assistant
11565398 · 2023-01-31 · ·

Methods and systems are described for robot transportation of objects into or out of a home automation system. One example may include determining, by a mobile robotic device, that an object is available to cross a boundary of the home automation system. The method may include deactivating at least a portion of the home automation system. The method also include retrieving, by the mobile robotic device, the object and transporting, by the mobile robotic device, the object across the boundary. The method further includes leaving, by the mobile robotic device, the object at a drop-off location. The method may also include reactivating at least the portion of the home automation system.

Systems and methods for automated makeup application
11568675 · 2023-01-31 ·

Systems and methods for automated makeup application allow a user to select and apply desired makeup styles to the user's face. The systems and methods include a computer application with a graphical user interface which allows selection of a look from a plurality of preconfigured looks. A camera coupled with a robotic arm records a face map and color coding and sends that data to be stored on a virtual server database. The application calculates formula quantity and a pump extracts desired formula amounts from appropriate formula cartridges which it releases into reservoirs on the robotic arm's head. An airbrush compressor mixes the formula and plug triggers release one of several airbrush nozzles to start spraying the user's face with formula. A cleaning mechanism is provided between makeup applications and after the final application.

INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING PROGRAM
20230028871 · 2023-01-26 ·

An information processing device (10) according to the present disclosure includes an operation controller (175) that controls a moving operation of an autonomous mobile body (10) that travels while maintaining an inverted state, and controls a posture operation of the autonomous mobile body that temporally changes from a reference posture in the inverted state. Furthermore, the information processing device (10) according to the present disclosure further includes an acquisition unit (174) that acquires motion data corresponding to a posture operation of the autonomous mobile body (10). The operation controller (175) controls a posture operation of the autonomous mobile body (10) based on the motion data acquired by the acquisition unit (174).

Information processing apparatus, information processing method and program

There is provided an information processing apparatus and information processing method to implement a more natural and flexible behavior plan of an autonomous mobile object, the information processing apparatus including a behavior planner configured to plan a behavior of an autonomous mobile object based on estimation of circumstances, wherein the behavior planner is configured to, based on the circumstances that are estimated and multiple sets of needs that are opposed to each other, determine the behavior to be executed by the autonomous mobile object. The information processing method includes, by a processor, planning a behavior of an autonomous mobile object based on estimation of circumstances, wherein the planning includes, based on the circumstances that are estimated and multiple sets of needs that are opposed to each other, determining the behavior to be executed by the autonomous mobile object.