Patent classifications
B25J19/026
Robot and method for recognizing wake-up word thereof
Provided is a robot including a microphone configured to acquire a sound signal corresponding to a sound generated near the robot, a camera, an output interface including at least one of a display configured to output a wake-up screen or a speaker configured to output a wake-up sound when the robot wakes up, and a processor configured to recognize whether the acquired sound includes a voice of a person, activate the camera when the sound includes a voice of a person, recognize whether a person is present in an image acquired by the activated camera, set a wake-up word recognition sensitivity based on a recognition result as to whether a person is present, and recognize whether a wake-up word is included voice data of a user acquired through the microphone based on the set wake-up word recognition sensitivity.
ROBOT CONTROL DEVICE, ROBOT, ROBOT CONTROL METHOD, AND PROGRAM RECORDING MEDIUM
Disclosed are a robot control device and the like with which the accuracy with which a robot starts listening to speech is improved, without requiring a user to perform an operation. This robot control device is provided with: an action executing means which, upon detection of a person, determines an action to be executed with respect to said person, and performs control in such a way that a robot executes the action; an assessing means which, upon detection of a reaction from the person in response to the action determined by the action executing means, assesses the possibility that the person will talk to the robot, on the basis of the reaction; and an operation control means which controls an operating mode of the robot main body on the basis of the result of the assessment performed by the assessing means.
Moving robot
Disclosed is a moving robot including: a voice input unit configured to receive a voice input of a user; a first display capable of receiving a touch input; a second display larger than the first display; and a controller configured to perform control such that a screen to be displayed in response to the voice input or the touch input is displayed on at least one of the first display or the second display based on a type and an amount of information included in the screen, and accordingly, it is possible to provide information and services more effectively using the two displays.
PROCESSING SYSTEM, ROBOT SYSTEM, CONTROL DEVICE, PROCESSING METHOD, CONTROL METHOD, AND STORAGE MEDIUM
According to one embodiment, a processing system sets a detector to a prescribed position. The detector includes a plurality of detection elements arranged along a first direction and a second direction. The second direction crosses the first direction. The processing system causes the detector to perform a probe of a weld portion of a joined body. The probe includes a transmission of an ultrasonic wave and a detection of a reflected wave. The processing system calculates a center position of the weld portion in a first plane along the first and second directions based on intensity data. The intensity data is of an intensity of the reflected wave obtained by the probe. The processing system performs a position adjustment of moving the detector along the first plane to reduce a distance between the center position and a position of the detector in the first plane.
Robot System with Casing Elements
A robot system comprising movable parts, a casing element, a force limiting sensor, a joint position sensor, and one or more processors, wherein the casing element comprises a vibration actuator. Multiple embodiments are introduced for the implementation of the casing element include haptic warning and proximity sensing. Furthermore, means to use the casing element to guide the robot and generate haptic effect by the vibration actuator to assist the user in a human-robot collaboration and/or guiding function are also disclosed.
Radar based position measurement for robot systems
An apparatus including at least one emitter configured to emit energy; at least one receiver configured to receive the emitted energy, where the at least one emitter is mounted on at least one of: a robot arm, an end effector of the robot arm, a substrate on the robot arm, or a substrate process module, where the at least one receiver is mounted on at least one of: the robot arm, the end effector of the robot arm, the substrate on the robot arm, or the substrate process module.
Determining vehicle integrity based on observed behavior during predetermined manipulations
A vehicle or another object is grasped by a robotic arm of a handling system and caused to undergo one or more movements or manipulations resulting in a change of position, orientation, velocity or acceleration of the vehicle. Sensors provided in the robotic arm capture data representative of forces or torques imparted upon the robotic arm by the vehicle during or after the movement, or power or energy levels of vibration resulting from the movement. A signature representative of an inertial or vibratory response of the vehicle to the movement is derived based on the data. The signature may be compared to a baseline signature similarly derived for a vehicle that is known to be structurally and aerodynamically sound. If the signature is sufficiently similar to the baseline signature, the vehicle may also be determined to be structurally and aerodynamically sound.
SERVICE ROBOT AND DISPLAY CONTROL METHOD THEREOF, CONTROLLER AND STORAGE MEDIUM
Provided are a service robot and a display control method thereof, a controller and a storage medium. The service robot display control method comprises receiving a start signal sent by a human body recognition sensor, wherein the human body recognition sensor outputs the start signal to a controller in the case where a user appears within a predetermined range around the service robot, and controlling the mounted device to start operation in the case where the start signal is received. A first display screen of the robot is in a standby state when there is no user, and when a user approaches the robot, the first display screen starts to light up.
Vacuum-based end effector for engaging parcels
A vacuum-based end effector for engaging parcels includes a base plate, one or more vacuum cups of a first type, and one or more vacuum cups of a second type. Each vacuum cup of the vacuum-based end effector is configured to be placed in fluid communication with a vacuum source to provide the vacuum cup with a suction force which can be used to engage and grasp parcels. Each vacuum cup includes a bellows defining a pathway for a flow of air and a lip connected to the bellows. Each lip of the one or more vacuum cups of the first type comprises a foam lip, and each lip of the one or more vacuum cups of the second type comprises an elastomeric lip. The vacuum-based end effector can be combined with a robot to provide an improved system for engaging parcels.
Systems and methods for robotic sensing, repair and inspection
Various embodiments of a bio-inspired robot operable for detecting crack and corrosion defects in tubular structures are disclosed herein.