B25J19/026

Automatic ultrasonic imaging inspection method and system based on six-axis manipulator

An automatic ultrasonic imaging inspection method and system based on a six-axis manipulator. The method includes: controlling, by a motion control card, a six-axis manipulator to perform scanning and an external axis motor to move; when scanning, feeding back, by a probe, an echo signal; when the probe moves to another scanning line, sending, by a controller, a displacement information of the probe along a stepping direction to an ultrasonic imaging device; when the external axis motor works, feeding back, by an encoder, a displacement information of the probe along a scanning direction to the ultrasonic imaging device; allowing the external axis motor to work when the probe moves along a scanning line; and generating, by the ultrasonic imaging device, a two-dimensional scanning image according to the echo signal, and the displacement information of the probe along the stepping direction and the scanning direction.

Fully automatic intelligent spraying robot

Disclosed is a fully automatic intelligent spraying robot. The robot includes a chassis, a driving device, a sliding device, a clamping device, a detection device and a control device; the driving device is fixedly connected to the chassis, and the driving device drives the chassis to move freely on the ground; the sliding device is fixed on the chassis; one end of the clamping device is used to fix a spray gun, and the other end of the clamping device is connected to the sliding device, and the clamping device can freely slide along the height direction of the sliding device; the detection device is fixed on the chassis, and the control device is also fixed on the chassis; and the control device is respectively in signal connection with the driving device, the sliding device, the clamping device and the detection device.

Robotic systems

A robotic system is controlled. Audiovisual data representing an environment in which at least part of the robotic system is located is received via at least one camera and at least one microphone. The audiovisual data comprises a visual data component representing a visible part of the environment and an audio data component representing an audible part of the environment. A location of a sound source that emits sound that is represented in the audio data component of the audiovisual data is identified based on the audio data component of the audiovisual data. The sound source is outside the visible part of the environment and is not represented in the visual data component of the audiovisual data. Operation of a controllable element located in the environment is controlled based on the identified location of the sound source.

NOISE REDUCTION IN ROBOT HUMAN COMMUNICATION

Noise reduction in a robot system includes the use of a gesture library that pairs noise profiles with gestures that can be performed by the robot. A gesture to be performed by the robot is obtained, and the robot performs the gesture. The robot's performance of the gesture creates noise, and when a user speaks to the robot while the robot performs a gesture, incoming audio includes both user audio and robot noise. A noise profile associated with the gesture is retrieved from the gesture library and is applied to remove the robot noise from the incoming audio.

SYSTEMS AND METHODS FOR ROBOTIC ARM ALIGNMENT AND DOCKING

Certain aspects relate to systems and techniques for preparing a robotic system for surgery. In one aspect, the method includes a robotic arm, a sensor configured to generate information indicative of a location of the robotic arm, a processor, and at least one computer-readable memory in communication with the processor and having stored thereon computer-executable instructions. The instructions are configured to cause the processor to receive the information from the sensor, determine that the robotic arm is located at a first position in which a first axis associated with the robotic arm is not in alignment with a second axis associated with a port installed in a patient, and provide a command to move the robotic arm to a second position in which the first axis associated with the robotic arm is in alignment with the second axis.

Sensor module for a robot
11396105 · 2022-07-26 · ·

A sensor module is provided for adding functionality to a robot. The sensor module is attached between the robot tool flange and the end effector. Additional interchangeable modules may also be provided between the tool flange and the end effector. The sensor module includes a sensor for monitoring a condition near the end effector. An output port of the module transmits sensor data to a processor outside of the sensor module for further processing.

Serving robot and method for receiving customer using the same

A serving robot includes a camera to obtain image data including at least one of a facial expression or a gesture, which is associated with food, of a customer, a microphone to obtain voice data including voice of the customer, which is associated with the food and a processor to obtain customer reaction data including at least one of the image data or the voice data, through at least one of the camera or the microphone, estimate a reaction of the customer to the food, from the obtained customer reaction data, and generate or update customer management information corresponding to the customer based on the estimated reaction. The robot estimates the reaction the customer from the customer reaction data through the learning model based on artificial intelligence.

Communications module for a robot
11383395 · 2022-07-12 · ·

A communications module is provided for simplifying routing of communications pathways in a robot. The communications module is attached between the robot tool flange and the end effector. Additional interchangeable modules may also be provided between the tool flange and the end effector. The communications module includes multiple input ports and at least one output port. A data switch within the module combines sensor data from multiple input ports into a single output stream that is transmitted from one or more of the output ports.

Device for recognizing voice content, server connected thereto, and method for recognizing voice content
11450326 · 2022-09-20 · ·

An artificial intelligence (AI) device, such as a robot, comprises: an output interface to output content in response to a request of a user; a camera to acquire an image of the user; a microphone to acquire a voice signal including a voice content uttered by the user; a processor to determine a characteristic of the user based on the content, the image, and/or the voice signal, and recognize the voice content through a voice recognition mode corresponding to the determined characteristic. The AI device may include a communication interface to forward the voice signal to a remote computer that identifies the characteristic and recognizes the voice content based on the characteristic. According to an embodiment, when an irregular voice is recognized from the acquired voice signal, the processor may recognize a regular voice corresponding to the irregular voice using an artificial intelligence-based learning model.

TACTILE SENSOR, AND TACTILE STIMULATION SENSING METHOD USING THE SAME, AND ROBOT SKIN AND ROBOT COMPRISING THE SAME
20220297309 · 2022-09-22 ·

The present invention relates to a tactile sensor, a tactile stimulation sensing method using the same, and a robot skin and a robot comprising the same. Particularly, the present invention relates to a tactile sensor comprising an input layer for receiving an external tactile stimulus; a microphone member; and a medium layer disposed between the input layer and the microphone member, and including gas therein to transmit vibrations by the stimulus, a tactile stimulation sensing method using the same, and a robot skin and a robot comprising the same.