Patent classifications
B25J11/009
Excrement care robot with integrated analysis function for motion and condition of bedridden patient
Disclosed is an excrement care robot with an integrated analysis function on the motion and condition of a bedridden patient including: a body-fitted type treating apparatus including a body which has a shape corresponding to bends of the genitals and buttocks of a human body and is formed with a treatment space that is opened in a direction of the genitals and buttocks of the human body to receive excrement discharged from the human body, a discharge unit which is provided in the body and communicates with the treatment space to discharge the excrement of the treatment space to the outside, and a sensor unit that is provided in the body to measure a direction of movement of the body; and a motion analysis unit that receives sensor data of the sensor unit to analyze the motion of the human body through the movement of the body.
Telepresence robot system that can be accessed by a cellular phone
A robot system with a robot that has a camera, a monitor, a microphone and a speaker. A communication link can be established with the robot through a cellular phone. The link may include an audio only communication. Alternatively, the link may include audio and video communication between the cellular phone and the robot. The phone can transmit its resolution to the robot and cause the robot to transmit captured images at the phone resolution. The user can cause the robot to move through input on the cellular phone. For example, the phone may include an accelerometer that senses movement, and movement commands are then sent to the robot to cause a corresponding robot movement. The phone may have a touch screen that can be manipulated by the user to cause robot movement and/or camera zoom.
Energy storage device management for a patient support apparatus
A system is provided that comprises a patient support apparatus and a unit being independent from the patient support apparatus. The patient support apparatus comprises a support structure having a base and a patient support surface for a patient. The patient support apparatus also comprises an electrical distribution system, one or more electrical devices, and an energy storage device (ESD) configured to store energy to power the one or more electrical devices through the electrical distribution system. The unit is configured to autonomously interact with the patient support apparatus to remove the ESD from the patient support apparatus and/or to place a replacement ESD on to the patient support apparatus.
GRIPPER FOR A PICKING DEVICE AND METHOD FOR OPERATING THE PICKING DEVICE
Grippers for a picking device for drug packages and methods for operating the picking device are provided. The gripper has a tray having an end portion with an arcuate front, the end portion forming a loading and unloading front. The gripper further includes a transport device arranged above the tray and configured for moving drug packages from a storage surface onto the tray, and at least one sensor arranged in the end portion of the tray and configured to determine a presence of a drug package in a detection area of the at least one sensor.
Method and Apparatus for Localizing Mobile Robot in Environment
The method and system disclosed herein presents a method and system for capturing, by a camera moving in an environment, a sequence of consecutive frames at respective locations within a portion of the environment; constructing a topological semantic graph corresponding to the portion of the environment based on the sequence of consecutive frames; in accordance with a determination that the topological semantic graph includes at least a predefined number of edges: searching, in a topological semantic map of the environment, one or more candidate topological semantic graphs corresponding to the first topological semantic graph; for a respective candidate topological semantic graph of the one or more candidate topological semantic graphs, searching, in a joint semantic and feature localization map, a keyframe corresponding to the respective candidate topological semantic graph; and computing a current pose of the camera based on the keyframe.
Control system and method for movement of neck mechanism for robot
A control system for a neck mechanism includes a perception system configured to track movement of an object, and a perception control system that controls a rotary motor to yaw a platform and controls a first linear actuator and a second linear actuator that is in parallel with the first linear actuator to pitch and roll the platform according to a target position of the platform. The perception system tracks movement of the object by estimating its position and pose in 3D space and the platform is moved according to a vision-based position and pose estimation result.
Interfacing with a mobile telepresence robot
A telepresence robot may include a drive system, a control system, an imaging system, and a mapping module. The mapping module may access a plan view map of an area and tags associated with the area. In various embodiments, each tag may include tag coordinates and tag information, which may include a tag annotation. A tag identification system may identify tags within a predetermined range of the current position and the control system may execute an action based on an identified tag whose tag information comprises a telepresence robot action modifier. The telepresence robot may rotate an upper portion independent from a lower portion. A remote terminal may allow an operator to control the telepresence robot using any combination of control methods, including by selecting a destination in a live video feed, by selecting a destination on a plan view map, or by using a joystick or other peripheral device.
CARE ROBOT CONTROLLER
The present invention discloses a care robot controller, which includes: a controller body that includes slide rails, finger slot sliders and a joystick, wherein the finger slot sliders are movably arranged on the slide rails and configured to receive pressing, and the joystick is configured to control the care robot; a gesture parsing unit configured to parse three-dimensional gestures of the controller body, and control the care robot to perform corresponding actions when the three-dimensional gestures of the controller body are in line with preset gestures; and a tactile sensing unit configured to sense the pressing received by the finger slot sliders and initiate a user mode corresponding to the pressing information, so that the controller body provides corresponding vibration feedback. Thus the user can control the controller efficiently and conveniently, the control accuracy is improved, and effective man-machine interaction is realized.
ASSISTANCE SYSTEM
An assistance system includes an assistance device configured to assist movement of a care receiver; a microphone provided on the assistance device; a speech analyzer configured to analyze speech of at least one of a caregiver or the care receiver included in sound data acquired by the microphone and acquire information related to an action; an echo analyzer configured to analyze an echo included the sound data and acquire information related to a location; and an action history information generator configured to generate action history information of at least one of the caregiver or the care receiver based on the information related to the action and the information related to the location.
CONTROL SYSTEM AND METHOD FOR MOVEMENT OF NECK MECHANISM FOR ROBOT
A control system for a neck mechanism includes a perception system configured to track movement of an object, and a perception control system that controls a rotary motor to yaw a platform and controls a first linear actuator and a second linear actuator that is in parallel with the first linear actuator to pitch and roll the platform according to a target position of the platform. The perception system tracks movement of the object by estimating its position and pose in 3D space and the platform is moved according to a vision-based position and pose estimation result.