Patent classifications
G05B2219/40298
AUTOMATIC APPLICATION DEVICE AND AUTOMATIC APPLICATION METHOD
An automatic application device includes: a robot arm; an application hand configured to apply, to a workpiece, a paint that is a liquid; a force sensor configured to detect a force and a moment acting on the application hand; and a control section configured to control the robot arm in accordance with a parameter calculated from an output signal from the force sensor.
PRODUCTION SYSTEM FOR PROCESSING WORKPIECES
Production system for processing workpieces, having a robot module, a workpiece carrier module and a machining module, all of them being working modules, wherein each of said working modules includes an interface surface, the interface surface having a supply interface and a communication interface, wherein the robot module includes a robot and a robot controller for handling workpieces, wherein the workpiece carrier module includes a plurality of workpiece locations for receiving unmachined and finished workpieces, wherein the machining module includes a processing system for carrying out at least one processing operation on at least one workpiece; wherein a data carrier is assigned to each of the working modules, which a data carrier stores processing data, the processing data including a transfer position for workpieces and being coded for processing in the robot controller of the robot module.
AREA SETTING DEVICE, RACK, CONTROL SYSTEM, AREA SETTING METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM STORING PROGRAM
A technique shortens the time taken to adjust a protection area. An area setting device includes a setting unit that sets a protection area in at least a part of a surrounding environment of a robot to detect an entry of an object, an obtainer that obtains surrounding information about the robot, and a storage prestoring a set value for the protection area and the surrounding information associated with each other. The setting unit sets the protection area based on the set value read from the storage.
Automated drywall painting system and method
An automated painting system that includes a robotic arm and a painting end effector coupled at a distal end of the robotic arm, with the painting end effector configured to apply paint to a target surface. The painting system can also include a computing device executing a computational planner that: generates instructions for driving the painting end effector and robotic arm to perform at least one painting task that includes applying paint, via the painting the end effector, to a plurality of drywall pieces, the generating based at least in part on obtained target surface data; and drives the end effector and robotic arm to perform the at least one painting task.
AUTOMATED DRYWALL PAINTING SYSTEM AND METHOD
An automated painting system that includes a robotic arm and a painting end effector coupled at a distal end of the robotic arm, with the painting end effector configured to apply paint to a target surface. The painting system can also include a computing device executing a computational planner that: generates instructions for driving the painting end effector and robotic arm to perform at least one painting task that includes applying paint, via the painting the end effector, to a plurality of drywall pieces, the generating based at least in part on obtained target surface data; and drives the end effector and robotic arm to perform the at least one painting task.
SOLAR PANEL HANDLING SYSTEM
A system for installing a solar panel may include a first end-of-arm assembly tool coupled to a first robotic arm and part of a first assembly robot and a second end-of-arm assembly tool coupled to a second robotic arm and part of a second assembly robot. The first and the second end-of-arm assembly tools have different tooling and perform different functions to assembly solar panels to support structure. The first assembly robot and the second assembly robot may be located on autonomous and non-autonomous vehicles and the various components can be operated by a control system based on operation instructions received from a neural network.
AUTONOMOUS TASK MANAGEMENT INDUSTRIAL ROBOT
Example implementations described herein involve systems and methods for operation of a robot configured to work on a first process and a second process, which can involve receiving sensor data indicative of a status of one or more of the first process and the second process; for the status indicative of the first process waiting on the robot, controlling the robot to work on the first process; and for the status indicative of the first process not waiting on the robot, controlling the robot to conduct one or more of work on the second process or return to standby.
Automated Non-Contact Thickness Inspection and Projection System
In one embodiment, systems and methods include using an inspection and projection system to measure the thickness of a coating and provide visual guidance for secondary operations. The inspection and projection system comprises a robotic arm operable to rotate about a plurality of axes, wherein an end effector is disposed at a distal end of the robotic arm. The inspection and projection system further comprises a linear rail system, wherein the robotic arm is coupled to the linear rail system, and wherein the robotic arm is operable to translate along the linear rail system. The inspection and projection system further comprises a frame, wherein the linear rail system is disposed on top of the frame, and an information handling system coupled to the frame, wherein the information handling system is operable to actuate the robotic arm and the linear rail system.
Mobile Robot Environment Sensing
A method includes receiving data collected by at least one sensor on a robotic device, wherein the data is to be used for an ambient environment state representation, and wherein the data represents ambient environment measurements collected at locations of the at least one sensor when the robotic device is passively monitoring an environment such that robotic device navigation is not based on the ambient environment state representation. The method further includes determining the ambient environment state representation using the data collected by the at least one sensor on the robotic device. The method also includes identifying, based on the ambient environment state representation, one or more anomalous ambient environment measurements. The method additionally includes causing, based on the one or more identified anomalous ambient environment measurements, the robotic device to actively monitor the environment such that robotic device navigation is based on the ambient environment state representation.
Robot configuration with three-dimensional lidar
A mobile robotic device includes a mobile base and a mast fixed relative to the mobile base. The mast includes a carved-out portion. The mobile robotic device further includes a three-dimensional (3D) lidar sensor mounted in the carved-out portion of the mast and fixed relative to the mast such that a vertical field of view of the 3D lidar sensor is angled downward toward an are in front of the mobile robotic device.