Patent classifications
G05B2219/40323
Method Of Building A Geometric Representation Over A Working Space Of A Robot
A method of building a geometric representation over a working space of a robot is provided. The method is performed in a control device and includes: representing the working space by a three-dimensional structure, obtaining information on a trajectory in the working space travelled by the robot as being collision free, determining, based on the obtained information on the collision free trajectory and on information on geometry of the robot, a volume in the working space to be free space, the volume corresponding to the geometry of at least a part of the robot having travelled along the trajectory, and updating the three-dimensional structure by indicating the determined volume of the three-dimensional structure as free space.
Validation of a pose of a robot and of sensor data of a sensor moved along with the robot
A method of validating a pose of a robot and/or of sensor data of a sensor moved along with the robot is provided, wherein a robot controller determines the real pose of the robot and the sensor measures real sensor data, In this respect, a robot simulation determines a simulated pose of the robot by a simulated movement of the robot and a sensor simulation determines simulated sensor data of the sensor by a simulated sensor measurement and the validation takes place by at least one comparison of the real pose and the simulated pose of the robot, of real sensor data and simulated sensor data, and/or of simulated sensor data among one another.
MOBILE MANIPULATOR, METHOD FOR CONTROLLING MOBILE MANIPULATOR, AND PROGRAM THEREFOR
A mobile manipulator includes a moving apparatus, a manipulator that is connected to the moving apparatus, a controller configured to control the moving apparatus and the manipulator, and an environment acquisition sensor configured to acquire predetermined environmental data originating from an environment at the movement destination to which the mobile manipulator is moved by the moving apparatus in association with a position at the movement destination, and the controller controls at least one of the moving apparatus and the manipulator based on the environmental data.
INTEGRATED ROBOTIC SYSTEM AND METHOD FOR AUTONOMOUS VEHICLE MAINTENANCE
A robotic system includes a controller configured to obtain image data from one or more optical sensors and to determine one or more of a location and/or pose of a vehicle component based on the image data. The controller also is configured to determine a model of an external environment of the robotic system based on the image data and to determine tasks to be performed by components of the robotic system to perform maintenance on the vehicle component. The controller also is configured to assign the tasks to the components of the robotic system and to communicate control signals to the components of the robotic system to autonomously control the robotic system to perform the maintenance on the vehicle component.
MODIFYING ROBOT DYNAMICS IN RESPONSE TO HUMAN PRESENCE
A robot system models the behavior of a user when the user occupies an operating zone associated with a robot. The robot system predicts future behaviors of the user, and then determines whether those predicted behaviors interfere with anticipated behaviors of the robot. When such interference may occur, the robot system generates dynamics adjustments that can be implemented by the robot to avoid such interference. The robot system may also generate dynamics adjustments that can be implemented by the user to avoid such interference.
Method and system for predicting a collision free posture of a kinematic system
A system and a method predict a collision free posture of a kinematic system. The method includes: receiving a 3D virtual environment, receiving a 3D representation of the kinematic system and a set of 3D postures defined for the 3D virtual kinematic system, receiving a target task to be performed by the kinematic system with respect to the surrounding environment, and receiving a prescribed location within the 3D virtual environment. The prescribed location defines a position at which the 3D virtual kinematic system has to be placed within the 3D virtual environment. A collision free detection function (CFD) is applied to a set of input data containing the 3D virtual environment, the target task, the prescribed location and the set of postures. The CFD function outputs a set of collision free postures enabling the kinematic system to perform the target task when located at the prescribed location.
SIMULATION DEVICE
Provided is a simulation device capable of efficiently adjusting the position of a visual sensor model. The simulation device includes: a virtual space generation unit that generates a virtual space three-dimensionally representing a work space; a measurement target model arrangement unit that arranges in the virtual space a measurement target model three-dimensionally representing a measurement target; a measurement portion specification unit that specifies a measurement portion in the measurement target model; a visual sensor model arrangement unit that arranges the visual sensor model three-dimensionally representing a visual sensor for imaging the measurement target, at an arbitrary visual sensor model position; and a position determination unit that determines, on the basis of the position of the measurement portion, the visual sensor model position included in an image size of the visual sensor model as an arrangement position of the visual sensor model.
Methods and systems for testing robotic systems in an integrated physical and simulated environment
Methods and systems for testing robotic systems in an environment blending both physical and virtual test environments are presented herein. A realistic, three dimensional physical environment for testing and evaluating a robotic system is augmented with simulated, virtual elements. In this manner, robotic systems, humans, and other machines dynamically interact with both real and virtual elements. In one aspect, a model of a physical test environment and a model of a virtual test environment are combined, and signals indicative of a state of the combined model are employed to control a robotic system. In a further aspect, a mobile robot present in a physical test environment is commanded to emulate movements of a virtual robot under control. In another further aspect, images of the virtual robot under control are projected onto the physical test environment to provide a visual representation of the presence and action taken by the virtual robot.
ROBOTIC ARM PROCESSING METHOD AND SYSTEM BASED ON 3D IMAGE
Robotic arm processing method and system based on 3D image are provided. The processing method includes: providing robotic arm 3D model data and processing environment 3D model data; obtaining workpiece 3D model data, and generating a processing path consisting of contact points according to the workpiece 3D model data, wherein a free end of a robotic arm moves along the processing path to complete a processing procedure; generating a posture candidate group according to a relationship according to each one of the contact points corresponding to the free end of the robotic arm; selecting an actual moving posture from the posture candidate group; moving the free end of the robotic arm to each corresponding one of the contact points according to the selected actual moving posture; and moving the free end of the robotic arm along the processing path according to the actual moving postures to perform the processing procedure.
ROBOT SYSTEM, CONTROL APPARATUS OF ROBOT SYSTEM, CONTROL METHOD OF ROBOT SYSTEM, IMAGING APPARATUS, AND STORAGE MEDIUM
A robot system including a robot apparatus and an imaging apparatus includes a control apparatus configured to control the robot apparatus and the imaging apparatus, and the control apparatus controls, based on a path in which a predetermined part of the robot apparatus is moved, a movement of the imaging apparatus to image the predetermined part even if the robot apparatus is moved.