Patent classifications
G05B2219/40577
Environment-based-threat alerting to user via mobile phone
Embodiments of the present disclosure relate to a method and an apparatus for alerting threats to users. The apparatus may capture a plurality of signals including at least one of Electro-Magnetic (E-M) signals and sound signals. The E-M signal and sound signals are used to detect objects around the user. A threat to the user is predicted based on the objects around the user and one or more alerts are generated such that the user avoids the threat. The prediction of the threat enables the user to take an action even before the threat has occurred. Also, the alerts are generated based on the prediction such that the user can avoid the threat well in advance of the occurrence of the threat.
HRC System And Method For Controlling An HRC System
A method for controlling a human-robot collaboration (HRC) system wherein the HRC system includes at least one manipulator having an end effector. The method includes using the end effector in a first operating mode, wherein the end effector is operated with reduced power; monitoring whether a desired object is manipulated when the end effector is used in the first operating mode; and increasing the power used to operate the end effector in order to use the end effector in a second operating mode when the monitoring indicates that the desired object is being manipulated.
Safe motion planning for machinery operation
Systems and methods monitor a workspace for safety purposes using sensors distributed about the workspace. The sensors are registered with respect to each other, and this registration is monitored over time. Occluded space as well as occupied space is identified, and this mapping is frequently updated. Based on the mapping, a constrained motion plan of machinery can be generated to ensure safety.
DETECTING AND CLASSIFYING WORKSPACE REGIONS FOR SAFETY MONITORING
Systems and methods monitor a workspace for safety purposes using sensors distributed about the workspace. The sensors are registered with respect to each other, and this registration is monitored over time. Occluded space as well as occupied space is identified, and this mapping is frequently updated.
COMPUTER VISION ROBOT CONTROL
Aspects of the present disclosure relate to computer vision robot control. As an example, a user device may use one or more cameras to process received visual data and generate control instructions for a robot. In some examples, a camera may be part of the user device, remote from the user device, part of the robot, or any combination thereof. Control instructions may be based on facial recognition and/or object recognition, among other computer vision techniques. As a result, the user may be able to more directly interact with the robot and/or control the robot in ways that were not previously available using simple user input methods.
Detecting and classifying workspace regions for safety monitoring
Systems and methods monitor a workspace for safety purposes using sensors distributed about the workspace. The sensors are registered with respect to each other, and this registration is monitored over time. Occluded space as well as occupied space is identified, and this mapping is frequently updated.
ROBOT LOCALIZATION IN A WORKSPACE VIA DETECTION OF A DATUM
Apparatus and method is disclosed for determining position of a robot relative to objects in a workspace which includes the use of a camera, scanner, or other suitable device in conjunction with object recognition. The camera, etc is used to receive information from which a point cloud can be developed about the scene that is viewed by the camera. The point cloud will be appreciated to be in a camera centric frame of reference. Information about a known datum is used and compared to the point cloud through object recognition. For example, a link from a robot could be the identified datum so that, when recognized, the coordinates of the point cloud can be converted to a robot centric frame of reference since the position of the datum would be known relative to the robot.
Robot end-effector sensing and identification
Systems and methods for identifying a robot end effector in a processing environment may utilize one or more sensors for digitally recording visual information and providing that information to an industrial workflow. The sensor(s) may be positioned to record at least one image of the robot including the end effector. A processor may determine the identity of the end effector from the recorded image(s) and a library or database stored digital models.
DETECTING AND CLASSIFYING WORKSPACE REGIONS FOR SAFETY MONITORING
Systems and methods monitor a workspace for safety purposes using sensors distributed about the workspace. The sensors are registered with respect to each other, and this registration is monitored over time. Occluded space as well as occupied space is identified, and this mapping is frequently updated.
DYNAMICALLY DETERMINING AND MONITORING WORKSPACE SAFE ZONES USING SEMANTIC REPRESENTATIONS OF WORKPIECES
Embodiments of the present invention determine the configuration of a workpiece and whether it is actually being handled by a monitored piece of machinery, such as a robot. The problem solved by the invention is especially challenging in real-world factory environments because many objects, most of which are not workpieces, may be in proximity to the machinery. Accordingly, embodiments of the invention utilize semantic understanding to distinguish between workpieces that may become associated with the robot and other objects (and humans) in the workspace that will not, and detect when the robot is carrying a workpiece.