Patent classifications
G05B2219/40625
Tactile sensor system and method for inspecting the condition of a structure
In a computer-implemented method and system for capturing the condition of a structure, the structure is scanned with a three-dimensional (3D) scanner. The 3D contact scanner includes a tactile sensor system having at least one tactile sensor for generating 3D data points based on tactile feedback resulting from physical contact with at least part of the structure. A 3D model is constructed from the 3D data and is then analyzed to determine the condition of the structure.
TACTILE, INTERACTIVE NEUROMORPHIC ROBOTS
In one embodiment, a neuromorphic robot includes a curved outer housing that forms a continuous curved outer surface, a plurality of trackball touch sensors provided on and extending across the continuous curved outer surface in an array, each trackball sensor being configured to detect a direction and velocity a sweeping stroke of a user, and a plurality of lights, one light being collocated with each trackball touch sensor and being configured to illuminate when its collocated trackball touch sensor is stroked by the user, wherein the robot is configured to interpret the sweeping stroke of the user sensed with the plurality of trackball touch sensors and to provide immediate visual feedback to the user at the locations of the touched trackball touch sensors.
Systems and methods for visuo-tactile object pose estimation
Systems and methods for visuo-tactile object pose estimation are provided. In one embodiment, a computer implemented method includes receiving image data, depth data, and tactile data about the object in the environment. The computer implemented method also includes generating a visual estimate of the object that includes an object point cloud. The computer implemented method further includes generating a tactile estimate of the object that includes a surface point cloud based on the tactile data. The computer implemented method yet further includes estimating a pose of the object based on the visual estimate and the tactile estimate by fusing the object point cloud and the surface point cloud in a 3D space. The pose is a six-dimensional pose.
Haptic photogrammetry in robots and methods for operating the same
Robots, robot systems, and methods for operating the same based on environment models including haptic data are described. An environment model which includes representations of objects in an environment is accessed, and a robot system is controlled based on the environment model. The environment model incudes haptic data, which provides more effective control of the robot. The environment model is populated based on visual profiles, haptic profiles, and/or other data profiles for objects or features retrieved from respective databases. Identification of objects or features can be based on cross-referencing between visual and haptic profiles, to populate the environment model with data not directly collected by a robot which is populating the model, or data not directly collected from the actual objects or features in the environment.
Tactile sensor system and method for inspecting the condition of a structure
In a computer-implemented method and system for capturing the condition of a structure, the structure is scanned with a three-dimensional (3D) scanner. The 3D contact scanner includes a tactile sensor system having at least one tactile sensor for generating 3D data points based on tactile feedback resulting from physical contact with at least part of the structure. A 3D model is constructed from the 3D data and is then analyzed to determine the condition of the structure.
Tactile, interactive neuromorphic robots
In one embodiment, a neuromorphic robot includes a curved outer housing, and multiple touch sensors provided on the outer housing, wherein the robot is configured to interpret a touch of a user sensed with the touch sensors.
Tactile sensor system and method for inspecting the condition of a structure
In a computer-implemented method and system for capturing the condition of a structure, the structure is scanned with a three-dimensional (3D) scanner. The 3D contact scanner includes a tactile sensor system having at least one tactile sensor for generating 3D data points based on tactile feedback resulting from physical contact with at least part of the structure. A 3D model is constructed from the 3D data and is then analyzed to determine the condition of the structure.
METHOD FOR UPDATING A SCENE REPRESENTATION MODEL
A computer implemented method for updating a scene representation model is disclosed. The method comprises obtaining a scene representation model representing a scene having one or more objects, the scene representation model being configured to predict a value of a physical property of one or more of the objects; obtaining a value of the physical property of at least one of the objects, the obtained value being derived from a physical contact of a robot with the at least one object; and updating the scene representation model based on the obtained value. An apparatus is also disclosed.
Control device, control method, and program
The present disclosure relates to a control device, a control method, and a program capable of supporting an object with a more appropriate supporting force. The control device includes a supporting force control unit that controls a supporting force for supporting an object on the basis of information regarding a shape of a contact portion in contact with the object and information regarding a shear force of the contact portion. The information regarding the shear force includes, for example, information regarding a shear displacement of the contact portion. The present disclosure can be applied to, for example, a control device, a control method, an electronic device, a robot, a support system, a gripping system, a program, and the like.
Robotic control device and method for manipulating a hand-held tool
Described is a robotic control device for manipulating a gripper-held tool. The device includes a robotic gripper having a plurality of tactile sensors. Each sensor generates tactile sensory data upon grasping an tool based on the interface between the tool and the corresponding tactile sensor. In operation, the device causes the gripper to grasp a tool and move the tool into contact with a surface. A control command is used to cause the gripper to perform a pseudo-random movement with the tool against the surface to generate tactile sensory data. A dimensionality reduction is performed on the tactile sensory data to generate a low-dimensional representation of the tactile sensory data, which is then associated with the control command to generate a sensory-motor mapping. A series of control commands can then be generated in a closed-loop based on the sensory-motor mapping to manipulate the tool against the surface.