Patent classifications
G05B2219/40114
TELEPRESENCE ROBOTS HAVING COGNITIVE NAVIGATION CAPABILITY
The embodiments of present disclosure herein address unresolved problem of cognitive navigation strategies for a telepresence robotic system. This includes giving instruction remotely over network to go to a point in an indoor space, to go an area, to go to an object. Also, human robot interaction to give and understand interaction is not integrated in a common telepresence framework. The embodiments herein provide a telepresence robotic system empowered with a smart navigation which is based on in situ intelligent visual semantic mapping of the live scene captured by a robot. It further presents an edge-centric software architecture of a teledrive comprising a speech recognition based HRI, a navigation module and a real-time WebRTC based communication framework that holds the entire telepresence robotic system together. Additionally, the disclosure provides a robot independent API calls via device driver ROS, making the offering hardware independent and capable of running in any robot.
Automated drywall painting system and method
An automated painting system that includes a robotic arm and a painting end effector coupled at a distal end of the robotic arm, with the painting end effector configured to apply paint to a target surface. The painting system can also include a computing device executing a computational planner that: generates instructions for driving the painting end effector and robotic arm to perform at least one painting task that includes applying paint, via the painting the end effector, to a plurality of drywall pieces, the generating based at least in part on obtained target surface data; and drives the end effector and robotic arm to perform the at least one painting task.
INFORMATION PROCESSING DEVICE, CONTROL METHOD, AND STORAGE MEDIUM
The information processing device 1A mainly includes a logical formula conversion unit 322A, a constraint condition information acquisition unit 323A, and a constraint condition addition unit 324A. The logical formula conversion unit 322A is configured to convert an objective task, which is a task to be performed by a robot, into a logical formula that is based on a temporal logic. The constraint condition information acquisition unit 323A is configured to acquire constraint condition information I2 indicative of a constraint condition to be satisfied in performing the objective task. The constraint condition addition unit 324A is configured to generate a target logical formula Ltag that is a logical formula obtained by adding a proposition indicative of the constraint condition to the logical formula generated by the logical formula conversion unit 322A.
AUTOMATED DRYWALL PAINTING SYSTEM AND METHOD
An automated painting system that includes a robotic arm and a painting end effector coupled at a distal end of the robotic arm, with the painting end effector configured to apply paint to a target surface. The painting system can also include a computing device executing a computational planner that: generates instructions for driving the painting end effector and robotic arm to perform at least one painting task that includes applying paint, via the painting the end effector, to a plurality of drywall pieces, the generating based at least in part on obtained target surface data; and drives the end effector and robotic arm to perform the at least one painting task.
AUTOMATED WALL FINISHING SYSTEM AND METHOD
A method of generating a building assembly that includes spraying a coating material onto a plurality of pieces of substrate disposed on a first assembly face. The spraying includes spraying the coating material onto the plurality of pieces of substrate via a sprayer configured to apply the coating material to a target surface via a nozzle coupled with a mobile storage container storing the coating material, the coating material impregnating voids of the substrate. The method also includes allowing the coating material impregnating the voids to dry and harden and become rigid to generate the building assembly.
Robot, control device, and robot system
A robot causes an image capturing device to capture an image of a container in which a plurality of targets are placed to overlap in part with one another, the plurality of targets including components whose types are different from each other among the component and a component kit in which two or more of the components are assembled with one another, detects types, positions and poses of the plurality of targets based on the captured image captured by the image capturing device, determines a priority order of one or more of component kits being assembled using the targets placed in the container according to the detected types, the detected positions and the detected poses, and assembles the component kit selected based on the determined priority order.
AUTOMATED DRYWALL PLANNING SYSTEM AND METHOD
An automated drywalling system network that including one or more automated drywalling systems that each has a robotic arm. The automated drywalling system network can also include a computational planner that generates instructions for the one or more automated drywalling systems to perform two or more drywalling tasks associated with a target wall assembly. The two or more drywalling tasks can include a hanging task that includes hanging pieces of drywall on studs of the target wall assembly; a mudding task that includes applying joint compound to pieces of drywall hung on studs of the target wall assembly; a sanding task that includes sanding joint compound applied to the pieces of drywall hung on studs of the target wall assembly; and a painting task that includes painting sanded the joint compound applied to the pieces of drywall hung on studs of the target wall assembly.
Automated drywall planning system and method
An automated drywalling system network that including one or more automated drywalling systems that each has a robotic arm. The automated drywalling system network can also include a computational planner that generates instructions for the one or more automated drywalling systems to perform two or more drywalling tasks associated with a target wall assembly. The two or more drywalling tasks can include a hanging task that includes hanging pieces of drywall on studs of the target wall assembly; a mudding task that includes applying joint compound to pieces of drywall hung on studs of the target wall assembly; a sanding task that includes sanding joint compound applied to the pieces of drywall hung on studs of the target wall assembly; and a painting task that includes painting sanded the joint compound applied to the pieces of drywall hung on studs of the target wall assembly.
Automated wall finishing system and method
A method of generating a building assembly that includes spraying a coating material onto a plurality of pieces of substrate disposed on a first assembly face. The spraying includes spraying the coating material onto the plurality of pieces of substrate via a sprayer configured to apply the coating material to a target surface via a nozzle coupled with a mobile storage container storing the coating material, the coating material impregnating voids of the substrate. The method also includes allowing the coating material impregnating the voids to dry and harden and become rigid to generate the building assembly.
INFORMATION PROCESSING DEVICE, CONTROL METHOD, AND STORAGE MEDIUM
The information processing device 1B mainly includes an abstract model information acquisition unit 34X, a measurement information acquisition unit 34Y, and an abstract model generation unit 34Z. The abstract model information acquisition unit 34X is configured to acquire abstract model information I5 regarding an abstract model in which dynamics in a workspace 6 where a robot 5 performs an objective task is abstracted. The measurement information acquisition unit 34Y is configured to acquire measurement information Im indicating a measurement result in the workspace 6. The abstract model generation unit 34Z is configured to generate an abstract model Σ based on the abstract model information I5 and the measurement information Im.