Patent classifications
G05D2201/0209
ROBOT CONTROL METHOD, ROBOT, AND RECORDING MEDIUM
A robot control system acquires request information requesting a robot to keep within a prescribed distance from a user and to collect surrounding information, acquires, from the robot, location information indicating a location of the robot and confirmation information indicating that the robot is near the user, and transmits, to the robot, a command for changing a setting of the robot from a first specification to a second specification in response to the request in a case where it is determined, on a basis of the map information, the location information, and the confirmation information, that the user of the robot is accompanying the robot outside a home area of the user, the first specification enabling the robot to collect the surrounding information inside the home area, and the second specification enabling the robot to collect the surrounding information outside the home area.
Autonomous robot
A robot in a location interacts with a user. The robot includes a camera, an image recognition processor, a microphone and a loudspeaker, a voice assistant, and a wireless transceiver. The robot moves around and creates a model of the location, and recognizes changes. It recognizes objects of interest, beings, and situations. The robot monitors the user and recognizes body language and gesture commands, as well as voice commands. The robot communicates with the user, the TV, and other devices. It may include environment sensors and health status sensors. It acts as a user companion by answering queries, executing commands, and issuing reminders. It may monitor to determine if the user is well. The robot may monitor objects of interest, their placement and their status. When necessary, it communicates with the user.
MOVING ROBOT AND METHOD OF CONTROLLING THE SAME
According to a moving robot and a method of controlling the same of the present disclosure, the moving robot detect the sound generated in the area, moves a sound generation point according to a type of the sound and an operation mode, analyzes an image of the sound generation point and determines an indoor situation to perform the corresponding operation. The moving robot detects the sound to determine an accident at a location at which the sound is generated, can automatically perform a specified operation corresponding to the generated accident even when there is no control command of a user, and thus, it is possible rapidly respond to the generated accident. The moving robot can divide an object generating the sound into a person, a companion animal, and a subject, and can perform different operations according to the object.
TACTICAL ADVANCED ROBOTIC ENGAGEMENT SYSTEM
This invention describes a tactical advanced robotic engagement system (ARES) (100) for combat or rescue mission by employing advanced electronics, AI and AR capabilities. In ARES, a user carries a weapon or tool (102) equipped with a hand-operable controller (150) for controlling an associated UGV (170), UAV (180) or UUV. The UGV (170) provides a ground/home station for the UAV (180). The UGV, UAV is equipped with a camera (290) to obtain real-time photographs or videos and to relay them to a heads-up display (HUD) (110) mounted on the user's helmet (104). The HUD (110) system provides intuitive UIs (132) for communication and navigation of the UGV, UAV; AR information reduces visual cognitive and mental loads on the user, thereby enhancing situation awareness and allowing the user to maintain heads-up, eyes-out and hands-on trigger readiness. The HUD (110) also provides intuitive UIs to connect up with peers and/or a Command Centre (190).
SYSTEMS AND METHODS FOR USE OF AUTONOMOUS ROBOTS FOR PERIMETER PROTECTION
Systems and methods for use of autonomous robot for perimeter protection may include a security system configured to receive, from a security device, a security signal, and detect a security event occurred based on the security signal, in response to the security signal being received. The security system may determine a security location for an autonomous mobile machine to perform a security task, in response to the security event being detected, and transmit, to the autonomous mobile machine, first instructions for the autonomous mobile machine to move to the security location to perform the security task.
SYSTEMS AND METHODS FOR USE OF AUTONOMOUS ROBOTS FOR BLIND SPOT COVERAGE
Systems and methods for use of autonomous mobile machine for blind spot coverage may include a security controller that determines a security coverage area based on data from a facility map indicating physical and functional representations of a facility and objects within the facility. The security controller may also determine a surveillance area based on the security coverage area. The security controller may also transmit, to an autonomous mobile machine, instructions for the autonomous mobile machine to deploy to the surveillance area and perform a surveillance task at the surveillance area.
AUTOMATED ROUTE SELECTION BY A MOBILE ROBOT
A mobile robot is configured for operation in a commercial or industrial setting, such as an office building or retail store. The robot can patrol one or more routes within a building, and can detect violations of security policies by objects, building infrastructure and security systems, or individuals. In response to the detected violations, the robot can perform one or more security operations. The robot can include a removable fabric panel, enabling sensors within the robot body to capture signals that propagate through the fabric. In addition, the robot can scan RFID tags of objects within an area, for instance coupled to store inventory. Likewise, the robot can generate or update one or more semantic maps for use by the robot in navigating an area and for measuring compliance with security policies.
SELF-AWARE MOBILE SYSTEM
Embodiments may provide techniques for operating autonomous systems with improved autonomy so as to operate largely, or even completely, autonomously. For example, in an embodiment, a self-aware mobile system may comprise a vehicle, vessel, or aircraft comprising at least one communication device configured to transmit and receive data so as communicate with at least one autonomous sensor platform and at least one computer system configured to receive data from the at least one autonomous sensor platform and, using the received data, to generate data to implement autonomous movement corresponding to SAE automation level 4 or level 5 using processing in accordance with a Hierarchical Intelligence Model, and at least one autonomous sensor platform comprising at least one communication device configured to transmit and receive data so as communicate with the vehicle, vessel, or aircraft.
TWO WHEEL ROBOT WITH CONVERTIBILITY AND ACCESSORIES
A two wheeled throwable robot with a pair of motorized wheels mounted on each end of an elongate body and a rearwardly extending tail. The body comprising a chassis with an accessory mounting interface on top side and rearward side. A backpack accessory providing active sensing or environmental effects has an inverted L-shape when viewed from an end and attaches to the rearward side of the elongate body and extends over the topside of the body. The stabilizing tail for the two wheeled robot attaches to a rearward surface of the backpack accessory. The robot and backpack accessory have impact protection means when the robot is thrown. The backpack accessory may be within a zone of protection defined by the maximum deflection of the wheels or may have elastomeric bumpers when the accessory backpack projects out of the zone of protection.
EXEMPLAR ROBOT LOCALIZATION
Methods, systems, and apparatus, including computer programs encoded on computer-storage media, for exemplar generation and localization. In some implementations, a method includes obtaining sensor data from a robot traversing a route at a property; determining sampling rates along the route using the sensor data obtained from the robot; selecting images from the sensor data as exemplars for robot localization using the sampling rates along the route; determining that a second robot is in a localization phase at the property; and providing representations of the exemplars for robot localization to the second robot.