G05D1/0238

NAVIGATION OF AUTONOMOUS MOBILE ROBOTS

An autonomous cleaning robot includes a controller configured to execute instructions to perform one or more operations. The one or more operations includes operating a drive system to move the cleaning robot in a forward drive direction along a first obstacle surface with a side surface of the cleaning robot facing the first obstacle surface, then operating the drive system to turn the cleaning robot such that the side surface of the cleaning robot faces a second obstacle surface, then operating the drive system to move the cleaning robot in a rearward drive direction along the second obstacle surface, and then operating the drive system to move the cleaning robot in the forward drive direction along the second obstacle surface.

EXTERNAL ENVIRONMENT SENSOR DATA PRIORITIZATION FOR AUTONOMOUS VEHICLE
20230046691 · 2023-02-16 ·

An autonomous vehicle includes an array of sensors, a processor, and a switch. The array of sensors generate sensor data related to one or more objects in an external environment of the autonomous vehicle and the processor determines an environmental context. The switch transfers the sensor data from the array of sensors to the processor, where the switch is configured to: (a) receive first sensor data from a first sensor group of the array of sensors; (b) receive second sensor data from a second sensor group of the array of sensors; (c) determine an order of transmission of the first sensor data over the second sensor data in response to the environmental context; and (d) transmit the first sensor data to the processor prior to transmitting the second sensor data based on the order of transmission.

Hyper planning based on object and/or region

A vehicle computing system may implement techniques to predict behavior of objects detected by a vehicle operating in the environment. The techniques may include determining a feature with respect to a detected objects (e.g., likelihood that the detected object will impact operation of the vehicle) and/or a location of the vehicle and determining based on the feature a model to use to predict behavior (e.g., estimated states) of proximate objects (e.g., the detected object). The model may be configured to use one or more algorithms, classifiers, and/or computational resources to predict the behavior. Different models may be used to predict behavior of different objects and/or regions in the environment. Each model may receive sensor data as an input, and output predicted behavior for the detected object. Based on the predicted behavior of the object, a vehicle computing system may control operation of the vehicle.

System and method for traversing vertical obstacles

Disclosed is a mobile robot adapted to traverse vertical obstacles. The robot comprises a frame and at least one wheel positioned in a front section of the robot, at least one middle wheel positioned in a middle section of the robot, at least one back wheel positioned in a back section of the robot, and at least one further wheel in the front, middle or back of the robot. The robot also comprises at least one motor-driven device for exerting a downward and/or upward force on the middle wheel and at least two motors for driving the wheels and the motor-driven device. Also disclosed is a method of climbing using a mobile robot as disclosed.

Infrared Transceiver Unit, Detection Apparatus, Multi-Infrared Detection Apparatus and Obstacle Avoidance Robot
20230042631 · 2023-02-09 ·

An infrared transceiver unit (107, 108), a detection apparatus, a multi-infrared detection apparatus and an obstacle avoidance robot. The infrared transceiver unit (107, 108) includes a mounting skewed slot, an infrared emitting source (1085), and two groups of infrared receiving sources (1083, 1084), wherein a sensing direction of one group of infrared receiving sources (1084) and an emitting direction of the infrared emitting source (1085) both face one side of a sensing center line (L) of the mounting skewed slot, and the sensing direction of the other group of infrared receiving sources (1083) faces the other side of the sensing center line (L) of the mounting skewed slot, so that one of the infrared receiving sources (1083, 1084) receives infrared modulation light emitted by the infrared emitting source and reflected by an obstacle. Two infrared transceiver units (107, 108) are respectively arranged on a left end and a right end of an obstacle avoidance robot, and the infrared transceiver unit (107, 108) arranged on one end of the robot receives the infrared modulation light emitted by the infrared transceiver unit (107, 108) arranged on the other end, or the infrared modulation light emitted by the infrared transceiver unit (107, 108) arranged on either end and reflected by the obstacle.

SITUATIONAL AWARENESS ROBOT

A system and methods for assessing an environment are disclosed. A method includes causing a robot to transmit data to first and second user devices, causing the robot to execute a first action, and, responsive to a second instruction, causing the robot to execute a second action. At least one user device is outside the environment of the robot. At least one action includes recording a video of at least a portion of the environment, displaying the video in real time on both user devices, and storing the video on a cloud-based network. The other action includes determining a first physical location of the robot, determining a desired second physical location of the robot, and propelling the robot from the first location to the second location. Determining the desired second location is responsive to detecting a touch on a touchscreen video feed displaying the video in real time.

External environment sensor data prioritization for autonomous vehicle

Sensor data is received from an array of sensors configured to capture one or more objects in an external environment of an autonomous vehicle. A first sensor group is selected from the array of sensors based on proximity data or environmental contexts. First sensor data from the first sensor group is prioritized for transmission based on the proximity data or environmental contexts.

Inventory system with high-speed corridors for autonomous surface vehicles

Aspects described herein include an autonomous surface vehicle (ASV) for operation within an inventory system of an environment. The ASV includes a drive system, a docking system, a plurality of sensors, and a memory storing a map of the environment. The ASV further includes one or more computer processors configured to (i) detect, using a location sensor, a location of the ASV within the environment; (ii) control the drive system to actuate the ASV toward a corridor defined in the map at a first speed setting; and control the drive system to actuate the ASV through the corridor along at least one barrier defined in the map. A second, greater speed setting is applied when (i) the location sensor indicates that the ASV is within the corridor and (ii) one or more fiducials along the at least one barrier are visually detected by one or more proximity sensors.

Guide-Type Virtual Wall System
20180004212 · 2018-01-04 ·

A guide-type virtual wall system is provided. The system comprises a beacon (11, 44) and a robot (12), wherein a transmission module of the beacon (11, 44) directionally transmits a first signal, and an area covered by the first signal defines a beacon signal area (13). The robot (12) comprises a beacon signal receiving module corresponding to the beacon signal transmission module. When the robot (12) enters the beacon signal area (13) and the beacon signal receiving module detects the first signal, the robot (12) advances towards the direction of the beacon (11, 44) until it detects a second signal, and then the robot (12) crosses over or exits from the beacon signal area (13). The system can restrict the robot (12) from entering a certain area, wherein the area where a virtual wall is located is not missed, and the robot (12) is also enabled to cross over the virtual wall to enter the restricted area when required.

ENVIRONMENTAL SENSING DEVICE AND INFORMATION ACQUIRING METHOD APPLIED TO ENVIRONMENTAL SENSING DEVICE
20180003822 · 2018-01-04 ·

Disclosed are embodiments of environmental sensing devices and information acquiring methods applied to environmental sensing devices. In some embodiments, an environmental sensing device includes a camera sensor, a laser radar sensor that are integrated, and a control unit. The control unit is connected simultaneously to the camera sensor and the laser radar sensor. The control unit is used for simultaneously entering a trigger signal to the camera sensor and the laser radar sensor. The design of integrating the camera sensor and the laser radar sensor avoids the problems such as poor contact and noise generation that easily occur in a high-vibration and high-interference vehicle environment, and can precisely trigger the camera sensor and the laser radar sensor simultaneously, so as to obtain high-quality fused data, thereby improving the accuracy of environmental sensing. As a result, the camera sensor and the laser radar sensor have a consistent overlapping field of view.