Patent classifications
G05D1/0272
AUTONOMOUS NAVIGATION METHOD AND SYSTEM FOR INTELLIGENT INDOOR LOGISTIC TRANSPORTATION BASED ON ONE-SHOT IMITATION
A method and systems for controlling and directing a means of transportation to autonomously navigate to a target location are provided. The method includes receiving measurements from one or more sensors of the means of transportation; building a route based on the measurement received for the means of transportation to navigate to a target location; generating localization estimation associated with the route built; generating a global path based on the route and the localization estimation; and performing local planning for directing the means of transportation to the target location while avoiding surrounding static or dynamic obstacles. The one or more sensors include a LiDAR sensor and an odometry sensor.
Sweeping robot obstacle avoidance treatment method based on free move technology
The present disclosure provides a sweeping robot obstacle avoidance treatment method based on free move technology, step 1 and step 2 are as following. Step 1: predetermining a sweeping robot provided with a six-axis gyroscope, a grating signal sensor, and a left-and-right-wheel electric quantity sensing unit. Step 2: performing a real-time sensing and data acquisition on an operation state of the sweeping robot by utilizing the six-axis gyroscope, the grating signal sensor, and the left-and-right wheel electric quantity sensing unit to obtain a real-time data information.
AUTONOMOUS TRANSPORT VEHICLE WITH STEERING
An autonomous transport vehicle, for transporting items in a storage and retrieval system, includes a frame, a controller, at least two independently driven drive wheels mounted to the frame, and at least one caster wheel mounted to the frame and having a castering assistance motor that engages the at least one caster wheel so as to impart castering assistance torque to the at least one caster wheel assisting castering of the at least one caster wheel. The controller is communicably connected to the castering assistance motor and configured to effect via a combination of vehicle yaw, generated by differential torque from the at least two independently driven drive wheels, and castering assistance torque from the castering assistance motor, castering of the at least one caster wheel with the autonomous transport vehicle in motion with a predetermined kinematic state.
Motor vehicle self-driving method and terminal device
Embodiments of this application describe a motor vehicle self-driving method and a terminal device. The method may include obtaining, by a terminal device, vehicle external-environment data of a position of a motor vehicle and initial positioning precision of the motor vehicle. The method may also include determining, by the terminal device, a target driving parameter of the motor vehicle based on the vehicle external-environment data and the initial positioning precision. Furthermore, the method may include controlling, by the terminal device, the motor vehicle to drive based on the target driving parameter. In the embodiments of this application, the terminal device determines the target driving parameter of the motor vehicle based on the vehicle external-environment data and the initial positioning precision. In this way, the target driving parameter varies with the vehicle external-environment data, and further matches an external environment, thereby improving self-driving safety of the motor vehicle.
STRUCTURED LIGHT MODULE AND AUTONOMOUS MOBILE DEVICE
Provided are a structured light module and an autonomous mobile device. A structured light module comprises a camera module and line laser emitters distributed on two sides of the camera module; the line laser emitters emit line laser outwards; and the camera module collects an environmental image detected by the line laser. By virtue of the advantage of high detection accuracy of the line laser, front environmental information may be detected more accurately. In addition, the line laser emitters are located on two sides of the camera module. This mode occupies a small size, may save more space, and is beneficial to expand an application scenario of a line laser sensor.
Control device and control method, program, and mobile object
A control device and a control method can quickly estimate a self-location even when the self-location is unknown. In a case of storing information supplied in a time series detected by LIDAR or a wheel encoder and estimating a self-location by using the stored time-series information, when a position change happens unpredictably in advance such as a kidnap state is detected, the stored time-series information is reset, and then the self-location is estimated again. Example host platforms include a multi-legged robot, a flying object, and an in-vehicle system that autonomously moves in accordance with a mounted computing machine.
Autonomous utility cart and robotic cart platform
A robotic cart platform with a navigation and movement system that integrates into a conventional utility cart to provide both manual and autonomous modes of operation. The platform includes a drive unit with drive wheels replacing the front wheels of the cart. The drive unit has motors, encoders, a processor and a microcontroller. The system has a work environment mapping sensor and a cabled array of proximity and weight sensors, lights, control panel, battery and on/off, “GO” and emergency stop buttons secured throughout the cart. The encoders obtain drive shaft rotation data that the microcontroller periodically sends to the processor. When in autonomous mode, the system provides navigation, movement and location tracking with or without wireless connection to a server. Stored destinations are set using its location tracking to autonomously navigate the cart. When in manual mode, battery power is off, and back-up power is supplied to the encoders and microcontroller, which continue to obtain shaft rotation data. When in autonomous mode, the shaft rotation data obtained during manual mode is used to determine the present cart location.
Particle filters and WiFi robot localization and mapping
Robot localization or mapping can be provided without requiring the expense or complexity of an “at-a-distance” sensor, such as a camera, a LIDAR sensor, or the like. Landmark features can be created or matched using motion sensor data, such as odometry or gyro data or the like, and adjacency sensor data. Despite the relative ambiguity of adjacency-sensor derived landmark features, a particle filter approach can be configured to use such information, instead of requiring “at-a-distance” information from a constant stream of visual images from a camera, such as for robot localization or mapping. Landmark sequence constraints or a Wi-Fi signal strength map can be used together with the particle filter approach.
Inspection robot and methods thereof for responding to inspection data in real time
An inspection robot, and methods and a controller thereof are disclosed. An inspection robot may include an inspection chassis including a plurality of inspection sensors and coupled to at least one drive module to drive the robot over an inspection surface. The inspection robot may also include a controller including an inspection data circuit to interpret inspection base data, an inspection processing circuit to determine refined inspection data, and an inspection configuration circuit to determine an inspection response value in response to the refined inspection data. The controller may further include an inspection response circuit to, in response to the inspection response value, provide an inspection command value while the inspection robot is interrogating the inspection surface.
Method, system and apparatus for localization-based historical obstacle handling
A method of obstacle handling for a mobile automation apparatus includes: obtaining an initial localization of the mobile automation apparatus in a frame of reference; detecting an obstacle by one or more sensors disposed on the mobile automation apparatus; generating and storing an initial location of the obstacle in the frame of reference, based on (i) the initial localization, and (ii) a detected position of the obstacle relative to the mobile automation apparatus; obtaining a correction to the initial localization of the mobile automation apparatus; and applying a positional adjustment, based on the correction, to the initial position of the obstacle to generate and store an updated position of the obstacle.