G05D1/027

Method and system for controlling movement of a device
11507096 · 2022-11-22 · ·

Methods and systems are disclosed for controlling device movement based on a movement command. Issues on accurately controlling movement of a device such as a programmable robot and rover are addressed by first determining a command type of the movement command, followed by determining specific types of controllers for a high-level controller and a low-level controller based on the determined command type. When the command type is a linear movement, a Proportional-Integral-Derivative (PID) controller is used at the high-level controller and a Proportional/Proportional-Integral (PPI) controller is used at the low-level controller to accurately control a target distance. When the command type is an angular movement, the PPI controller is used at the high-level controller and the PID controller is used at the low-level controller to accurate control the end-heading of the device.

ROBOTIC LAWN MOWER
20230059610 · 2023-02-23 ·

A robotic lawn mower includes a mowing element, a body, a drive assembly, a first detection module, a second detection module, a failure determination module, an execution module, and a control module. The first detection module detects a first journey of the robotic lawn mower in a period. The second detection module detects a motion parameter of the drive assembly in the period and calculates a second journey of the robotic lawn mower in the period. The failure determination module determines whether a difference between the second journey and the first journey is greater than or equal to a first preset value. The execution module drives the robotic lawn mower to execute a response program. When the difference is greater than or equal to the first preset value in each of n1 consecutive periods, the control module controls the execution module to execute the response program.

Low quality pose lane associator
11585935 · 2023-02-21 · ·

An autonomous vehicle (AV) includes a vehicle computing system configured to receive map data of a geographic location, obtain position estimate data of the autonomous vehicle and determine a route of the autonomous vehicle including a plurality of roadways in the plurality of submaps. The autonomous vehicle determines a route including a plurality of roadways, determines a first roadway in the plurality of roadways closest to the position estimate and a second roadway outside the plurality of roadways closest to the position estimate of the autonomous vehicle, and determines a pose relative to the first roadway or the second roadway based on a distance between the position estimate of the autonomous vehicle and a roadway associated with a prior pose of the autonomous vehicle to control travel of the autonomous vehicle based on the vehicle pose.

Method and apparatus for recognizing a stuck status as well as computer storage medium

The present disclosure proposes a method and an apparatus for recognizing a stuck status as well as a computer storage medium, with the method comprising: building an environmental map within a preset extent by taking the current position of the mobile robot as center; real-time monitoring the march information of the mobile robot and predicting whether the mobile robot is stuck or not; acquiring data from multiple sensors of the mobile robot, if the mobile robot is stuck; and recognizing the current stuck status of the mobile robot based on the data from multiple sensors.

CAMERA-BASED COMMISSIONING

Lighting control systems may be commissioned for programming and/or control with the aid of a mobile device. Design software may be used to create a floor plan of how the lighting control system may be designed. The design software may generate floor plan identifiers for each lighting fixture, or group of lighting fixtures. During commissioning of the lighting control system, the mobile device may be used to help identify the lighting devices that have been installed in the physical space. The mobile device may receive a communication from each lighting control device that indicates a unique identifier of the lighting control device. The unique identifier may be communicated by visible light communication (VLC) or RF communication. The unique identifier may be associated with the floor plan identifier for communication of digital messages to lighting fixtures installed in the locations indicated in the floor plan identifier.

Behavior prediction device

A behavior prediction device comprising: a moving object behavior detection unit configured to detect moving object behavior, a behavior prediction model database that stores a behavior prediction model, a behavior prediction calculation unit configured to calculate a behavior prediction of the moving object using the behavior prediction model, a prediction deviation determination unit configured to determine whether a prediction deviation occurs based on the behavior prediction and a detection result of the moving object behavior corresponding to the behavior prediction, a deviation occurrence reason estimation unit configured to estimate a deviation occurrence reason when determination is made that the prediction deviation occurs, and an update necessity determination unit configured to determine a necessity of an update of the behavior prediction model database based on the deviation occurrence reason when the determination is made that the prediction deviation occurs.

AUTOMATIC GUIDANCE ASSIST SYSTEM USING GROUND PATTERN SENSORS
20220354044 · 2022-11-10 · ·

An automatic guidance system is adapted to be mounted on a work vehicle such as a farm tractor for assisting an operator steer the vehicle on a desired track relative to a furrow. The system includes sensors for transmitting and receiving ultrasonic ranging signals. The sensors are ultrasound transducers mountable on ends of a planter drawn by the vehicle for directing ranging signals downwardly toward field adjacent of a furrow such that the ranging signals strike the field or furrow and are reflected back into the respective sensor. Guidance logic stored in a memory of a controller is executed by a processor to determine tractor headway direction and headland turning directions representative of desired tractor headway and headland turning directions, and a human interface device generates guidance images viewable by an operator for steering the tractor relative to furrows in the field and in the headland.

TACTICAL ADVANCED ROBOTIC ENGAGEMENT SYSTEM

This invention describes a tactical advanced robotic engagement system (ARES) (100) for combat or rescue mission by employing advanced electronics, AI and AR capabilities. In ARES, a user carries a weapon or tool (102) equipped with a hand-operable controller (150) for controlling an associated UGV (170), UAV (180) or UUV. The UGV (170) provides a ground/home station for the UAV (180). The UGV, UAV is equipped with a camera (290) to obtain real-time photographs or videos and to relay them to a heads-up display (HUD) (110) mounted on the user's helmet (104). The HUD (110) system provides intuitive UIs (132) for communication and navigation of the UGV, UAV; AR information reduces visual cognitive and mental loads on the user, thereby enhancing situation awareness and allowing the user to maintain heads-up, eyes-out and hands-on trigger readiness. The HUD (110) also provides intuitive UIs to connect up with peers and/or a Command Centre (190).

SYSTEMS AND METHODS FOR MULTI-MODALITY AUTONOMOUS VEHICLE TRANSPORT
20230032496 · 2023-02-02 ·

Technologies disclosed herein facilitate identification of a trip route from an origin of a user to a desired destination of the user, wherein the trip route includes multiple transportation modalities, at least one of which is an autonomous vehicle (AV), and dispatching of the AV to a pickup location for the user in connection with providing transportation to the user along a portion of the identified trip route.

AUGMENTATION OF GLOBAL NAVIGATION SATELLITE SYSTEM BASED DATA

A vehicle computing system validates location data received from a Global Navigation Satellite System receiver with other sensor data. In one embodiment, the system calculates velocities with the location data and the other sensor data. The system generates a probabilistic model for velocity with a velocity calculated with location data and variance associated with the location data. The system determines a confidence score by applying the probabilistic model to one or more of the velocities calculated with other sensor data. In another embodiment, the system implements a machine learning model that considers features extracted from the sensor data. The system generates a feature vector for the location data and determines a confidence score for the location data by applying the machine learning model to the feature vector. Based on the confidence score, the system can validate the location data. The validated location data is useful for navigation and map updates.