Patent classifications
G05D1/2245
Backend automation system for simulation of drone deliveries through virtual fleets
A method includes receiving configuration data for an unmanned aerial vehicle (UAV) simulation system, the configuration data indicating at least one base location specification, at least one aircraft specification, and at least one virtual vehicle specification and determining an aircraft record comprising, for each of the at least one aircraft to be simulated, aircraft mission data associated with an aircraft identifier of the at least one aircraft to be simulated. The method further includes configuring the UAV simulation system so that each of the at least one aircraft has a corresponding base location as specified by the at least one base location specification and a corresponding vehicle software version as specified by the at least one virtual vehicle specification and executing a simulation of the at least one aircraft carrying out flying missions by using the configured UAV simulation system and updating the aircraft mission data in the aircraft record.
User feedback on potential obstacles and error conditions detected by autonomous mobile robots
A mobile computing device includes a user input device and a controller. The user input device includes a display, and the controller is operably connected to the user input device and configured to execute instructions to perform operations. The operations include presenting on the display, information about one or more areas that were not cleaned by an autonomous cleaning robot during a first mission. The operations further include transmitting data corresponding to a user-selected subset of the one or more areas to cause the robot to clean the user-selected subset during a second mission.
Air and sea based fishing data collection and analysis systems and methods
Flight based marine object search, detection and identification systems and related techniques include an unmanned aerial system (UAS) having a flight platform configured to execute a search path to search for an underwater object, an imaging system comprising image capture components configured to generate a stream of images corresponding to a field of view of the UAS, and a logic device associated with the UAS and configured to analyze the stream of images using a marine video analysis (MVA) system to detect a region of interest comprising an underwater object, identify an underwater object in the detected region of interest, and notify a mobile structure of the identified object.
Method for path planning, automatic gardening device, and computer program product
A method for path planning, an automatic gardening device, and a computer program product are provided. The method includes: receiving a preset travel direction in a work region; dividing the work region into a plurality of subregions; determining, for each of the subregions, an actual planned direction in the subregion from the preset travel direction and a recommended planned direction in the subregion, and determining a local planned path corresponding to the subregion based on the actual planned direction, wherein a path of traversing the subregion along the recommended planned direction has a shortest length; acquiring a moving sequence between the subregions; and determining a global planned path of the work region based on the local planned path of each of the subregions and the moving sequence between the subregions.
WORKING SYSTEM
A working system that is capable of carrying out a work by working equipment that is operated remotely, the working system including an image generating unit that generates a virtual space image corresponding to an actual space surrounding a working movable body in which the working equipment is disposed, and a display control unit that displays on a display unit the virtual space image that is generated by the image generating unit, wherein the image generating unit generates the virtual space image including a virtual image corresponding to the working equipment.
Performance testing for robotic systems
Herein, a perception statistical performance model (PSPM) for modeling a perception slice of a runtime stack for an autonomous vehicle or other robotic system may be used e.g. for safety/performance testing. A PSPM is configured to: receive a computed perception ground truth t; determine from the perception ground truth t, based on a set of learned parameters, a probabilistic perception uncertainty distribution of the form p(e|t), p(e|t,c), in which p(e|t,c) denotes the probability of the perception slice computing a particular perception output e given the computed perception ground truth t and the one or more confounders c, and the probabilistic perception uncertainty distribution is defined over a range of possible perception outputs, the parameters learned from a set of actual perception outputs generated using the perception slice to be modeled, wherein each confounder is a variable of the PSPM whose value characterized a physical condition on which p(e|t,c) depends.
Multi-vehicle coordination systems and methods for agricultural field operations
Control systems and methods for coordinating multiple agricultural machines for operation on an agricultural field are provided. Each machine may receive field data for an agricultural field that optionally includes a plurality of pre-defined swaths. Each machine may receive state data from other machine(s). The state data for the other machine(s) may include a next swath and a current swath. Each machine may, after an indication of a start of its current swath, determine a next swath based on open swaths and the state data for the other machine(s). A next swath may also be determined using vehicle kinematic data, and conflicts between machines may be resolved.
Mapping an environment around an autonomous vacuum
An autonomous cleaning robot (e.g., an autonomous vacuum) may use a sensor system to map an environment that may be used to determine where to clean. The autonomous vacuum receives visual data about the environment and determines a ground plane of the environment based on the visual data. The autonomous vacuum detects objects within the environment based on the ground plane. For each object, the autonomous vacuum segments a three-dimensional (3D) representation of the object out of the visual data and determines whether the object is static or dynamic. The autonomous vacuum adds static objects to a long-term level of a map of the environment and dynamic objects to an intermediate level of the map. The autonomous vacuum may further add virtual borders, flags, walls, and messes to the map.
REMOTE OPERATOR TERMINAL AND REMOTE ASSISTANCE SYSTEM
A remote operator terminal for a remote operator to perform remote assistance for a vehicle is disclosed. The remote operator terminal includes a display device and processing circuitry configured to generate a remote assistance screen in response to remote assistance requests assigned to the remote operator. The remote assistance screen includes monitoring images showing an assistance situation of a target vehicle. The processing circuitry is further configured to set a display priority for each of the remote assistance requests, to select, as a preferential monitoring image, a monitoring image corresponding to a remote assistance request having a highest display priority among the monitoring images, and to display the monitoring images on the remote assistance screen so that the preferential monitoring image has higher noticeability than other monitoring images have.
SHIFTING ONE OR MORE GUIDANCE LINES FOR NAVIGATING AN AGRICULTURAL MACHINE TO FOLLOW CROP ROWS
An agricultural harvester includes a navigation system that controls navigation of the agricultural harvester to follow a guidance line. The agricultural harvester is positioned at a desired location to engage the crop. An operator interface is generated with a shift actuator. Operator actuation of the shift actuator is detected, and a line/path shift processor shifts one or more guidance lines based upon the location and orientation of the agricultural harvester and based on the location of the crop rows.