Patent classifications
G05D1/225
ELEVATION BASED MACHINE LOCALIZATION SYSTEM AND METHOD
A machine localization system includes a work machine including an extendable implement, a first pressure sensor coupled to the work machine, a second pressure sensor located at a known elevation, and a computing system operably coupled to the work machine, the first pressure sensor, and the second pressure sensor. The computing system is configured to receive a first pressure measurement from the first pressure sensor and a second pressure measurement from the second pressure sensor, determine a maximum operating height of the extendable implement based on a difference between the first pressure measurement and the second pressure measurement, and configure the extendable implement to not exceed the maximum operating height.
ELEVATION BASED MACHINE LOCALIZATION SYSTEM AND METHOD
A machine localization system includes a work machine including an extendable implement, a first pressure sensor coupled to the work machine, a second pressure sensor located at a known elevation, and a computing system operably coupled to the work machine, the first pressure sensor, and the second pressure sensor. The computing system is configured to receive a first pressure measurement from the first pressure sensor and a second pressure measurement from the second pressure sensor, determine a maximum operating height of the extendable implement based on a difference between the first pressure measurement and the second pressure measurement, and configure the extendable implement to not exceed the maximum operating height.
SYSTEMS AND METHODS FOR CONFIGURABLE OPERATION OF A ROBOT BASED ON AREA CLASSIFICATION
A method of operating a mobile robot includes generating a segmentation map defining respective regions of a surface based on occupancy data that is collected by a mobile robot responsive to navigation of the surface, identifying sub-regions of at least one of the respective regions as non-clutter and clutter areas, and computing a coverage pattern based on identification of the sub-regions. The coverage pattern indicates a sequence for navigation of the non-clutter and clutter areas, and is provided to the mobile robot. Responsive to the coverage pattern, the mobile robot sequentially navigates the non-clutter and clutter areas of the at least one of the respective regions of the surface in the sequence indicated by the coverage pattern. Related methods, computing devices, and computer program products are also discussed.
SYSTEMS AND METHODS FOR CONFIGURABLE OPERATION OF A ROBOT BASED ON AREA CLASSIFICATION
A method of operating a mobile robot includes generating a segmentation map defining respective regions of a surface based on occupancy data that is collected by a mobile robot responsive to navigation of the surface, identifying sub-regions of at least one of the respective regions as non-clutter and clutter areas, and computing a coverage pattern based on identification of the sub-regions. The coverage pattern indicates a sequence for navigation of the non-clutter and clutter areas, and is provided to the mobile robot. Responsive to the coverage pattern, the mobile robot sequentially navigates the non-clutter and clutter areas of the at least one of the respective regions of the surface in the sequence indicated by the coverage pattern. Related methods, computing devices, and computer program products are also discussed.
Taking corrective action based upon telematics data broadcast from another vehicle
A computer-implemented method of using telematics data associated with an originating vehicle at a destination vehicle is provided. The method may include receiving telematics data associated with the originating vehicle by (1) a mobile device or (2) a smart vehicle controller associated with a driver or vehicle. The mobile device or smart vehicle controller may analyze the telematics data received to determine that (i) a travel event exists, or (ii) that a travel event message or warning is embedded within the telematics broadcast received. If the travel event exits, the method may include automatically taking a preventive or corrective action, at or via the mobile device or smart vehicle controller, which alleviates a negative impact of the travel event on the driver or vehicle to facilitate safer or more efficient vehicle travel. Insurance discounts may be provided to insureds based upon their usage of the risk mitigation or prevention functionality.
Flight path determination
A method of determining a flight path for an aerial vehicle, includes controlling the aerial vehicle to fly along a first route, identifying, during the flight along the first route and with aid of one or more processors, a change in a state of signal transmission occurring at a first location, in response to identifying the change, determining, by the one or more processors, a second location different from the first location, determining a second route to the second location, and controlling, by the one or more processors, the aerial vehicle to fly to and land at the second location. The change of the state of signal transmission indicates an abnormal state in a signal transmission between the aerial vehicle and a control device.
Wearable device determining emotional state of rider in vehicle and optimizing operating parameter of vehicle to improve emotional state of rider
A transportation system includes an artificial intelligence system for processing a sensory input from a wearable device in a self-driving vehicle to determine an emotional state of a rider and optimizing a vehicle operating parameter to improve the rider emotional state. The artificial intelligence system detects the rider emotional state in the self-driving vehicle by recognition of patterns of emotional state indicative data from a set of wearable sensors worn by the rider. The patterns are indicative of at least one of a favorable emotional state and an unfavorable emotional state of the rider. The artificial intelligence system is to optimize, for achieving at least one of maintaining a detected favorable emotional state of the rider and achieving a favorable emotional state of a rider subsequent to a detection of an unfavorable emotional state, the operating parameter of the vehicle in response to the detected emotional state of the rider.
Parameters of augmented reality responsive to location or orientation based on rider or vehicle
A vehicle includes a display disposed to facilitate presenting an augmentation of content in an environment of a rider of the vehicle; a circuit for registering at least one of location and orientation of the vehicle; a machine learning circuit that determines at least one augmentation parameter by processing at least one input relating to at least one of the rider and the vehicle; and a reality augmentation circuit that, responsive to the at least one of the location or the orientation of the vehicle, generates an augmentation element for presenting in the display, the generating based at least in part on the at least one augmentation parameter.
System and method for implementing pedestrian avoidance strategies for a mobile robot
A system and method for implementing pedestrian avoidance strategies for a mobile robot that include receiving position data of a pedestrian and the mobile robot from systems of the mobile robot and estimating positions of the pedestrian and the mobile robot based on the position data. The system and method also include determining an expected intersection point of paths of the pedestrian and the mobile robot and an estimated time for the pedestrian to reach and cross the expected intersection point of the paths. The system and method further include implementing a pedestrian avoidance strategy based on the positions of the pedestrian and the mobile robot and the expected point in time when the pedestrian will reach and cross the expected intersection point of the paths.
User interface for mission generation of area-based operation by autonomous robots in a facility context
A system and a method are disclosed that generate for display to a remote operator a user interface comprising a map, the map comprising visual representations of a source area, a plurality of candidate robots, and a plurality of candidate destination areas. The system receives, via the user interface, a selection of a visual representation of a candidate robot of the plurality of candidate robots, and detects a drag-and-drop gesture within the user interface of the visual representation of the candidate robot being dragged-and-dropped to a visual representation of a candidate destination area of the plurality of candidate destination areas. Responsive to detecting the drag-and-drop gesture, the system generates a mission, where the mission causes the candidate robot to autonomously transport an object from the source area to the candidate destination area.