G05D1/0044

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD, AND PROGRAM
20230052360 · 2023-02-16 · ·

A user terminal generates a virtual drone camera image, as an estimated captured image where it is assumed that a virtual drone camera mounted on a drone has captured an image of a planned landing position on the basis of a captured image obtained by capturing the planned landing position of the drone with the user terminal, and transmits the generated virtual drone camera image to the drone. The drone collates the virtual drone camera image with the image captured by the drone camera and lands at the planned landing position in the image captured by the drone camera. The user terminal generates a corresponding pixel positional relationship formula indicating a correspondence relationship between a pixel position on the captured image of the user terminal and a pixel position on the captured image of the virtual drone camera, and generates the virtual drone camera image using the generated relationship formula.

OBTAINING AND AUGMENTING AGRICULTURAL DATA AND GENERATING AN AUGMENTED DISPLAY SHOWING ANOMALIES

A geographic position of an agricultural machine is captured. Agricultural data is received that corresponds to a geographic position. Georeferenced visual indicia are displayed that are indicative of the received agricultural data.

SYSTEM AMD METHOD FOR PROVIDING SITUATIONAL AWARENESS INTERFACES FOR AUTONOMOUS VEHICLE OPERATORS

A supervisory control system is disclosed that provides an operator situational awareness interface use with monitoring a plurality of automated vehicles (AVs). The system is configured to: generate a map of a geographical area of interest; obtain location data and perceived risk data for a plurality of AVs in the geographical area; generate a vehicle icon corresponding to each AV; position the vehicle icon for each AV on the map based on the location data for a corresponding AV; apply a color coding to each vehicle icon based on a perceived risk level for a corresponding AV; and signal a display device to display an AV fleet map graphic that includes the color coded vehicle icons positioned on the map. The controller may be further configured to: generate an AV servicing queue graphic that displays vehicle icons in an order based on a determined servicing priority.

Robotic Source Detection Device And Method
20230051111 · 2023-02-16 ·

An autonomous robotic vehicle is capable of detecting, identifying, and locating the source of gas leaks such as methane. Because of the number of operating components within the vehicle, it may also be considered a robotic system. The robotic vehicle can be remotely operated or can move autonomously within a jobsite. The vehicle selectively deploys a source detection device that precisely locates the source of a leak. The vehicle relays data to stakeholders and remains powered that enables operation of the vehicle over an extended period. Monitoring and control of the vehicle is enabled through a software interface viewable to a user on a mobile communications device or personal computer.

ROBOTIC CLEANER
20230046417 · 2023-02-16 ·

A robotic cleaning system may include a robotic cleaner configured to generate a map of an environment and a mobile device configured to communicatively couple to the robotic cleaner, the robotic cleaner configured to communicate the map to the mobile device. The mobile device may include a camera configured to generate an image of the environment, the image comprising a plurality of pixels, a display configured to display the image and to receive a user input while displaying the image, the user input being associated with one or more of the plurality of pixels, a depth sensor configured to generate depth data that is associated with each pixel of the image, an orientation sensor configured to generate orientation data that is associated with each pixel of the image, and a mobile controller configured to localize the mobile device within the map using the depth data and the orientation data.

Transferring data from autonomous vehicles
11580687 · 2023-02-14 · ·

A system includes at least one imaging sensor and a processor. The processor is configured to acquire, using the imaging sensor, detected data describing an environment of an autonomous vehicle. The processor is further configured to derive reference data, which describe the environment, from a predefined map, to compute difference data representing a difference between the detected data and the reference data, and to transfer the difference data. Other embodiments are also described.

Mapping, controlling, and displaying networked devices with a mobile cleaning robot

A mobile cleaning robot that includes a drive system configured to navigate around an operational environment, a ranging device configured to communicate with other ranging devices of respective electronic devices that are in the operational environment, and processors in communication with the ranging device that are configured to receive a distance measurement from the respective electronic devices present in the operational environment, each distance measurement representing a distance between the mobile cleaning robot and a respective electronic device, tag each of the distance measurements with location data indicative of a spatial location of the mobile cleaning robot in the operational environment, determine spatial locations of each of the electronic devices in the operational environment, and populate a visual representation of the operating environment with visual indications of the electronic devices in the operating environment.

Method for controlling an autonomous mobile robot
11709497 · 2023-07-25 · ·

A method for controlling an autonomous mobile robot for carrying out a task in a local region of an area of application of the robot. According to one embodiment, the method comprises the following steps: positioning the robot in starting position within the area of application of the robot; detecting information relating to the surroundings of the robot by means of at least one sensor; selecting a region with a determined geometric basic shape; and automatically determining, based on the detected information relating to the surroundings, at least one of the two following parameters: size and position (also including the orientation/alignment) of the selected region.

MOBILE ROBOT, MOVEMENT CONTROL SYSTEM, AND MOVEMENT CONTROL METHOD

A mobile robot includes a receiving unit that receives designation of a destination region including a destination, a moving unit that moves toward the destination region, and a seeking unit that seeks a client after movement toward the destination region starts. The moving unit moves toward the sought client.

Moving apparatus for cleaning, collaborative cleaning system, and method of controlling the same

The disclosure relates to a moving apparatus for cleaning, a collaborative cleaning system, and a method of controlling the same, the moving apparatus for cleaning including: a cleaner configured to perform cleaning; a traveler configured to move the moving apparatus; a communicator configured to communicate with an external apparatus; and a processor configured to identify an individual cleaning region corresponding to the moving apparatus among a plurality of individual cleaning regions assigned to the moving apparatus and at least one different moving apparatus based on current locations throughout a whole cleaning region, based on information received through the communicator, and control the traveler and the cleaner to travel and clean the identified individual cleaning region. Thus, the individual cleaning regions are assigned based on the location information about the plurality of cleaning robots, and a collaborative clean is efficiently carried out with a total shortened cleaning time.