G05D1/0038

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD, AND PROGRAM
20230052360 · 2023-02-16 · ·

A user terminal generates a virtual drone camera image, as an estimated captured image where it is assumed that a virtual drone camera mounted on a drone has captured an image of a planned landing position on the basis of a captured image obtained by capturing the planned landing position of the drone with the user terminal, and transmits the generated virtual drone camera image to the drone. The drone collates the virtual drone camera image with the image captured by the drone camera and lands at the planned landing position in the image captured by the drone camera. The user terminal generates a corresponding pixel positional relationship formula indicating a correspondence relationship between a pixel position on the captured image of the user terminal and a pixel position on the captured image of the virtual drone camera, and generates the virtual drone camera image using the generated relationship formula.

WORK MACHINE AND REMOTE CONTROL SYSTEM FOR WORK MACHINE

A work machine includes: a vehicle body; a first imaging device that is disposed in the vehicle body and images a first imaging range; a second imaging device that is disposed in the vehicle body and images a second imaging range; and a communication device that transmits a first image in the first imaging range and a second image in the second imaging range to a remote place. At least a part of the second imaging range is set below the first imaging range.

Robotic Source Detection Device And Method
20230051111 · 2023-02-16 ·

An autonomous robotic vehicle is capable of detecting, identifying, and locating the source of gas leaks such as methane. Because of the number of operating components within the vehicle, it may also be considered a robotic system. The robotic vehicle can be remotely operated or can move autonomously within a jobsite. The vehicle selectively deploys a source detection device that precisely locates the source of a leak. The vehicle relays data to stakeholders and remains powered that enables operation of the vehicle over an extended period. Monitoring and control of the vehicle is enabled through a software interface viewable to a user on a mobile communications device or personal computer.

AUTONOMOUS TRANSPORT VEHICLE WITH VISION SYSTEM
20230050980 · 2023-02-16 ·

An autonomous guided vehicle includes a frame, a drive section, a payload handler, a sensor system, and a supplemental sensor system. The sensor system has electro-magnetic sensors, each responsive to interaction or interface of a sensor emitted or generated electro-magnetic beam or field with a physical characteristic, the electro-magnetic beam or field being disturbed by interaction or interface with the physical characteristic, and which disturbance is detected by and effects sensing of the physical characteristic. The sensor system generates sensor data embodying at least one of a vehicle navigation pose or location information and payload pose or location information. The supplemental sensor system supplements the sensor system, and is, at least in part, a vision system with cameras disposed to capture image data informing the at least one of a vehicle navigation pose or location and payload pose or location supplement to the information of the sensor system.

ROBOTIC CLEANER
20230046417 · 2023-02-16 ·

A robotic cleaning system may include a robotic cleaner configured to generate a map of an environment and a mobile device configured to communicatively couple to the robotic cleaner, the robotic cleaner configured to communicate the map to the mobile device. The mobile device may include a camera configured to generate an image of the environment, the image comprising a plurality of pixels, a display configured to display the image and to receive a user input while displaying the image, the user input being associated with one or more of the plurality of pixels, a depth sensor configured to generate depth data that is associated with each pixel of the image, an orientation sensor configured to generate orientation data that is associated with each pixel of the image, and a mobile controller configured to localize the mobile device within the map using the depth data and the orientation data.

AERIAL MARINE DRONE SYSTEM AND METHOD
20230046127 · 2023-02-16 ·

A marine drone system utilizing an unmanned aerial vehicle to provide visual feedback for conditions including temperature, depth, and conditions which may suggest favorable fishing conditions, such as weed lines, flotsam, breaks, and objects, such as birds or fish. The system utilizes a plurality of sensors, including, but not limited to, cameras, laser, GPS, radar, and LIDAR. The visual feedback may be shown as a video fees or a map, wherein the feedback is shown as a visual backgrounds, wherein an overlay of interactive functions provides information regarding the conditions. The system also includes method steps for implementing, obtaining, and displaying the information. The system hardware includes the unmanned aerial vehicle, a base station, and a hardwired tether between the unmanned aerial vehicle and the base station providing power and bi-directional data transfer.

Information processing apparatus
11580468 · 2023-02-14 · ·

An information processing apparatus includes an accumulation unit that accumulates history information regarding a flight history of an aircraft. A specification unit, based on the history information accumulated by the accumulation unit, specifies a candidate to be an operation assistant who assists an operation planned by an assisted operator. An output unit outputs information regarding the candidate to be the operation assistant specified by the specification unit.

Systems and methods for utilizing images to determine the position and orientation of a vehicle

Described are systems and methods to utilize images to determine the position and/or orientation of a vehicle (e.g., an autonomous ground vehicle) operating in an unstructured environment (e.g., environments such as sidewalks which are typically absent lane markings, road markings, etc.). The described systems and methods can determine the vehicle's position and orientation based on an alignment of annotated images captured during operation of the vehicle with a known annotated reference map. The translation and rotation applied to obtain alignment of the annotated images with the known annotated reference map can provide the position and the orientation of the vehicle.

Transferring data from autonomous vehicles
11580687 · 2023-02-14 · ·

A system includes at least one imaging sensor and a processor. The processor is configured to acquire, using the imaging sensor, detected data describing an environment of an autonomous vehicle. The processor is further configured to derive reference data, which describe the environment, from a predefined map, to compute difference data representing a difference between the detected data and the reference data, and to transfer the difference data. Other embodiments are also described.

System and method for interception and countering unmanned aerial vehicles (UAVS)

Systems, devices, and methods for identifying a target aerial vehicle, deploying an interceptor aerial vehicle comprising at least one effector, maneuvering the interceptor aerial vehicle to a position to engage a target aerial vehicle, deploying the at least one effector to intercept the target aerial vehicle, and confirming that the target aerial vehicle has been intercepted.