B64U101/30

Autonomous robotic navigation in storage site

A robot includes an image sensor that captures the environment of a storage site. The robot visually recognizes regularly shaped structures to navigate through the storage site using various object detection and image segmentation techniques. In response to receiving a target location in the storage site, the robot moves to the target location along a path. The robot receives the images as the robot moves along the path. The robot analyzes the images captured by the image sensor to determine the current location of the robot in the path by tracking a number of regularly shaped structures in the storage site passed by the robot. The regularly shaped structures may be racks, horizontal bars of the racks, and vertical bars of the racks. The robot can identify the target location by counting the number of rows and columns that the robot has passed.

Method and apparatus for controlling a crane, an excavator, a crawler-type vehicle or a similar construction machine

The present invention generally relates to the control of material transfer machines and/or construction machines having camera assistance. The invention here in particular relates to a method and to an apparatus for controlling a material transfer machine and/or a construction machine, in particular in the form of a crane, of an excavator, or of a crawler-type vehicle, wherein an image of the piece of working equipment is provided to a machine operator and/or to a machine control by an imaging sensor. The invention furthermore also relates to the material transfer machine and/or construction machine itself, in particular to a crane, having a display apparatus for displaying an image of the piece of working equipment and/or of the environment of the piece of working equipment. It is proposed to use a remote-controlled aerial drone which is equipped with at least one imaging sensor and by means of which the desired image of the piece of working equipment and/or of the equipment environment can be provided from different directions of view.

Noise cancellation for aerial vehicle
11670274 · 2023-06-06 · ·

A noise cancelation system for an unmanned aerial vehicle may have an audio capture module, a metadata module and a filter. The audio capture module may be configured to receive an audio signal captured from a microphone, e.g., on a camera. The metadata module may be configured to retrieve noise information associated with noise generating components operating on the unmanned aerial vehicle (UAV). The filter may be configured to receive the audio signal and noise information from the audio capture module. The filter also may be configured to retrieve a baseline profile from a database based on the noise information. The baseline profile includes noise parameter to filter out audio frequencies from the audio signal corresponding to the noise generating component. The filter may generate a filtered audio signal for output.

Systems and methods of alarm triggered equipment verification using drone deployment of sensors

An equipment monitoring system includes an unmanned vehicle, an alarm circuit remote from the unmanned vehicle, and a vehicle control circuit remote from the unmanned vehicle. The unmanned vehicle can include a communications circuit and a flight controller. The alarm circuit detects a failure condition of a building component and outputs an indication of the failure condition. The vehicle control circuit receives the indication of the failure condition from the alarm circuit; generates, based on the indication of the failure condition, an equipment verification signal that includes an identifier of the building component, a position of the building component, and a test of the building component to be executed; and transmits the equipment verification signal to the flight controller of the unmanned vehicle via the communications circuit of the unmanned vehicle to cause the unmanned vehicle to execute the test of the building component.

In data acquisition, processing, and output generation for use in analysis of one or a collection of physical assets of interest

Various examples are provided for data acquisition, processing, and output generation for use in analysis of a physical asset or a collection of physical assets of interest. In one example, a method includes providing a user information goal including user information for acquisition, processing, or output of data associated with a physical asset or collection of physical assets; and evaluating existing database information to determine whether all or part of the first user information goal can be substantially completed by retrieval and processing of an information set obtainable from existing database information. If the user information goal cannot substantially be completed using the information set, then a data acquisition plan configured to acquire data needed to substantially complete first user information goal can be generated. If the user information goal can be substantially completed using the information set, then the formation set can be processed to provide an output.

Control method, control device, and carrier system

A control method includes acquiring motion information of a tracked object, and based on the motion information, controlling movement of a carrying device that carries a tracking device to enable the tracking device to track the tracked object while maintaining approximately unchanged an angle between a moving direction of the tracked object and a line connecting the tracked object and the tracking device when the tracked object moves.

Automatic terrain evaluation of landing surfaces, and associated systems and methods

Automatic terrain evaluation of landing surfaces, and associated systems and methods are disclosed herein. A representative method includes receiving a request to land a movable object and, in response to the request, identifying a target landing area on a landing surface based on at least one image of the landing surface obtained by the movable object. The method can further include directing the movable object to land at the target landing area.

Systems and methods for autonomous marking maintenance
11726478 · 2023-08-15 · ·

A marking maintenance system comprising a marking database, a drone, and a data network communicatively coupled to the marking database and drone. The marking database is arranged to store marking data associated with one or more markings. The marking data can include one or more marking locations within a geographic area and a type of infrastructure associated with each of the one or more marking. The drone is arranged to determine the location of the drone via one or more location sensors, receive data from the marking database, and deploy to each marking location within a portion of the geographic area. The drone is also arranged to determine whether each marking within the portion of the geographic area is sufficiently present using one or more marker sensors and repair each marking within the portion of the geographic area that is determined to not be sufficiently present.

Drone-assisted sensor mapping

Methods, systems, and apparatus for drone-assisted sensor mapping are disclosed. A method includes detecting a sensor in a detection area of a drone; based on detecting the sensor in the detection area of the drone, detecting the drone in sensor data captured by the sensor; determining a detection area of the sensor based on movement of the drone after the drone is detected; and determining a destination for the drone based on the detection area of the sensor. The method may include mapping boundaries of the detection area of the sensor to a map of an area where the sensor is located. The sensor can be a passive infrared sensor, an active infrared sensor, a radar sensor, a sonar sensor, a time of flight sensor, a structured light sensor, or a lidar sensor.

Unmanned aerial vehicle imaging control method, unmanned aerial vehicle imaging method, control terminal, unmanned aerial vehicle control device, and unmanned aerial vehicle

An imaging control method for an unmanned aerial vehicle (“UAV”) includes receiving a video capturing instruction from a control terminal, determining, based on an imaging scene, a combined action mode, and capturing a video based on the combined action mode. The combined action mode includes a plurality of action modes, the plurality of action modes are different from each other, and the plurality of action modes include at least one of a type of the plurality of action modes, an arrangement order of the plurality of action modes, a flight trajectory of each of the plurality of action modes, a composition rule for the each of the plurality of action modes, or a flight duration of the each of the plurality of action modes.