Patent classifications
G05D1/0038
METHOD AND APPARATUS FOR UAV AND UAV CONTROLLER GROUP MEMBERSHIP UPDATE
In the method, an unmanned aerial system application enabler (UAE) server can determine that a first UAV (UAV-1) is to be replaced with a second UAV (UAV-2) based on a received request. The UAV-2 is recognized by the UAE server based on a Civil Aviation Authority (CAA) level identity (ID) of the UAV-2. A request to perform a group membership update is sent by the UAE server to a SEAL group management (GM) server. The group membership update replaces the UAV-1 with the UAV-2. A response message is received by the UAE server from the SEAL GM server. The request to perform the group membership update includes (i) an ID of an UAE client that corresponds to the group of the UAV-1 and the UAV-C, (ii) a user equipment (UE) ID of the UAV-1, (iii) a UE ID of the UAV-2, and (iv) the CAA-level ID of the UAV-2.
Work system, work machine, and control method
A work system includes an operation device that transmits an operation signal, a work machine that operates on the basis of the operation signal, and a transport vehicle that outputs a traveling control signal in a case where a fault in communication between the operation device and the work machine occurs.
Method and controller for the situational transmission of surroundings information of a vehicle
The disclosure relates to a method for the situational transmission of surroundings data of a vehicle, comprising the steps: determining a driving situation of the vehicle; determining relevant surroundings data from recorded surroundings data based on the driving situation, wherein the relevant surroundings data are a subset of the recorded surroundings data that are relevant to the driving situation; and providing the relevant surroundings data to an external data processing device.
Following control method, control terminal, and unmanned aerial vehicle
The present disclosure provides a following control method. The method includes receiving and displaying an acquired image acquired by an imaging device of an unmanned aerial vehicle (UAV); detecting a user's selection operation on two or more objects in the image; determining a following instruction based on the detected selection operation; and controlling the UAV to follow the two or more followed objects indicated by the following instruction so that the two or more followed objects are in an imaging frame of the imaging device.
Systems and methods for utilizing machine-assisted vehicle inspection to identify insurance buildup or fraud
A remotely-controlled (RC) and/or autonomously operated inspection device, such as a ground vehicle or drone, may capture one or more sets of imaging data indicative of at least a portion of an automotive vehicle, such as all or a portion of the undercarriage. The one or more sets of imaging data may be analyzed based upon data indicative of at least one of vehicle damage or a vehicle defect being shown in the one or more sets of imaging data. Based upon the analyzing of the one or more sets of imaging data, damage to the vehicle or a defect of the vehicle may be identified. The identified damage or defect may be compared to a claimed damage or defect to determine whether the claimed damage or defect occurred.
Enhanced visibility system for work machines
An enhanced visibility system for a work machine includes an image capture device, a sensor, one or more control circuits, and a display. The image capture device is configured to obtain image data of an area surrounding the work machine. The sensor is configured to obtain data regarding physical properties of the area surrounding the work machine. The control circuits are configured to receive the image data and the data regarding the physical properties, and augment the image data with the data regarding the physical properties to generate augmented image data. The display is configured to display the augmented image data to provide an enhanced view of the area surrounding the work machine.
Drone inspection analytics for asset defect detection
A set of images of a three-dimensional (3D) inspection object collected by a drone during execution of a first flight path may be received, along with telemetry data from the drone. A tagged set of images may be stored, with each tagged image being stored together with a corresponding drone position at a corresponding time that the tagged image was captured, as obtained from the telemetry data. A mapping of the set of tagged images to corresponding portions of a 3D model of the 3D inspection object may be executed, based on the telemetry data. Based on the mapping, at least one portion of the 3D inspection object omitted from the set of tagged images may be identified. A second flight path may be generated for the drone that specifies a position of the drone to capture an image of the at least one omitted portion of the 3D inspection object.
Inspection robot and methods thereof for responding to inspection data in real time
An inspection robot, and methods and a controller thereof are disclosed. An inspection robot may include an inspection chassis including a plurality of inspection sensors and coupled to at least one drive module to drive the robot over an inspection surface. The inspection robot may also include a controller including an inspection data circuit to interpret inspection base data, an inspection processing circuit to determine refined inspection data, and an inspection configuration circuit to determine an inspection response value in response to the refined inspection data. The controller may further include an inspection response circuit to, in response to the inspection response value, provide an inspection command value while the inspection robot is interrogating the inspection surface.
Systems and methods for vehicle motion control with interactive object annotation
Systems and methods for vehicle motion control with interactive object annotation are provided. A method can include obtaining data indicative of a plurality of objects within a surrounding environment of the autonomous vehicle. For example, the plurality of objects can include at least at one problem object encountered by the autonomous vehicle while navigating a planned route. The method can include determining a group of objects of the plurality of objects. For example, the group of objects can include the problem object and one or more other objects in proximity to the problem object. The method can include determining a classification update to be applied to the group of objects. The method can include applying the classification update to the group of objects. The method can include providing data indicative of the classification update for the group of objects to the autonomous vehicle for use in motion planning.
Work vehicle display systems and methods for generating visually-manipulated context views
A work vehicle display system utilized in piloting a work vehicle includes a display device having a display screen, a context camera mounted to the work vehicle and positioned to capture a context camera feed of the work vehicle's exterior environment, and a controller architecture. The controller architecture is configured to: (i) receive the context camera feed from the context camera; (ii) generate a visually-manipulated context view utilizing the context camera feed; and (iii) output the visually-manipulated context view to the display device for presentation on the display screen. In the process of generating the visually-manipulated context view, the controller architecture applies a dynamic distortion-perspective (D/P) modification effect to the context camera feed, while gradually adjusting a parameter of the dynamic D/P modification effect in response to changes in operator viewing preferences or in response to changes in a current operating condition of the work vehicle.