Patent classifications
G05D1/689
System and method for autonomously landing a vertical take-off and landing (VTOL) aircraft
A system for autonomously landing a Vertical Take-Off and Landing (VTOL) aircraft, comprising: a first sensor; a second sensor; and a processing resource configured to: (a) obtain, from, the first sensor, first readings; (b) generate, at a first rate, based on at least part of the first readings, a 3D model of at least, part of a scene visible by the first sensor; (c) obtain, from the second sensor, a plurality of second readings, enabling identifying changes within the at least part of the scene; (d) analyze at least part of the second readings, at a second rate, to obtain changes information indicative of the changes; (e) identify, using the 3D model and the changes information, potential landing areas for the aircraft; (f) generate commands to maneuver the aircraft towards a selected landing area of the potential landing areas; and (g) repeat steps (a) to (f) until landing the aircraft.
METHODS AND AUTONOMOUS ROBOTS FOR TAKING INVENTORY IN A STRUCTURE
System and method for taking inventory of a plurality of objects within a structure. The method is executed by a controller of an autonomous mobile robot and comprises causing the said robot to navigate through at least a portion of the structure, causing at least one camera of the robot to acquire a plurality of positioning images at a first resolution and determining that at least one positioning image contains an image of a predetermined landmark. In response to determining that the at least one positioning image contains the image of the predetermined landmark, the autonomous mobile robot navigates to a predetermined data collection position, the at least one camera of the autonomous mobile robot acquires at least one inventory image at a second resolution, the second image resolution being greater than the first resolution, and a plurality of inventory labels are extracted from the at least one inventory image.
Moving robot, system of moving robot and method for moving to charging station of moving robot
The present disclosure relates to a moving robot, a moving robot system, and a method for moving to a charging system of the moving robot, wherein the moving robot moves to the charging system based on a reception result obtained by receiving a plurality of transmission signals transmitted from the charging station and a sensing result obtained by sensing a magnetic field state.
System and method for providing easy-to-use release and auto-positioning for drone applications
The present disclosure provides an aerial system, comprising: a body; a lift mechanism coupled to the body; an optical system coupled to the body; and a computer system having at least one processor and at least one memory comprising first program instructions. When the first program instructions are executed by the at least one processor, the at least one processor may be configured to: receive a target operation, the target operation associated with a flight trajectory and a predefined action performed by the optical system and execute the target operation.
Navigation system with camera assist
One embodiment is a navigation system for an aircraft including a positioning system to generate information related to a position of the aircraft, a group of cameras mounted to a body of the aircraft, each camera of the group of cameras to simultaneously capture images of a portion of an environment that surrounds the aircraft, and a processing component coupled to the positioning system and the group of cameras, the processing component to determine a current position of the aircraft based on the information related to the position of the aircraft and the images.
Airborne drones with non-physical distractors
Embodiments of the invention include a drone system or method for distracting a threat with a light source. In some embodiments, an environment can be scanned with a sensor that produces sensor data. A threat within the environment can be identified from the sensor data, and the location of the threat can be determined. The light source can be aimed toward the threat; and light may be directed from the light source toward the threat.
Optical fiber sensing system, optical fiber sensing equipment, and unmanned aerial vehicle allocation method
An optical fiber sensing system according to the present disclosure includes an optical fiber (10) that detects vibration, a detection unit (21) that detects occurrence of a predetermined event, based on an optical signal on which vibration detected by the optical fiber (10) is superimposed, an identification unit (22) that identifies an occurrence location of the predetermined event, based on the optical signal, and identifies a movement destination area serving as a moving destination of an unmanned aerial vehicle that monitors an occurrence location of the predetermined event, based on an occurrence location of the predetermined event, and a control unit (23) that controls the unmanned aerial vehicle to move to the movement destination area.
Optical fiber sensing system, optical fiber sensing equipment, and unmanned aerial vehicle allocation method
An optical fiber sensing system according to the present disclosure includes an optical fiber (10) that detects vibration, a detection unit (21) that detects occurrence of a predetermined event, based on an optical signal on which vibration detected by the optical fiber (10) is superimposed, an identification unit (22) that identifies an occurrence location of the predetermined event, based on the optical signal, and identifies a movement destination area serving as a moving destination of an unmanned aerial vehicle that monitors an occurrence location of the predetermined event, based on an occurrence location of the predetermined event, and a control unit (23) that controls the unmanned aerial vehicle to move to the movement destination area.
Autonomous vehicle for handling goods in cooperation with unmanned aerial vehicle and method thereof
Provided is a method for an autonomous vehicle to handle goods in collaboration with an unmanned aerial vehicle. The method comprises recognizing an unmanned aerial vehicle loading goods by the autonomous vehicle, capturing an image of the recognized unmanned aerial vehicle by the autonomous vehicle, analyzing the captured image to recognize a marker by the autonomous vehicle, adjusting a relative position of the autonomous vehicle and the unmanned aerial vehicle by moving the autonomous vehicle based on a recognition result of the marker, and taking over the goods from the unmanned aerial vehicle by the autonomous vehicle after position adjustment is completed.
Autonomous vehicle for handling goods in cooperation with unmanned aerial vehicle and method thereof
Provided is a method for an autonomous vehicle to handle goods in collaboration with an unmanned aerial vehicle. The method comprises recognizing an unmanned aerial vehicle loading goods by the autonomous vehicle, capturing an image of the recognized unmanned aerial vehicle by the autonomous vehicle, analyzing the captured image to recognize a marker by the autonomous vehicle, adjusting a relative position of the autonomous vehicle and the unmanned aerial vehicle by moving the autonomous vehicle based on a recognition result of the marker, and taking over the goods from the unmanned aerial vehicle by the autonomous vehicle after position adjustment is completed.