G05D2111/10

Skydiving Robots which precisely land and deliver Payloads
20240158110 · 2024-05-16 · ·

Device, system, and method for Skydiving Robots? which can skydive using customized or off-the-shelf parachutes and deliver civilian or military payloads. The Skydiving Robots can freefall, open the parachute and steer toward the target, carry payloads, operate in the daytime or the pitch black at night using GPS guidance to land precisely. If they exited the plane at up to or over 30,000 feet above ground level (AGL) the final target could be miles away. They are the ideal reconnaissance scouts with a wide array of sensors such as cameras. They can carry payloads and precisely land within a few feet of a target.

SYSTEM AND METHOD FOR QUEUEING ROBOT OPERATIONS IN A WAREHOUSE ENVIRONMENT BASED ON WORKFLOW OPTIMIZATION INSTRUCTIONS
20240157556 · 2024-05-16 ·

A system and method are described that provide for queueing robot operations in a warehouse environment based on workflow optimization instructions. In one example of the system/method of the present invention, a control system causes certain robots to queue proximate to one another to permit resources to be obtained, transported, deposited, etc. without the robots crashing into one another (or into other objects), or forming traffic jams. A robot may remain at an assigned queue position at least until another position assigned to the robot becomes available.

SYSTEM AND METHOD FOR MAPPING FEATURES OF A WAREHOUSE ENVIRONMENT HAVING IMPROVED WORKFLOW
20240160223 · 2024-05-16 ·

A system and method are described that provide for mapping features of a warehouse environment having improved workflow. In one example of the system/method of the present invention, a mapping robot is navigated through a warehouse environment, and sensors of the mapping robot collect geospatial data as part of a mapping mode. A Frontend N block of a map framework may be responsible for reading and processing the geospatial data from the sensors of the mapping robot, as well as various other functions. The data may be stored in a keyframe object at a keyframe database. A Backend block of the map framework may be useful for detecting loop constraints, building submaps, optimizing a pose graph using keyframe data from one or more trajectory blocks, and/or various other functions.

BINOCULAR VISION-BASED ENVIRONMENT SENSING METHOD AND APPARATUS, AND UNMANNED AERIAL VEHICLE
20240153122 · 2024-05-09 ·

A binocular vision-based environment sensing method and apparatus, is applied to an unmanned aerial vehicle. The unmanned aerial vehicle is provided with five binocular cameras. The first binocular camera is disposed at the front portion of the fuselage of the unmanned aerial vehicle. The second binocular camera is inclined upward and disposed between the left side of the fuselage and the upper portion of the fuselage of the unmanned aerial vehicle. The third binocular camera is inclined upward and disposed between the right side of the fuselage and the upper portion of the fuselage of the unmanned aerial vehicle. The fourth binocular camera is disposed at the lower portion of the fuselage of the unmanned aerial vehicle. The fifth binocular camera disposed at the rear portion of the fuselage of the unmanned aerial vehicle. The method can simplify an omni-directional sensing system while reducing the sensing blind area.

DEVICES, SYSTEMS AND METHODS FOR NAVIGATING A MOBILE PLATFORM
20240152159 · 2024-05-09 ·

Aspects of embodiments to systems and methods for navigating a mobile platform using an imaging device on the platform, from a point of origin towards a target located in a scene, and without requiring a Global Navigation Satellite system (GNSS), by employing the following steps: acquiring, by the imaging device, an image of the scene comprising the target; determining, based on analysis of the image, a direction vector pointing from the mobile platform to the target; advancing the mobile platform in accordance with the direction vector to a new position; and generating, by a distance sensing device, an output as a result of attempting to determine, with the distance sensing device, a distance between the mobile platform and the target. The mobile platform advanced towards the target until the output produced by the distance sensing device is descriptive of a distance which meets a low-distance criterion.

Method for straight edge detection by robot and method for reference wall edge selection by cleaning robot
11977390 · 2024-05-07 · ·

The present disclosure relates to a method for straight edge detection by a robot and a method for reference wall edge selection by a cleaning robot. The method for straight edge detection by the robot includes that: position coordinates of detection points are determined according to distance values detected by a distance sensor of the robot and angle values detected by an angle sensor of the robot, and then a final straight edge is determined according to a slope of a straight line formed by adjacent two of the detection points.

MOBILITY AND MOBILITY SYSTEM
20190192361 · 2019-06-27 ·

Provided are a mobility and a mobility system which enable a person who requires nursing care with a deteriorated bodily function or cognitive function to lead a life with greater independence. The drive and travel of the mobility by the travel drive unit are controlled by adjusting the current travel state of the mobility to an optimal travel state for ensuring safety while reflecting the user's intention based on the current operation instruction detected by the operation unit, the command signal generated by the voluntary control unit, the travel environment sensed by the environment sensing unit, and the respective types of environmental information acquired by the information acquisition unit.

METHOD AND APPARATUS FOR CONTROLLING AIRCRAFT FOR FOLLOWING AND PHOTOGRAPHING, ELECTRONIC DEVICE, AND STORAGE MEDIUM
20240210961 · 2024-06-27 ·

Embodiments of the present disclosure disclose a method and apparatus for controlling an aircraft for following and photographing, an electronic device, and a storage medium. The method includes: receiving a following instruction of a user, and controlling an aircraft to enter a following mode according to the following instruction; obtaining a currently captured picture of the aircraft, and displaying the currently captured picture to the user, so that the user selects a following target from the currently captured picture; controlling the aircraft to lock on the following target for following and photographing, and displaying a following orientation control to the user; and controlling, according to an operation of the user on the following orientation control, the aircraft to switch a following orientation.

HANDS FREE ADVANCE CONTROL FOR MATERIAL HANDLING VEHICLE

A system has been developed to facilitate hands-free autonomous control of material handling equipment or vehicles, such as forklifts and pallet trucks. The system has been designed to facilitate visual tracking by the automated vehicle and further facilitates voice commands. In some use cases, visual tracking is only used, and in other cases only voice commands are used. In other cases, both visual tracking and voice commands are used to control the material handling vehicle. For visual tracking, one or more cameras are configured to capture one or more images of a fiducial. In one case, the vehicle moves, turns, and/or stops based on the movement of the fiducial. For voice control, the operator provides voice commands via a voice controller. The voice controller converts the voice commands to vehicle control commands that are sent to a remote receiver unit (RRU) in the vehicle.

STRUCTURED LIGHT MODULE AND SELF-MOVING DEVICE
20240197130 · 2024-06-20 ·

The application provides a structured light module and an autonomous mobile device. The structured light module includes a first camera and line laser emitters for collecting a first environmental image containing laser stripes generated when the line laser encounters an object. The structured light module can also capture a visible light image through a second environmental image that does not contain laser stripes. Both the first and second environmental images can help to detect more accurate and richer environmental information, expanding the application range of laser sensors.