Patent classifications
B64U2101/23
Motion and image-based control system
Systems, devices, media, and methods are presented for detecting and interpreting motion of a device and a remote object to control operations of the device. The systems and methods identify a sensor input within a drone. The sensor input indicates movement of the drone within a three dimensional space. The systems and methods determine one or more movement attributes from the sensor input and, in response to the one or more movement attributes, selects one or more maneuvers corresponding to at least one movement attribute. The system and methods then execute the one or more maneuvers by controlling one or more drone control components to move the drone within the three dimensional space.
METHOD PERFORMED IN AN AUTONOMOUS UNMANNED AERIAL VEHICLE FOR ENABLING AUTONOMOUS EMERGENCY ASSISTANCE FOR A COMMUNICATION DEVICE REGISTERED IN A REGULAR CELLULAR NETWORK, VEHICLE AND DEVICE THEREFORE
A method for enabling autonomous emergency assistance for one or more communication device, CD, registered in a regular cellular network. The method is performed in an autonomous unmanned aerial vehicle, UAV, and comprises emulating a cellular network in a geographical region, wherein the UAV and the one or more CD are without connectivity with the regular cellular network, sending an information message in the geographical region, the message comprising an emergency response trigger, receiving an automatic emergency data response from the one or more CD in the geographical region, in response to the sent message, and determining an action based on the received automatic emergency data response. A CD, a UAV, a computer program and a computer program product are also presented.
SMART DRONE ROOFTOP AND GROUND AIRPORT SYSTEM
An unmanned vehicle control system is disclosed, comprising a ground control station in operable communication with a plurality of unmanned vehicles via a communications network. The ground control station receives unmanned vehicle mission information and provides a plurality of instructions to the unmanned vehicle to execute a mission including a take-off procedure and a landing procedure. A plurality of microservices process requests from a controller and at least one charging station provides a docking point for the plurality of unmanned vehicles. The charging station provides a power source to the plurality of unmanned vehicles and receives mission information from the ground control station, wherein the unmanned vehicles are operable to deliver a good to a remote location.
Vehicle marking in vehicle management systems
The present disclosure generally discloses a vehicle marking capability configured to support marking of vehicles in a vehicle management system. The vehicle marking capability is configured to support marking of vehicles in a graphical user interface (GUI) that is supported by a vehicle management application that is supported by the vehicle management system. The vehicle marking capability may be configured to support marking of various vehicle measures of vehicles for various vehicle types which may be managed by the vehicle management system. The vehicle marking capability may be configured to support marking of vehicle measures for vehicles using measure objects having lengths that are based on the vehicle measures (e.g., using linear scaling, logarithmic scaling, or the like). The vehicle marking capability may be configured to support marking of vehicle altitudes for aerial vehicles using altitude columns having column heights that are based on the vehicle altitudes.
AUTONOMOUS AIRCRAFT AND METHOD FOR PROVIDING VENTILATION TO PEOPLE
An autonomous aerial vehicle for ventilating persons. The ventilation becomes necessary as a result of fires, accidents, or medical emergencies. In these and comparable cases, the aerial vehicle aids for ventilating a person quickly and independently of the transport links of a location or the traffic situation at the time so the state of the person is stabilized until the arrival of an emergency doctor or other rescue workers and the chances of survival improves. The aerial vehicle provides positional determination inside and/or outside buildings, recording of the surrounding area, ventilation of at least one person, and a communication method or mechanism.
Automated individual security
A controller monitors for an activation condition through a monitoring interface of a wearable aerial device. In response to detecting the activation condition through the monitoring interface, the controller triggers the wearable aerial device to release from an aesthetic attachment proximate to a user and hover a distance above the user of a height above a selected height threshold. The controller analyzes a recording of content by the wearable aerial device to assess a particular threat level associated with the content from among multiple threat levels. The controller, in response to the particular threat level exceeding a threat threshold, automatically sends a communication to one or more emergency contacts.
Motion and image-based control system
Systems, devices, media, and methods are presented for detecting and interpreting motion of a device and a remote object to control operations of the device. The systems and methods identify a sensor input within a drone. The sensor input indicates movement of the drone within a three dimensional space. The systems and methods determine one or more movement attributes from the sensor input and, in response to the one or more movement attributes, selects one or more maneuvers corresponding to at least one movement attribute. The system and methods then execute the one or more maneuvers by controlling one or more drone control components to move the drone within the three dimensional space.
Unmanned aerial vehicle (UAV)-assisted worksite operations
In one example, a position of landscape modifiers within a worksite is determined and a position output indicative of the position of the landscape modifiers is generated. Based on the position output, different types of worksite areas within the worksite are identified and an area identifier output indicative of the types of worksite areas is generated, as is a location of the worksite areas within the worksite. The worksite areas are prioritized based on the type. A route is generated for an unmanned aerial vehicle (UAV) based on the prioritized worksite areas. Control signals are provided to the UAV based on the route. In another example, a user input mechanism on a user interface is configured to receive a user input indicative of field data for a worksite and at least one vehicle control variable for controlling an unmanned aerial vehicle (UAV) to carry out a worksite mission within the worksite. Dependent variables related to the field data are calculated, as are at least one vehicle control variable, based on the received user input indicating the field data and the at least one vehicle control variable. A display of the calculated dependent variables along with the field data is generated with at least one vehicle control variable on a user interface device. Control signals are provided to the UAV based on the field data, the at least one vehicle control variable and calculated dependent variables.
Connected automation controls using robotic devices
Methods, systems, and apparatus, including computer programs encoded on a storage device, for using a robotic device to manipulate a manual control of a device. In one aspect, the system includes a robotic device, a first device that is located at a property and that has a manual control, and a monitoring unit. The monitoring unit may include a network interface, a processor, and a storage device that includes instructions to cause the processor to perform operations. The operations may include determining an operating state of the first device, determining the state of the monitoring system, determining whether one or more of the manual controls associated with the first device should be manipulated to alter the operating state of the first device, and transmitting one or more instruction to the robotic device that instruct the robotic device to manipulate one or more manual controls that are associated the first device.
MOTION AND IMAGE-BASED CONTROL SYSTEM
Systems, devices, media, and methods are presented for detecting and interpreting motion of a device and a remote object to control operations of the device. The systems and methods identify a sensor input within a drone. The sensor input indicates movement of the drone within a three dimensional space. The systems and methods determine one or more movement attributes from the sensor input and, in response to the one or more movement attributes, selects one or more maneuvers corresponding to at least one movement attribute. The system and methods then execute the one or more maneuvers by controlling one or more drone control components to move the drone within the three dimensional space.