Patent classifications
G05D1/223
Navigating an autonomous vehicle based upon an image from a mobile computing device
An autonomous vehicle receives geographic location data defined via a mobile computing device operated by a user. The geographic location data is indicative of a device position of the mobile computing device. The autonomous vehicle also receives image data generated by the mobile computing device. The image data is indicative of a surrounding position nearby the device position. The surrounding position is selected from an image captured by a camera of the mobile computing device. A requested vehicle position (e.g., a pick-up or drop-off location) is set for a trip of the user in the autonomous vehicle based on the geographic location data and the image data. A route from a current vehicle position of the autonomous vehicle to the requested vehicle position for the trip of the user in the autonomous vehicle is generated. Moreover, the autonomous vehicle can follow the route to the requested vehicle position.
Systems and methods of detecting intent of spatial control
Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.
Parachute landing methods and systems for an unmanned aerial vehicle
The present application provides a system for unmanned aerial vehicle (UAV) parachute landing. An exemplary system includes a detector configured to detect at least one of a flight speed, a wind speed, a wind direction, a position, a height, and a voltage of a UAV. The system also includes a memory storing instructions and a processor configured to execute the instructions to cause the system to: determine whether to open a parachute of the UAV in accordance with a criterion, responsive to the determination to open the parachute of the UAV, stop a motor of the UAV that spins a propeller of the UAV, and open the parachute of the UAV after stopping the motor of the UAV for a first period.
Vehicle remote control method and vehicle remote control device
The relative position between a vehicle and an extension unit located outside the vehicle is detected, and a command touch operation for operating the vehicle with a remote operation device is set in accordance with the detected relative position. The touch operation of an operator is detected by a touch panel of the remote operation device, and a determination is made as to whether or not the detected touch operation is the command touch operation. When the touch operation is the command touch operation, the vehicle is controlled to execute autonomous travel control. The vehicle has an autonomous travel control function.
Moving robot, moving robot control method and program therefor
A mobile robot includes a position distance calculation command transmission unit 1, a position distance calculation command transfer unit 2, a reply position distance calculation command transmission unit 3, a direction storage unit 4, a reply position distance calculation command transfer unit 5, a first head robot unit determination command transmission unit 6, a robot unit determination unit 7, a first movement unit 8, a second movement unit 9, a next head robot unit selection command transmission unit 10 and a second head robot unit determination command transmission unit 11, for example.
Mobile body, information processor, mobile body system, information processing method, and information processing program
An information processing method of an information processor includes: obtaining information received from a mobile body through a wireless communication, the mobile body including a movement mechanism and an imaging unit configured to capture image data, the information received from the mobile body including captured image data obtained by the imaging unit, with the captured image data being updated periodically; and generating route guidance information for use in moving the mobile body by the movement mechanism. The captured image data is stored together with data update time information. The route guidance information includes at least two selectable routes. The route guidance information is generated based on the captured image data, position information of the mobile body, and the data update time information.
MOBILE BODY, INFORMATION PROCESSOR, MOBILE BODY SYSTEM, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING PROGRAM
An information processing method of an information processor includes: obtaining information received from a mobile body through a wireless communication, the mobile body including a movement mechanism and an imaging unit configured to capture image data, the information received from the mobile body including captured image data obtained by the imaging unit, with the captured image data being updated periodically; and generating route guidance information for use in moving the mobile body by the movement mechanism. The captured image data is stored together with data update time information. The route guidance information includes at least two selectable routes. The route guidance information is generated based on the captured image data, position information of the mobile body, and the data update time information.
Hybrid Modular Storage Fetching System
A hybrid modular storage fetching system is described. In an example implementation, the system may include a warehouse execution system adapted to generate a picking schedule for picking pick-to-cart and high-density storage items, and an AGV dispatching system adapted to dispatch a cart automated guided vehicle and a modular storage fetching automated guided vehicle based on the picking schedule. The cart automated guided vehicle may be adapted autonomously transport a carton through a pick-to-cart area and to a pick-cell station. The modular storage fetching automated guided vehicle may be adapted to synchronously autonomously transport a modular storage unit containing items to be placed in the cartons from a high-density storage area to the pick-cell station.
REMOTELESS CONTROL OF DRONE BEHAVIOR
A drone system is configured to capture an audio stream that includes voice commands from an operator, to process the audio stream for identification of the voice commands, and to perform operations based on the identified voice commands. The drone system can identify a particular voice stream in the audio stream as an operator voice, and perform the command recognition with respect to the operator voice to the exclusion of other voice streams present in the audio stream. The drone can include a directional camera that is automatically and continuously focused on the operator to capture a video stream usable in disambiguation of different voice streams captured by the drone.
Navigating semi-autonomous mobile robots
Techniques for navigating semi-autonomous mobile robots are described. A semi-autonomous mobile robot moves within an environment to complete a task. A navigation server communicates with the robot and provides the robot information. The robot includes a navigation map of the environment, interaction information, and a security level. To complete the task, the robot transmits a route reservation request to the navigation server, the route reservation request including a priority for the task, a timeslot, and a route. The navigation server grants the route reservation if the task priority is higher than the task priorities of conflicting route reservation requests from other robots. As the robot moves within the environment, the robot detects an object and attempts to classify the detected object as belonging to an object category. The robot retrieves an interaction profile for the object, and interacts with the object according to the retrieved interaction profile.