Patent classifications
G05D1/0038
Virtual reality remote valet parking
Techniques and examples pertaining to virtual reality remote valet parking are described. A processor of a control system of a vehicle may establish a wireless communication with a remote controller. The processor may provide a stream of video images captured by a camera of the vehicle to the remote controller. The processor may receive a signal from the remote controller. The processor may maneuver the vehicle from one location to another according to the signal.
ENHANCED REMOTE CONTROL OF AUTONOMOUS VEHICLES
Devices, systems, and methods for remote control of autonomous vehicles are disclosed herein. A method may include receiving, by a device, first data indicative of an autonomous vehicle in a parking area, and determining, based on the first data, a location of the autonomous vehicle. The method may include determining, based on a the location, first image data including a representation of an object. The method may include generating second image data based on the first data and the first image data, and presenting the second image data. The method may include receiving an input associated with controlling operation of the autonomous vehicle, and controlling, based on the input, the operation of the autonomous vehicle.
PROJECTILE DELIVERY SYSTEMS AND WEAPONIZED AERIAL VEHICLES AND METHODS INCLUDING SAME
A projectile delivery module to be mounted on an aerial vehicle includes a projectile delivery system including a kinetic projectile and a base system. The kinetic projectile includes a projectile body, an RF receiver, and an onboard steering system including: a steering mechanism operable to change an attitude, orientation, and/or direction of flight of the kinetic projectile; and a steering actuator. The base system includes: an RF transmitter to communicate with the RF receiver; a projectile holder; a target tracking system; and a projectile guidance system including a projectile tracking system and a projectile control system. The base system is configured to: release the kinetic projectile from the projectile holder such that the kinetic projectile is driven toward a target by gravity; track the target using the target tracking system; track the released kinetic projectile using the projectile tracking system; and automatically control the onboard steering system using the projectile control system to adjust a trajectory of the falling kinetic projectile to steer the kinetic projectile to the target.
Robot with Embedded Systems for Flight for Cell Sites and Towers
In various embodiments, the present disclosure relates to robot systems configured to operate on a cell tower to inspect, install, reconfigure, and repair cellular equipment. The present disclosure provides a robot for performing audit tasks of cell towers. The robot includes a body portion configured to hold various electronic components of the robot including monitoring equipment disposed thereon, one or more arms extending from the body portion adapted to manipulate components of a cell tower and to facilitate movement of the robot on the cell tower, embedded systems for flight, and wireless interfaces adapted to allow wireless control of the robot. The robot is configured to be controlled by one of a user in a remote location, a user at the cell tower site, and autonomously via direct programing.
Teleoperations for collaborative vehicle guidance
Techniques to provide guidance to a vehicle operating in an environment are discussed herein. For example, such techniques may include sending a request for assistance, receiving a reference trajectory, and causing the vehicle to determine a trajectory based on the reference trajectory. Data such as sensor data and vehicle state data may be sent from the vehicle to a remote computing device. The computing device outputs a user interface using the data and determines the reference trajectory based on receiving an input in the user interface. The techniques can send an indication of the reference trajectory to the vehicle for use in planning a trajectory for the vehicle. A vehicle, such as an autonomous vehicle, can be controlled to traverse an environment based on the trajectory.
Controller, control method, and program
The present technology relates to a controller, a control method, and a program that enable self-localization with lower power consumption. Provided are a selection unit that selects, from a horizontal camera mounted in a horizontal direction and a downward camera mounted in a downward direction, a camera used for self-localization depending on speed, and a self-localization unit that performs self-localization using an image obtained by imaging with the horizontal camera or the downward camera selected by the selection unit. The selection unit selects the horizontal camera in a case where the speed is equal to or higher than a predetermined speed, and selects the downward camera in a case where the speed is not equal to or higher than the predetermined speed. The present technology can be applied to a self-localization system.
HYBRID SKY AND GROUND NAVIGATION FOR MACHINE EMPLOYING SATELLITE POSITIONING
Disclosed are techniques for navigating a mobile machine, such as an autonomous robot, in an environment that includes objects that may block, reflect, or distort satellite signals to be used for positioning. Satellite data may be captured from one or more satellites. An image may be captured using an imaging device that is at least partially oriented toward the one or more satellites. A set of sky scores may be calculated for a set of ground positions surrounding the mobile machine based on the satellite data and the image. Each of the set of sky scores may be indicative of an accuracy of a satellite-based position at one of the set of ground positions. The mobile machine's navigation may be modified using the set of sky scores.
AUTOMATIC GUIDANCE ASSIST SYSTEM USING GROUND PATTERN SENSORS
An automatic guidance system is adapted to be mounted on a work vehicle such as a farm tractor for assisting an operator steer the vehicle on a desired track relative to a furrow. The system includes sensors for transmitting and receiving ultrasonic ranging signals. The sensors are ultrasound transducers mountable on ends of a planter drawn by the vehicle for directing ranging signals downwardly toward field adjacent of a furrow such that the ranging signals strike the field or furrow and are reflected back into the respective sensor. Guidance logic stored in a memory of a controller is executed by a processor to determine tractor headway direction and headland turning directions representative of desired tractor headway and headland turning directions, and a human interface device generates guidance images viewable by an operator for steering the tractor relative to furrows in the field and in the headland.
REMOTE AUTONOMOUS DRIVING CONTROL MANAGEMENT APPARATUS, SYSTEM INCLUDING THE SAME, AND METHOD THEREOF
A remote autonomous driving control management apparatus, a system including the same, and a method thereof are provided. The remote autonomous driving control management apparatus is configured to authenticate a remote control terminal, obtains a surrounding image from an autonomous vehicle, receives a parking position from the remote control terminal, and performs autonomous parking or autonomous stop by remote control, based on the parking position and the surrounding image. The remote autonomous driving control management apparatus shares an accurate position through communication to facilitate precise remote control of the autonomous vehicle.
REMOTE ASSISTANCE SYSTEM AND REMOTE ASSISTANCE METHOD
A processor of a remote facility executes image generation processing to generate assistance image data to be displayed on a display based on front image data indicating front image data of a vehicle. In the image generation processing, when an image of a traffic light is included in the front image data, it is determined whether recognition likelihood of a luminescent state of a light emitting section of the traffic light is equal to or smaller than a threshold. If it is determined that recognition likelihood is less than or equal to the threshold, super-resolution processing of a preset region including the traffic light in the front image data is executed. Then, super-resolution image data of the preset region by the super-resolution processing is superimposed on a region corresponding to the preset region in the front image data. As such, the assistance image data is generated.