Patent classifications
G05D1/2232
UNMANNED ASSET CONTROL SYSTEM
A system for controlling operation of unmanned assets. The system includes a graphical user interface displaying a map. The graphical user interface accepts graphical inputs drawn on the map. The system also includes a calculation unit having an input interpretation module that recognizes the graphical inputs and translates the graphical inputs into tasks and/or commands. The calculation unit also includes an operation planning module that generates operation instructions based on the tasks and/or commands; and a journey planning module that generates a journey plan for the unmanned assets based on the operation instructions. The system also includes a communications module that communicates with the unmanned assets to instruct the unmanned assets to operate according to the generated journey plan.
PORTABLE TERMINAL AND VEHICLE CONTROL SYSTEM
The portable terminal is operated by an operator to operate the vehicle, and performs wireless communication with the vehicle. The portable terminal includes a sensor, a touch panel, and a processor. The sensor detects an inclination direction and an inclination angle of the portable terminal. The touch panel includes a display screen and detects touch of an operator on the display screen. The processor selects one of the first and second operation methods corresponding to the first and second holding methods of the portable terminal by the operator based on orientation information, generates travel control information for vehicle travel control based on the tilting operation of tilting the portable terminal during selection of the first operation method, and generates travel control information based on the tilting operation and the touch operation to a predetermined position on the display screen during selection of the second operation method.
GRAPHICAL USER INTERFACES INCLUDING TOUCHPAD DRIVING INTERFACES FOR TELEMEDICINE DEVICES
The present disclosure describes various aspects of remote presence interfaces (RPIs) for use on portable electronic devices (PEDs) to interface with remote presence devices. An RPI may allow a user to interact with a telepresence device, view a live video feed, provide navigational instructions, and/or otherwise interact with the telepresence device. The RPI may allow a user to manually, semi-autonomously, or autonomously control the movement of the telepresence device. One or more panels associated with a video feed, patient data, calendars, date, time, telemetry data, PED data, telepresence device data, healthcare facility information, healthcare practitioner information, menu tabs, settings controls, and/or other features may be utilized via the RPI.
Autonomous driving system
A remote controller including an emergency stop button that receives an emergency stop operation, a temporary stop button that receives a temporary stop operation, and a control unit that restarts the running of a tractor when a running restart operation is performed after the tractor has been temporary stopped, the running restart operation including a pressing operation for pressing the temporary stop button continuously for a first predetermined time and a release operation for releasing the pressing operation on the temporary stop button within a second predetermined time following the first predetermined time.
AUTO-CLEANING SYSTEM, CLEANING ROBOT AND METHOD OF CONTROLLING THE CLEANING ROBOT
A cleaning robot that performs cleaning while travelling a space to be cleaned, the cleaning robot including: a travelling unit that moves the cleaning robot; a cleaning unit that cleans the space to be cleaned; an image capturing unit that captures an image viewed from the cleaning robot; a voice input unit to which a user's voice instructions are input; and a controller obtaining the user's motion instructions through the image capturing unit and determining a restricted area in which entry of the cleaning robot is prohibited and/or a focused cleaning area to be intensely cleaned by the cleaning robot based on the user's motion instructions or the user's voice instructions when the user's voice instructions are input through the voice input unit. The restricted area and the focused cleaning area may be input to the cleaning robot through the user's voice and motion.
Auto-cleaning system, cleaning robot and method of controlling the cleaning robot
A cleaning robot that performs cleaning while travelling a space to be cleaned, the cleaning robot including: a travelling unit that moves the cleaning robot; a cleaning unit that cleans the space to be cleaned; an image capturing unit that captures an image viewed from the cleaning robot; a voice input unit to which a user's voice instructions are input; and a controller obtaining the user's motion instructions through the image capturing unit and determining a restricted area in which entry of the cleaning robot is prohibited and/or a focused cleaning area to be intensely cleaned by the cleaning robot based on the user's motion instructions or the user's voice instructions when the user's voice instructions are input through the voice input unit. The restricted area and the focused cleaning area may be input to the cleaning robot through the user's voice and motion.
Information processing device, mobile device, information processing system, and method
Provided is a data processing unit of a user terminal that sets a real object included in a camera-captured image as a marker, generates marker reference coordinates with a configuration point of the set marker as an origin, and transmits position data on the marker reference coordinates to a mobile device. The data processing unit transforms the destination position of the drone or the position of the tracking target from the coordinate position on the user terminal camera coordinates to the coordinate position on the marker reference coordinates, and transmits the transformed coordinate position to the drone. The data processing unit receives the movement path from the drone as coordinate position data on the marker reference coordinates, transforms the coordinate position data into a coordinate position on the user terminal camera coordinates, and displays the path information on the display unit.
User interaction paradigms for a flying digital assistant
Methods and systems are described for new paradigms for user interaction with an unmanned aerial vehicle (referred to as a flying digital assistant or FDA) using a portable multifunction device (PMD) such as smart phone. In some embodiments, a method for synchronizing video and audio is described. The method includes capturing video of a physical environment, receiving first audio of the physical environment captured by a first microphone of a first distributed electronic device, and synchronizing the video of the physical environment with the first audio of the physical environment.
METHODS AND SYSTEMS FOR REMOTELY ASSISTING PARKING OR UNPARKING A VEHICLE
A method for a vehicle system to assist to remotely parking or unparking a vehicle, the method comprising transmitting, to a remote device, a challenge request, determining, by the vehicle system, whether a challenge response to the challenge request has been received from the remote device, wherein the challenge response comprises information about a touch position on a touch screen of the remote device wherein a touch input has been received, if the challenge response is received, determining, by the vehicle system, whether the challenge response is valid, if the challenge response is valid, controlling the vehicle to move, and if the challenge response is not valid, controlling the vehicle to brake, wherein the challenge response is valid if the touch position indicates that the touch input is an expected rub movement performed on a valid area of the touch screen.
Robotic floor-cleaning system manager
Some aspects provide a method for instructing operation of a robotic floor-cleaning device based on the position of the robotic floor-cleaning device within a two-dimensional map of the workspace. A two-dimensional map of a workspace is generated using inputs from sensors positioned on a robotic floor-cleaning device to represent the multi-dimensional workspace of the robotic floor-cleaning device. The two-dimensional map is provided to a user on a user interface. A user may adjust the boundaries of the two-dimensional map through the user interface and select settings for map areas to control device operation in various areas of the workspace.