Patent classifications
G05D1/2446
Submersible remote operated vehicle vision assistance and control
This disclosure describes monitoring and operating subsea well systems, such as to perform operations in the construction and control of targets in a subsea environment. A submersible ROV that performs operations in the construction and control of targets (e.g., well completion components) in a subsea environment, the ROV has one or more imaging devices that capture data that is processed to provide information that assists in the control and operations of the ROV and/or well completion system while the ROV is subsea.
Information processing device, mobile device, information processing system, and method
Provided is a data processing unit of a user terminal that sets a real object included in a camera-captured image as a marker, generates marker reference coordinates with a configuration point of the set marker as an origin, and transmits position data on the marker reference coordinates to a mobile device. The data processing unit transforms the destination position of the drone or the position of the tracking target from the coordinate position on the user terminal camera coordinates to the coordinate position on the marker reference coordinates, and transmits the transformed coordinate position to the drone. The data processing unit receives the movement path from the drone as coordinate position data on the marker reference coordinates, transforms the coordinate position data into a coordinate position on the user terminal camera coordinates, and displays the path information on the display unit.
TRAVELING SYSTEM, TRAVELING METHOD, AND RECORDING MEDIUM STORING TRAVELING PROGRAM
A first acquisition processing unit acquires tag status information including information indicating whether a transport target is present at a position corresponding to a tag and information indicating whether an automatic traveling device is present at the position corresponding to the tag. A second acquisition processing unit acquires AGV status information indicating whether the automatic traveling device is transporting the transport target. A determination processing unit determines whether the automatic traveling device is capable of traveling through the position corresponding to the tag on the basis of the tag status information and the AGV status information. A generation processing unit generates a travel route of the automatic traveling device on the basis of a determination result of the determination processing unit.
STORE SHELF IMAGING SYSTEM AND METHOD
A store profile generation system includes a mobile base and an image capture assembly mounted on the base. The assembly includes at least one image capture device for acquiring images of product display units in a retail environment. A control unit acquires the images captured by the at least one image capture device at a sequence of locations of the mobile base in the retail environment. The control unit extracts product-related data from the acquired images and generates a store profile indicating locations of products and their associated tags throughout the retail environment, based on the extracted product-related data. The store profile can be used for generating new product labels for a sale in an appropriate order for a person to match to the appropriate locations in a single pass through the store.
DETERMINING A KINEMATIC STATE OF A LOAD HANDLING DEVICE IN A STORAGE SYSTEM
A method of determining a kinematic state of a load handling device in a storage system. Wheel state data, representative of a state of a wheel of the load handling device, from one or more sensors communicatively coupled to the wheel is obtained. A creep value for the load handling device is determined based on the wheel state data and using a trained model. The kinematic state of the load handling device is determined based on the creep value and kinematic data, representative of the kinematic state of the load handling device, is outputted. A positioning system to employ the method for the load handling device is also provided.
Aircraft control system, aircraft, aircraft control method, and program
An aircraft control system includes a target instruction value calculation unit configured to acquire a target instruction value to set an aircraft in a target state, a reference velocity calculation unit configured to input, to a reference model in which a reference velocity corresponding to a reference value of an aircraft velocity is set uniquely as an output value according to an input value, a value based on the target instruction value as the input value. A relative velocity calculation unit is configured to calculate a relative velocity of the aircraft to a target position. An estimated disturbance quantity calculation unit is configured to calculate an estimated disturbance quantity acting on the aircraft, based on a difference between the relative and reference velocities, and a correction target instruction value calculation unit is configured to correct the target instruction value, based on the estimated disturbance quantity calculated at a previous time.
SHUTTLE VEHICLE TRAVELING AND POSITIONING CONTROL METHOD BASED ON ENCODER SELF-CORRECTION
Disclosed in the present invention is a shuttle vehicle traveling and positioning control method based on encoder self-correction. Provided is a self-correction solution based on track positioning identifiers and external encoders. When a shuttle vehicle travels through each identifier, information is fed back and a servo target position is updated instantaneously, so as to eliminate, at any time, a cumulative error caused by skidding; and the shuttle vehicle realizes a full-closed-loop traveling and positioning control process under the guidance of position information which is corrected at any time. The shuttle vehicle traveling and positioning control method based on encoder self-correction comprises the following implementation stages: 1) performing customization and initialization; 2) performing self-learning; 3) performing self-correction; 4) updating a target position; and 5) handling a position offset.
CONTROL DEVICE AND CONTROL SYSTEM
The control device includes: an actual data acquisition unit that acquires at least one of a mark displayed on an external monitor mounted on a moving object that can be moved by remote control, acquired information that is acquired using the mark and includes identification information for identifying the moving object; a reference data acquisition unit that acquires reference data corresponding to the actual data; and a remote control unit that remotely controls the moving object, in which a process related to the movement of the moving object is different between a case where the actual data and the reference data match and a case where the actual data and the reference data do not match.
PASSIVE RFID TAG PLACEMENT FOR CYBERNETIC COMMAND AND CONTROL VIA LOCALIZATION
An autonomous appliance configured to operate in a facility. The autonomous appliance includes a chassis including a motor configured to move the autonomous appliance within the facility and a radio frequency identification (RFID) tag reader configured to communicate with infrastructure RFID tags attached to fixed infrastructure and with product RFID tags attached to product containers. The RFID tag reader emits an RF signal that provides ambient RF power to the infrastructure RFID tags and the product RFID tags. The autonomous appliance includes at least one actuator configured to manipulate the product containers and a controller coupled to the RFID tag reader and configured to read information from the infrastructure RFID tags and the product RFID tags. The controller adapts the behavior of the autonomous appliance relative to a first product container based on the read information.
ARCHITECTURE AND METHOD FOR AR TAG DETECTION AND LOCALIZATION FOR MOBILE ROBOTS
A robot includes a controller programmed to: when a single tag is detected in an image captured by an imaging sensor, apply a first tag detection algorithm to the image to obtain pose data of the single tag; when two or more tags are detected in the image, apply a second tag detection algorithm to the image to obtain pose data of the two or more tags; obtain pose data of the single tag in a map frame or pose data of the two or more tags in the map frame; determine pose data of the robot in the map frame based on a comparison of the pose data of the tag in the image and the pose data of the tag in the map frame; and operate one or more motors to autonomously navigate the robot based on the pose data of the robot in the map frame.