G05D111/10

Nighttime cooperative positioning method based on unmanned aerial vehicle group

Disclosed is a nighttime cooperative positioning method based on an unmanned aerial vehicle (UAV) group, falling within the technical field of aircraft navigation and positioning. According to the present disclosure, the cooperative visual positioning and the collision warning for UAVs are realized by means of light colors of the UAVs, respective two-dimensional turntable cameras and a communication topology network, without adding additional equipment and without relying on an external signal source, avoiding external interference. Compared with the positioning method in a conventional manner, in the present disclosure, the system is effectively simplified, and the cooperative positioning among the interiors of a UAV cluster can be realized relatively simply and at a low cost to maintain the formation of the UAV group.

Automatic berthing system and method

An automatic berthing system and method for a vessel are provided. The automatic berthing system comprises a controller, controlling the vessel; an input part, receiving an input to perform the automatic berthing mode; at least one peripheral sensor, detecting position information and speed information of another vessel other than the vessel to determine whether a wake caused by the another vessel has influence on the vessel. The controller performs an automatic berthing control when the input to perform the automatic berthing mode is received until the vessel reaches a berthing position, and the controller stops the automatic berthing control when the another vessel is located within a predetermined distance from the vessel based on the position information.

Vehicle body transport system
12202560 · 2025-01-21 · ·

A vehicle body transport system includes an unmanned carrier carrying and transporting a vehicle body between work stations; and an imaging device including an imaging part imaging a traveling route of the unmanned carrier and the surroundings of the traveling route from above, an analysis part analyzing an image captured by the imaging part, and a transmission part transmitting a signal to the unmanned carrier. When a moving object other than the unmanned carrier carrying the vehicle body is present in the image, the analysis part predicts whether a movement trajectory that the vehicle body passes after a predetermined time intersects a movement position where the moving object is located after the predetermined time. When predicting that the movement trajectory and the movement position intersect after the predetermined time, the transmission part transmits an emergency operation signal to the unmanned carrier before the predetermined time elapses.

Robot and control method thereof

A robot includes: a light emitter configured to output light; a camera; and at least one processor configured to: obtain first information about an object using the camera while the light emitter is outputting the light, obtain second information about the object using the camera while the light emitter is not outputting the light, obtain third information about the object based on the first information and the second information, obtain information about an external light area based on at least one from among the first information, the second information, and the third information, and generate a driving path of the robot based on the information about the external light area.

Simultaneous localization and mapping (SLAM) method

A method for determining an ego pose of a mobile system and creating a surfel map of a surrounding area of the mobile system via an optimization problem represented by a factor graph includes the steps of: receiving environment sensor data generated by an environment sensor attached to the mobile system, wherein the environment sensor surveys the surrounding area of mobile system, and wherein the environment sensor data represent the surrounding area of the mobile system as a point cloud; generating surfels by converting the point cloud of the received environment sensor data into surfel data; identifying new surfels and known surfels in the generated surfels by comparing the surfel data with the surfel map; and adding a surfel factor for the known surfels to the factor graph and/or adding a surfel node and a surfel factor for the new surfels to the factor graph.

Docking orientation device

Autonomously aligning docking bodies is provided. A first-axis surface is identified as a target axis surface corresponding to a desired docking approach vector of a first body to a second body. The first body is maneuvered to align relative to a second-axis surface until a third-axis surface is less than a defined surface detection threshold level. The first body is continued to be maneuvered to align relative to the third-axis surface until the second-axis surface is less than the defined surface detection threshold level. Docking is completed between the first body and the second body in accordance with the desired docking approach vector based on alignment of the first body with the target axis surface.

METHOD AND SYSTEM FOR AUTONOMOUS EXPLORATION AND SCANNING

A computer-implemented method for autonomously exploring, by a mobile robot, one or more objects of interest, the mobile robot comprising a computing unit and a laser scanner module for scanning surfaces of the one or more objects of interest, the laser scanner module having a field of view. The method comprising defining a 3D exploration map, wherein the one or more objects of interest are situated in the exploration map, partitioning the exploration map into a multitude of 3D exploration blocks, and an autonomous exploration of the exploration map by means of the mobile robot, wherein the exploration comprises, by the laser scanner module, generating scan data related to a point cloud while the mobile robot is travelling along an exploration path.

INFORMATION PROCESSING SYSTEM, AUTONOMOUS TRAVELING BODY, INFORMATION PROCESSING APPARATUS, METHOD FOR CONTROLLING AUTONOMOUS TRAVELING BODY AND RECORDING MEDIUM

An information processing system controls an autonomous traveling body capable of autonomously traveling on a learned route. The information processing system includes a route information storage unit to store suspension point information indicating a suspension point at which the autonomous traveling body has suspended autonomous traveling on a particular learned route, and an acquisition unit to acquire current position information indicating a current position of the autonomous traveling body according to an instruction to resume the autonomous traveling, and controls the autonomous traveling body to return to the particular route, based on at least the current position information and the suspension point information.

Method for determining a motion path on a surface
12339675 · 2025-06-24 · ·

A method for determining a motion path on a surface in an environment, along which motion path a mobile appliance, in particular a robot, preferably a domestic robot or a robot vacuum cleaner, is intended to move. The method includes obtaining environment information and determining a region of the surface intended to be covered by the motion of the mobile appliance; determining, while taking into account the environment information, whether within the region there is at least one uneven area in the surface that can be negotiated by the mobile appliance; and determining the motion path while taking into account the at least one uneven area, if there is one. A mobile appliance is also described.

Remote operations using immersive telepresence

A method computer system and computer program product are provided for performing remote operations by enhanced telepresence. A set of physical robots in a three-dimensional (3D) space in which is contained a physical object. The set of physical robots includes cameras separated by one or more distances between the cameras. Using at least two of the cameras, images are captured, and a parallax measurement is generated. A 3D virtual reality environment is generated that includes a space representation of the 3D space and an object virtual representation of the physical object. The robot virtual representation includes a point-of-view located about on the set of physical robots. The 3D virtual reality environment is projected using a virtual reality projector.