Patent classifications
G05D1/0263
Gaze detection method and apparatus
A method for determining if a user's gaze is directed in the direction of a zone of interest in a 3D scene comprises: providing a 3D scene containing a zone of interest; associating a property with the zone of interest; creating a bitmap representing the location of the zone of interest in a projected view of the 3D scene, each pixel of the bitmap to which the zone of interest is projected storing the property of the zone of interest; detecting the direction of the user's gaze; using the bitmap to determine if the detected user's gaze is directed in the direction of the zone of interest.
System and method for independently routing vehicles and delivering containers and closures to unit operation systems
A system and method for independently routing vehicles and delivering containers and closures to unit operation stations are disclosed. The containers and closures can, in some cases, be transported on the same vehicle. In other cases, the containers and closures can be transported on different vehicles.
NAVIGATING A ROBOTIC MOWER ALONG A GUIDE WIRE
A method navigates a robotic mower (2) by means of a wire (8). The robotic mower (2) comprises at least two sensors (12; 14). The method comprises detecting (S101), by means of the at least two sensors (12, 14), at least one signal from the wire (4a; 8; 10), measuring (S102) a polarity of the at least one signal of the wire (4a; 8; 10) by means of each one of the at least two sensors (12, 14), determining (S103) a direction based on the polarities measured by means of the at least two sensors (12, 14) and turning (S104) the robotic mower (2) towards the determined direction.
METHOD FOR GENERATING INTERSECTION POINT PATTERN RECOGNITION MODEL USING SENSOR DATA OF MOBILE ROBOT AND INTERSECTION POINT PATTERN RECOGNITION SYSTEM
One embodiment of the present invention provides an intersection point pattern recognition system using sensor data of a mobile robot, comprising: a mobile robot that autonomously drives by using sensor data received from a sensor unit and an intersection point pattern recognition model provided by a management server; and the management server that receives usage environment information of the mobile robot and generates the intersection point pattern recognition model of the mobile robot to provide the intersection pattern recognition model to the mobile robot, wherein the management server comprises: a map generation unit for receiving the usage environment information of the mobile robot and generating a route map of the mobile robot on the basis of the usage environment information; a normalization unit for generating a virtual map by normalizing the route map according to a preset rule; and a learning unit for generating the intersection point pattern recognition model by using the virtual map and the sensor data of the mobile robot as learning data.
Road-based vehicle guidance system
A vehicle may include a frame structure, a body mounted to the frame structure, and a vehicle navigation system. The vehicle navigation system may include a navigation sensor mounted to the frame structure, and a processor in communication with the navigation sensor. The navigation sensor may be configured to detect reference elements disposed in or on a road on which the vehicle travels. The processor may be configured to receive, from the navigation sensor, signals indicative of a sequence or pattern of detected reference elements. The processor may also be configured to determine, using the received signals, at least one of a position, velocity, or orientation of the vehicle on the road.
ROAD-BASED VEHICLE GUIDANCE SYSTEM
A vehicle may include a frame structure, a body mounted to the frame structure, and a vehicle navigation system. The vehicle navigation system may include a navigation sensor mounted to the frame structure, and a processor in communication with the navigation sensor. The navigation sensor may be configured to detect reference elements disposed in or on a road on which the vehicle travels. The processor may be configured to receive, from the navigation sensor, signals indicative of a sequence or pattern of detected reference elements. The processor may also be configured to determine, using the received signals, at least one of a position, velocity, or orientation of the vehicle on the road.
Method for Detecting Physical Forbidden Zone and Global Relocating of Service Robot
The present disclosure provides a method for detecting a physical forbidden zone and global relocating of a service robot, the method comprising: presetting an identification on an edge of the physical forbidden zone that the service robot cannot enter in a working scenario; constantly detecting whether there is an artificial identification in the working scenario during operations of the service robot; identifying the artificial identification and confirming a position and heading angle information of the service robot relative to the artificial identification when there is artificial identification information in the working scenario; controlling a motion trajectory of the service robot according to the position and heading angle information of the service robot relative to the artificial identification to forbid the service robot from entering a respective physical forbidden zone.
Method for generating intersection point pattern recognition model using sensor data of mobile robot and intersection point pattern recognition system
One embodiment of the present invention provides an intersection point pattern recognition system using sensor data of a mobile robot, comprising: a mobile robot that autonomously drives by using sensor data received from a sensor unit and an intersection point pattern recognition model provided by a management server; and the management server that receives usage environment information of the mobile robot and generates the intersection point pattern recognition model of the mobile robot to provide the intersection pattern recognition model to the mobile robot, wherein the management server comprises: a map generation unit for receiving the usage environment information of the mobile robot and generating a route map of the mobile robot on the basis of the usage environment information; a normalization unit for generating a virtual map by normalizing the route map according to a preset rule; and a learning unit for generating the intersection point pattern recognition model by using the virtual map and the sensor data of the mobile robot as learning data.
VEHICLE BODY TRANSPORT SYSTEM
A vehicle body transport system includes an unmanned carrier carrying and transporting a vehicle body between work stations; and an imaging device including an imaging part imaging a traveling route of the unmanned carrier and the surroundings of the traveling route from above, an analysis part analyzing an image captured by the imaging part, and a transmission part transmitting a signal to the unmanned carrier. When a moving object other than the unmanned carrier carrying the vehicle body is present in the image, the analysis part predicts whether a movement trajectory that the vehicle body passes after a predetermined time intersects a movement position where the moving object is located after the predetermined time. When predicting that the movement trajectory and the movement position intersect after the predetermined time, the transmission part transmits an emergency operation signal to the unmanned carrier before the predetermined time elapses.
TRAVELING APPARATUS
In a traveling apparatus, multiple first detectors and multiple second detectors are arrayed in an intersectant direction intersecting a travel direction where a vehicle body travels. The second detectors are separated in the travel direction from the first detectors. The first detectors and the second detectors detect a guide extending on a road surface. A first interval between two first detectors adj acent to each other in the intersectant direction is smaller than a second interval between two second detectors adjacent to each other in the intersectant direction.