G05D1/628

System and method for validating availability of machine at worksite

A worksite management system may include a worksite controller including one or more worksite controller processors configured to receive a signal indicative of a task to be performed by a machine at a worksite, identify a machine for performing the task, and generate a signal indicative of the machine. The worksite management system may also include a mobile device including one or more mobile device processors configured to receive the signal indicative of the machine, display an image representative of the machine, and display a prompt for a person at the worksite to validate availability of the machine to perform the task.

Automatically moving floor treatment appliance comprising at least one fall sensor

An automatically moving floor treatment appliance has an appliance housing, a drive, a computing element and a plurality of fall sensors. The computing element compares a detection result of a fall sensor with a known reference result, and when the detection result does not correspond with the reference result, determines a malfunctioning of the fall sensor. The computing element determines distances detected chronologically successively by the same fall sensor during a movement of the appliance with one another, and when the distances are identical, determines a malfunctioning of the fall sensor, and/or compares a detection result of the leading fall sensor with a detection result of a trailing fall sensor and when the trailing fall sensor detects a slope without the leading fall sensor having detected the slope before, determines a malfunctioning of the leading fall sensor and the trailing fall sensor takes over the from the leading fall sensor.

Automatically moving floor treatment appliance comprising at least one fall sensor

An automatically moving floor treatment appliance has an appliance housing, a drive, a computing element and a plurality of fall sensors. The computing element compares a detection result of a fall sensor with a known reference result, and when the detection result does not correspond with the reference result, determines a malfunctioning of the fall sensor. The computing element determines distances detected chronologically successively by the same fall sensor during a movement of the appliance with one another, and when the distances are identical, determines a malfunctioning of the fall sensor, and/or compares a detection result of the leading fall sensor with a detection result of a trailing fall sensor and when the trailing fall sensor detects a slope without the leading fall sensor having detected the slope before, determines a malfunctioning of the leading fall sensor and the trailing fall sensor takes over the from the leading fall sensor.

System to determine non-stationary objects in a physical space

A physical space contains stationary objects that do not move over time (e.g., a couch) and may have non-stationary objects that do move over time (e.g., people and pets). An autonomous mobile device (AMD) determines and uses an occupancy map of stationary objects to find a route from one point to another in a physical space. Non-stationary objects are detected and prevented from being incorrectly added to the occupancy map. Point cloud data is processed to determine first candidate objects. Image data is processed to determine second candidate objects. These candidate objects are associated with each other and their characteristics assessed to determine if the candidate objects are stationary or non-stationary. The occupancy map is updated with stationary obstacles. During navigation, the occupancy map may be used for route planning while the non-stationary objects are used for local avoidance.

Method for eliminating misjudgment of reflective lights and optical sensing system
11921205 · 2024-03-05 · ·

An optical sensing system and a method for eliminating misjudgment of a reflective light are provided. The optical sensing system includes a first light source, a second light source, a light sensor, and a processor. The processor is configured to: control the first light source to scan a horizontal detection area; control the light sensor to capture a first frame by receiving first reflective lights from the horizontal detection area; obtain a first reflection pattern, and analyze the first reflection pattern to determine whether an object is within the first portion; if so, control the second light source to scan a first vertical detection area; control the light sensor to capture a second frame from the first vertical detection area; process the second frame to obtain a second reflection pattern, and analyze the second reflection pattern to determine whether the object is detected by a misjudgment.

Multiple stage image based object detection and recognition

Systems, methods, tangible non-transitory computer-readable media, and devices for autonomous vehicle operation are provided. For example, a computing system can receive object data that includes portions of sensor data. The computing system can determine, in a first stage of a multiple stage classification using hardware components, one or more first stage characteristics of the portions of sensor data based on a first machine-learned model. In a second stage of the multiple stage classification, the computing system can determine second stage characteristics of the portions of sensor data based on a second machine-learned model. The computing system can generate an object output based on the first stage characteristics and the second stage characteristics. The object output can include indications associated with detection of objects in the portions of sensor data.

Automated inspection of autonomous vehicle equipment

An equipment inspection system receives data captured by a sensor of an autonomous vehicle (AV). The captured data describes a current state of equipment for servicing the AV. The equipment inspection system compares the captured data to a model describing an expected state of the equipment. The equipment inspection system determines, based on the comparison, that the equipment differs from the expected state. The equipment inspection system may transmit data describing the current state of the equipment to an equipment manager. The equipment manager may schedule maintenance for the equipment based on the current state of the equipment.

Method for operating a picking device for medicaments and a picking device for carrying out said method

Picking devices for operating a picking devices for medicaments are provided. A picking device includes multiple storage spaces for medicament packaging, an operating device movable horizontally in an X-direction and vertically in a Z-direction in front of the storage spaces in a movement space, and an identification device configured for identifying medicament packaging. An optical detection device is configured to create an overall image of the movement space and a control device is coupled to the operating device, the identification device and the optical detection device, wherein the control device is configured to determine the presence of an obstacle in a portion of the movement space and to control the operating device based on the determined presence of the obstacle. Methods of operating picking devices for medicaments are also provided.

System and method for autonomous operation of a machine

A system for autonomous or semi-autonomous operation of a vehicle is disclosed. The system includes a machine automation portal (MAP) application configured to enable a computing device to (a) display a map of a work site and (b) provide a graphical user interface that enables a user to (i) define a boundary of an autonomous operating zone on the map and (ii) define a boundary of one or more exclusion zones. The system also includes a robotics processing unit configured to (a) receive the boundary of the autonomous operating zone and the boundary of each exclusion zone from the computing device, (b) generate a planned command path that the vehicle will travel to perform a task within the autonomous operating zone while avoiding each exclusion zone, and (c) control operation of the vehicle so that the vehicle travels the planned command path to perform the task.

Distinguishing between direct sounds and reflected sounds in an environment

Techniques for determining information associated with sounds detected in an environment based on audio data and map data or perception data are discussed herein. A vehicle can use map data and/or perception data to distinguish between multiple audio signals or sounds. A direct source of sound can be distinguished from a reflected source of sound by determining a direction of arrival of sounds and which objects the directions of arrival are associated with in the environment. A reflected sound can be received without receiving a direct sound. Based on the reflected sound and map data or perception data, characteristics of sound in an occluded region of the environment may be determined and used to control the vehicle.