G01S5/16

System and method of providing a multi-modal localization for an object
11561553 · 2023-01-24 · ·

An example method includes gathering, via a first module of a first type, first simultaneous localization and mapping data and gathering, via a second module of a second type, second simultaneous localization and mapping data. The method includes generating, via a simultaneous localization and mapping module, a first map based on the first simultaneous localization and mapping data and the second simultaneous localization and mapping data, the first map being of a first map type and generating, via the simultaneous localization and mapping module, a second map based on the first simultaneous localization and mapping data and the second simultaneous localization and mapping data, the second map being of a second map type. The map of the first type is used by vehicles with module(s) of the first and/or second types and the map of the second type is used by vehicles with a module of the second type exclusively.

Methods and system for multi-target tracking

A computer-implemented method for tracking multiple targets includes identifying a plurality of targets based on a plurality of images obtained from an imaging device carried by an unmanned aerial vehicle (UAV) via a carrier, determining a target group comprising one or more targets from the plurality of targets, and controlling at least one of the UAV or the carrier to track the target group.

METHOD, MOBILE DEVICE AND CLEANING ROBOT FOR SPECIFYING CLEANING AREAS
20230229158 · 2023-07-20 ·

A method for specifying a cleaning area to a cleaning robot without an in-built map provides a hand-held mobile device capturing a two-dimensional code label arranged on a top of a cleaning robot parked on a charging base, and obtaining a positional relationship between the mobile device and the cleaning robot through the captured image. The cleaning robot is controlled to enter a cleaning mode under the guidance of the mobile device. With captured images, a user can specify an area within the environment for cleaning, and through a touch display screen can control the cleaning robot to go to the specified cleaning area for cleaning. The mobile device and the cleaning robot employing the method are also disclosed.

Machine Vision Determination of Location Based on Recognized Surface Features and Use Thereof to Support Augmented Reality

A system and method can support image based determination of mobile device location through recognition of surface features for a previously scanned physical environment. The system and method can also support authoring and positioning of augmented reality features in an authoring interface using the same images and positions of surface features that are to be used for subsequent mobile device localization. As a result, mobile devices leveraging those same images and positions of surface features for localization will be more likely to obtain a localization that is consistent with the positioning displayed in the authoring interface. Augmented reality features authored using the same scan of the environment can be reliably displayed to an end user of an augmented reality application in a position consistent with their authoring in a common coordinate system, even though the authoring may have been performed remotely, away from the actual situs of the physical environment.

APPARATUSES AND METHODS FOR COLLISION AVOIDANCE
20230222928 · 2023-07-13 · ·

The present disclosure relates to a collision avoidance concept comprising emitting a modulated light beacon from an object, wherein a luminance of the light beacon is modulated based on a useful signal carrying information on a position of the object, detecting, by an event-based vision sensor of a vehicle, the modulated light beacon of the object and outputting an event signal in response to a detected change in luminance of the modulated light beacon, and estimating a distance between the object and the vehicle based on the event signal.

APPARATUSES AND METHODS FOR COLLISION AVOIDANCE
20230222928 · 2023-07-13 · ·

The present disclosure relates to a collision avoidance concept comprising emitting a modulated light beacon from an object, wherein a luminance of the light beacon is modulated based on a useful signal carrying information on a position of the object, detecting, by an event-based vision sensor of a vehicle, the modulated light beacon of the object and outputting an event signal in response to a detected change in luminance of the modulated light beacon, and estimating a distance between the object and the vehicle based on the event signal.

ESTIMATING TRACKING SENSOR PARAMETRIZATION USING KNOWN SURFACE CONSTRAINTS

A sensor system and a method of operating a sensor system including a plurality of sensors tracking a moving object in an area having known bounding surfaces. The apparatus and method calculate a time-specific position of the object based on data and sensor parameters from at least two of the plurality of sensors and determine errors between the calculated time-specific positions calculated. The method and apparatus calculate a minimum system error attributable to the at least two sensors by constraining at least one dimension in the data of the sensor used in the calculated time-specific position of the object associated with the sensor, the constraining based on an object/surface interaction, the minimum system error calculated by solving for modified sensor parameters for each sensor.

ESTIMATING TRACKING SENSOR PARAMETRIZATION USING KNOWN SURFACE CONSTRAINTS

A sensor system and a method of operating a sensor system including a plurality of sensors tracking a moving object in an area having known bounding surfaces. The apparatus and method calculate a time-specific position of the object based on data and sensor parameters from at least two of the plurality of sensors and determine errors between the calculated time-specific positions calculated. The method and apparatus calculate a minimum system error attributable to the at least two sensors by constraining at least one dimension in the data of the sensor used in the calculated time-specific position of the object associated with the sensor, the constraining based on an object/surface interaction, the minimum system error calculated by solving for modified sensor parameters for each sensor.

Locating method for localizing at least one object using wave-based signals and locating system

The invention relates to a locating method for localizing at least one object using wave-based signals, wherein a wave field emanates from the object to be localized and the wave field emanating from the object is received by a number N of receivers, at least one measurement signal is formed in every receiver, said measurement signal being dependent on the spatial and temporal distribution of the wave field and the phase progression of said measurement signal being characteristically influenced by the signal propagation time from the object to the receiver, wherein, for position locating, phase values for each of the at least two measurement signals are taken as measured phase values, and wherein the current position (P(k)) of the object to be located at the time k is determined by a comparison of at least one linear combination of the measured phase values with at least one linear combination of the associated hypothetical phase values, which result from the transmitter-receiver distance(s), and using a recursive filter/estimator.

Locating method for localizing at least one object using wave-based signals and locating system

The invention relates to a locating method for localizing at least one object using wave-based signals, wherein a wave field emanates from the object to be localized and the wave field emanating from the object is received by a number N of receivers, at least one measurement signal is formed in every receiver, said measurement signal being dependent on the spatial and temporal distribution of the wave field and the phase progression of said measurement signal being characteristically influenced by the signal propagation time from the object to the receiver, wherein, for position locating, phase values for each of the at least two measurement signals are taken as measured phase values, and wherein the current position (P(k)) of the object to be located at the time k is determined by a comparison of at least one linear combination of the measured phase values with at least one linear combination of the associated hypothetical phase values, which result from the transmitter-receiver distance(s), and using a recursive filter/estimator.