G01S11/16

Mapping and tracking system for robots
11048268 · 2021-06-29 ·

A robotic mapping and tracking system including a robot and boundary posts are disclosed. The robot includes an ultrasonic transmitter, a processor and a camera component. The boundary posts are configured to be placed adjacent to a boundary of a working region. Each boundary post of the plurality of boundary posts includes an ultrasonic receiver. Time-of-flights of the ultrasonic waves are measured to identify distances in between the robot and boundary posts. The camera component of the robot captures an image of an environment of the robot. The processor of the robot analyzes the image of the environment and identifies at least a portion of the working region in front of the robot from the image. The processor of the robot determines a moving route based on the identified portion of the working region in front of the robot and the distances in between the robot and the boundary posts.

Mapping and tracking system for robots
11048268 · 2021-06-29 ·

A robotic mapping and tracking system including a robot and boundary posts are disclosed. The robot includes an ultrasonic transmitter, a processor and a camera component. The boundary posts are configured to be placed adjacent to a boundary of a working region. Each boundary post of the plurality of boundary posts includes an ultrasonic receiver. Time-of-flights of the ultrasonic waves are measured to identify distances in between the robot and boundary posts. The camera component of the robot captures an image of an environment of the robot. The processor of the robot analyzes the image of the environment and identifies at least a portion of the working region in front of the robot from the image. The processor of the robot determines a moving route based on the identified portion of the working region in front of the robot and the distances in between the robot and the boundary posts.

METHOD AND APPARATUS FOR ENHANCED POSITION AND ORIENTATION BASED INFORMATION DISPLAY

Apparatus and methods for enhanced wireless determination of a position and direction of a smart device are describe which support the display of a virtual tag upon a user interface of the smart device. Wireless transceivers controlled by the smart device communicate with reference point transceivers to generate data sufficient to determine relative positions of the wireless transceivers and a direction of interest. Operation of LIDAR may be operative to verify the position and direction of the Smart Device as well as a topography of the environment.

Method and apparatus for enhanced position and orientation based information display

Apparatus and methods for enhanced wireless determination of a position and direction of a smart device are describe which support the display of a virtual tag upon a user interface of the smart device. Wireless transceivers controlled by the smart device communicate with reference point transceivers to generate data sufficient to determine relative positions of the wireless transceivers and a direction of interest. Operation of LIDAR may be operative to verify the position and direction of the Smart Device as well as a topography of the environment.

SENSORS FOR DETERMINING OBJECT LOCATION

A method includes recording a first timestamp of a detection of an event with a first array of an array of light sensors or an array of acoustic sensors. The array of light sensors and the array of acoustic sensors are mounted on a same chassis. The method further includes determining a direction of the event, based on the detection with the first array; recording a second timestamp of a detection of the event by a second array of the array of light sensors or the array of audio sensors; determining a distance of the event from the arrays of light and audio sensors, based on the first timestamp and the second timestamp; and controlling a vehicle, based on the event, the vehicle including the chassis.

SENSORS FOR DETERMINING OBJECT LOCATION

A method includes recording a first timestamp of a detection of an event with a first array of an array of light sensors or an array of acoustic sensors. The array of light sensors and the array of acoustic sensors are mounted on a same chassis. The method further includes determining a direction of the event, based on the detection with the first array; recording a second timestamp of a detection of the event by a second array of the array of light sensors or the array of audio sensors; determining a distance of the event from the arrays of light and audio sensors, based on the first timestamp and the second timestamp; and controlling a vehicle, based on the event, the vehicle including the chassis.

SENSORS FOR DETERMINING OBJECT LOCATION

A method includes recording a first timestamp of a detection with a first sensor array of a first physical property of an event; determining a direction of the event, based on the detection with the first sensor array; recording a second timestamp of a detection with a second sensor array of a second physical property of the event; determining a distance of the event from the first and second sensor arrays, based on the first timestamp and the second timestamp; determining the event, based on the first physical property and the second physical property; and taking an action based on the event.

POSITIONING SYSTEM
20210149015 · 2021-05-20 ·

An ultrawide band two-way ranging based positioning system includes a number of active tags each having a position, and a number of beacons configured for location of a position of a tag of the plurality of active tags. The active tags and the beacons are synchronized continuously to a common time base.

POSITIONING SYSTEM
20210149015 · 2021-05-20 ·

An ultrawide band two-way ranging based positioning system includes a number of active tags each having a position, and a number of beacons configured for location of a position of a tag of the plurality of active tags. The active tags and the beacons are synchronized continuously to a common time base.

Three dimensional object-localization and tracking using ultrasonic pulses

A tracking method includes displaying visual content on a screen of a head mounted display (HMD). One or more base stations may be stationary with respect to the screen while the visual content is being displayed. In contrast, one or more objects may move with respect to the screen while the visual content is being displayed. Time-difference-of-arrival (TDoA) and/or time-of-flight (ToF) may be measured for one or more ultrasonic pulses transmitted from the base station, one or more objects, or HMD. Position and orientation of the objects and HMD may be calculated based on the TDoA and ToF. Different frequencies of pulses may be used to locate the HMD and the objects. An electromagnetic synchronization signal from the HMD and/or base station may be used to measure TDoA. Position and orientation measurements may be fused with outputs from IMUS (inertial measurement units) to reduce jitter.