VEHICLE TRACKING SYSTEM
20220396261 · 2022-12-15
Inventors
Cpc classification
B60W50/14
PERFORMING OPERATIONS; TRANSPORTING
B60W30/0956
PERFORMING OPERATIONS; TRANSPORTING
B60W30/0953
PERFORMING OPERATIONS; TRANSPORTING
G01C21/005
PHYSICS
B60W2554/4049
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60W30/095
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A system including a proximity monitor associated with an entity. The proximity monitor includes a communication interface, a memory, and an electronic processor. The communication interface receives first position state data and identification data associated with a neighboring entity. The electronic processor includes instructions to associate a first shape model with the neighboring entity based on the identification data, determine a first space occupied by the neighboring entity based on the first position state data and the first shape model, identify a proximity event based on the first space occupied by the neighboring entity relative to a position of the entity, and generate an alert based on the proximity event.
Claims
1. A system comprising: a proximity monitor associated with an entity and including: a communication interface configured to receive first position state data and identification data associated with a neighboring entity; a memory; and an electronic processor including instructions to: associate a first shape model with the neighboring entity based on the identification data; determine a first space occupied by the neighboring entity based on the first position state data and the first shape model; identify a proximity event based on the first space occupied by the neighboring entity relative to a position of the entity; and generate an alert based on the proximity event.
2. The system of claim 1, wherein the electronic processor is configured to: associate a second shape model with the entity; determine a second space occupied by the entity based on the second position state data associated with the entity and the second shape model; and identify the proximity event based on the first and second spaces.
3. The system of claim 2, wherein the electronic processor is configured to identify the proximity event responsive to the first space associated with the neighboring entity being within a proximity threshold of the second space associated with the entity.
4. The system of claim 3, wherein the entity is a vehicle and the proximity threshold is defined for the entirety of the vehicle based on the second shape model.
5. The system of claim 2, wherein the electronic processor is configured to: generate the first space using a predicted position state of the neighboring entity; generate the second space using a predicted position state of the entity; and generate the proximity event responsive to the first space overlapping the second space.
6. The system of claim 1, further comprising a vehicle tracking unit configured to receive position data from the proximity monitor to log the proximity event.
7. The system of claim 1, wherein the proximity monitor is associated with a stationary reference point and the neighboring entity is a movable object.
8. The system of claim 1, wherein the proximity monitor interfaces with an external sensor coupled to a movable portion of the entity.
9. The system of claim 1, wherein the proximity monitor interfaces with an external sensor coupled to a movable portion of the neighboring entity.
10. The system of claim 1, wherein the proximity monitor interfaces with an external sensor coupled to a movable portion of the entity and with another external sensor coupled to a movable portion of the neighboring entity.
11. The system of claim 1, wherein the proximity monitor is configured to broadcast identity and position state information associated with the entity via the communication interface, and to receive identity and position state information associated with the neighboring entity via the communication interface.
12. The system of claim 1, wherein the proximity event includes a predictive position state associated with one or both of the entity and the neighboring entity.
13. A method of proximity tracking, the method comprising: receiving, at a proximity monitor coupled to an entity, first position state data and identification data associated with a neighboring entity via a communication interface of the proximity monitor; associating, via an electronic processor of the proximity monitor, a first shape model with the neighboring entity based on the identification data; determining a first space occupied by the neighboring entity based on the first position state data and the first shape model; identifying a proximity event based on the first space occupied by the neighboring entity relative to a position of the entity; and generating an alert based on the proximity event.
14. The method of claim 13, further comprising: associating a second shape model with the entity; determining a second space occupied by the entity based on the second position state data associated with the entity and the second shape model; and identifying the proximity event based on the first and second spaces.
15. The method of claim 14, wherein identifying the proximity event includes determining the first space occupied by the neighboring entity being within a proximity threshold of the second space occupied by the entity.
16. The method of claim 15, wherein the entity is a vehicle, the method further comprising defining the proximity threshold for the entirety of the vehicle based on the second shape model.
17. The method of claim 14, further comprising: generating the first space using a predicted position state of the neighboring entity; generating the second space using a predicted position state of the entity; and generating the proximity event responsive to the first space overlapping the second space.
18. The method of claim 17, further comprising: adjusting the predicted position state of the entity in response to a change in speed of the entity, and adjusting the predicted position state of the neighboring entity in response to a change in speed of the neighboring entity.
19. The method of claim 13, further comprising interfacing the proximity monitor with an external sensor coupled to a movable portion of the entity and with another external sensor coupled to a movable portion of the neighboring entity.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0005] The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, and serve to further illustrate embodiments of concepts that include the claimed invention, and explain various principles and advantages of those embodiments.
[0006]
[0007]
[0008]
[0009]
[0010] Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of embodiments of the present invention.
[0011] The apparatus and method components have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[0012] Before any embodiments of the invention are explained in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the accompanying drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways.
[0013] One or more embodiments are described and illustrated in the following description and accompanying drawings. These embodiments are not limited to the specific details provided herein and may be modified in various ways. Furthermore, other embodiments may exist that are not described herein. Also, the functionality described herein as being performed by one component may be performed by multiple components in a distributed manner. Likewise, functionality performed by multiple components may be consolidated and performed by a single component. Similarly, a component described as performing particular functionality may also perform additional functionality not described herein. For example, a device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed. Furthermore, some embodiments described herein may include one or more electronic processors configured to perform the described functionality by executing instructions stored in non-transitory, computer-readable medium. Similarly, embodiments described herein may be implemented as non-transitory, computer-readable medium storing instructions executable by one or more electronic processors to perform the described functionality. As used herein, “non-transitory computer-readable medium” includes all computer-readable media but does not consist of a transitory, propagating signal. Accordingly, non-transitory computer-readable medium may include, for example, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a RAM (Random Access Memory), register memory, a processor cache, or any combination thereof.
[0014] Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. For example, the use of “including,” “containing,” “comprising,” “having,” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms “connected” and “coupled” are used broadly and encompass both direct and indirect connecting and coupling. Further, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings and can include electrical connections or couplings, whether direct or indirect. In addition, electronic communications and notifications may be performed using wired connections, wireless connections, or a combination thereof and may be transmitted directly or through one or more intermediary devices over various types of networks, communication channels, and connections. Moreover, relational terms such as first and second, top and bottom, and the like may be used herein solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
[0015]
[0016] In
[0017] The proximity monitors 110 interface with a vehicle tracking unit 120 through a communication network 125 (e.g., WiFi, BLUETOOTH®, cellular, Internet, RF, Ultra Wideband (UWB), or the like). In some embodiments, the vehicle tracking unit 120 may be implemented as a cloud-based resource using a virtual computing environment. In some embodiments, the vehicle tracking unit 120 is a server. The proximity monitors 110 cooperate with the vehicle tracking unit 120 to identify and log proximity events between vehicles 105A-105H and to alert operators of the vehicles 105A-105H of potential collisions or proximity threshold violations. In some embodiments, each proximity monitor 110 identifies proximity events with respect to neighboring or adjacent vehicles or structures 105A-105H. In other embodiments, the vehicle tracking unit 120 receives position data from the proximity monitors 110 and generates alerts or logs events. In some embodiments, the proximity monitor 110 is mounted to the vehicle 105A-105H.
[0018] In some embodiments, an individual 135 may carry a proximity monitor 110 to facilitate alerts should the individual come within a proximity threshold of a vehicle 105A-105H. In some embodiments where a proximity monitor 110 is carried by a driver of the vehicle 105A-105H, proximity tracking may be disabled for the vehicle 105A-105H being operated. In some embodiments, a shape model is employed for an individual.
[0019] In some embodiments, a proximity monitor 110 is associated with a stationary reference point 140. Example stationary reference points 140 include buildings or fixed structures. Shape models may also be associated with the stationary reference points 140.
[0020]
[0021] For example,
[0022] The memory 210 includes read only memory (ROM), random access memory (RAM), other non-transitory computer-readable media, or a combination thereof. The electronic processor 205 is configured to communicate with the memory 210 to store data and retrieve stored data. The electronic processor 205 is configured to receive instructions and data from the memory 210 and execute, among other things, the instructions. In particular, the electronic processor 205 executes instructions stored in the memory 210 to perform the methods described herein. The power source 215 provides power to the various components of the proximity monitor 110. In some embodiments, the memory 210 stores a proximity event log 250 that records events from the proximity detector 220. In some embodiments, the power source 215 includes a rechargeable device, such as a battery, a capacitor, a super capacitor, or the like. The power source 215 may charge rechargeable device using inductive charging or energy harvesting. In some embodiments, the power source 215 includes a replaceable battery. In some embodiments, the orientation sensor 240 includes an accelerometer, magnetometer, mercury switch, gyroscope, compass, or some combination thereof. In some embodiments, the orientation sensor 240 is an inertial measurement unit (IMU). In some embodiments, the orientation sensor 240 includes a magnetic compass or a pressure sensor.
[0023] The communication interface 225 provides communication between the electronic processor 205 and an external device, such as the vehicle tracking unit 120, over the communication network 125. In some embodiments, the communication interface 225 may include separate transmitting and receiving components. In some embodiments, the communication interface 225 is a wireless transceiver that encodes information received from the electronic processor 205, such as the proximity event log 250, into a carrier wireless signal and transmits the encoded wireless signal to the monitoring unit 120 over the communication network 125. The communication interface 235 also decodes information from a wireless signal received from the vehicle tracking unit 120 over the communication network 125 and provides the decoded information to the electronic processor 205.
[0024] In some embodiments, the user interface 235 includes one or more of input buttons, a display, a visual indicator (e.g., LED light), a vibration indicator, an audible indicator, or the like. In some embodiments, the user interface 235 is employed to set parameters for the proximity monitoring, such as a proximity threshold.
[0025] In general, the proximity monitor 110 uses position and shape model information for neighboring entities, such as the vehicles 105A-105H, individuals 135, and stationary elements 140 to identify proximity events. In some embodiments, the proximity monitor 110 stores a neighbor list 255 of nearby vehicles 105A-105H and a shape model library 260 indicating shape models for various vehicles 105A-105H in the memory 210. Upon identifying a current or ensuing proximity event, the proximity monitor 110 generates an alert and logs a proximity event in the proximity event log.
[0026] In general, the proximity monitor 110 uses position tracking to identify proximity events. In some embodiments, proximity events are identified for vehicles 105A-105H, individuals 135, or stationary elements 140. In some embodiments, the proximity monitor 110 uses a current position state in conjunction with a proximity threshold to generate a proximity event if the proximity threshold is violated. For example, a proximity threshold is defined for the entire vehicle 105A-105H based on the shape model. Based on the position state and the shape models, the system determines the presence of a neighboring or adjacent vehicle or structure 105A-105H within the proximity threshold. Overlap or collision (potential or actual) is identified by comparing the coordinates of the neighboring vehicle 105A-105H relative to the proximity threshold. If a neighboring vehicle 105A-105H is determined to violate the proximity threshold, a proximity event is generated. In some embodiments, the proximity threshold may vary for different portions of the vehicle. For example, the proximity threshold for the body 300 of the belt loader 105G illustrated in
[0027] In some embodiments, the proximity monitor 110 uses one or more predictive future position states to generate a proximity event responsive to the future position states indicating overlap or collision of the associated vehicles 105A-105H. Based on the position state and the shape models, overlap or collision is identified by comparing the coordinates of the vehicles 105A-105H relative to one another. The position states of the vehicles 105A-105H may be provided to a predictive filter, such as a Kalman filter, to predict the position states at a future time (e.g., 1 second, two seconds, five seconds. etc.). In some embodiments, the future time interval may change based on the speed of the vehicle 105A-105H. For example, the predicted future time interval for potential overlap or collision may be decreased as vehicle speed increases.
[0028] In general, the proximity thresholds or future position states are selected to provide adequate notice to an operator of the vehicle 105A-105H of the proximity event to avoid collision. In some embodiments, the proximity monitor 100 for each vehicle 105A-105H identifies proximity events for each identified neighboring vehicle 105A-105H. In operation, each vehicle 105A-105H broadcasts its position state and the proximity monitor 110 stores the identity and position state of the neighboring vehicles in the neighbor list 255. The proximity monitor 110 identifies a shape model from the shape model library 260 for itself and the neighboring vehicles 105A-105H based on their identities. The proximity monitor 110 generates a set of 3D coordinates (e.g., future or present, or both) based on the position state and position model that define the space occupied by the particular vehicle 105A-105H (i.e. self or neighboring vehicle). The proximity monitor 110 identifies the proximity events based on the 3D coordinates of its vehicle and those of the neighboring vehicles 105A-105H using a proximity threshold and current position state or using an overlap prediction using future position states.
[0029] In some embodiments, the identification of the proximity events is determined by the vehicle tracking unit 120. Each vehicle 105A-105H broadcasts its identity and position state to the vehicle tracking unit 120. The vehicle tracking unit 120 identifies the shape model for each vehicle 105A-105H and identifies proximity events using the threshold or future state techniques described above. In some embodiments, proximity events are identified for vehicles 105A-105H, individuals 135, or stationary elements 140, or any combination thereof.
[0030]
[0031] In block 410, shape models are associated with the entities. In some embodiments, the proximity monitors 110 select shape models from the shape model library 260 based on the identity of the neighbors in the neighbor list 255. In some embodiments, the vehicle tracking unit 120 maintains a shape model library and associates the entities with the shape models.
[0032] In block 415, the space occupied by each entity is determined based on the position state and the shape models. The space occupied may represent the current time or a future time, as described above.
[0033] In block 420, a proximity event is identified based on the occupied space of the entities. For example, if the space between two entities is less than the proximity threshold or the future occupied spaces of the entities overlap, a proximity event is identified.
[0034] Responsive to the identification of the proximity event in block 420, the proximity event is logged and an alert is generated in block 430. If no proximity event is identified, the method 400 returns to block 405 for continued monitoring. In some embodiments, the proximity monitor 110 stores an entry in the proximity event log 250 including an identifier of the associated entity and a time stamp. In some embodiments, the proximity monitor 110 also logs the proximity measure. In some embodiments, the proximity monitor 110 may provide the alert as feedback using the user interface 238 (e.g., an audible tone, a vibration, a flashing visual indicator, a message on the display, etc). In some embodiments, the operator may acknowledge the proximity event by clicking one of the input buttons on the user interface 235.
[0035] In some embodiments, all of the proximity monitors 110 are simultaneously detecting proximity events, so there could be multiple detections of the same proximity events (i.e., the proximity monitors 110 detect each other). Entries in the proximity event log 250 may be aggregated for the duplicate events. In some embodiments, the aggregated data may be filtered to identify the proximity event generated by the multiple detections.
[0036] In the foregoing specification, specific embodiments have been described. However, one of ordinary skill in the art appreciates that various modifications and changes may be made without departing from the scope of the invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present teachings.
[0037] Various features and advantages of some embodiments are set forth in the following claims.