APPARATUS FOR IMPROVING DETECTION AND IDENTIFICATION BY NON-VISUAL SCANNING SYSTEM
20230057325 · 2023-02-23
Inventors
Cpc classification
G01S7/4802
PHYSICS
International classification
Abstract
Apparatus and method for providing improved detection and identification of objects (e.g. people, pets, bicycles or vehicles), by devices, such as autonomous vehicles, that rely on non-visible detection systems, such as lidar, for understanding their surrounding environment. Such objects having integrated or embedded materials of a predetermined shape or pattern that is readily detectable and identified by devices using such detection systems, such as autonomous vehicles. The predetermined shape or pattern is of a material, such as aluminum, that is more easily detectable by a non-visible detection system and allows the detection system to recognize and identify the type of object, even in challenging visibility conditions.
Claims
1. An apparatus detectable by a detection system, comprising: an article of clothing wearable by a person; embedded within said article of clothing, a three-dimensional pattern comprised of metallic material; wherein said pattern is not visible; and wherein said pattern has a predefined association with said detection system that said apparatus is wearable by a person.
2. The apparatus claimed in claim 1, wherein said pattern is comprised of aluminum.
3. The apparatus claimed in claim 1, further comprising: wherein said pattern has a further predefined association that said pattern is located in the front of said article of clothing.
4. The apparatus in claim 3, further comprising: embedded within said article of clothing, a second pattern comprised of a metallic material; wherein said second pattern is not visible; wherein said second pattern has a predefined association that said article of clothing is wearable by a person; and wherein said second pattern is further has a further predefined association that said second pattern is located in the back of said article of clothing.
5. A method of detecting an first object in an environment, said object having an embedded pattern comprised of metallic material, comprising the steps of: scanning said environment using a LIDAR scanner; detecting, in the environment, said pattern; identifying a second object based on a predefined association with said first object and said second object and a predefined association between said pattern and said second object; outputting said identification to generate a virtual image of said environment.
6. The method of claim 5, further comprising the steps of: identifying the orientation of said second object based on said detection of said pattern.
7. An apparatus comprising: a pattern comprised of a metallic material; said pattern embedded within said apparatus so as to be invisible; wherein said pattern has a predefined association identifying said apparatus.
8. An apparatus as claimed in claim 7, wherein said pattern further has a predefined association identifying the orientation of said apparatus.
9. An apparatus as claimed in claim 7, wherein said pattern is comprised of aluminum.
10. An apparatus as claimed in claim 7, wherein said apparatus is wearable by a person.
11. An apparatus as claimed in claim 7, wherein said apparatus is road paint.
12. An apparatus as claimed in claim 7, wherein said apparatus is attachable to a vehicle.
13. An apparatus as claimed in claim 7, wherein said apparatus is attachable to a bicycle.
Description
BRIEF DESCRIPTION OF THE FIGURES
[0015]
[0016]
[0017]
[0018]
[0019]
DETAILED DESCRIPTION
[0020] The present specification is directed towards multiple embodiments. The following disclosure is provided in order to enable a person having ordinary skill in the art to practice the invention. Language used in this specification should not be interpreted as a general disavowal of any one specific embodiment or used to limit the claims beyond the meaning of the terms used therein. The general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the invention. Also, the terminology and phraseology used is for the purpose of describing exemplary embodiments and should not be considered limiting. Thus, the present invention is to be accorded the widest scope encompassing numerous alternatives, modifications and equivalents consistent with the principles and features disclosed. For purposes of clarity, details relating to technical material that is known in the technical fields related to the invention have not been described in detail so as not to unnecessarily obscure the present invention.
[0021]
[0022] A control and processing module 160 interacts with the FDV 110 to provide control and targeting functions for the scanning sensor. In addition, the control and processing module 160 can utilize a neural network 162 comprised of software to analyze groups of points in the point cloud 150 to identify the category of object of interest 105 and generate a model of the object of interest 105 that is stored in a database 164. The processing and control module 160 can have computer code in resident memory, on a local hard drive or in a removable drive or other memory device, which can be programmed to the processing module 160 or obtained from a computer program product such as a CD-ROM or download signal.
[0023] The FDV 110 can include an optical transceiver, shown in
[0024] Conventional LIDAR scanning systems generate distance information based upon time-related measurements of the output from a single wavelength laser. If any color information on the scanned object or scene is required, it is typically obtained using a second conventional, non-time resolved camera, as discussed above with respect to the
[0025] In embodiments, object 105 includes a symbol 107 that is embedded in object 105. In embodiments, symbol 107 is comprised of a material that is more readily detected by LIDAR 120, such as aluminum or other metallic material that are known to be reflective of laser sources. In embodiments, symbol 107 has a shape or pattern that is unique to the category of object 105 in which it is embedded. For example, a unique symbol or pattern may be ascribed to a person, whereas a separate unique symbol or pattern may be ascribed to a bicycle. In embodiments, symbol 107 is embedded in a way that is not visible to people but is detectable by LIDAR 120. For example, the symbol 107 may be a pattern embedded into a person's clothing in a discrete way, such as by use of thin threads composing the symbol 107 or placing the symbol 107 in the clothing of a person in a non-visible location, such as the interior of a pocket.
[0026] In embodiments, object 105 may include more than one symbol 107. In embodiments, a first symbol 107 may be of a shape or pattern that designates both the category of object 105 and the orientation of object 105. For example, in embodiments where object 105 is a human wearing clothing embedded with a symbol 107, symbol 107 may have a shape or pattern that identifies object 105 as a human. Symbol 107 that is located on the front side of the person's clothing may have an additional shape or pattern identifying that it is located on the front of the object 105. A second symbol may also be embedded in the person's clothing on the back side with a separate shape or pattern identifying that it is located on the back side of object 105. In this way, the orientation and direction of object 105 may be more readily detected, for example, in conditions where it may be difficult to distinguish which way an object 105 is facing. This may be useful in predicting whether the object 105 may move in a particular direction. It is understood that the embodiments system described in
[0027]
[0028] In embodiments, a second symbol is embedded on the backside of human object 205. In embodiments, the pattern 207 located on the front of human object is different from the symbol located on the back of human object 205 to allow for detection of the orientation of the human object 205. For example, as shown in
[0029]
[0030]
[0031]
[0032] A number of embodiments have been described. Nevertheless, it will be understood that various modifications can be made without departing from the spirit and scope of the processes and techniques described herein. In addition, the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. In addition, other steps can be provided, or steps can be eliminated, from the described flows, and other components can be added to, or removed from, the described systems. Accordingly, other embodiments are within the scope of the following claims.