METHOD AND SYSTEM FOR MAPPING AND IDENTIFICATION OF OBJECTS
20220392196 · 2022-12-08
Inventors
Cpc classification
International classification
Abstract
The invention is concerned with a method and a system for the identification and mapping of objects on the basis of a picture taken of the object. The objects to be identified and mapped are classified in the database(s) on the basis of their position and/or characteristic(s). A picture that has been taken of an object to be identified is saved in the user device. The position of the object is informed to the service. The method is mainly characterized by the service checking whether there is earlier information of an object from the same position. The service identifies the object on the basis of the picture, the position of the object and, its characteristic(s). In the absence of earlier identification information of an object in the same position, the service marks the location of the object on a map on the basis of the position information.
Claims
1.-23. (canceled)
24. Method for identification and mapping of objects on the basis of a picture taken of the object by means of a user device, a service program product that is connected to a database (6) and a user interface, the service program product providing a service that is made use of by the user device via the user interface, with which service objects can be identified and mapped, the objects to be identified and mapped being classified in the database on the basis of their position and/or characteristic(s), the method comprising: a) presenting the service for the user via the user interface on the user device in form of a menu of objects to be identified, b) taking a picture of an object to be identified and saving it in the user device c) informing the position of the object to the service, and wherein d) the service checking whether there is earlier information of the object in the same position, e) the service identifying the object on the basis of the position of the object, its characteristic(s) and/or the picture, f) in the absence of earlier identification information of the object in the same position, the service marking the location of the object on a map on the user interface on the basis of the position information, g) the service presenting the identification information of the object on the user interface together with associated information and a map, wherein the location of the object has been marked.
25. The method of claim 24, wherein the presence of earlier identification information of the object from the same position, the service presents one or more alternative identification of the object for the user device corresponding to the earlier identifications.
26. The method of claim 24, wherein the service program product of the service using an Artificial Neural Network, ANN, for analyzing the picture in order to identify the object.
27. The method of claim 26, wherein the Artificial Neural Network, ANN, has been trained with pattern recognition or image recognition.
28. The method of claim 27, wherein the Artificial Neural Network, ANN, is a Convolutional Neural Network, CNN, that has been trained with image segmentation for the image recognition in order to teach the CNN to make use of different parts of the object in the image and classify the object on the basis of characteristic(s) and thereby facilitate the final identification.
29. The method of claim 27, wherein the Artificial Neural Network, ANN, is a Convolutional Neural Network, CNN, that has been trained with position information to be taken into consideration.
30. The method of claim 24, wherein said user device is a mobile station, whereby the localization of an object to be identified is performed by positioning the mobile station, the position information corresponding to the information given by a positioning system in the mobile station.
31. The method of claim 24, wherein the localization of an object to be identified is performed by manually informing the position of the object found to the service product.
32. The method of claim 24, wherein the picture sent to the service product is taken by a camera in the mobile station.
33. The method of claim 24, wherein the picture is taken by photography macrophotography, photomacrography, photomicrography and/or microphotography.
34. The method of claim 24, wherein the service product is in the user device.
35. The method of claim 24, wherein the user device communicates via a public network with the service provider holding the service program with the database, and the service product is requested from the service provider via the public network such as internet.
36. System for the identification and mapping of objects on the basis of its position and a picture of the object, which comprises a user device, a service program product with a database and a user interface, via which the service program product provides a service that is made use of by the user device, with which service objects can be identified and mapped, the objects to be identified being classified in the database on the basis of their position and/or characteristic(s), the user device having means for taking a picture of an object to be identified and for sending it to the service product, and wherein the service product having means to fetch information of the identified object from the database and present the identified object with associated information on the user interface, and the service product presenting a map on the user interface of the service and the object to be identified is placed or to be placed on the map on a place corresponding to the position of the object localized and identified.
37. The system of claim 36, wherein the service product includes at least one Artificial Neural Network, ANN, for analyzing the picture for the identification, the neural network preferably being a Convolutional Neural network, CNN.
38. The system of claim 36, wherein the user device is a mobile station and further comprises a positioning system, such as the Global Positioning Service (GPS).
39. The system of claim 36, wherein the service product is in the user device.
40. The system of claim 36, wherein the system further comprises a service provider providing the service product.
41. The system of claim 41, wherein the user device is in connection with the service provider through the Internet.
42. The system of claim 36, wherein the database consists of local databases for different countries and/or for different regions of countries, an object menu for each local database and submenus on different levels for the object menu, whereby the objects to be identified are classified in the database on the basis of different characteristics in a hierarchic system by means of the submenus.
43. The system of claim 42, wherein the submenus contains images, text and/or photographs of the objects.
44. The system of claim 42, wherein each object menu comprises different object, such as plants, mushrooms, stones, minerals, butterflies, insects, and animals.
45. The system of claim 36, wherein the associated information, in connections with plants or mushrooms, consists of an identified name of the object, such as occurrence information, nutrition information, effective substances for conducing health, common allergies causing substances, substances with medical effects, and combined effects of the substances with other ingredients.
46. The system of claim 36, wherein the means for taking a picture of an object to be identified is a camera in the mobile station enabling photography, macrophotography and/or microphotography.
Description
FIGURES
[0089]
[0090]
DETAILED DESCRIPTION
[0091]
[0092] The local databases 7a-7n consist of an object menu for each local database and submenus on different levels for the object menu. It is practical to have a local database for each country as well as local databases for different regions in a country. Objects to be identified are classified in the database on the basis of different characteristics in a hierarchic system by means of submenus. The submenus contain images, text or photographs of the objects and the images, text or photographs in the submenus can describe the objects by pointing out certain characteristics.
[0093]
[0094] In signals 3 and 4, the user requests for respective gets a service from the service provider by means of which the plant found can be identified. The GPS information might be forwarded to the service provider in this stage.
[0095] First, the service can be presented for the user in form of a menu of alternatives to be identified on a user interface, after which the user has to select whether he wants to identify e.g. a plant, an animal, an insect, or a stone etc. The database can e.g. have a main menu comprising objects to be identified, such as e.g. Plants, Animals, Insects, Stones, etc. according to which the service provider has designed the product.
[0096] When the user for instance selects a plant like in this example of
[0097] In an embodiment, wherein the user has a terminal using the GPS system, the service product gets the information directly from the position of the object and therefore sometimes also the type of terrain, where the plant was found or the area, wherein it appears and can make the identification faster by selecting the right local database to match the plant to be identified.
[0098] Next, the user takes a picture from the object to be identified with a camera in his mobile station and stores it in step 5.
[0099] When the type “plant” has been selected for identification, the information of what is requested goes with signal 6 together with the picture taken to the service provider.
[0100] The mobile station sends the picture taken of the object to the service provider together with message 6 or possible as a separate message.
[0101] The message, which is sent to the service provider in step 6, can contain the position of the object, as being that of the mobile station from where the signal is sent by signal 6, which position information has been received from a GPS receiver in the telephone. In alternative, said position information is sent to the service in separate messages or even with signal 3. If the telephone does not have any GPS receiver, the user can put in the position manually either by clicking on alternatives presented by the product or by giving coordinates, in which case the manually entered position information is forwarded in signal 6.
[0102] The service makes use of Artificial Intelligence AI so that the picture is interpreted in step 7 by an Artificial Neural Network (ANN), preferably by a Convolutional Neural Network (CNN), which is kept by the service product provided by the service provider and which ANN or CNN makes a decision for an analysis of the picture with respect to the identification of the plant presented in the picture and to be identified. The ANN or CNN has earlier been taught by the pictures in the databases so to be able to identify the plant.
[0103] In a certain embodiment, the ANN or CNN might be taught to use position information in identifying the plant presented (or generally an object. Thus, it might be taught either to take that information into consideration or not. The service can also make use of the position information.
[0104] In some embodiments, the ANN or CNN might be taught to use information of characteristics of the object in identifying the plant presented (or generally an object). Such characteristics can be read from the picture on the basis of the form, the colour and/or the size of the object or be given as data information for the ANN or CNN.
[0105] In further embodiments, the identification made by the ANN or CNN can be combined with other methods for the identification for further accuracy, such as those methods described in U.S. Pat. No. 7,400,295 B2 and WO publication 2006/120286 A2 of the applicants.
[0106] In U.S. Pat. No. 7,400,295 B2, the object was identified on the basis of characteristics presented by the service program and selected by the user in a way that the service found and fetched an object from the database with objects classified by their characteristics and based on the selected characteristics and optionally also position information.
[0107] In WO publication 2006/120286 A2, the object was identified by sending a picture of the object to be identified to the service product, which the service product reads the characteristics of the object on the basis of which the search from the database takes place based on characteristics.
[0108] The service program of the service product can fetch content with respect to the identified plant from the correct local database with signals 8 and 9. The content fetched may comprise the name of the plant, a photograph of the plant, a description of characteristics of the plant and associated information such as ingredients, constituents, components, information of use as food or nutrition, medical use, health information, information of possible allergic reactions, and toxicity, etc.
[0109] The service then presents the content fetched at a user interface of the service and to be viewed at the mobile station of the user by means of signal 10.
[0110] In step 11, the service program checks the relevance of the position information and possible inserts the information on a map.
[0111] The service program of the service places the plant on a map on the basis of its position information. The service provides a map service function that shows the occurrence of different plants (an/or other object) found by the users. In this way, user participate in updating the occurrence map kept by the service. Certain users can also directly add plant information on the map.
[0112] In practice, the position information is also checked (step not shown) with respect to authenticity in a second check before final acceptance to be placed permanently on the map. The second check might be performed by scientists, researchers, experts or the like.
[0113] Either the second check is performed before placing the species info on the map at all or then it is first temporarily there (possible with a message that the information is under check) and first after having been accepted it is permanently there as long as the occurrence is true.
[0114] If there is no earlier information on the species in question on the map, and if the second check results in that the position information of the species is relevant for being notified with respect to its occurrence, it is inserted in the map.
[0115] If, however, there is earlier information on the species in question on the map, and/or the second check results in that the position information of the species is not correct or not worth being notified with respect to its occurrence, it is not inserted in the map.
[0116] A confirmation message telling the results of both checks is optionally sent to the mobile station of the user in step 12.
[0117] The steps of