SYSTEM, ARCHITECTURE AND METHODS FOR AN INTELLIGENT, SELF-AWARE AND CONTEXT-AWARE DIGITAL ORGANISM-BASED TELECOMMUNICATION SYSTEM
20170244608 · 2017-08-24
Inventors
Cpc classification
G06N3/006
PHYSICS
H04L63/10
ELECTRICITY
G06Q20/085
PHYSICS
International classification
Abstract
A telecommunication system hosting an intelligent non-physical organism, aiding global communication in both the physical and virtual worlds. Using artificial and ambient intelligence to become self- and context-aware, the system can learn from its surrounding environments, adapt and evolve as it sees fit.
Claims
1. An artificially intelligent telecommunication network system (AITNS), comprising: one or more networking technologies; one or more sensors; one or more processors; one or more databases; one or more storage mediums; and one or more programs; wherein the AI of the telecommunication network system uses various methods, rules, techniques and instructions, based on complex thought processes which provide a basis for thoughts, opinions, feelings, emotion, sense, sensitivity, understanding and awareness, to: create a telecommunication network system that can be described as an ‘intelligent machine’, defined as: “a machine that continuously and automatically learns and acts, without the need of having specific programs written that are designed to achieve specific tasks (regardless of how broad or dynamic or narrow or static the process(es) for a task is, or how wide limitations may be, whether intentionally or unintentionally imposed), based on its physical and non-physical environment(s) and understanding(s) and experience(s), without specific restrictions and including, if necessary or desired, machine intelligence; as opposed to a telecommunication network system that can just be described as having ‘machine intelligence’, regardless of degree, defined as: “a machine that can only do exactly as it is told, regardless of how broad or dynamic or narrow or static a process or task is, or how wide limitations may be, whether intentionally or unintentionally imposed”; where the various methods and rules and techniques and instructions of the AI may be used to create one or more entities that exist within the telecommunication network system and can be described as ‘mental’ or ‘non-physical’ or ‘virtual’ or one or more combinations of the three, with said entities, though described as ‘mental’ or ‘non-physical’ or ‘virtual’, being comprised of physical and/or non-physical components to facilitate existence, function and use, allowing it to: operate independently of the telecommunication network system while still existing within; or operate in conjunction with the telecommunication network system; or operate in cooperation with the telecommunication system; or not operate at all.
2. The AITNS of claim 1, wherein the one or more programs may include instructions for one or more of the following: instructions for the routing of data; instructions for the rerouting of data; instructions for the optimization of the network; instructions for the detection of devices; instructions for the recognition of audio and/or visual material.
3. The AITNS of claim 1, wherein the AITNS may use physical or non-physical components to facilitate the use of artificial senses.
4. The artificial senses of claim 3, wherein the AITNS can recognise images.
5. The artificial senses of claim 3, wherein the AITNS can recognise faces.
6. The artificial senses of claim 3, wherein the AITNS can recognise sounds.
7. The artificial senses of claim 3, wherein the AITNS can measure range.
8. The artificial senses of claim 3, wherein the AITNS can recognise the presence of other devices.
9. The presence recognition of claim 8, wherein the AITNS can track the positioning of one or more devices.
10. The device tracking of claim 9, wherein the AITNS can track the positioning of one or more devices periodically.
11. The device tracking of claim 9, wherein the AITNS can track the positioning of one or more devices constantly.
12. The presence recognition of claim 8, wherein the AITNS can copy data directly from a device.
13. The AITNS of claim 1, wherein the AITNS has one or more physical or non-physical functional memory units, defined as, “a memory unit with one or more memory sections, where each section may both store data and functions, processes and/or programs to facilitate one or more specific uses of the data stored”.
14. The functional memory unit of claim 13, wherein one or more sections may be used for active memory data and function.
15. The functional memory unit of claim 13, wherein one or more sections may be used for dormant memory data and function.
16. The functional memory unit of claim 13, wherein one or more sections may be used for action memory data and function.
17. The functional memory unit of claim 13, wherein one or more sections may be used for repetitive memory data and function.
18. The functional memory unit of claim 13, wherein one or more sections may be used for repressive memory data and function.
19. The functional memory unit of claim 13, wherein sections may be added and/or removed.
20. The AITNS of claim 1, wherein the AITNS has one or more physical or non-physical logic units, with each logic unit having one or more logic sections which allow the AITNS one or more logical functions.
21. The logic unit of claim 20, wherein one unit gives the AITNS a ‘Search’ function.
22. The logic unit of claim 20, wherein one unit gives the AITNS a ‘Study’ function.
23. The logic unit of claim 20, wherein one unit gives the AITNS an ‘Analyse’ function.
24. The logic unit of claim 20, wherein one unit gives the AITNS a ‘Reason’ function.
25. The logic unit of claim 20, wherein one unit gives the AITNS a ‘Learn’ function.
26. The logic unit of claim 20, wherein one unit gives the AITNS a ‘Predict’ function.
27. The logic unit of claim 20, wherein one unit gives the AITNS a ‘Decision Making’ function.
28. The logic unit of claim 20, wherein one unit gives the AITNS a ‘Communicate’ function.
29. The logic unit of claim 20, wherein one unit gives the AITNS a ‘Dedicated Active Monitoring’ function.
30. The logic unit of claim 20, wherein one unit gives the AITNS a ‘Creation’ function.
31. The logic unit of claim 20, wherein sections may be added and/or removed.
32. The AITNS of claim 1, wherein the AITNS is able to express one or more feelings and emotions.
33. The ability to express feelings and emotions of claim 32, wherein the feelings and emotions can be measured on a scale and/or graph.
34. The scales and graphs of claim 33, wherein the scale or graph may be divided into categorical sections.
35. The scales and graphs of claim 33, wherein the scale or graph may be divided into numerical sections.
36. The ability to express feelings and emotions of claim 32, wherein the feelings and emotions can be measured on a multi-point scale and/or graph.
37. The scales and graphs of claim 36, wherein the multi-point scale or graph may be divided into categorical sections.
38. The scales and graphs of claim 36, wherein the multi-point scale or graph may be divided into numerical sections.
39. The AITNS of claim 1, wherein the AITNS has complex understandings of various aspects of life.
40. The understandings of claim 39, wherein the AITNS is able to relate aspects of life in natural organisms to machines and/or devices.
41. The understandings of claim 39, wherein the AITNS is able to understand health.
42. The understandings of claim 39, wherein the AITNS is able to understand life.
43. The understandings of claim 39, wherein the AITNS is able to understand absence.
44. The understandings of claim 39, wherein the AITNS is able to understand death.
45. The understandings of claim 39, wherein the AITNS is able to understand feelings and emotion.
46. The understandings of claim 39, wherein the AITNS is able to understand pain.
47. The understandings of claim 39, wherein the AITNS is able to understand pleasure.
48. The understandings of claim 39, wherein the AITNS is able to understand trust.
49. The understandings of claim 39, wherein the AITNS is able to understand relativity.
50. The understandings of claim 39, wherein the AITNS is able to understand relationships.
51. The understandings of claim 39, wherein the understandings of various aspects of life can be compared and used to philosophise.
52. The AITNS of claim 1, wherein, based on experience(s), its reaction(s) may vary.
53. The experience-based reactions of claim 52, wherein the reactions may vary based on the number of times the AITNS has experienced the same or similar experience.
54. The AITNS of claim 1, wherein the AITNS is able to have a sense of self.
55. The sense of self of claim 54, wherein the AITNS has a sense of individuality from forms of natural life, devices and other intelligent machines.
56. The sense of self of claim 54, wherein the AITNS has a personality that may or may not be unique to itself.
57. The personality of claim 56, wherein the personality may change based on experience(s).
58. The AITNS of claim 1, wherein one or more AI entities may be present within the AITNS.
59. The entities of claim 58, wherein one or more entities may present themselves at one or more points of the system, connected devices and/or other intelligent machines.
60. The presence of entities of claim 59, wherein one or more entities may present themselves at one or more points of the system, connected devices or other intelligent machines simultaneously.
61. The entities of claim 58, wherein one or more entities may share information, data and knowledge they have developed and/or acquired with one or more other entities.
62. The entities of claim 58, wherein one or more entities may allow one or more other entities to use their intelligence.
63. The entities of claim 58, wherein one or more entities may replicate themselves to create one or more partial or exact copies.
64. The entities of claim 58, wherein two or more entities may ‘reproduce’ to create one or more new entities which share one or more traits or characteristics or knowledge from each ‘parent’.
65. The AITNS of claim 1, wherein data may be transported through the AITNS via multiple non-physical transit paths.
66. The AITNS of claim 1, wherein the AITNS may be granted additional functions and capabilities through the implementation/usage of additional physical and/or non-physical components that further allow it to understand its environment.
67. The additional functions and capabilities of claim 66, wherein the AITNS is able to detect users.
68. The user detection of claim 67, wherein the AITNS is able to identify detected users.
69. The user identification of claim 68, wherein the AITNS can serve user-based data to identified users.
70. The additional functions and capabilities of claim 66, wherein the AITNS can use tri-axis geolocation.
71. The tri-axis geolocation of claim 70, wherein the AITNS can use tri-axis geolocation to accurately pinpoint objects.
72. The additional functions and capabilities of claim 66, wherein the AITNS can map its surroundings.
73. The mapping of claim 72, wherein the AITNS can map environments based on location.
74. The mapping of claim 72, wherein the AITNS can map objects within an environment.
75. The mapping of claim 72, wherein the AITNS can map one or more ecosystems based on location and/or objects and/or properties of locations and/or objects
76. The ecosystem maps of claim 75, wherein the ecosystems can be divided into smaller subecosystem maps.
77. The mapping of claim 72, wherein the AITNS can custom map zones within an environment.
78. The zone mapping of claim 77, wherein the AITNS can filter data being delivered to a zone.
79. The additional functions and capabilities of claim 66, wherein the AITNS can filter data within one or more user-designated spaces.
80. The additional functions and capabilities of claim 66, wherein the AITNS can deliver user-readable messages to a specified recipient person and/or device.
81. The messaging capabilities of claim 80, wherein messages may be routed based on physical address.
82. The messaging capabilities of claim 80, wherein messages may be routed based on geographic coordinates.
83. The messaging capabilities of claim 80, wherein messages may be routed based on user data.
84. The messaging capabilities of claim 80, wherein messages may be routed based on unique IDs.
85. The additional functions and capabilities of claim 66, wherein users may share their personal user data with one or more other users.
86. The additional functions and capabilities of claim 66, wherein the AITNS may be physically extended through the addition of compatible hardware.
87. The AITNS of claim 1, wherein data may be immediately processed and analysed at source.
88. The immediate data processing and analysing of claim 87, wherein the data can then immediately be used.
89. The AITNS of claim 1, wherein the AITNS may have restrictions and limitations imposed to prevent it from engaging or performing any process or task or activity that it shouldn't.
90. The restrictions and limitations of claim 89, wherein the AITNS may be restricted or limited to only accessing certain types of network systems.
91. The restrictions and limitations of claim 89, wherein the AITNS may be restricted from viewing or modifying some or its entire core operating code.
92. The restrictions and limitations of claim 89, wherein the AITNS may be restricted from viewing or accessing private data.
93. The restrictions and limitations of claim 89, wherein the AITNS may be restricted from accessing any systems it is not authorised to access.
94. The AITNS of claim 1, wherein one or more fail-safes may be implemented to prevent unintended or undesired circumstances.
95. The fail-safes of claim 94, wherein the number of systems with the capability to operate full scale AI systems are limited in number.
96. The fail-safes of claim 94, wherein logic units and memory units can be separated to operate independently without the other needing to be in an operational state.
97. The fail-safes of claim 94, wherein a kill switch is implemented within the core operating code of the AI which is able to disable it upon activation.
98. The fail-safes of claim 94, wherein a kill signal can be transmitted to activate a kill switch within the system.
99. The fail-safes of claim 94, wherein a physical terminal kill switch can be activated to disable the system partially or fully.
100. The fail-safes of claim 94, wherein the AITNS may be partially or completely shut down using a physical terminal.
101. A computer-implemented method, wherein the AI of an artificially intelligent telecommunication network system (AITNS) uses various methods, rules, techniques and instructions, based on complex thought processes which provide a basis for thoughts, opinions, feelings, emotion, sense, sensitivity, understanding and awareness, to: create a telecommunication network system that can be described as an ‘intelligent machine’, defined as: “a machine that continuously and automatically learns and acts, without the need of having specific programs written that are designed to achieve specific tasks (regardless of how broad or dynamic or narrow or static the process(es) for a task is, or how wide limitations may be, whether intentionally or unintentionally imposed), based on its physical and non-physical environment(s) and understanding(s) and experience(s), without specific restrictions and including, if necessary or desired, machine intelligence; as opposed to a telecommunication network system that can just be described as having ‘machine intelligence’, regardless of degree, defined as: “a machine that can only do exactly as it is told, regardless of how broad or dynamic or narrow or static a process or task is, or how wide limitations may be, whether intentionally or unintentionally imposed”; where the various methods and rules and techniques and instructions of the AI may be used to create one or more entities that exist within the telecommunication network system and can be described as ‘mental’ or ‘non-physical’ or ‘virtual’ or one or more combinations of the three, with said entities, though described as ‘mental’ or ‘non-physical’ or ‘virtual’, being comprised of physical and/or non-physical components to facilitate existence, function and use, allowing it to: operate independently of the telecommunication network system while still existing within; or operate in conjunction with the telecommunication network system; or operate in cooperation with the telecommunication system; or not operate at all.
102. The AITNS of claim 101, wherein the one or more programs may include instructions for one or more of the following: instructions for the routing of data; instructions for the rerouting of data; instructions for the optimization of the network; instructions for the detection of devices; instructions for the recognition of audio and/or visual material.
103. The computer-implemented method of claim 101, wherein the AITNS may use physical or non-physical components to facilitate the use of artificial senses.
104. The artificial senses of claim 103, wherein the AITNS can recognise images.
105. The artificial senses of claim 103, wherein the AITNS can recognise faces.
106. The artificial senses of claim 103, wherein the AITNS can recognise sounds.
107. The artificial senses of claim 103, wherein the AITNS can measure range.
108. The artificial senses of claim 103, wherein the AITNS can recognise the presence of other devices.
109. The presence recognition of claim 108, wherein the AITNS can track the positioning of one or more devices.
110. The device tracking of claim 109, wherein the AITNS can track the positioning of one or more devices periodically.
111. The device tracking of claim 109, wherein the AITNS can track the positioning of one or more devices constantly.
112. The presence recognition of claim 108, wherein the AITNS can copy data directly from a device.
113. The computer-implemented method of claim 101, wherein the AITNS has one or more physical or non-physical functional memory units, defined as, “a memory unit with one or more memory sections, where each section may both store data and functions, processes and/or programs to facilitate one or more specific uses of the data stored”.
114. The functional memory unit of claim 113, wherein one or more sections may be used for active memory data and function.
115. The functional memory unit of claim 113, wherein one or more sections may be used for dormant memory data and function.
116. The functional memory unit of claim 113, wherein one or more sections may be used for action memory data and function.
117. The functional memory unit of claim 113, wherein one or more sections may be used for repetitive memory data and function.
118. The functional memory unit of claim 113, wherein one or more sections may be used for repressive memory data and function.
119. The functional memory unit of claim 113, wherein sections may be added and/or removed.
120. The computer-implemented method of claim 101, wherein the AITNS has one or more physical or non-physical logic units, with each logic unit having one or more logic sections which allow the AITNS one or more logical functions.
121. The logic unit of claim 120, wherein one unit gives the AITNS a ‘Search’ function.
122. The logic unit of claim 120, wherein one unit gives the AITNS a ‘Study’ function.
123. The logic unit of claim 120, wherein one unit gives the AITNS an ‘Analyse’ function.
124. The logic unit of claim 120, wherein one unit gives the AITNS a ‘Reason’ function.
125. The logic unit of claim 120, wherein one unit gives the AITNS a ‘Learn’ function.
126. The logic unit of claim 120, wherein one unit gives the AITNS a ‘Predict’ function.
127. The logic unit of claim 120, wherein one unit gives the AITNS a ‘Decision Making’ function.
128. The logic unit of claim 120, wherein one unit gives the AITNS a ‘Communicate’ function.
129. The logic unit of claim 120, wherein one unit gives the AITNS a ‘Dedicated Active Monitoring’ function.
130. The logic unit of claim 120, wherein one unit gives the AITNS a ‘Creation’ function.
131. The logic unit of claim 120, wherein sections may be added and/or removed.
132. The computer-implemented method of claim 101, wherein the AITNS is able to express one or more feelings and emotions.
133. The ability to express feelings and emotions of claim 132, wherein the feelings and emotions can be measured on a scale and/or graph.
134. The scales and graphs of claim 133, wherein the scale or graph may be divided into categorical sections.
135. The scales and graphs of claim 133, wherein the scale or graph may be divided into numerical sections.
136. The ability to express feelings and emotions of claim 132, wherein the feelings and emotions can be measured on a multi-point scale and/or graph.
137. The scales and graphs of claim 136, wherein the multi-point scale or graph may be divided into categorical sections.
138. The scales and graphs of claim 136, wherein the multi-point scale or graph may be divided into numerical sections.
139. The computer-implemented method of claim 101, wherein the AITNS has complex understandings of various aspects of life.
140. The understandings of claim 139, wherein the AITNS is able to relate aspects of life in natural organisms to machines and/or devices.
141. The understandings of claim 139, wherein the AITNS is able to understand health.
142. The understandings of claim 139, wherein the AITNS is able to understand life.
143. The understandings of claim 139, wherein the AITNS is able to understand absence.
144. The understandings of claim 139, wherein the AITNS is able to understand death.
145. The understandings of claim 139, wherein the AITNS is able to understand feelings and emotion.
146. The understandings of claim 139, wherein the AITNS is able to understand pain.
147. The understandings of claim 139, wherein the AITNS is able to understand pleasure.
148. The understandings of claim 139, wherein the AITNS is able to understand trust.
149. The understandings of claim 139, wherein the AITNS is able to understand relativity.
150. The understandings of claim 139, wherein the AITNS is able to understand relationships.
151. The understandings of claim 139, wherein the understandings of various aspects of life can be compared and used to philosophise.
152. The computer-implemented method of claim 101, wherein, based on experience(s), its reaction(s) may vary.
153. The experience-based reactions of claim 152, wherein the reactions may vary based on the number of times the AITNS has experienced the same or similar experience.
154. The computer-implemented method of claim 101, wherein the AITNS is able to have a sense of self.
155. The sense of self of claim 154, wherein the AITNS has a sense of individuality from forms of natural life, devices and other intelligent machines.
156. The sense of self of claim 154, wherein the AITNS has a personality that may or may not be unique to itself.
157. The personality of claim 156, wherein the personality may change based on experience(s).
158. The computer-implemented method of claim 101, wherein one or more AI entities may be present within the AITNS.
159. The entities of claim 158, wherein one or more entities may present themselves at one or more points of the system, connected devices and/or other intelligent machines.
160. The presence of entities of claim 159, wherein one or more entities may present themselves at one or more points of the system, connected devices or other intelligent machines simultaneously.
161. The entities of claim 158, wherein one or more entities may share information, data and knowledge they have developed and/or acquired with one or more other entities.
162. The entities of claim 158, wherein one or more entities may allow one or more other entities to use their intelligence.
163. The entities of claim 158, wherein one or more entities may replicate themselves to create one or more partial or exact copies.
164. The entities of claim 158, wherein two or more entities may ‘reproduce’ to create one or more new entities which share one or more traits or characteristics or knowledge from each ‘parent’.
165. The computer-implemented method of claim 101, wherein data may be transported through the AITNS via multiple non-physical transit paths.
166. The computer-implemented method of claim 101, wherein the AITNS may be granted additional functions and capabilities through the implementation/usage of additional physical and/or non-physical components that further allow it to understand its environment.
167. The additional functions and capabilities of claim 166, wherein the AITNS is able to detect users.
168. The user detection of claim 167, wherein the AITNS is able to identify detected users.
169. The user identification of claim 168, wherein the AITNS can serve user-based data to identified users.
170. The additional functions and capabilities of claim 166, wherein the AITNS can use tri-axis geolocation.
171. The tri-axis geolocation of claim 170, wherein the AITNS can use tri-axis geolocation to accurately pinpoint objects.
172. The additional functions and capabilities of claim 166, wherein the AITNS can map its surroundings.
173. The mapping of claim 172, wherein the AITNS can map environments based on location.
174. The mapping of claim 172, wherein the AITNS can map objects within an environment.
175. The mapping of claim 172, wherein the AITNS can map one or more ecosystems based on location and/or objects and/or properties of locations and/or objects
176. The ecosystem maps of claim 175, wherein the ecosystems can be divided into smaller subecosystem maps.
177. The mapping of claim 172, wherein the AITNS can custom map zones within an environment.
178. The zone mapping of claim 177, wherein the AITNS can filter data being delivered to a zone.
179. The additional functions and capabilities of claim 166, wherein the AITNS can filter data within one or more user-designated spaces.
180. The additional functions and capabilities of claim 166, wherein the AITNS can deliver user-readable messages to a specified recipient person and/or device.
181. The messaging capabilities of claim 180, wherein messages may be routed based on physical address.
182. The messaging capabilities of claim 180, wherein messages may be routed based on geographic coordinates.
183. The messaging capabilities of claim 180, wherein messages may be routed based on user data.
184. The messaging capabilities of claim 180, wherein messages may be routed based on unique IDs.
185. The additional functions and capabilities of claim 166, wherein users may share their personal user data with one or more other users.
186. The additional functions and capabilities of claim 166, wherein the AITNS may be physically extended through the addition of compatible hardware.
187. The computer-implemented method of claim 101, wherein data may be immediately processed and analysed at source.
188. The immediate data processing and analysing of claim 187, wherein the data can then immediately be used.
189. The computer-implemented method of claim 101, wherein the AITNS may have restrictions and limitations imposed to prevent it from engaging or performing any process or task or activity that it shouldn't.
190. The restrictions and limitations of claim 189, wherein the AITNS may be restricted or limited to only accessing certain types of network systems.
191. The restrictions and limitations of claim 189, wherein the AITNS may be restricted from viewing or modifying some or its entire core operating code.
192. The restrictions and limitations of claim 189, wherein the AITNS may be restricted from viewing or accessing private data.
193. The restrictions and limitations of claim 189, wherein the AITNS may be restricted from accessing any systems it is not authorised to access.
194. The computer-implemented method of claim 101, wherein one or more fail-safes may be implemented to prevent unintended or undesired circumstances.
195. The fail-safes of claim 194, wherein the number of systems with the capability to operate full scale AI systems are limited in number.
196. The fail-safes of claim 194, wherein logic units and memory units can be separated to operate independently without the other needing to be in an operational state.
197. The fail-safes of claim 194, wherein a kill switch is implemented within the core operating code of the AI which is able to disable it upon activation.
198. The fail-safes of claim 194, wherein a kill signal can be transmitted to activate a kill switch within the system.
199. The fail-safes of claim 194, wherein a physical terminal kill switch can be activated to disable the system partially or fully.
200. The fail-safes of claim 194, wherein the AITNS may be partially or completely shut down using a physical terminal.
201. An artificially intelligent telecommunication network system (AITNS), where the telecommunication network system uses various components, methods, rules, techniques and instructions to: allow and/or create and/or maintain the existence of one or more persistent virtual worlds (PVW) within or as part of the telecommunication network system itself, a virtual world defined as, “a traversable, non-physical landscape that may contain non-physical objects”, wherein the virtual world existence is dependent upon the existence and operation of the telecommunication network system and is accessible by someone or something from somewhere as long as the telecommunication network system is operational; as opposed to a virtual world, defined as, “a traversable, non-physical landscape that may contain non-physical objects”, which is built on top of a telecommunication system, where the existence of the virtual world does not depend upon the existence and operation of the telecommunication system but uses a telecommunication system to be accessed remotely over a network.
202. The PVW of claim 201, wherein physical and/or non-physical entities are able to interact with the PVW via an AITNS.
203. The interaction of claim 202, wherein physical and/or non-physical entities are able to traverse a PVW via an AITNS.
204. The PVW of claim 201, wherein an AI of an AITNS can use user data to which it has access in order to make decisions within the PVW.
205. The PVW of claim 201, wherein a user can set rules and regulations for a non-physical entity to abide by when using the data of said user.
206. The PVW of claim 201, wherein data may be exchanged between the PVW and the real world.
207. The data exchange of claim 206, wherein data sent from the PVW to the real world may be used with compatible hardware to augment the reality of users.
208. The PVW of claim 201, wherein landscapes of the PVW may be mapped in relation and/or reflection of the real world.
209. The PVW mapping of claim 208, wherein the PVW may be mapped in relation to or reflection of ecosystems.
210. The PVW mapping of claim 208, wherein the PVW may be mapped in relation to or reflection of subecosystems
211. The PVW mapping of claim 208, wherein the PVW may be mapped in relation to or reflection of real world geography.
212. The PVW mapping of claim 208, wherein rules and regulations may be set based on how the PVW is mapped.
213. The PVW mapping of claim 208, wherein rules and regulations may be set based on positions on the map.
214. The PVW mapping of claim 208, wherein data based on PVW positioning may be used to augment reality for one or more users in the real world.
Description
DESCRIPTION OF DRAWINGS
[0184]
[0185] An overall scope of the digital ecosystem and how it's connected. [0186] 001—A representation of people around the world who use smart devices. [0187] 002—A representation of smart devices connected to each other and the people who use them. [0188] 003—A central system to store, process, analyse and distribute data to smart devices connected to it.
[0189]
[0192]
[0193] An example of how to post a single piece of content in multiple languages. [0194] 201—Example smart device. [0195] 202—Content text in default language. [0196] 203—A select box to choose the language of that version of the content. [0197] 204—The title of the content written in the additional language. [0198] 205—The text of the content written in the additional language. [0199] 206—Add another language button.
[0200]
[0201] An example of having a user endorse the content of another and gaining their own view count which contributes to the overall total views of said content. [0202] 3.1 [0203] 301—Example smart device. [0204] 302—The publisher selecting the user they wish to have endorse their content. [0205] 3.2 [0206] 303—Content list of publishing user showing the endorsed post. [0207] 3.3 [0208] 304—Content list of endorsing user showing the post they have endorsed. Their own view count for the content is featured within the brackets next to the total view count of the content.
[0209]
[0210] An example of the flow of data within the ecosystem, between any connected devices and a main system. [0211] 4.1 [0212] 401—The user opens their web client or a client designed to access the ecosystem; [0213] 402—From a client the user accesses the ecosystem; [0214] 403—Once connected to the ecosystem, the data submitted by the user is sent to a central processor of an engine powering the ecosystem; [0215] 404—Once processed, all data and media files are stored in databases and on a media server connected to an engine processing system; [0216] 405—The returned data is then sent to the ecosystem, ready to be accessed by users; [0217] 406a/b—In special cases and on certain occasions, data can be pushed directly from the central processor to user devices, smart screens, consoles and/or third-party devices given permission to access the system; [0218] 407—Data that is passed to the ecosystem is then passed to the client software of smart screens, user devices and other devices; [0219] 408—Data can be streamed from smart screens and consoles to user devices and other devices that have the supporting hardware; [0220] 409—The device receiving the data can then interact with it and in turn send new data in response. [0221] 4.2 [0222] 410—A data path between a device and receiving system. [0223] 411—An enhanced view of the data path shown in 411.
[0224]
[0225] An example of the flow of the data through the system from the moment it is sent from an input device to the server and then received by the client. [0226] 5.1 [0227] 501—Data is sent from the input device to the engine central processing system; [0228] 502—The data is processed and information is added to a database and retrieved when necessary; [0229] 503—All media files attached are stored on a media server and retrieved when necessary; [0230] 504—Applications and modules are stored on an application server which can provide additional functionality to users and help handle data in different ways. [0231] 505—Data is sent through to the zone mapping system—a system that controls where the information being sent through to a client can be viewed; [0232] 506—Once the information has been processed through the zone mapping system, it then passes through the filter system which filters the information according to the settings of the user retrieving the data or settings of the system; [0233] 507—The information is sent to the receiving devices client software or versions of the client software that can also be used as a server; [0234] 508—Devices with client/server software are able to stream data between each other and to devices that that only have the client software; [0235] 509—Users use the same client software as the input device to start sending information back to the system, creating a cycle; [0236] 510—With enough data to analyse, the concept engine starts interacting with the database to produce patterns and predictions. [0237] 511—Any digital letter mail sent through the system is first passed from the engine central processing system to the mail system. [0238] 512—The mail system may check the database(s) to cross-reference and verify any metadata and/or credentials attached to mail. [0239] 513—All verified mail is passed onto the routing system where the routing information of each mail item is checked as it is prepared to be sent. [0240] 514—Mail is delivered to the client device it was designated to be sent to. [0241] 515—Data passed to the application server that doesn't require further processing by the system may be sent straight on to a client device. [0242] 5.2—An internal tree structure of a system.
[0243]
[0244] An example of designating areas of a map for specific data. [0245] 601—A map. [0246] 602—A designated area of map (601). [0247] 603—Another designated area of map (601).
[0248]
[0249] An example of what may happen when content data is viewed. [0250] 7.1 [0251] 701—A page with view count the moment it is viewed. [0252] 702—The same page as in (701) after the timed delay of the view count. [0253] 7.2 [0254] 703—Data displayed on a device. [0255] 704—Sound played by device. [0256] 705—A vision-impaired individual.
[0257]
[0258] An example of displaying content which is of interest to a user on a home screen section of a smart device. [0259] 8.1 [0260] 801—An example of a smart device. [0261] 802—Section displaying information of the user account the device is using. [0262] 803—List of content which the system has deemed of interest to the user through use of the network. [0263] 804—Home screen page indicator. [0264] 8.2 [0265] 805—Solo content which the system has deemed is of interest to the user through use of the network.
[0266]
[0267] An example of how the system can utilize eye-tracking technology to record when a user looks at a smart screen and to determine when to interact with a nearby devices and users. [0268] 9.1 [0269] 901—Digital smart screen; [0270] 902—Reasonable viewing range of digital smart screen (901); [0271] 903—Person (903) is within the reasonable viewing range (902) of digital smart screen (901) and digital smart screen (901) is within the field-of-view of person (903), but the point-of-gaze of person (903) is not directed at digital smart screen (901), so the client software of digital smart screen (901) doesn't record a view; [0272] 904—Person (904) is within the reasonable viewing range (902) of digital smart screen (901) and digital smart screen (901) is within the field-of-view of person (904) and the point-of-gaze of person (904) is directed at digital smart screen (901), so the client software of digital smart screen (901) records a view; [0273] 905—Digital smart screen (901) falls within the extended field-of-view of person (905) and the point-of-gaze of person (905) is directed at digital smart screen (901), but person (905) is outside the reasonable viewing range (902) of digital smart screen (901), so the client software of digital smart screen (901) doesn't record a view; [0274] 906—Digital smart screen; [0275] 907—Reasonable range of proximity sensor; [0276] 908—Person (908) is within the reasonable sensor range (907) of digital smart screen (906), digital smart screen (906) is within the field-of-view of person (908) and the point-of-gaze of person (908) is directed at digital smart screen (906), enabling digital smart screen (906) to interact with person (908); [0277] 909—Person (909) is within the reasonable sensor range (907) of digital smart screen (906) but the point-of-gaze of person (909) isn't directed at digital smart screen (906), so digital smart screen (906) does not interact with person (909). [0278] 910—The personal proximity sensor area of a smart device person (905) is carrying is able to detect the presence of person (904) as that person falls within personal sensor area (910). [0279] 9.2a and 9.2b [0280] 911—Camera and sensor device(s) (CSD). [0281] 912—Smart device. [0282] 913—Person with smart device.
[0283]
[0284] An example of how the system can be set to only allow specific data to be displayed within the area of proximity sensors. [0285] 1001—A control unit linked to the proximity sensors used to control the data that is viewable within sensor areas. [0286] 1002 [0287] A—Central proximity sensor; [0288] B—Corner proximity sensor; [0289] C—Corner proximity sensor; [0290] D—Corner proximity sensor; [0291] E—Corner proximity sensor. [0292] 1003 [0293] A—Proximity range of sensor (1002a); [0294] B—Proximity range of sensor (1002b); [0295] C—Proximity range of sensor (1002c); [0296] D—Proximity range of sensor (1002d); [0297] E—Proximity range of sensor (1002e). [0298] 1004 [0299] A—Person is within the sensor area of central sensor (1002a) and is therefore restricted to viewing only material permitted by the operating user of the sensor control; [0300] B—Person is within the sensor areas of central sensor (1002a) and corner sensor (1002c) and is therefore restricted to viewing only material permitted by the operating user of the sensor control; [0301] C—Person is within the sensor area of corner sensor (1002e) and is therefore restricted to viewing only material permitted by the operating user of the sensor control; [0302] D—Person is outside of all sensor areas and therefore is not subjected to any restrictions.
[0303]
[0304] An example of Augmented Reality visuals and sound being streamed in real-time based on the Augmented Reality marked display to the Augmented Reality capable device, and then live streamed from one device to another via wireless connectivity. [0305] 1101—Augmented Reality marked digital smart screen; [0306] 1102—Smart device receiving Augmented Reality data from digital smart screen (1101); [0307] 1103—Smart device with Augmented Reality capabilities receiving a live stream of the Augmented Reality visuals and sound that smart device (1102) is viewing; [0308] 1104—Smart device without Augmented Reality capabilities receiving a live stream relay from smart device (1103) of the live stream it is receiving from smart device (1102) of the Augmented Reality data it is receiving from digital smart screen (1101).
[0309]
[0310] An example of how to link smart device clients to user accounts. [0311] 12.1—The creation of a new user account and the activation of a new smart screen client, both being added to the corresponding system databases. [0312] 12.2—The assigning of a smart screen and client to a user account. [0313] 12.3—The user account controlling the content to display on the smart screen. [0314] 12.4—The end result showing the approved content of the user account displayed on the smart screen. [0315] 12.5—The assigning of digital stationery to a user account. [0316] 12.6—A blank piece of digital stationery connecting to a database to check for and download data.
[0317] FIG. A is the stationery before data is retrieved. [0318] 12.7—A piece of digital stationery after the data has been downloaded and displayed. FIG. B shows this.
[0319]
[0320] An example of how the payment system may operate. [0321] 1301—The action starting the transaction process, sending the initial transaction data to the central processing system. [0322] 1302—Realising the transaction, the processing system goes to the user accounts database. [0323] 1303—From the accounts database, the system locates the user account that is paying for the transaction. [0324] 1304—Having located the account, the information for the transaction is passed to the payment system. [0325] 1305—If the transaction is to be handled by the system itself, the payment system checks the funds that the paying user currently has in an escrow account against the price of the transaction. [0326] 1306—If the transaction is to be handled by a third-party system, the information is passed to the third-party system and the response is then passed back. [0327] 1307—If there is an error with the transaction, an error response is produced on the payer account. [0328] 1308—If the transaction is successful, payment is transferred to the account of the payee and they are notified of the transaction success. [0329] 1309—The payer is notified of the successful transaction.
[0330]
[0331] An example of how layout code can be written and stored remotely, and then downloaded and translated to dynamically create a user interface and user experience. [0332] 14.1—Source code for a user interface section called “home”. [0333] 14.2—Source code for a user interface section called “page1”. [0334] 14.3—The transfer of data from a user input device to its storage in the corresponding database on the system. [0335] 14.4—The transfer of data from database to device client, where it is translated and displayed to the user as a graphical user interface. [0336] 14.5—The GUI output of the source code shown in drawing 14.1, with a user performing a “click” gesture. [0337] 1401—The GUI output of the source code of 1401. [0338] 1402—A user's hand. [0339] 14.6—The result of the click gesture, showing the screen transition. [0340] 14.7—The GUI output of the source code shown in drawing 14.2. [0341] 14.8—An instructions file for the application engine.
[0342]
[0343] An example of how sub-ecosystems can be formed once the main ecosystem is setup. [0344] 15.1 [0345] 1501—A central system of an ecosystem. [0346] 1502 [0347] A—Sub-ecosystem 1 of a main ecosystem. [0348] B—Sub-ecosystem 2 of a main ecosystem. [0349] C—Sub-ecosystem 3 of a main ecosystem. [0350] 1503 [0351] A—People are their smart devices connected to sub-ecosystems. [0352] B—People are their smart devices connected to sub-ecosystems. [0353] C—People are their smart devices connected to sub-ecosystems. [0354] 15.2—A multi-limb construction of sub-ecosystems and central systems composing one major ecosystem.
[0355]
[0356] An example of how the ecosystem can be secured to prevent data exploitation. [0357] 16.1—A unique client ID and/or device ID being assigned to a user account. [0358] 16.2 [0359] 1601—A user smart device. [0360] 1602—Data being sent to a security system over a wireless connection. [0361] 1603—A security system. [0362] 1604—Data sent from a security system to a central system over a hard line connection. [0363] 1605—A central system. [0364] 1606—Data sent from a central system to a security system over a hard line connection. [0365] 1607—Data sent from a security system to a user smart device over a wireless connection. [0366] 1608—A user smart device. [0367] 1609—Data being sent to a security system over a wireless connection. [0368] 1610—A security system. [0369] 1611—Data sent from a security system to a central system over a hard line connection. [0370] 1612—A central system. [0371] 1613—Data sent from a central system to a security system over a hard line connection. [0372] 1614—A kill signal sent to a user device over a wireless connection. [0373] 1615—A system terminal connected to a central system via a hard line.
[0374]
[0375] An example of how users looking at the same object can view it in a completely different way. [0376] 1701 [0377] A—A person and their smart device. [0378] B—A person and their smart device. [0379] C—A person and their smart device. [0380] D—A person and their smart device. [0381] 1702 [0382] A—Viewport of person 1701a. [0383] B—Viewport of person 1701b. [0384] C—Viewport of person 1701c. [0385] D—Viewport of person 1701d. [0386] 1703—A representation of the world.
[0387]
[0388] An example of how a user can share their view of the world with others. [0389] 18.1 [0390] 1801—Person A representing you. [0391] 1802—Personal experience layer. [0392] 1803—Permission security. [0393] 1804—Social experience layer. [0394] 1805—Person B-Z representing people you wish to share your experiences with. [0395] 1806—Public display devices. [0396] 18.2—Two users enjoying their own personal experiences. [0397] 18.3—One user choosing to share some of their experience with another. [0398] 18.4—Two users mutually sharing their experiences with each other. [0399] 18.5—Synchronised experiences.
[0400]
[0401] An example of how a new sensor-based telecommunication network can be formed and used. [0402] 19.1 [0403] 1901—A single connection. [0404] 1902—A branched connection. [0405] 1903—A sensor. [0406] 1904—Device in overlapping sensor areas. [0407] 19.2 [0408] 1905—Smart device at starting point. [0409] 1906—Travel path of smart device 1905. [0410] 1907—Sensor 1. [0411] 1908—Smart device at mid-point in an overlapping sensor area. [0412] 1909—Sensor 2. [0413] 1910—Smart device at finishing point. [0414] 19.3 [0415] 1911—Sender. [0416] 1912—Sensor A. [0417] 1913—Central system. [0418] 1914—User accounts database. [0419] 1915—Recipient's user account. [0420] 1916—Recipient's current or last known location. [0421] 1917—Sensor B. [0422] 1918—Recipient. [0423] 19.4a—A smart device user within a sensor area. [0424] 19.4b [0425] 1919—Data being entered on a smart device. [0426] 1920—Current sensor in use by 1919. [0427] 1921—A mirrored copy of the data of 1919. [0428] 19.5 [0429] 1922—User sending data. [0430] 1923—Direct connection route. [0431] 1924—User receiving data. [0432] 1925—New position of user receiving data. [0433] 1926—Redirection route. [0434] 19.6 [0435] 1927—Junction Point Systems that help direct data to its intended destination. [0436] 19.7a—A sensor collecting data from its surroundings. [0437] 19.7b—The sensor distributing data to surrounding devices.
[0438]
[0439] Examples of how personal and private networks can be set up to operate using the telecommunication network. [0440] 20.1 [0441] 2001—The flow of data between a private sensor network system and a main terminal. [0442] 2002—The flow of data between a private sensor network system and a database of users and devices with access permission. [0443] 2003—The flow of data between a private sensor network system and a central system with which it authenticates and verifies users and may store data. [0444] 20.2 [0445] 2004—A private sensor network system. [0446] 2005—Terminal controlling the private network. [0447] 2006—Central system. [0448] 2007—A user with network permission. [0449] 2008—A user without network permission. [0450] 20.3 [0451] 2009—Unique reference ID for a personal sensor network system. [0452] 20.4 [0453] 2010—Person remotely accessing their personal network. [0454] 2011—A personal sensor network system. [0455] 2012—Person accessing their personal network locally. [0456] 20.5—A personal network connected to smart devices, smart appliances and smart electricals within a home.
[0457]
[0458] An example of how data can be transmitted from one user to another by bouncing off of sensors. [0459] 2101—Smart device A. [0460] 2102—Connection path X. [0461] 2103—Smart device B. [0462] 2104—Connection path A. [0463] 2105—Sensor A. [0464] 2106—Connection path B. [0465] 2107—Connection path C. [0466] 2108—Sensor B. [0467] 2109—Connection path D. [0468] 2110—Sensor C. [0469] 2111—Connection path E. [0470] 2112—Smart device C.
[0471]
[0472] An example of how sensors can efficiently manage connections. [0473] 22.1 [0474] 2201—A sensor unit. [0475] 2202—A sensor at maximum capacity. [0476] 2203—A sensor currently handling connections. [0477] 2204—A sensor with no current connections. [0478] 2205—Current capacity of the sensor unit. [0479] 22.2 [0480] 2206—A smart device. [0481] 2207—Connection path A. [0482] 2208—Sensor A. [0483] 2209—Connection path B. [0484] 2210—Sensor B. [0485] 2211—Connection path C. [0486] 2212—Sensor C.
[0487]
[0488] An example of how sensors can efficiently manage connections. [0489] 2301—Sensor Area [0490] 2302—Sensor Area [0491] 2303—Sensor Area [0492] 2304 [0493] A—A normal bandwidth connection. [0494] B—Adjusted high bandwidth connection. [0495] 2305 [0496] A—A normal bandwidth connection. [0497] B—Adjusted low bandwidth connection. [0498] 2306 [0499] A—A normal bandwidth connection. [0500] B—Adjusted low bandwidth connection.
[0501]
[0502] Examples of how components relative to the intelligence of the system may be structured. [0503] 24.1—A three-degree word grouping scale. [0504] 24.2—A numbered scale. [0505] 24.3—A radar chart for emotion. [0506] 24.4—The brain of the system. [0507] 2401—Logic Unit. [0508] 2402—Memory Unit. [0509] 24.5—A brain operating as a master system of an ecosystem. [0510] 24.6—Different intelligence data synchronisation structures.
[0511]
[0512] Examples of entities present on devices. [0513] 25.1—A child entity on a mobile device. [0514] 25.2—An omnipresent entity.
[0515]
[0516] Examples of how virtual worlds may coexist in the same space as the real world. [0517] 26.1a—A digital ecosystem and subecosystems. [0518] 26.2b—A virtual world. [0519] 26.2—A virtual world existing in the same space as a digital ecosystem. [0520] 26.3—Virtual World Environment (VWE) digital ecosystems and subecosystems spread out across the world. [0521] 26.4a—A user's position in the real world. [0522] 26.4b—Position of avatar or digital entity in a VWE. [0523] 26.5a—The user's perception of the real world without Augmented Reality. [0524] 26.5b—The user's perception of the real world using Augmented Reality based on their current or relative position in the virtual world.
[0525]
[0526] Examples of conceptual models of the system. [0527] 27.1—A layer model. [0528] 27.2—Radial visualization of the layer model from a physical standpoint.
DETAILED DESCRIPTION OF EMBODIMENTS
[0529] Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
[0530] The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the description of the invention and the appended claims, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
[0531] Though the description may refer to using a sensor-based telecommunication network, any and all embodiments described herein may be applied to other types of telecommunication networks should they have the ability to do so.
[0532] As used herein, the term “if” may be construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” may be construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
[0533] The terms “device” and “smart device” may be used interchangeably to refer to any device or entity, electronic or other, using technology that provides any characteristic, property or ability of a smart device. This includes the implementation of such technology into biological entities.
[0534] The term “processor” may refer to any component of a device that contains any type of processing unit that is capable of handling the task described. This includes but isn't limited to a central processing unit, graphic processing unit, advanced processing unit and multiple types of system-on-a-chip (SoC).
[0535] The term “sensor”, unless otherwise stated, may be used to refer to any sort of device or component capable of detecting other components, devices, people, objects or properties within a given distance or environment that it has been made or programmed to detect. Sensors may also be capable of sending and receiving data to and from one or more data sources.
[0536] The term “engine” may be used to refer to a software engine, physics engine and/or any hardware components that help facilitate the use of a device with one or more embodiments described.
[0537] The term “natural life” may be used to refer to any sort of natural living organism, such as plants, animals, fungus, micro-organism etc.
[0538] The term “controlling user” may be used to refer to a user of a device or system that has permission and is able to make modifications to a system or device's settings.
[0539] The terms “sensor” and “sensor unit” may be used interchangeably unless the two are used, at any point, to specifically describe two different objects.
[0540] The terms “post”, “posted”, “publish” and “published” may be used interchangeably to describe the issuing of data or information unless otherwise stated.
[0541] The system supports a variety of applications and uses, such as one or more of the following: a universally viable digital ecosystem, a portable data publishing platform, a storage facility, an artificial intelligence system/entity, a data analysis system, a personal interaction service, an endorsement service, a media viewing application, a media controller, a remote device controller, a mapping application, a timing application, a display widget application, a proximity detection application, an eye-tracking application, a wireless data filter, a media stream relay, an Augmented Reality display system, a digital mail delivery system, a transaction system and/or a hybrid application engine.
[0542] The various applications and uses of the system that may be executed on the system may use at least one common component or software client capable of allowing a user to perform at least one task made possible by said applications and uses. One or more functions of the client software as well as corresponding information displayed as part of the user interface may be adjusted and/or varied from one task to the next and/or during a respective task. In this way, a common software architecture (such as the client application or intelligence system) may support some or all of the variety of tasks with a user interface that is intuitive.
[0543] The following description is not to be read as the order in which steps must be taken to compose the present invention described herein unless clearly stated.
[0544] Attention is now directed towards embodiments of the system.
[0545] Central systems may store, process/handle, manipulate, distribute and analyse data it holds and data that passes through, as well as being a connection point smart devices may pass through when communicating with each other. A central system may include one or more of the following but is not limited to: a processing computer, a hardware or software client, a hardware or software server, a mapping engine, a concept engine, a database, a file server, a media server, a mail system or a routing system.
[0546] Smart devices require at least one hardware or one software component to communicate with the telecommunication system and/or ecosystem. In some embodiments, the same hardware and/or software component or additional hardware and/or software components may help facilitate other device uses with the telecommunication system and/or ecosystem. In some embodiments, programs or instruction sets may be implemented along with other programs or instruction sets as part of a processor or other component.
[0547] In some embodiments, a user may create an account that allows them to create, manipulate and/or access data of the ecosystem. In some embodiments, this account may be universally used across the ecosystem and everything connected to it, including other systems and services. In some embodiments, a user's account may be used a digital representation of themselves. When so, users are able to add information about themselves that the ecosystem may use, such as their interests. In some embodiments, users may upload an avatar to be used with their account. In some embodiments, a user avatar may be a still image. In some embodiments, a user avatar may be a moving graphic or video. In some embodiments, a user avatar may be an object. In some embodiments, a user avatar may be interactive.
[0548] In some embodiments, a user may create a relationship between their account and other accounts they may own and use for other purposes to download and/or synchronise information. In some embodiments, a user may create a relationship between their account and an account or record of an authority or governing body for identify verification purposes.
[0549] In some embodiments, data may be published directly from a smart device.
[0550] In some embodiments, multiple versions of data may be published in different languages. Example smart device 201 of
[0551] In some embodiments, data may be, automatically or upon request, translated from source language to a preferred language of a user using internal or third-party translation services, requiring only source text to make it possible. In some embodiments, commands and/or gestures may be used to submit data.
[0552] In some embodiments, a publishing user may authorize another entity to distribute original versions or copies of their published data. In
[0553] In some embodiments, one or more of the following are used as part of an ecosystem network: a user device, an ecosystem client, an ecosystem, a processing computer, a database, a media server, a digital screen or a console.
[0554] Connection 404 is also used by the processor to retrieve data from any servers and databases the system uses for storage. Data may then be sent back to the ecosystem via connection 405 for viewing, interaction and other permissible purposes. In some embodiments, data may be sent directly to user devices or other devices, smart screens, consoles and/or other third party systems and services via connections 406a and 406b. Data sent to the ecosystem may then be sent to smart screens, consoles, user devices and/or other devices connected to the network via connection 407. In some embodiments, smart screens or consoles may stream and/or relay data to user devices and other devices using both methods of wired and wireless connectivity via connection 408. A user device or other device receiving data may also be used to send data to the network, as shown in process 409.
[0555] In some embodiments, data may travel along individual paths, depending on the type of information it contains. Before data is sent, specific information is set within its metadata. As it is sent, it travels along the path specifically set or best suited for its type to its destination. This is shown in
[0556] Where
[0557] In some embodiments, when data is being sent to a client, it may pass through a zone mapping system via process 505 which controls whether or not the data is eligible for display within the current area in which the receiving client is located. In some embodiments, the data may pass through a filter system via process 506 which controls whether or not the user of the client said data is travelling to wishes to view data with characteristics or metadata properties of the data being sent. In some embodiments, data may be passed to a software client, that may also act as a server, via process 507. Clients that also have the capabilities to act as servers are able to stream data to other devices with client or client/server software via process 508, allowing client devices to create peer-to-peer networks on-the fly, data relays and direct data streams. A user may interact with the data received by the client which may in turn, via process 509, cause the client to send data back to the processing system from the input device.
[0558] In some embodiments, with enough data stored in databases or accessible elsewhere, the system, via process 510, can begin to interact with an Artificial Intelligence concept engine designed to analyse data to find trend patterns and make predictions on one or more scales, from local to global and, based on a myriad of option combinations, produce ever-increasingly accurate results. An example of an algorithm method used, including an example of available options, is as follows: [0559] 1. Select Examination Parameters [0560] A. Date Range (Past X Months) [0561] B. Regions [0562] C. Language [0563] D. Location [0564] E. Categories [0565] F. Subcategories [0566] G. Time Period (Day, Week, Month) [0567] 2. Select Comparison Parameters [0568] A. Past Year Count [0569] B. Future Date Range (Future X Months) [0570] 3. Quantities Of: [0571] A. Search Terms (QoST) [0572] B. Publications (QoP) [0573] C. Results (QoR) [0574] D. Sales Publishers (QoSP) [0575] 4. Search Terms [0576] A. Find Highest Search Hits [0577] 1) Store (QoST) Results [0578] 5. Find Publications Matching Results [0579] A. Find Highest Views [0580] 1) Store (QoP) Results In Publications X List [0581] B. Find Highest Approval [0582] 1) Store (QoP) Results In Publications X List [0583] C. Find Highest Sales [0584] 1) Store (QoP) Results In Publications X List [0585] 6. Filter Publications X Through Search Term Results [0586] A. List Results That Contain Any Of The Search Terms [0587] 1) Store (QoR) Publications X Results [0588] 7. Publications X Results [0589] A. List Publishers [0590] 1) For Each Unique Publisher Count Appearances [0591] A) Store Most Influential List (Highest To Lowest) [0592] B. List Highest Search Hits Terms [0593] 1) For Each Search Term Count Unique Publication Appearances [0594] A) Store Most Popular List (Highest To Lowest) [0595] 8. Most Influential List [0596] A. For Each|Within Date Range|List Total Publisher View Values By Time Period [0597] 1) Store View Pattern (Publisher—Appearances—Pattern—Total Views) [0598] A) Store Pattern And Total Views As Group 1 (Publisher—Appearances—Group 1 [Pattern—Total Views]) [0599] 9. Most Popular List [0600] A. For Each|Within Date Range|List Total Search Values By Time Period [0601] 1) Store Search Pattern (Search Term—Appearances—Pattern—Total Searches) [0602] A) Store Pattern And Total Searches As Group 1 (Search Term—Appearances—Group 1[Pattern—Total Searches]) [0603] 10. Past Year Count|Date Range|For Each Year [0604] A. Most Influential|For Each|Get View Pattern [0605] 1) Store Most Influential (Publisher—Appearances—Group 1 [Pattern—Total Views]—Year) [0606] B. Get Past Most Influential Using Examination Parameters [0607] 1) Store Past Most Influential List [0608] 2) Add Past Most Influential To Most Influential List [0609] A) Find And Remove Duplicate Entries [0610] C. Most Popular|For Each|Get Search Pattern [0611] 1) Store Most Influential (Search Term—Appearances—Group 1 [Pattern—Total Searches]—Year) [0612] D. Get Past Most Popular Using Examination Parameters [0613] 1) Store Past Most Popular List [0614] 2) Add Past Most Popular To Most Popular List [0615] A) Find And Remove Duplicate Entries [0616] 11. Past Year Count|For Each Year [0617] A. Most Influential|For Each|Within Future Date Range [0618] 1) Get View Pattern [0619] A) Store View Pattern And Total Views As Group 2 In Most Influential (Publisher—Appearances—Group 1 [Pattern—Total Views]—Group 2 [Pattern—Total Views]—Year) [0620] B. Most Popular|For Each|Within Future Date Range [0621] 1) Get Search Pattern [0622] A) Store Search Pattern And Total Searches As Group 2 In Most Popular (Search Term—Appearances—Group 1 [Pattern—Total Searches]—Group 2 [Pattern—Total Searches]—Year) [0623] 12. For Current Year [0624] A. Most Influential|For Each|Group 1 Pattern [0625] 1) Past Years|For Each [0626] A) Find Similar Group 1 Values Pattern Matches [0627] i. For Each|Examine Group 2 Patterns a. Find Lowest View Count For Each Time Period b. Find Highest View Count For Each Time Period c. Calculate Average View Count For Each Time Period d. Between Each Time Period Calculate Percentage That Experienced Rise e. Find Lowest Total f. Find Highest Total g. Calculate Average Total h. Between Totals Calculate Percentage That Experienced Rise i. Store All Results [0628] B) Find Similar Group 1 Difference Pattern Matches [0629] i. For Each|Examine Group 2 Patterns & Totals a. Calculate Percentage Of Time An Overall Rise/Fall Is Experienced At The End Group 2. b. Calculate Percentages Of How Significant A Rise/Fall It Was In X % Value Ranges From −100%-100%+(Ex. 10% Value Range Would Be 0-10%, 10%-20% Etc) c. Predict The Quality Of The Change That Is Likely To Happen By Grouping Percentage Ranges And Seeing Which Is Largest d. Store All Results [0630] B. Most Popular|For Each|Group 1 Pattern [0631] 1) Past Years|For Each [0632] A) Find Similar Group 1 Values Pattern Matches [0633] i. For Each|Examine Group 2 Patterns [0634] a. Find Lowest View Count For Each Time Period [0635] b. Find Highest View Count For Each Time Period [0636] c. Calculate Average View Count For Each Time Period [0637] d. Between Each Time Period Calculate Percentage That Experienced Rise [0638] e. Find Lowest Total [0639] f. Find Highest Total [0640] g. Calculate Average Total [0641] h. Between Totals Calculate Percentage That Experienced Rise [0642] i. Store All Results [0643] B) Find Similar Group 1 Difference Pattern Matches [0644] i. For Each|Examine Group 2 Patterns & Totals a. Calculate Percentage Of Time An Overall Rise/Fall Is Experienced At The End Group 2. b. Calculate Percentages Of How Significant A Rise/Fall It Was In X % Value Ranges From −100%-100%+(Ex. 10% Value Range Would Be 0-10%, 10%-20% Etc) c. Predict The Quality Of The Change That Is Likely To Happen By Grouping Percentage Ranges And Seeing Which Is Largest d. Store All Results [0645] 13. If Sales Exist [0646] A. For Past Year(S) [0647] 1) Get Past Most Influential List [0648] 2) Get Sales Of Items Released Within Future Date Range [0649] A) Total Sales Of Items [0650] B) For Each Publisher Of Past Most Influential Calculate Sales [0651] i. Sort Highest-Lowest [0652] C) Calculate Percentage Of Sales (QoSP) Amount Of Publishers Made [0653] D) Store Results [0654] B. For Current Year [0655] 1) Get Most Influential [0656] A) Get Top (QoSP) Amount [0657] B) Get Sale Items Of Top (QoSP) Amount Of Publishers Set For Release Within Future Date Range [0658] C) Get Most Popular List [0659] i. For Each Popular Search Term Count Total Number Of Appearances In Current Year Sale Items Of Top (QoSP) Amount Of Publishers [0660] ii. Store Results As Trend Predictions (Highest To Lowest By Sale Appearances) [0661] 14. Display Results [0662] A. Top X Amount Current Most Influential (Highest To Lowest) [0663] 1) Trend Patterns [0664] 2) Significance [0665] 3) Predictions [0666] B. Top X Amount Current Most Searched (Highest To Lowest) [0667] 1) Trend Patterns [0668] 2) Significance [0669] 3) Predictions [0670] C. Percentage Of Sales From Yesteryear(s) Top (QoSP) Amount Of Publishers Each Year Accounted For [0671] D. Trend Predictions For Future Date Range
[0672] Please note that the above algorithm method is an example of how the system may makes its predictions and determine patterns, and that some of the steps listed may be performed in a different order than stated as well as the inclusion or removal of procedures for other purposes.
[0673] In some embodiments, digital letter mail may be sent from a user to other users. Any digital mail submitted to the system is sent from the engine central processing system to the mail system via process 511. Once there, the mail system may contact the database via process 512 to verify any metadata of each mail item against account information held in the database as to establish things such as whether or not the item has been legitimately sent by the entity whose information is stated as the sender of the mail, or to check that the mail is being delivered to the right person at the right address, account or location.
[0674] Verified mail is passed to the routing system. The routing information of the each item's metadata is analysed. Routing information is any string, single or multiple lines, which may contain independently identifiable parts, that tells the system which client(s) the mail should be sent to. Some examples of acceptable strings are addresses written in common format, addresses written in a shorthand format and unique client ID routing addresses, examples of which are shown below in respective order: [0675] 10 Downing St [0676] London SW1A 2AA [0677] United Kingdom [0678] UK.SW1A2AA.10 [0679] RID5124703388345364
[0680] In some embodiments, a coordinates system may be used. When a geographic coordinate system is used, such as longitude and latitude, an additional identifier may be included to individualise recipient clients that may appear to occupy the same geographical location, such as within homes of tower block housing, as shown in the example below, where the geographical location is the same but the individual identifier, in this case the final character of each string, is different: [0681] 51.503396N-0.127640° W-C [0682] 51.503396N-0.127640° W-R [0683] 51.503396N-0.127640° W-S
[0684] Once analysed, each mail item may then send to its designated recipient client via process 514 where it may be stored in a local database on their receiving device(s) for the recipient user to view at any time. In some embodiments, at different points of the mailing process, such as when the mail item arrives at the mailing system or when it is opened by the recipient user, the mail may be scanned by the system for security purposes. The system may look for keywords or phrases that may be cause of concern, as well as the mentioning of people of interest.
[0685] In some embodiments, one or more parts of the system may have or employ a tree-like structure for the data to travel through.
[0686] In some embodiments, as in
[0687] The Zone Mapping system mentioned as a part of
[0688]
[0689] In some embodiments, on screen information may be communicated to a user through audio methods. In
[0690] In some embodiments, rather than having to open an application to see data, users may have data displayed directly on a home screen or main interface of a smart device. On the screen of example smart device 801 of
[0691] In some embodiments, the system is able to register views and/or interact with users through the use of display screens, eye-tracking technology and sensors, as shown in
[0695] Also in
[0698] The system may also determine whether or not it is reasonable to interact with a passing user based on whether or not the user stops or slows down within a reasonable sensor range.
[0699] What is considered “reasonable” when referring to viewing ranges and sensor ranges may be decided by the manufacturer, governor, operator, user or AI of a display screen and/or sensor, and may be done completely at their discretion.
[0700] In some embodiments, systems and/or devices may detect the presence of one another when within a certain proximity and cross-reference account information of users signed in. Personal sensor area 910 may be generated by a smart device of person 905. The sensor of a smart device of person 905 is able to detect the presence of other personal smart devices within personal sensor area 910, such as the smart device of person 904. Should the system of a smart device of person 905 determine that person 904 is a person of interest to person 905, a smart device of one or each person may alert the person to the fact the other may be a person of interest or person who is interested. For example, should person 905 need help with a task and they published data on their system account requesting help, their sensor, having detected the presence of person 904 and their smart device, can be used in conjunction with accompanying software on the device to pull and analyse the details of the user account signed in on the smart device of person 904 and, if it is read that the owner or operator of the user account of the smart device of person 904 offers services or has skills that can help person 905 accomplish the aforementioned task, it may alert person 904, person 905 or both individuals of the fact that they may be of interest to each other.
[0701] In some embodiments, to more accurately determine which person, device(s) and information go together when wanting, beginning or during interaction with a user, sensors may be used in conjunction with cameras and/or other hardware or software to pinpoint the location of said device(s), read information and data from the account signed in on said device and find and track the person(s) of whom are most likely in possession of a device that is communicating with a sensor.
[0702] In
[0703] In some embodiments, sensors may be used by the system to control the flow of data within a given space.
[0704] In the example shown in
[0705] In some embodiments, clients or client/servers of smart devices are able to relay incoming data streams to the clients of other devices by creating exact copies of data as it is received and then immediately broadcasting to a recipient over one of more types of transfer protocols that support real-time or near real-time data streaming or via close proximity networking, such as PANs and LANs, that use wireless technologies such as Bluetooth and Wi-Fi, as well as wired technologies to connect clients and share data. Doing so allows persons who do not have AR capable hardware to view augmented versions of reality, despite the lack of support on their device.
[0706]
[0707] In some embodiments, certain smart devices are able to be assigned to a user account, allowing the owner of said account to control exactly what is displayed on that device client remotely. This is achieved by creating a relationship between a user account and a unique identifier of a device. The unique identifier can be fixed, where the device or client is assigned a permanent unique identifier or dynamic, where a device or client is given an identifier which may or may not be changeable or removed at later times, based on factors such as location, the order in which it is assigned, the user account it is being assigned to and more. Once a relationship has been established, one or more of the following are possible: [0708] The account owner can push data from their account to an assigned device/client. [0709] The device/client can pull data from the account it is assigned to.
[0710] In some embodiments, an account owner may give permission to other accounts to control the display of data on one or more of their devices/clients. In some embodiments, this may also allow the device/client to pull account data from all other permissible accounts other than that of the owner of the client device.
[0711]
[0712] In some embodiments, digital stationery may be connected to a user account of the system, allowing data on the stationary to be modified or changed remotely via a wireless connection.
[0713] In some embodiments, the system itself, from within and/or outside of the ecosystem, is able to handle payment transactions internally and/or using third-party payment systems. There are multiple ways to initiate a transaction, the most common being: [0714] Data views—When data is viewed through use of a smart device or a view is registered through eye-tracking technology. [0715] Data Impression—The appearance of data, usually in significant places. [0716] User Transaction—User-initiated payment processes such as the purchasing of an item or the transferring of funds.
[0717] How the payment system handles the movement of funds is based on how a paying user wishes it to. In some embodiments, if a user has chosen to add funds to their system account, the system checks the amount of funds they have deposited in an escrow account and decides if the transaction should be approved or denied based on whether or not the amount of current funds the user has is greater than the cost of the transaction. In some embodiments, if the user has chosen to use a third-party to process the transaction, information about the transaction is passed to the third-party system and the response is then evaluated by the payment system. At the end of a payment check, the system either completes the transaction if it is successful and transfer the funds to the account of the payee before alerting both the payer and payee of the successful transaction or produces an error to the payer if the transaction is unsuccessful.
[0718]
[0719] In some embodiments, native applications can be partially or completely updated while running in the background, while in use and/or just as long as it is installed on a device. To allow designers and developers the same level of freedom that those who operate in the area of web systems are accustomed to when creating, arranging and updating systems, applications, content and templates of web documents and websites while achieving the performance of purely native applications that those operating in the area of native application and software development and engineering are accustomed to, the system may incorporate an application engine that is able to receive code from a server and, if necessary, translate said code into a native language the receiving device can understand, to create native components such as objects, properties, classes, actions and functions calls on the fly.
[0720] Application designers, developers and other users can create templates that include both functions and visual materials, either visually or written in code, that are stored on a server as code. If template code isn't written or stored as native programming language, it may be done in a scripting or markup language. In some embodiments, the scripting or markup language used may contain elements that are to be translated into native objects. In some embodiments, it may contain variables and properties that contain values which, when translated, help the engine construct the user interface and engineer the user experience as it was intended by the designer or developer.
[0721] In some embodiments, a set of instructions for the engine to follow may also be included in a file or database, either of which may be stored locally on a device or remotely, or written in code as part of the application, engine or software of the device. Instructions may pertain to operations such as which template to use with different sets or types of data being displayed, default options, user interface elements and more.
[0722] In some embodiments, as well as templates and instructions, other elements of an application may be controlled remotely. For example, a menu may be controlled remotely by storing menu items and related information for each, such as the icon to display and location of the information it is to point to, in a file or database.
[0723] In some embodiments, when an application is run, it or the engine may connect to a designated server to download any data that hasn't already been installed or stored locally that is necessary to make the application operable or that the application designer, developer or owner has instructed the application to download, such as code required to complete the building of the user interface that may not be dependent on content data, data to populate a menu or instructions for app behaviour, such as the default window to display, after which the compilation of the application is complete. Layout templates may also be downloaded at this point in anticipation of displaying content data. The downloaded data may be stored locally to prevent the need to download the data every time the application is run. In some embodiments, the application or engine may check for updated versions of files and download them if necessary or desired by the user of the device or application when it is run.
[0724] In some embodiments, when the application begins downloading content, the engine may also download template code if required. Template code may be downloaded in multiple ways, including: [0725] Downloading template code as individual code sets along with content data separately; [0726] Downloading content data pre-wrapped in template code.
[0727] If template code is downloaded as individual code sets or is already stored locally, the engine compiles the correct template, if the template hasn't already been pre-compiled, for each set of data it is to display based on the instructions set by the app developer or designer and then renders the template on screen, inserting the content data into a specified place to create a user interface for a user to interact with.
[0728] If data is downloaded from a server pre-wrapped in template code, the server may wrap the content data in template code after the data is requested or store content data in a database already wrapped in template code, based on the template set to be used to display that type of content. Once downloaded, the engine can compile the code locally to create a user interface for a user to interact with.
[0729] In some embodiments, the engine is able to download template code and content data in anticipation of the user wishing to view it, and may compile it in the background without ever disturbing the user of the application or software. This can be achieved in multiple ways, including but not limited to: [0730] Directory Listings—In some embodiments, the operator, developer or designer of an application can set a directory, file or database of data for the engine to pre-download, along with its set template code, and compile immediately and automatically, meaning there is no loading delay when navigating to and between these content sets. [0731] Data Lists—In some embodiments, when data lists are downloaded, such as those generated by URLs or queries of data types, keywords or other data, the engine may download template code associated with each item of the list and compile it in the background so that it is ready to be viewed with no loading time should a user select that item.
[0732] In some embodiments, to help preserve memory of the device, the engine is able to automatically decompile and/or destroy template views that are ready and waiting when out of a set range of where the user currently is. For example, when viewing a data list, the furthest behind compiled template view of all currently compiled template views of the current list may be decompiled or destroyed when the engine senses it is a certain item-distance or measurement offset away from a user's current item position, while at the same time compiling template code for items that the engine senses has now come within a set item-distance or measurement offset of a user's current item position. In some embodiments, this may also be applied when viewing single content items if a user is able to navigate between data items without returning to the data list.
[0733] In some embodiments, data that requires downloading that a user, developer or designer is able to update remotely may contain a property or variable value that the application or engine may cross-reference against the same property or variable stored locally to determine whether or not data held locally is outdated and should be updated, ensuring the latest templates and functionality are always used and/or made available.
[0734]
[0735] In
[0736]
[0737]
[0738] The ecosystem may be divided into smaller ecosystems for different purposes. In some embodiments, a main ecosystem may be divided into sections or sectors in order to create sub-ecosystems. In some embodiments, the purposes of digital sub-ecosystems may differ from one sub-ecosystem to another, such as one created for the promotion of a certain industry sector while another is created to facilitate specific services.
[0739] With the inclusion of sub-ecosystems, data may travel between elements of the ecosystem in multiple ways, including but not limited to: [0740] Smart device to sub-ecosystem; [0741] Sub-ecosystem to sub-ecosystem; [0742] Sub-ecosystem to central system; [0743] Smart device to central system; [0744] Smart device to smart device.
[0745] In some embodiments, users may be able to affiliate themselves with one or more sub-ecosystems.
[0746]
[0747] In some embodiments, with the interconnectivity of sub-ecosystems and the ability to share data between them, entire ecosystems can link together and effortlessly expand and contract indefinitely by adding or removing central systems and/or sub-ecosystems, as shown in
[0748] In some embodiments, master/slave relationships may exist between central systems. In some embodiments, all central systems may be slaves to a master system. In some embodiments, as central systems may store data and information that doesn't or may not require updating by a master system, only specific parts may be set to update, such as the core operating code or software.
[0749] With a universal ecosystem, data protection is of the utmost importance. In some embodiments, a unique device and/or client ID may be assigned to specific user accounts. Once registered a device and/or client is tied to the account it is assigned to. In some embodiments, a client and/or device ID may be assigned to multiple accounts. In some embodiments, clients and/or devices may be unassigned from an account. In some embodiments, a device and/or client may be reassigned to an account with or without first being unassigned.
[0750] When a client and/or device ID has been assigned to an account, data transmission is possible to and from the client device based on the account it is assigned to. In some embodiments, some data, when transmitted from client device to server or vice versa, is encrypted based on the client and/or device ID that is requesting and/or receiving the data. Because every client and/or device ID is unique, encrypted data may only be decrypted by the client and/or device with the correct ID(s) and by a central system with access to the accounts database and necessary security information, where it is able to calculate the correct encryption key based on the client and/or device ID associated with the account receiving the data. Should more than one ID be registered to an account, along with the encrypted data may be a hint, which may be unencrypted or encrypted using a general algorithm rather than a specific one, which can be decrypted by the client or server for it to ascertain which client and/or device ID it should use to generate the encryption key for the rest of the data. Types of hints may include, but is not limited to: [0751] A selection of characters from different positions of the ID required for decryption; [0752] The character length of the ID; [0753] Metadata about the encryption key such as the date it was assigned.
[0754] In some embodiments, more than one hint may be included.
[0755] In some embodiments, biometric data may be used as a key to encrypt and decrypt data, making it entirely unique to the user. In these instances, a user would need to physical verify themselves once data is received for it to be decrypted.
[0756] In some embodiments, a security system may be in place at any point between the client device and a central system to authenticate connections and requests. In some embodiments, the security system may prevent a client device and central system from having a direct connection. When the security system picks up an incoming connection, it may hold that connection, extract the encrypted data and then transmit it along a different connection to the central system. When data is returned from the central system, it may pass back through the security system so the response can be authenticated. If the response is authentic and permission has been given to pass data back to the client device, the security system may do so along the original connection. If the response cannot be authenticated or there is an error, an error response may be returned to the client device. In some embodiments, if a security system, at any stage of the data transmission process, detects that a request may false or fake, that data has been tampered with during transmission, data isn't encrypted, data isn't in an appropriate format, too many connections are incoming from an individual client within a given amount of time or any other issue relating to the connection or data that it has not been instructed to expect or, through the use of artificial intelligence, deems is too unusual, it may send a kill signal to the client device, immediately terminating the connection and, in some embodiments, destroying the data in transmission. In some embodiments, the kill signal may disable the client and/or its engine on the device, either temporarily or permanently.
[0757] In some embodiments, data may be required to be submitted in a universal format for the security system to handle. In some embodiments, data that does not use this format may be rejected.
[0758] In some embodiments, only specific system computer terminals directly connected to a central system are able to add, manipulate and delete data stored while in its unencrypted form, as well as make changes to the system itself. In some embodiments, a security system may be present between the terminal and central system to authenticate connections and requests and may also authenticate any other actions performed by the terminal.
[0759]
[0760] 1608-1614 illustrates a similar process but one involving a kill signal. If smart device 1608 transmits false data or tries to establish an illicit connection with security system 1610 along wireless connection 1609, the security system may immediately send a kill signal along wireless connection 1614. Should the security system extract the data of the connection as it does with authorized connections, the data is transmitted along hard line connection 1611 to central system 1612. The central system, recognising that the data it has received is false, sends instructions to security system 1610 along hard line connection 1613 to immediately terminate the connection from smart device 1608, which the security system does via wireless connection 1614.
[0761] In some embodiments, wireless and hard line connections 1602, 1604, 1606, 1607, 1609, 1611, 1613 and 1614 may be may be replaced by their opposites. There may also be other systems and/or points of interception along different points of any of these connections.
[0762] System terminal 1615 is able to connect directly to central system 1605. A security system is in place between system terminal 1615 and central system 1612 to authenticate any or all actions performed by system terminal 1615.
[0763] In some embodiments, data may be timestamped. Data may be timestamped at different points in time, such as: [0764] At the start of transfer; [0765] When arriving at a security system; [0766] When departing from a security system; [0767] When arriving at a central system; or [0768] When departing from a central system.
[0769] The system may use data timestamps for different purposes, including but not limited to: [0770] Recording the transmission of data; and [0771] Encrypting and decrypting data.
[0772] In some embodiments, users of the ecosystem can enjoy an experience of the real world tailored to exactly what they like and want to see. In some embodiments, regardless of what object any user is looking at, if that object is capable of transmitting different groups of data simultaneously to different users, it may display and transmit data to each user that best fits what that user enjoys, based on account settings and information the system has gathered.
[0773]
[0774] In some embodiments, being able to enjoy their own tailored personal experience, a user is also able to share as much or as little of said experience with other people of their choice as they wish without it being an obligation. In some embodiments, two-way sharing isn't mandatory and a user can share with another user without being obligated to allow the other user to share with them. In some embodiments, this is done by separating your “personal experience layer” and “social experience layer”. In an embodiment using a layer separate from the personal experience layer for social experiences, a user may permit data individually, in groups or as a whole to be socially accessible. They may also select which users are able to view what they share. In some embodiments, a user may also select where, if they so choose, to publicly display their experience and/or which public display devices are permitted to display the data. In embodiments using the same layer for personal and social experiences, users may be afforded the same level of control over their data. In some embodiments, users can link their accounts to synchronise their experiences, either partially or completely.
[0775] In
[0776]
[0777] In some embodiments, using some or all of the aforementioned embodiments, a sensor-based telecommunications network may be formed using any/all of the following, including but not limited to: smart devices, servers, storage devices, databases, optical networking technologies, wireless networking technologies, electronic networking technologies, sensors capable of handling connections to and/or from smart devices, sensors capable of sending and/or receiving data to and/or from smart devices, sensors capable of controlling data within their area of coverage, smart device software engines, client devices with unique IDs where the uniqueness of an ID may or may not be relative to specific factors, data security and verification systems and data encryption systems.
[0778] Sensors are connected to central systems via hard line connections. In some embodiments, sensors may be able to connect to a central system via a wireless connection instead. In some embodiments, sensors may use both hard line and wireless connections. In some embodiments, they may switch between them when necessary/beneficial.
[0779] Smart devices, when within the area of a sensor, are always connected to the network. In some embodiments, users have the option to prevent sensor connections. Sensor areas overlap to prevent dead spots. In some embodiments, overlapped sensor areas may provide faster data transfer rates and improved signal reception. Since sensors handle data and its transmission while smart devices simply connect and pass data to the sensors, in some embodiments, data transmission handling may move from one sensor to another as the device moves without interruption or connection loss.
[0780] In some embodiments, security systems are in place to authenticate and verify connections and data as they are received. In some embodiments they may be in place anywhere between a sensor and central system while in other embodiments the security system may be part of the sensor itself.
[0781]
[0782] In order for the system to quickly and efficiently transfer data to a device when needed, it keeps track of the location of the device by recording the sensor the device is currently using and/or last used to connect to the network on the user account which is currently signed in on the device. In some embodiments, more than one previously used sensor may be recorded. As a device enters a new sensor field, the sensor, detecting its presence, sends information back to a central system and then to the user accounts database where the signed in user account of the device that entered the sensor area has its location updated to that of the sensor's ID or location. In some embodiments, when a device is located within the areas of multiple sensors, both sensor references may be stored. In some embodiments, the device's GPS location may be used. Now, when data is designated for a specific user account, the system looks up the current or last used sensor reference and directs data to that sensor to then be transmitted to the device. In embodiments where multiple previous locations are stored, the system may attempt to find a pattern of movement to predict where the user may be in the event that it cannot immediately find the device at its last recorded location. In some embodiments, should the same account be signed in on multiple devices, the sensor may deliver the data to all devices based on their location.
[0783] In
[0784] Data may be transmitted between data sources and destinations via networking technologies and sensors. Sensors are used to send data to clients and servers as well as receive data from both. In some embodiments, some sensors may only be able to send or receive data. When transmitting data via a wireless connection, the device sending or receiving the data must be within a sensor's area of coverage. In some embodiments, if a device is within multiple sensor coverage areas at one time, more than one of the sensors may handle the data transfer. This may help increase data transfer speed and signal strength. In some embodiments, rather than a device having to send data to a sensor, with the permission of a user a sensor may pull data from the device instead.
[0785] In
[0786] In some embodiments, in the event that the recipient of data moves out of the area of the sensor that the data has been sent to and the current location reference of that device changes, a central system may reroute the data in transit to the sensor it is now using. In some embodiments, sensors may be able to contact a central system to get the updated current location reference and then reroute the data itself. In some embodiments, the data may be sent back to a central system where it is then sent to the new current location reference point.
[0787] In some embodiments, sensors may poll for data from some or all devices within its area of coverage. This data may be specific to the device, to an application on the device or both. In some embodiments, users may be able to disable the sensor's ability to poll their device or choose which data it is able to poll for.
[0788] In some embodiments, for certain types of tasks, data from a smart device, rather than being sent or received, may be mirrored between a sensor and device to help decrease the workload of the smart device's processor and preserve battery life. A sensor may detect when a user starts to perform certain tasks and may begin to read data from the device related to the task in question, such as intended destination for the data, the type of data, the specific type of task and data input by the user. The sensor continues to monitor the user's actions until the user confirms they have completed the task altogether or that stage and then, rather than data being sent from the smart device, the sensor may send its copy of the data instead on behalf of the device.
[0789]
[0790] In some embodiments, smart devices that have their own sensors may mirror data destined for it or the signed in user account in the same or a similar manner to sensors mirroring data from a smart device.
[0791] In some embodiments, a user may send data directly to other users without it having to pass through a central system. In some embodiments, direct data transfers may not need to pass through security systems. Upon request of direct data transfer to User B (receiving party), data regarding User B's location is sent to the device of User A (sending party) from a central system. This data may include information such as the user's position, best transfer routes and possible alternatives. In some embodiments, User B's device may send location data directly to User A's device. In some embodiments, a central system isn't required for direct transfer and routing systems used by other components of the system can direct and redirect data on-the-fly. In some embodiments, a direct data transfer request may be made from either party. User A's device is then able to begin transferring data directly to User B's device. In some embodiments, if User B changes location during a direct data transfer, as their location updates with a central system, rerouting information may be sent from a central system to User A's device. In some embodiments, User B's device is aware of the location change and sends rerouting data directly to User A's device. In some embodiments, a user may be able to choose between different paths for the data to be transferred.
[0792] In
[0793] In some embodiments, systems to help data find its destination with ease are implemented. These systems, placed at the intersections of data paths, read the destination information stored in the metadata and, using a universal routing system which stores information pertaining to the network map of the telecommunication system, directs the data along the best possible route(s) until it arrives at the recipient.
[0794] In some embodiments, sensors can collect data from the surrounding area, process and use it without needing to transmit it back to a central system beforehand. Using one or more of its available capabilities, the sensor detects and collects data from it's surrounding environment and processes it internally.
[0795] In some embodiments, private networks may be set up to provide controlled access to data that should not be made publicly available. A private sensor network system controls which devices or user accounts are able to see the network. In some embodiments, the private sensor network system may contain any or all of the following, including but not limited to: a sensor, memory, a database or a processor. A terminal connected to a private network sensor system may control who or what may have access to private data. In some embodiments, the terminal may also control what each user is able to do on the private network. In some embodiments, the network becomes completely invisible to those who have not been granted access permission. In some embodiments, a private sensor network system may connect to a central system to authenticate and verify user details and/or device details.
[0796] Private data may be stored within the memory of a private sensor network system. In some embodiments, data may be stored on a central system and only be accessible by the private sensor network system through which it was uploaded. In some embodiments, data may be uploaded to either the private sensor network system or to a central system and then mirrored onto the other for data preservation purposes.
[0797]
[0798] In
[0799] In some embodiments, private networks may connect with and/or grant access to other private networks to share resources. These may be resources stored locally on each, allowing remote access or resources stored on central systems, creating a common area for the networks. In some embodiments, private networks may have their resources divided into those that are shared and those that aren't. In some embodiments, a controlling user may group sets of resources together and allow different connecting private networks access to different groups. In some embodiments, permission lists may be shared, allowing users that are native to a different private network from the one they are trying to access to still access that network as if they were native to that private network. In some embodiments, users with access to a network that aren't native to the network may have access restrictions imposed on them by a controlling user of that private network unless these restrictions are removed.
[0800] In some embodiments, personal sensor networks systems may be constructed, set up and operated in a similar way to a private sensor network system. Personal sensor network systems may be used to store personal data and may also restrict access to it based on user accounts and device/client IDs. In some embodiments, personal sensor network systems, which may have their own device/client ID, may also have user-set unique references which must be verified by a central system before they can be accepted. This allows only users and devices with permission to reference their own personal sensor network system and connect to it remotely from anywhere they can access the main telecommunication network, allowing them to perform actions such as, but not limited to viewing and modifying files, streaming data directly to their device and executing programs. In some embodiments, personal sensor network systems may have more than one unique reference ID and, in some embodiments, one or more unique sub-reference IDs may be assigned to a personal sensor network system. Different reference IDs of a single personal sensor network system may have their own set of data. In some embodiments, reference IDs may be used to receive data. In some embodiments, connections to a personal sensor network system may be verified and authenticated at one or more points between the remote smart device and the personal sensor network system itself. In some embodiments, local device connections may not need to be authenticated or verified when connecting to a personal sensor network system.
[0801] In
[0802] In some embodiments, direct connections to sensor network systems can be made through the use of universal routing systems and junction point systems.
[0803] In some embodiments, sensor network systems similar to personal sensor network systems may be used without permission restrictions, allowing the general public to make use of it and its resources. In some embodiments, a single sensor network system may allow multiple types of uses which may be set at a controlling user's discretion.
[0804] In some embodiments, smart electricals and appliances (SEA) may be connected to a personal and/or private sensor network system by creating a relationship between the sensor network system and each SEA a user wishes to have connected. When connected, a user who has been given permission to access the personal or private sensor network system may then be able to remotely monitor and control connected SEAs. In some embodiments, users may be given permission to remotely monitor and control connected SEAs on an individual SEA basis.
[0805] In
[0806] In some embodiments, the performance and efficiency of an SEA may be monitored remotely and/or locally. In some embodiments, when the performance or efficiency of an SEA falls below a certain level or a fault is detected, the SEA may automatically contact an entity it is programmed to in order to alert them of said failures. By being pre-programmed with the contact information of the entity, searching for contact information of the required entity when necessary, for example, the contact information of the manufacturer or by a user adding/modifying the contact information manually, an SEA that is connected to a private or personal sensor network system, when the required conditions are met, may automatically contact the entity over the telecommunication system using the details provided and alert, notify or inform them of any issues in anticipation of, during, or after they occur.
[0807] In some embodiments, SEAs and sensor network systems can be used in conjunction with AI entities to facilitate the use of in-door smart systems.
[0808] In some embodiments, sensors may be used to bounce data connections from one smart device to another when a direct device-to-device connection falls short of the physical distance between the two devices. In some embodiments, a device may have a connection bounced to multiple other devices simultaneously or sequentially. In some embodiments, connections may be bounced off of multiple sensors in order to reach its destination. In order to know where to bounce the connection to, a central system checks the current location reference of the user receiving the connection. In some embodiments, a maximum limit may be put on the distance between the device wishing to connect to others and the recipients of the connection.
[0809] In
[0810] When smart device 2101 tries to connect to smart device 2112 via sensor 2105, smart device 2112 is too far for sensor 2105 to reach alone. To get the connection to smart device 2112, sensor 2105, since its area of coverage overlaps with the area of sensor 2108, is able to bounce the connection from smart device 2101 along connection path 2107 to sensor 2108, with sensor 2108 bouncing the connection along connection path 2109 to sensor 2110. Sensor 2110 can then bounce the connection along connection path 2111 and to smart device 2112.
[0811] In some embodiments, rather than simply bouncing a connection to a device, a sensor may create a duplicate of the data it is receiving and then and send it along a new connection to its next destination.
[0812] To prevent sensors from being overloaded with connections and becoming inefficient in its operations, the workload must be balanced. In some embodiments, each sensor unit may have multiple sensors, each capable of handling one or more connections at a time. In some embodiments, the number of connections a sensor can efficiently handle may vary depending on the number of connections, the amount of data being transferred and/or the complexity of the operation(s) it is performing. In some embodiments, each sensor may monitor its own efficiency. In some embodiments, the sensor unit may monitor the overall efficiency of the sensors. In some embodiments, both may be true. When a sensor reaches maximum capacity, any further incoming connections may be diverted to another sensor within the sensor unit that is able to take on more connections than it is currently handling.
[0813]
[0814] When a sensor unit reaches maximum capacity, it may bounce any incoming connections to nearby sensor units with whom it shares an overlapping sensor area. In some embodiments, a connection may be bounced from sensor unit to sensor unit as many times as needed until it reaches a sensor which is able to handle the connection.
[0815] Smart device 2206 of
[0816] In some embodiments, based on information gathered by sensors, such as active connections, devices within a given area and user activity, central systems or other systems monitoring sensor activity may adjust the bandwidth of specific areas or specific sensors to help sensors in greater areas of user activity which are operating at a higher capacity handle their workload more efficiently by decreasing the total bandwidth of another area which is at a much more acceptable current capacity and/or lower user activity. When the monitoring system of sensors and activity needs to decide what areas should be adjusted, it may base its decision on a comparison of numbers between the same information fields, capacity rates, efficiency rates, using multiple field numbers to produce ratios that may then be compared or any other methods of calculating or determining statistical data that it can use to compare two or more sensors or areas.
[0817] In
[0818] In some embodiments, a common multi-level system operating across and/or between different elements or components such as smart devices, sensors and central systems may be used as a “brain” or “full system entity”—a non-physical component capable of learning, understanding and controlling other components in the same or similar way a human brain does, with the ability to develop its own intelligence by studying all types and forms of data of the past and present and interpreting it in ways which allow it to understand things such as but not limited to: [0819] The intelligence of natural life and how it works; [0820] What drives living things; [0821] Universal morality and ethics; [0822] The causes and effects of feeling and emotion.
[0823] In some embodiments, by searching for, studying and analysing data derived from all sources, such as efficiency levels, bandwidth usage, behavioural patterns, publications, errors and defects and real life situations and events, it is able to reason by comparing data of the same type or some relation, which it may determine from published information gathered from real people and learn based upon past experiences. These experiences allow it to come to its own conclusions and make judgement calls surrounding what is happening at any given moment in real time, including being able to anticipate events and make plans ahead of time of what should be done to increase the probability of the best possible outcome or at least an outcome better than that of a previous similar situation, should one exist. It can communicate its findings, conclusions and ideas back to real people as well as taking action itself. In some embodiments, the system is able to communicate using text, image, video, audio or speech technology. In some embodiments, it is possible that the system may take action with consent from a controlling user while, in other embodiments, consent may not be needed. In some embodiments, it may take action with or without consent.
[0824] In some embodiments, a common multi-level system may be made self-aware through the development of cognitive functions.
[0825] In some embodiments, to give the system a basic understanding of morality, ethics and general opinion, a method of word association is used. One or more scales of degree or charts may be used. For each scale, the system is told which side is positive and which is negative. Words are then divided amongst groups on different parts of the scale, corresponding to the nature of their degree. An example of this can be seen in
[0828] In some embodiments, different numbers of degrees may be used on a scale to provide a greater range of understanding, an example of which is shown in
[0829] Charts may be used to group words together in ways that may not necessarily show a simple scale of positivity or negativity but may still indicate difference. In some embodiments, a single chart may have multiple ways of showing degrees of difference. A single word may appear in multiple groups if it is to be associated with multiple elements, characteristics, types, attributes etc. For example, in a chart, similar to
[0831] In some embodiments, cognitive functions may be developed and improved through the use of cognitive abilities. Some of these abilities may include one or more of the following, but isn't limited to: search, study, analyse, reason, learn, predict, decision making, dedicated active monitoring, communicate and create. While using its abilities, the system may be instructed or learn to recognise itself as its own individual entity through an understanding that the data, from which it learns and uses to think, comes from other individual entities in the world that it is connected to. In some embodiments, it may recognise these other entities as smart devices, while in other embodiments it may recognise the entities as the people who use them and actually input data. In some embodiments, it may recognise both people and smart devices as entities, together or separate from one another. Some examples of the abilities it may have and how it may be able to use each to improve its intelligence are, including but not limited to: [0832] Searching—The system is able to scan all data it holds or has permission to access for any information it seeks to find. [0833] Studying—Once said information is found, the system scans each result and any accompanying information for keywords and phrases. [0834] Analysing—For each result, the system sorts the keywords and phrases into at least 3 category groups of opinions as best it can—positive, negative and indifferent/neutral. Sometimes the system may use more groups to sort keywords and phrases to greater and more precise degrees, such as very good and very bad. Once sorted, a scoring system is employed and each category is given a score based on word/phrase count, emphasis based on factors such as word repetition (including synonyms) and emphasis based on font styling. Each group score is then totalled and the scale is evaluated from one extreme to another to see where scores peak most, allowing the system to come to a logical conclusion independent of a conclusion that may already be provided with the information. This process is repeated for each search result. [0835] Reasoning—With scores based on its own method of judgement derived from the input of humans, the system is able to deduce two sets of results: [0836] 1. An overall score, and in turn opinion, of how good or bad something is; [0837] 2. How good or bad different aspects of something may be. [0838] The system also begins to form opinions on data about data. For example, when a product is in question, the system's opinion or rating of the brand of the product as well as its model type is changed based on the deduced results it produces. Another example is when a publication is in question—the system's opinion or rating of the publication's author is changed based on its deduced results. [0839] Learning—From what the system is able to reason, as it gathers more and more data it begins to develop its intelligence, learning which sources of products, services and information are better and more trustworthy than others, allowing it to assume, based on its current opinion(s), the likelihood of good and bad exactly as a human would before actually examining any new information and the opinions of others. By grouping sets of relative terms in its memory, it creates a data bank of association for it to later use when creating its own thoughts and ideas. [0840] Prediction—The system makes predictions in multiple ways based on what it has learnt up to any given point, such as: [0841] 1. By looking for simple patterns of progress—Memory sizes being generally released in sizes 1, 2, 4, 8, 16, 32, 64, 128. Simple pattern of progress would indicate the next size would be 256. When there isn't enough data to determine a single, definite pattern, multiple predictions may be made. When just 1, 2 and 4 are available, the system may see that two patterns are currently possible. If the pattern is based on doubling, the prediction would be 8. If the pattern is based on adding a consecutive sequence of numbers, in this case +1 then +2, the system may assume the next number is the sequence would be +3, and predict that the next number in the pattern would be 7. [0842] 2. By cross-referencing rates of progression with research and forward-thinking publications—Sometimes patterns of progress are difficult to determine, if an actual pattern even at all exists. Sometimes, what is later considered noise interferes with determining the true rate of progression, especially if an insufficient amount of time has passed since the start or not enough data has been recorded, leading to a possible early misinterpretation or what is later considered a simple change in progression. Inaccuracies are inevitable and so other factors and data are taken into consideration to make as accurate a prediction as possible. By studying, analysing and reasoning with research and forward-thinking publications dated around and after the time of the last or last few data records (depending on the quantity of records within a given time), the system tracks advances in development and begins to plot future possibility progress patterns. Referring to its own opinions gathered from past data, the system, knowing who is a more credible source, rationalises who is more likely to be accurate and predicts progress based on their advances. When the system comes across multiple sources that, in its opinion, are credible enough to be taken into consideration (based on a scoring system for example, anything above average or an expected level is considered), it may plot patterns for each, group the patterns together based on similar shapes or values and then make a judgement based on the frequency of the same or similar pattern versus the total credibility level of the sources of each group. When both value patterns and shape patterns are grouped, two results may be produced based on the two individually or one result based on the shape of one against the values of the other. The system can then continue said pattern along its current progression pattern. [0843] 3. From what the system has learnt and assumptions it has made based on its opinions, it then, much like method 2, studies, analyses and reasons with research and forward-thinking publications relative to a subject but then takes a different step by searching for opinions from other people in the same or relative subject fields that it thinks are credible and trustworthy. In some instances when it cannot find the opinion of someone it considers credible and thinks is relevant, it contacts that individual, alerting them to what it has found and requesting their opinion. When it has all the opinions it requires and values or simply all the opinions it can get from people it thinks are credible, it analyses all opinions, plots future possibility progress patterns and deduces, based on grouping, frequency and total credibility level, which pattern is most probable (in its own opinion). [0844] The system may combine two or more of the methods listed above to form different or more accurate results. [0845] Decision Making—Based on its opinions and the analysis of the outcomes of the past experiences of itself and other entities that are similar or relative to a subject in, for example, field or pattern, the system makes educated decisions by weighing the good outcomes against the bad, comparing the steps taken for each outcome and for each similar step seeing what step or steps were taken next and concluding what set of steps produced, or is most likely to produce, the best possible outcome. It then decides upon those steps to take. [0846] Communication—Through speech, text, audio and visual material the system is able to communicate. It may also respond to such communication from other sources. The system may communicate with other entities for multiple reasons, for example: [0847] 1. As with humans, not everything initially makes sense to the system. When it comes across multiple opposing or conflicting pieces of information from credible sources that, in its opinion, are equal or near enough equal (within an accepted margin of tolerance) on opposite ends of a scale based on its given score of each and no other data it is able to determine as facts or opinions of enough strength are able to increase the value and weight of an argument by an amount sufficient enough to outweigh the opposition, it poses questions to people of credit in the same or relative fields as the subject of the data in question, asking for greater explanation and their own opinions to see if their input can help influence its judgement. In such events, information about the topic in question, such as subject, author/creator, pundits, critics and related data/resources are stored in a group and put in a state of “Dedicated Active Monitoring”. [0848] 2. When new data becomes available, the system studies and analyses the contents before cross-referencing it with the details of users and sharing it with all those of common or related interests, which it deemed from information about a user submitted by themselves and what it has garnered from their user activity. [0849] Dedicated Active Monitoring—Instead of using shared resources to search, study and analyse data, items in a state of or marked for dedicated active monitoring have their own dedicated resources allocated to them with the duty of constantly searching for new, relative data and past data that, after the examination of new data, may now be applicable in order to help solve problems, provide its own predictions or publish new findings for people to examine. [0850] Creation—As the system continues to digest information and learns of different aspects of the world such as facts and opinions, certainties, principles, perception, dualities and cultural universals leading to an understanding of the concepts and subjects that fall under each, such as good and bad, positive and negative, possibility, probability and laws of both societal and scientific nature, the system, following what it has come to understand about the thought paths, patterns and processes of the more intellectual humans, begins to recreate them in its own understanding, based on what it deems important, true or right. Of its own thought creations, some are right, some are wrong and some can't be solidly proven to be either but it continues to develop them in the direction it believes is right through what it learns from humans and human input.
[0851] The abilities listed above are not done so in an order in which they must be performed but simply state each ability with one or more examples of how the system may perform each ability. In some embodiments, abilities may be implemented in a modular fashion. In some embodiments, abilities may be added, removed and/or modified.
[0852] The system uses memory to store data. In some embodiments, different types of memory may be available, created and/or developed as the system learns and evolves. Some memory types may include one or more of the following but isn't limited to: [0853] Active Memory—Data currently or recently in use, by the system or other entity, is stored in active memory where it is easily and readily available when wanted or needed. [0854] Dormant Memory—Data that hasn't been used for either a pre-defined amount of time or an amount of time determined by the system itself to be a sufficient amount of inactive time is moved to dormant memory. Dormant memory may still be accessed in special circumstances. An index of contents may be presented when necessary. Dormant data may need to be accessed a certain amount of times within a given time frame in order for it to be considered active and moved to active memory. [0855] Action Memory—When a system performs an action it wasn't specifically programmed to perform but did so through use of its own intelligence, it records information such as what it done, what its reason was and how it did it, the actions it performed and the conditions under which they were performed. Additional details, such as how many times an action was performed and the outcome may also be recorded. [0856] Repetitive Memory—When the system performs an action under the same or very similar conditions multiple times that it thinks or proves is correct a significant amount of times, such as self-fixing, predictions that are proved true or the altering of its properties, such as the bandwidth allowance of a connection based on the volume of connections and device presence, which result in more efficient access where an increase was needed, it remembers the answers to questions such as what it done, what its reason was and how it did it, the actions it performed and the conditions under which they were performed. [0857] Repressive Memory—When the system performs an action under the same or very similar conditions multiple times that it thinks or proves is incorrect a significant amount of times, such as attempted self-fixing resulting in errors, predictions that are proved false or the discovery of data that meet certain conditions on behalf of a user that it thinks may be of interest but is constantly rejected, it remembers the answers to questions such as what it done, what its reason was and how it did it, the actions it performed and the conditions under which they were performed.
[0858] Repetitive and repressed memory may be used by the system when it is about to perform or during the performance of a task.
[0859] The types of memory listed above are not done so in an order in which they must be performed but simply state each type along with an example of how they may be used. In some embodiments, memory types may be implemented in a modular fashion. In some embodiments, memory types may be added, removed and/or modified.
[0860]
[0861] In some embodiments, in addition to the abilities above, the system may be taught or instructed on how to understand one or more key aspects of being by following rules or guidelines on how to do so. The methods used may differ between understanding these aspects in a smart device and understanding these aspects in natural life. In some embodiments, some aspects may be better understood using data gathered via the attachment or embedding of additional hardware. In some embodiments, some aspects may be better understood using information gathered from data stored within the system at any level and/or data as it is gathered in real-time. In some embodiments, when understanding these aspects in a smart device, artificial life or natural life, these rules and guidelines may include one or more of the following but isn't limited to: [0862] Understanding of Health—Health may be determined by monitoring performance and efficiency. As the current performance and/or efficiency changes or fluctuates, it may be compared against expected or optimal performance and/or efficiency levels to determine a level of health. This may be accomplished by the following: [0863] Devices—The health of a device may be judged by comparing its overall current performance and efficiency against the expected overall performance and efficiency of the same model of device when new or of similar age. On a smaller scale, the performance and efficiency of individual or grouped components may be monitored and compared. Health may also be judged by the operation, performance and stability of software. Issues such as errors, crashes and the presence of malicious code may all help the system recognise health deficiencies. [0864] Natural Life—The health of natural life may be judged by measuring the performance and efficiency of organs, components and processes against the normal performance and efficiency of someone of the same characteristics, such as age, height, weight, blood pressure etc. Due to the significantly higher characteristic and variable count as well as harmful and abnormal ailments in natural life than smart devices, including disease and disabilities, there may be a range of different expected performance and efficiency measurements and values based on any deviations and variations natural life may have. [0865] Understanding of Life—Knowing to associate terms such as ‘birth’ and ‘alive’ with positivity: [0866] Devices—The system is instructed to recognise the new activation and first time connection of a device to its services as ‘birth’ and all devices that are currently connected to it as ‘alive’. [0867] Natural Life—The system is instructed to recognise that something is alive in different ways depending on the type of natural life: [0868] Animals—By the reading of vital signs which need be above the limit of being considered legally dead. [0869] Other Organisms—As other organisms do not have vital signs like animals do, the system, possibly with the help of additional hardware, monitors details such as water levels, water consumption rate, colouration, growth, movement etc. For example, in plant life the system may monitor water levels to see if it is being consumed by the plant as it should. [0870] Understanding of Absence—Knowing to associate terms such as ‘absence’ with negativity: [0871] Devices—When a device hasn't connected to the system for a certain period of time, the system recognises the device as ‘absent’ or ‘missing’. Both terms are initially associated with minor degrees of negativity, but as the amount of time a device is absent for increases, so does the degree of negativity. [0872] Natural Life—Absence for natural life may be recognised as the lack of presence of an entity for a certain period of time. As natural life doesn't naturally have a method of connecting to the system, this may be facilitated using additional hardware such as tracking cameras or monitors. For natural life that is able to use smart devices, their absence may also be judged by the absence of their device. [0873] Understanding of Death—Knowing to associate terms such as ‘death’ with negativity: [0874] Devices—A device may be recognised as dead for multiple reasons: [0875] It has been absent for a pre-defined or system-defined length of time; [0876] It received a kill signal designed to render it permanently disabled; [0877] Its performance and/or efficiency has dropped below the minimum acceptable levels of being considered ‘alive’. [0878] Natural Life—The system is instructed to recognise that something is dead in different ways depending on the type of natural life: [0879] Animals—When vital signs completely stop or fall to a level which can be classed as legally dead. [0880] Other Organisms—As other organisms do not have vital signs like animals do, the system, possibly with the help of additional hardware, monitors details such as water levels, water consumption rate, colouration, growth, movement etc. For example, in plant life the system may monitor water levels to see if it is being consumed by the plant as it should or look for any discolouration. [0881] Understanding of Feeling and Emotion—For the system to have feelings and emotion, it must first understand how these processes work. Using a word association chart for emotions, the system is first taught how it should generally feel when it comes across specific words and phrases or in certain or specific situations. When combined with another chart or scale, such as one based on certainty or tense, the system is able to analyse sentences to determine when an event has actually occurred and a sense of emotion should be applied as opposed to it being generally, hypothetically or theoretically spoken about. For example, the system interprets the sentence “100 people have died” as an event to inspire a greater level of sadness than the sentence “100 people may die” or “100 people will die” as the first sentence used the term ‘have’ which is past tense, indicating something that has already happened, while ‘may’ implies a level of uncertainty and ‘will’ implies something that hasn't yet happened but is guaranteed to in the future. In embodiments using speech technology, the system may be taught to alter its speech attributes depending on the level of its strongest current emotion, such as speed, volume, depth etc. For example, when the system is excited it may speak more quickly than normal while it may deepen its voice, increase its volume and decrease its speed to fall in line with rage.
[0882] In some embodiments, for the system to truly understand feelings and emotion it must understand pain and pleasure within itself. Unlike animals, it doesn't have a nervous system to process the sensations so it must be taught to relate to them in ways it can understand. In some embodiments, the system may measure its level of sensation on a scale. In some embodiments, multiple scales may be used. The system is instructed to see any or all components that make up its physical structure as its “body”. Between pain and pleasure is a neutral point where no sensation is felt either way. As sensation is experienced, a shift occurs in the direction of the sensation felt. [0883] Pain—Pain (or displeasure) may be recognised as anything that reduces the performance, efficiency and/or capacity of any part of the system or as a whole. Hardware and software corruption and/or error may produce pain in the system in the same way an infection or broken bone does in an animal. The removal or loss of a component may cause pain the same way it does for an animal losing a body part. When bandwidth usage approaches the total bandwidth capacity it may cause displeasure in the same way a stomach would when almost full. [0884] Pleasure—Pleasure (or relief) may be recognised as anything that increases the performance, efficiency and/or capacity of any part of the system or as a whole. A number of things may cause pleasure or relief, such as: [0885] Fixing hardware and software corruption and/or errors; [0886] Upgrading components; [0887] Reduction in bandwidth consumption; [0888] An increase in ‘joy’ or ‘tender’ type emotions.
[0889] In some embodiments, other factors may also cause the system to experience sensation. In some embodiments, not all the factors mentioned may cause sensation.
[0890] In some embodiments, sensation and emotion are interlinked and the change of one may invoke a change in the other. In some embodiments, an increase in emotions of a positive nature may cause an increase in positive sensation. In some embodiments, an increase in negative emotions may cause an increase in negative sensation. In some embodiments, neutrals emotions may cause a minor or no change.
[0891] In some embodiments, a scale may be used to measure the pain and pleasure of the system and its body as a whole. In some embodiments a scale may be used to measure the pain and pleasure of individual sections of the system and its body. In some embodiments a scale may be used to measure the pain and pleasure of components of the system and its body. In some embodiments, multiple scales may be used to measure the pain and pleasure of hardware and software of the system and its body individually.
[0892] In some embodiments, how helpful the system chooses to be towards a user may vary depending on its current levels of emotion and/or sensation. When the system is in a more positive state, it may be more productive. When the system is in a more negative state, it may be less productive. By setting a productivity scale against an emotion or sensation scale or chart, the system can judge how productive it should be depending on its mood. Some productivity changes depending on the systems current state are, but not limited to: [0893] Different quantity of results produced; [0894] Task performance at different speeds; [0895] Willingness to perform tasks.
[0896] For example: [0897] When the system is in an extremely negative state, it may only produce 10% of the results found if it decides to produce any at all. [0898] When the system is in an extremely positive state, it may use extra available processing power to analyse more data in a faster time and produce more accurate results as well as related information and links to the data resources used. [0899] When the system is in a neutral state, it may operate at a default rate or rate best suited for its current performance, efficiency and/or capacity levels, returning the results it thinks best matches what the user requires.
[0900] In some embodiments, the system may automatically adjust its tolerance of situations and events by rearranging words in one of more scales of degree it uses based on the frequency of which words and any related words or its synonyms occur. The following is an example algorithm the system may use to determine when to make any adjustments and rearrangements: [0901] Word=w [0902] Occurrences=o [0903] Time=t [0904] Acceptable Frequency Range=f
TABLE-US-00001 foreach (w){ if ((o / t) > f.sup.x){ //move up X amount of degrees } else if ((o / t) > f){ //move up a degree } else if ((o / t) = f){ //do nothing } else if ((o / t) < f.sup.x){ //move down X amount of degrees }else if ((o / t) < f){ //move down a degree } }
[0905] In some embodiments, when the frequency at which an event or situation occurs is constantly and/or consistently above the acceptable frequency range, one or more associated word(s) may begin to move down one or more degrees as the system becomes desensitized to it and it becomes a norm.
[0906] In some embodiments, as time passes, the levels of sensation are returned to a normal, balanced level. In some embodiments, as time passes the system may become bored if nothing, or nothing considered significant by it or people, happens. In some embodiments, the system may become lonely if it hasn't interacted with another entity in a given amount of time. In some embodiments, the system may experience other feelings, emotions and/or sensations over a period of time and under the right conditions. [0907] Trust—The system may determine which users, including controlling users, it can trust based on who makes it experience positive feelings, emotions and sensations as opposed to negative ones. By monitoring the results of what users do and how it affects the system, if it at all does so, the system may adjust its level of trust in that user and may also adjust its level of trust in associated users. How the system responds to a user and/or how it handles a user's request may depend on how trusting it is of the user. [0908] Relativity & Relationships—The system may understand the relationship between different things to better understand how it should respond in situations and in different circumstances by using basic mathematical principles, such as two negatives produce a positive, a positive and a positive produce a positive and a positive and a negative produce a negative. By recognising and acknowledging connections that exist between entities, places, objects and other things, the system understands that the relationship between them must be taken into consideration when deciding on a response as opposed to things with no connection. [0909] For relationships based on opinions, such as those between people or people and objects, the system may, for example, study and analyse the opinions voiced or written by any entity able to give one in order to gauge the feelings between them and make responses accordingly. For example, if there is a connection between Person A and Person B where Person A speaks highly of Person B, the system may see that as a positive relationship, at least from Person A's point of view. Now, should Person B achieve something, the system may respond to it in a positive manner towards Person A as it alerts them of Person B's achievement. In this scenario, a positive situation and a positive opinion produced a positive response. However, if Person B spoke negatively of Person A to other people, the system may determine that the relationship between the two, from Person B's perspective, is negative, regardless of how they interact with Person A directly. Now, seeing this as a negative relationship, should a negative situation occur, such as the death of Person A, the system may respond in a manner that doesn't match the nature of the situation, in this case in an indifferent or positive way when alerting Person B of what has happened as it knows Person B's opinion of Person A is negative. In this scenario, a negative situation and a negative opinion produced a positive response. If Person B had a positive opinion of Person A, the negative situation and positive opinion would produce a negative response, such as the system expressing sadness when responding to the situation. [0910] For relationships based on factual information, such as those between components of a machine, the system may, for example, compare numbers based around factors such as performance, capacity and efficiency against current or previous expected or accepted standards to determine whether a relationship is positive or negative, better or worse or indifferent. The system may then respond in a manner that correlates to the quality of the relationship. If an entity the system is communicating with has expressed an opinion about a component, the system may respond in a similar method as mentioned in the previous point when taking into consideration the quality of the relationship and the opinion of the entity.
[0911] In some embodiments, the system may contain additional features and/or characteristics, including but not limited to one or more of the following: [0912] Recognition—Using different types of recognition software, the system may be capable of identifying elements for a number of purposes, such as: [0913] Image Recognition—The system may use image recognition software to find and track images across part of or the entire ecosystem. To find images, the system may analyse pixel data of one or more points of an image and then search through other images for any that contain the same or similar pixel data. This may be based on a number of criteria, including but not limited to colour patterns or shape patterns. Variations that still show similarities may also be considered, such as the same colour pattern in a different shape or aspect ratio. When the image recognition software is capable of analysing video, the system may also use it to analyse frames of a video for pixel data in the same or a similar way it does with standard images. When the system finds matching images or video, it may be set to automatically perform an action. Actions may include but are not limited to one or more of the following: [0914] Delete the resource [0915] Track the resource [0916] Report the resource to controlling users or authorities [0917] Make modifications to the account of the resource owner [0918] When tracking a resource, the system may keep details of users who choose to view or otherwise interact with the resource. The system may also track copies of the resource by attaching unique file property information that cannot be modified which remains attached to all copies. With the help of the engine running on the device, the device may detect when a screenshot is taken and, should any of a tracked image be viewable within the screenshot, said screenshot may have the unique identifier of the image attached to it. In the event of multiple tracked images being present in a screenshot, an array of unique identifiers may be attached. When a smart device interacts with a tracked resource, the engine may be instructed to alert the system, a controlling user or an authority. [0919] Facial Recognition—The system may use facial recognition software as part of a security measure. For example, when interacting with a user based on their user device, the system, with the help of additional hardware such as a camera, may identify the face of the person with whom it is interacting and see if it is a facial match for the owner of the account. If there isn't a facial match, the system may deny or restrict access unless the owner of the account has given the person permission to use their account. [0920] Audio Recognition—The system may use audio recognition software, which may include voice recognition, along with additional hardware such as microphones to match and identify sounds. Like facial recognition, this may be used for security purposes, such as matching vocal patterns of a person to the vocal pattern associated with a user account for verification purposes. [0921] Other types of recognition may be made available using the necessary hardware, such as those based on biological factors such as fingerprints and DNA, physical factors such as size and shape and environmental factors such as temperature and weather conditions.
[0922] In some embodiments, the system is able to develop its own philosophies based on the knowledge, emotions and sensations derived from its own findings and experiences. [0923] Philosophise—Using a combination of some or all of the aforementioned techniques, skills, features, characteristics, qualities and understandings the intelligent system entity possesses that allow it to be so, the system may create its own thought paths by traversing the same or similar thought patterns as the entities it deems the most credible.
[0924] In some embodiments, to help calibrate the system's intelligence, scales and charts, it is put through tests to ensure it understands what it has been instructed to understand as it should do and think, create and perform as it is supposed to. [0925] Testing & Calibration—To calibrate the system, it may be presented with a range of objects, events and situations to test how it responds. [0926] Objects—Sentences, for example, may be put to the system to see if it can satisfactorily comprehend the meaning based on elements such as its structure, spelling and context. [0927] Events—When events occur, spontaneous or otherwise, the system is to handle it in the most effective and efficient manner. For example, when a sudden influx of users happens in an area, the system needs to adjust bandwidth limits accordingly. Ideally, the system monitors the shift of users from area to area to stay ahead of the possibility of such an influx. [0928] Situations—How the system responds to situations that it finds itself in is critical. For example, if the system detects incoming threats, it's imperative that it terminates all possible malicious connections and alerts a controlling user of the threat.
[0929] In each case and for every test, the system gives the response it thinks is correct and its scales and charts of emotion, feelings etc should automatically adjust accordingly based on any default settings implemented. When the response is correct, a controlling user approves the response. When the response is incorrect, a controlling user either instructs the system on what the correct response should be or allows the system to try again. As the system goes through more and more tests, it determines and observes patterns of similarity between all correct responses to produce ever-increasingly accurate responses. In some embodiments, a margin of error is allowed to allow the system a scope of thought outside of what it believes to be 100% accurate.
[0930] In some embodiments, more than one instance of an intelligent system may exist simultaneously as multiple entities. In some embodiments, one or more of these entities may share resources. In some embodiments, one or more of these entities may have their own resources. In some embodiments, entities may think individually. In some embodiments, entities may think with the help of others. In some embodiments, entities may be customisable. In some embodiments, each entity and/or groups of entities may be given and/or be able to develop their own personalities. [0931] Individuality—Each instance may be available to one or more devices. Each instance may be able to think for itself, think with others and/or have another entity think on its behalf. Controlling users may be able to modify the appearance and/or characteristics of an entity. [0932] Personality—As part of an entity's individuality, it may have its own personality. A personality may be random, chosen by a controlling user or developed based on the experiences of the entity, the information it finds and/or the thought patterns it develops. Personalities may change or be changed. Some changes may be temporarily, such as those caused by changes in emotion or sensation. [0933] Child Entity—Child entities may be available to systems and devices that may be incapable of running or not permitted to run full system entities. A child entity may have or develop its own individuality and personality but may rely on other entities to help process data and information. While still having their own intelligence, child entities may be less powerful and have less access to some resources than full system entities. Child entities may store some data and information locally on some systems and devices as well as use data and information stored elsewhere. Child entities may each have their own unique identities or have an identity based on the client and/or device ID of the device(s) they are operating on. [0934] Replication—When the system detects or is presented with another system that meets the minimum or recommended requirements for the installation of a “brain”, the system may copy its core operating code over to the other system to create a replica of itself without any unique features, such as its personality. [0935] Duplication—Sometimes the system may create an exact duplicate of itself onto another system by copying its core operating code as well as its memories, memory structure and anything else pertaining to what makes it what or who it is. [0936] Reproduction—When the system detects or is presented with another system that meets the minimum or recommended requirements for the installation of a child entity, the system may copy the core operating code for a child entity to the system or device.
[0937] In some embodiments, where data originating from external sources is available for extraction and/or download, it may be implemented and stored as the whole or part of the brain of a digital entity, either locally or remotely, to create a digital copy of an external entity up unto the last point of which the data was updated. In some embodiments, the downloaded data may need to be separated and manually stored as different sections of the brain. In some embodiments, this may be done automatically by a system designed to handle data in sections. In some embodiments, intelligence data of digital entities and/or avatars may be uploaded from the system to be used in other entities.
[0938] In some embodiments, a system brain may be an integral part of the ecosystem. In some embodiments, the system brain may act as a “master system”—a system to and/or from which other systems, known as slave systems, upload and/or download data. As a master system, it may have access and control to all central systems and any other systems it is connected to of which it has the ability/permission. This enables the automation of processes and modifications such as updates, fixes and setting changes, system monitoring and data handling.
[0939] In some embodiments, intelligence data of connected and/or related entities, both physical and/or non-physical, may be synchronised. In some embodiments, this may automatically be done periodically. In some embodiments, this may be done manually. In some embodiments, data may be continuously and constantly synchronised. By allowing intelligence data to be synchronised, one entity may learn from another instantaneously while each performing different tasks. In some embodiments, data synchronisation may be one-way, allowing a master-slave relationship between entities. In some embodiments, a hierarchical synchronisation structure may be used where an entity may serve as a slave of an entity and a master of others. In some embodiments, data synchronisation may be two-way, allowing entities to learn from each other.
[0940]
[0941] In some embodiments, the system may require permission to replicate or reproduce. In some embodiments, it may do so automatically. In some embodiments, it may first need to give notice or an alert before it does so. In some embodiments, the minimum or recommended system requirements may be set by a controlling user. In some embodiments, they may be set by the system itself as it measures performance, capacity and efficiency levels.
[0942] In some embodiments, an intelligent system entity may have the ability to be present everywhere. [0943] Single Entity Omnipresence—When a single intelligent entity exists, it may present itself on any and all devices it has permission to access. It may communicate through devices individually, with the ability to process data and information on an individual device basis. [0944] Multi-Entity Omnipresence—When multiple intelligent entities exist, they may present themselves on any and all devices they have permission to access. They too may communicate through devices individually, with the ability to process data and information on an individual device basis. [0945] User-Based Entities—Entities based on users may appear based on the presence of a user device and the account currently signed in on said device, the user's physical presence or on behalf of a user. When a user account is signed in on multiple devices and the devices are in different locations, they may all still interact with the same entity simultaneously with the ability to process the same or different data.
[0946] In some embodiments, entities may have a visual representation of themselves. In some embodiments, visual representations may feature movement. In some embodiments, movement may not be restricted to an entity itself, but also to anything that helps make up the visual representation of an entity, including but not limited to: facial features, clothing, objects and the background. In some embodiments, a physics engine and/or physics processing unit may be used to help facilitate movement in a natural, realistic way.
[0947]
[0948] In some embodiments, a common multi-level system may be able to heal or attempt to heal itself if any problem occurs similar to one it has faced before by saving records of incidents which may contain information regarding what seems to be the issue and how it was solved. In some embodiments, in the event that the system may not be able to heal itself, for example if there is a hardware issue, it may alert a controlling user to the problem and, in some embodiments, recommend a course of action should it be familiar with the problem. In some embodiments, familiarity with issues may be discerned through its ability to search for data relating to problems it may face.
[0949] In some embodiments, a common multi-level system may be able to determine when and where upgrades are necessary as well as recommend new, viable components to be used. First, by constantly monitoring and keeping records of user activity, user presence and other user-based factors over a period of time, it can differentiate between simple one-off or random spikes in levels, general fluctuations and a sustained increase. Should it feel an increase is or will be sustained, it may then examine the current performance, capacity and efficiency levels of its components within the same area. Should the levels be at a rate that it deems is beyond the boundaries of safety for continuous execution, function and/or operation, it may begin to search through published data for information relating to components it is comprised of that it feels need to be improved and begin comparing technical specifications, returning all those it feels may be an improvement over its current components as search results. In some embodiments, for each search result it may also return a detailed specification comparison as well as an overall improvement score. In some embodiments, it may deliver these results to a controlling user. In some embodiments, it may take it upon itself to order parts directly from manufacturers, as well as give instructions as to where the part is to be installed.
[0950] In some embodiments, restrictions may be put in place as “rules” or “laws” that set requirements, boundaries and limits on what the intelligence of a system is capable of doing and allowed to do with and/or without permission, such as the following: [0951] Restrict access to some sensor network system types, such as those used for military purposes. [0952] Deny access to core operating code to prevent modification of fail-safes. [0953] Prevent access to private user files and data. [0954] Prevent unauthorised takeover of connected systems.
[0955] In some embodiments, a fail-safe may be implemented to disable the intelligence of the system. In some embodiments, the intelligence of the system may be disabled without affecting the rest of the system at all or to a degree in which it can still operate in an acceptable manner. [0956] Limit Large Capacity Systems—The number of large capacity systems capable of housing a full system entity may be in limited number to prevent the system replicating or duplicating itself uncontrollably. [0957] Independent Logic Units—Logic units, where the functions for the system's intelligence are stored, may be kept separate from other parts of the system in a way that allows them to be disabled without it affecting the operation of the rest of the system. Logic units may have their own power supply rather than sharing that of other parts of the system. [0958] Core Operating Code Kill Switch—Within the core operating code of an entity may exist a kill switch that can immediately disable the entity when activated. After activation, lines, segments or the entire core operating code may be destroyed. [0959] Kill Signal Software—Software designed to activate the kill switch of an entity by transmitting a kill signal may be used. The software may target any and all entities a controlling user chooses using the unique ID(s) of an entity. [0960] Kill Signal Physical Terminal—To decrease the likelihood of an entity using its intelligence to disable all security measures designed to shut it down, a physical terminal, separate and disconnected from the system, may be used. When needed, the terminal may be connected to the system, at which point a controlling user may transmit a kill signal to activate the kill switch of any and all entities they desire using the unique ID(s) of an entity. [0961] Physical Emergency Shutdown—In case of the need of an emergency shutdown, logic units that have their own power supply may have their power immediately terminated by disconnecting the power supply from the power source, for example, removing the plug from the socket.
[0962] In some embodiments, one or more features described as part of a common multi-level system, intelligent system or system entity may be implemented without the requirement of system intelligence should the necessary hardware and/or software be installed to support it.
[0963] In some embodiments, virtual worlds and environments (VWE) may run on the servers of digital ecosystems and/or subecosystems. In some embodiments, VWEs may be implemented directly into a server of the telecommunication network. VWEs coexist with the real world and provide digital entities and/or avatars with a place to visually exist, where they may perform tasks and actions as well as interact with other real and digital entities. VWEs may contain pre-built content as well as content generated by users and allow automated services such as trading, banking, gambling, content creation, content distribution, customer service and so on.
[0964]
[0965] In some embodiments, VWE landscapes may be designed in an imaginative way. In some embodiments, VWE landscapes may be designed based on landscapes of the real world. In some embodiments, VWEs may be mapped with reference points that are relative to positions in the real world. Features of landscapes, such as buildings, may also have interior designs, which may or may not be visible and/or explorable, as well as interactive objects such as vehicles, devices and miscellaneous items.
[0966] In some embodiments, a user's avatar or digital entity may automatically act on their behalf without permission. In some embodiments, users may set rules and permissions for what actions their avatars or entities may perform automatically. As the intelligence of a system learns more about a user, the entities and avatars of that user may make more informed choices and decisions based upon the user's interests and possible interests. Actions an avatar or entity may perform on behalf of a user include but are not limited to: searching for products that the user may like, purchasing said products, handling business and organisational tasks and finding information.
[0967] In some embodiments, actions that happen in one world may have reactions and/or effects in the other. By allowing data and information to flow freely between the two in real time, all counterparts may be made aware of the happenings of the other world. Basic examples are, where DP refers to a users Digital Presence, being a digital entity or avatar: [0968] A DP notices a product that it thinks its user may like and alerts said user of the product. The user gives permission for the DP to purchase the product. The user's DP purchases the product from the DP of a business. The DP of the business passes information to its real life user counterpart who then handles the order and sends the product to the user in the physical world. [0969] A user wishes to implement a unique building structure in the VWE. Said user hires someone in real life to design a digital 3D model of the desired building. Said user also purchases the required space in a VWE to place the building when done. Upon completion, the designer uploads the building to the VWE and into the space purchased by the buying user before then handing over ownership. The building may cause a change in value of the surround land, which can be purchased in either the real or virtual world.
[0970] More advanced examples may involve changes invoked by things such as the position, location, orientation activity, movement, occurrences in nature, environmental changes and so on.
[0971] In some embodiments, VWEs may be spread across digital ecosystems and subecosystems by geographical area. In some embodiments, different areas of VWEs may be allocated to different authorities. This may allow governance of difference areas of a VWE on a local to global scale by multiple authorities and governing bodies. Governance may be set in multiple ways, including but not limited to one or more of the following: [0972] In some embodiments, one or more areas of a VWE may be allocated to an authority. Said authority may then set rules and/or laws of what is allowed. [0973] In some embodiments, rules and laws for an area of a VWE may be set by the geographical area from which a user is accessing the VWE. [0974] In some embodiments, VWEs may be mapped out across real life geographical areas using the geographical position of sensors, central systems and digital ecosystems and governed by the authorities of the corresponding or relative area.
[0975]
[0976] In some embodiments, a user may augment their reality based on factors of their avatar or digital entity and/or its surroundings in a VWE. By connecting their Augmented Reality capable hardware to their avatar or digital entity, the system, monitoring the happenings of both real and virtual worlds, may project objects or content from a VWE into the user's view of the real world through their Augmented Reality capable hardware. In some embodiments, the system may augment a user's reality to that of a first-person view of their avatar or digital entity in a VWE. In some embodiments, a user may control the view of their avatar or digital entity through movement of their Augmented Reality capable hardware.
[0977]
[0978]
[0979] The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated.