Abstract
The digital command prompting device for dementia patients using augmented reality is an aid is to help all people, especially those who have special needs particularly individuals who have diminished or diminishing function of their brain. The device is predominately mobile but can also be stationary and can be programmed by receiving and selecting pre-set commands to operate and assist a user with their daily living standards or needs but is particularly adapted for use when a user is having a disorientation episode as to place and time. The device also has other various features including an illuminated display panel, a GPS tracking capability, an alarm, an illumination element, solar panel and battery backup components, time and date clocks, is provided water resistance covers and/or material, among other things. The device may be used within the home environment, outdoor environment or a restricted environment, e.g. aged care facility, hospital, pre-school or school.
Claims
1: A digital command prompting device using augmented reality for assisting and orienting a user during a disorientation episode comprising means for electronically communicating said device with an electronic appliance and wherein the device can be activated by the user and further wherein the device is provided with means which can send and receive electronic signals and further wherein the device is provided with means to produce a wireless signal which is transmittable to the mobile electronic appliance and further wherein said electronic appliance is provided with a computer processing application software which can process any signal received by the device and further wherein the computer processing application software can produce a visual display on the electronic appliance to inform or command the user to respond to a question for assisting in the orientation of the user during a disorientation episode.
2: The digital command prompting device of claim 1 further provided with means for transmitting to and receiving a wireless signal from a navigational positioning system.
3: The digital command prompting device of claim 2 further provided with means in the form of push button for activating the device by the user.
4: The digital command prompting device of claim 3 further provided with means for causing the device to vibrate.
5: The digital command prompting device of claim 4 further provided with means for causing the device to produce sounds or to receive voice commands or input.
6: The digital command prompting device of claim 5 further provided with a microcomputer sensor transmitter processor.
7: The digital command prompting device of claim 6 further provided with a power source.
8: The digital command prompting device of claim 7 further provided with a lanyard to allow the user to wear the device.
9: The digital command prompting device of claim 2 wherein the device is housed in a watch.
10: The digital command prompting device of claim 9 further provided with means for causing the device to vibrate.
11: The digital command prompting device of claim 10 further provided with means for causing the device to produce sounds or to receive voice commands or input and further wherein the device is provided with means for allowing the device to receive input from the user to assist in the orientation of the user.
12: The digital command prompting device of claim 11 further provided with a watchband.
13: The digital command prompting device of claim 12 further provided with augmented reality in the form of a series of reminders unique to the user for assisting in stimulating the user's memory.
14: The digital command prompting device of claim 1 wherein the appliance comprises an electronic tablet.
15: The digital command prompting device of claim 1 wherein the appliance comprises a smart phone.
16: The digital command prompting device of claim 1 further provided augmented reality in the form of a series of reminders unique to the user for assisting in stimulating the user's memory.
17: The digital command prompting device of claim 1 further provided means for assessing the user's orientation status.
18: The digital command prompting device of claim 1 further provided means for providing a series of questions to be posed to the user in the event of the disorientation of the user.
19: A digital command prompting device using augmented reality for assisting and orienting a user during a disorientation episode, comprising a lanyard or necklace and wherein the lanyard or necklace is provided with the means which can electronically communicating, by a wireless signal with an electronic appliance in the form of an electronic tablet and/or smart phone and wherein the device is provided with a button which can be activated by the user and wherein the device can create and transmit an electronic signal, via a wireless signal transmitter imbedded in the device in the form of a microcomputer transmitter sensor processor and further wherein upon activation the microcomputer sensor transmitter processor sends a wireless signal by either a wireless technology standard for exchanging data over short distances and further wherein the wireless signal can be transmitted to a carer or emergency office or medical facility and further wherein the wireless signal is transmittable to the electronic tablet or smart phone and further wherein the electronic tablet or smart phone can send, receive, and process the wireless signal and is compatible the Android mobile operating system and/or the iPhone cellular phone mobile operating system known as and iOS and further wherein the microcomputer sensor transmitter processor can send, receive, process, and transmit datum and/or a signal to and from a navigational positioning system for geographically locating the user and further wherein the electronic tablet or smart phone is provided with or is accessible to computer processing application software which can process a signal datum received by the device and further wherein the computer processing application software can produce an augmented reality to the user in the form of a visual display consisting of a series of reminders to assist in the orientation of the user and further wherein the series of reminders are unique to the user for assisting in stimulating the user's memory the device is powered by a battery and further wherein the device is provided with means for causing the device to vibrate to act in an alarm or awakening mode and further wherein the device is provided with means for causing the device to produce a sound or to receive a voice command or input a voice command to assist in orienting the user in the event of a disorientation episode of the user and further wherein the device is provided with means for allowing the device for assessing the user's orientation status and further wherein the device is provided means for providing a series of questions to be posed to the user in the event of the disorientation of the user and further wherein the device is provided with means for allowing the device to receive input from the user to assist in the orientation of the user.
20: A digital command prompting device using augmented reality for assisting and orienting a user during a disorientation episode comprising a watch housing the device and further provided with a watchband capable of allowing the device to be worn by the user and further the device can create and transmit an electronic signal via a wireless signal transmitter imbedded in the device in the form of a microcomputer transmitter sensor processor and further wherein upon activation the microcomputer sensor transmitter processor is capable of sending a wireless signal to a carer or emergency office or medical facility and further wherein the device can send, receive, and process the wireless signal and is compatible the Android mobile operating system and/or the iPhone cellular phone mobile operating system known as and iOS and further wherein the microcomputer sensor transmitter processor can send, receive, process, and transmit datum and/or a signal to and from a navigational positioning system for geographically locating the user and further wherein the device is provided with or is accessible to computer processing application software which can process a datum signal received by the device and further wherein the computer processing application software can produce an augmented reality to the user in the form of a visual display consisting of a series of reminders to assist in the orientation of the user and further wherein the series of reminders are unique to the user for assisting in stimulating the user's memory the device is powered by a battery and further wherein the device is provided with means for causing the device to vibrate to act in an alarm or awakening mode and further wherein the device is provided with means for causing the device to produce a sounds to receive a voice command or input voice command to assist in orienting the user in the event of a disorientation episode of the user and further wherein the device is provided with means for allowing the device to assess the user's orientation status and further wherein the device is provided means for providing a series of questions to be posed to the user in the event of the disorientation of the user and further wherein the device is provided with means for allowing the device to receive input from the user to assist in the orientation of the user.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0038] FIG. 1 discloses a concept flow diagram of the DCP with AR system and devices.
[0039] FIG. 2 discloses a first embodiment of the DCP with AR devices 2, 3 and 4.
[0040] FIG. 3 discloses details of the DCP with AR devices of FIG. 2 in use.
[0041] FIG. 4 discloses a second embodiment of the DCP with AR device.
[0042] FIG. 5 discloses details of the DCP with AR device 2 of FIG. 2.
[0043] FIG. 6 discloses further details of the DCP with AR device 2 of FIG. 2.
[0044] FIG. 7 discloses the DCP with AR devices of FIG. 2 in use.
[0045] FIG. 8 discloses details of the DCP with AR device of FIG. 4 in use.
[0046] FIG. 9 discloses further details of the DCP with AR device of FIG. 4 in use.
[0047] FIG. 10 discloses further details of the DCP with AR device of FIG. 4 in use.
[0048] FIG. 11 discloses the DCP with AR device of FIG. 4 in use on a user.
[0049] FIG. 12 discloses a flow chart of the operation of the DCP with AR system and devices.
[0050] FIG. 13 discloses a questionnaire for use with the DCP with AR system and devices.
[0051] FIG. 14 discloses a sample photo for use with the DCP with AR system and devices.
[0052] FIG. 15 discloses a second sample photo for use with the DCP with AR system and devices.
[0053] FIG. 16 discloses a third sample photo for use with the DCP with AR system and devices.
[0054] FIG. 17 discloses a fourth sample photo for use with the DCP with AR system and devices.
[0055] FIG. 18 discloses the devices of FIG. 2 of the DCP with AR in use with the photo of FIG. 14.
[0056] FIG. 19 discloses the device of FIG. 4 of the DCP with AR in use with the photo of FIG. 14.
DETAILED DESCRIPTION OF THE INVENTION
[0057] Referring to FIG. 1, a concept flow diagram of the system and devices according to the preferred embodiments of the invention is shown. Particularly in FIG. 1, the overall operation of the preferred embodiments of the invention is disclosed wherein the DCP with AR device, denoted by the empaneled term device may comprise a watch device, e.g., a wrist watchband, bracelet, or a lanyard attachment, etc. As FIG. 1 furthers discloses, the DCP with AR device and system, whether it is either embodiment discussed in reference to FIG. 2 or 4, of the instant invention, is provided with a sensor device capable of creating and transmitting electronic signals, via a wireless signal transmitter imbedded in the sensor device. In the DCP with AR device of FIG. 2, the wireless signals are transmittable to a mobile electronic appliance, such as a smart phone, or an electronic tablet. The mobile electronic appliance is capable of sending, receiving, and processing the wireless signals and/or data and also is compatible with the Android mobile operating system, the iPhone cellular phone mobile operating system, i.e., iOS, and other commercially available mobile device operating systems. The DCP with AR device of FIG. 4 has similar capacity of sending, receiving, and processing wireless signals. Additionally, the sensor devices of either DCP with AR devices of FIGS. 2 and 4, use radiofrequency microchip technology, in the form of a microcomputer transmitter processors capable of sending, receiving, processing, and transmitting data and/or signals to and from navigational positioning systems, such as the Global Positioning System for locating the sensor devices which signals and/or data can transmit signals and/or data to the mobile electronic appliance. As is further shown in FIG. 6D, the DCP with AR devices 2, 3 and 4, are provided with or is accessible to computer processing application software, i.e., otherwise known as an app, which app is capable of processing the any signal and/or data receiving by the sensor device. The DCP with AR device of FIG. 4 has a similar app as discussed in reference to FIG. 8C. DCP with AR devices 2 and 31 are further constructed to interact with a system comprising a centralized database constructed to electronically interact, via wired or wireless electronic communication network, with a DCP with AR centralized computer system (not shown) which allow for the programming of user DCP with AR mobile devices for receiving and selecting pre-set commands to operate and assist a patient or user to help them with all aspects of their daily living. The whole DCP with AR system, and DCP with AR devices, are operated by computer application software, hereinafter the DCP with AR app, constructed to operate the DCP with AR computer system and devices. The administrators using the DCP with AR administrative internet web page interfaces (not shown) are known as DCP with AR administrators and carers using the carer internet web page interfaces (not shown) are known as DCP with AR carers.
[0058] FIG. 2 discloses a first embodiment of the DCP with AR device 2 being utilized by a User 1, in the form of a lanyard or necklace 9. DCP with AR device 2 is shown being attached to lanyard or necklace 9 and is capable of electronically communicating, by wireless signals 7, as is disclosed and discussed below, with an electronic appliance in the form of either an electronic tablet 3 and/or smart phone device 4. Additionally, as is further discussed in FIG. 3, electronic tablet 3 and smart phone device 4 are provided with computer application software 5, and 6, respectively used to assist in operating the electronic tablet and/or smart phone device.
[0059] FIG. 3 discloses further the operation of the DCP with AR device 2. Particularly, FIG. 3 shows that DCP with AR device 2 is provided with an activation button 20 which can be depressed by a user; button 20 can be used either for emergencies or can be used to activate a set of commands pre-set by the User. Additionally, the DCP with AR device 2 is capable of creating and transmitting electronic signals, via a wireless signal transmitter imbedded in the DCP with AR device, in the form of a micro-chip transmitter sensor device 60, further disclosed and discussed in reference to FIG. 6. Upon activation, microcomputer sensor transmitter processor 60 sends a wireless signal 7, by various systems, such as a technology that allows electronic devices to a network, i.e., WI-FI, or a wireless technology standard for exchanging data over short distances commonly known as Bluetooth, or ZigBee, etc. Additionally, wireless signal 7 can be transmitted to an emergency alert system such medical alert staff desk, nursing station or emergency office or facility. The wireless signal 7 is also transmittable to a mobile electronic appliance, i.e., smart phone 4, or electronic tablet 3. The mobile electronic appliance, either electronic tablet 3 or smart phone 4, is capable of sending, receiving, and processing the wireless signals and/or data and also is compatible with the Android mobile operating system, the iPhone cellular phone mobile operating system, i.e., iOS, and other commercially available mobile device operating systems. Additionally, the microcomputer sensor transmitter device processor 60 incorporates radiofrequency microchip technology, discussed further in reference to FIG. 6, capable of sending, receiving, processing, and transmitting data and/or signals 8 to and from navigational positioning systems, such as the Global Positioning System, i.e., GPS, or Global Navigational Satellite System, i.e., the GLONASS. Additionally, microcomputer sensor transmitter device processor 60 is capable, via signal 8, of connecting with the DCP with AR centralized computer system discussed in reference to FIG. 1, supra. Microcomputer transmitter processor 60 is further capable of being coded with information regarding the identity of the User such as his or her birth date, owner, home location, medical records, etc. Additionally, the microcomputer sensor transmitter device 60 is capable of creating a signal and/or data which can be transmitted to a satellite navigational technology system, i.e., GPS or GLONASS, for geographically locating the sensor transmitter device which signals and/or data can transmit signals and/or data to the mobile electronic appliance or smartphone. As is further shown in FIG. 3, the mobile electronic appliance, either electronic tablet 3 or smart phone 4, is provided with or is accessible to computer processing application software, i.e., otherwise known as an app, which app is capable of processing any signal and/or data received by the sensor device. For the electronic tablet 3, this app is shown as reference number 5 while for the smart phone 4, the app is shown as reference number 6. Additionally, apps 5 and 6 are capable of producing display or command prompts, displayed respectively as display 15, on electronic tablet 3 and display 16, on smartphone device 4, in the example of FIG. 2, the command Who Where When Update to inform, remind or command a user to perform a certain task. Although DCP with AR device 2 is disclosed, as is discussed in reference to FIG. 6, as being powered by a battery, the device is configured to be allowed to be powered with solar energy. Additionally, app 5 or 6 is coded with artificial intelligence techniques for assisting in analyzing data received by DCP with AR device 2.
[0060] FIG. 4 discloses a second embodiment of the DCP with AR device. Particularly, FIG. 4 discloses a DCP with AR device 31 housed in the form of watch, having watchband 32, capable of being worn on a hand of a user. Additionally, as FIG. 4 discloses, DCP with AR device 31 has a visual screen display 33, via a computer application program, imbedded in the device, similar to that of DCP with AR device 2 electronic tablet 3 app 5 and/or smart phone 4 app 6, capable of communicating a command message or prompt to the User, here as example Who Where When Update. DCP with AR device 31 is configured to be touch activated on its screen although it can be activated by traditional buttons. Additionally, DCP with AR device 31 has an illumination display feature allowing the device to be seen at night or in low-light environments. Further, as is discussed in further detail in reference to FIG. 8, the DCP with AR system is provided with a computer processing application software, i.e., otherwise known as the DCP with AR app 85, which app is capable of processing any signal, data, or input from the DCP with AR device 31 and the DC? with AR device app is capable of producing display or command prompts. In the example of FIG. 8, it is the prompt Who Where When Update, reference 37 to inform, remind or command the user to update his physical/mental information. Additionally, app 85 is coded with artificial intelligence techniques for assisting in analyzing data received by DC? with AR device 31. Additionally shown in FIG. 4, is a camera feature 46, and sensor 42 and sensor connector 41.
[0061] FIG. 5 reveals additional features of the DCP with AR device 2. Particularly, FIG. 5A shows the overall construction of DCP with AR device 2 having a body 21 and activation button 20 and camera 26 while FIG. 5B shows reverse side of device showing sensor 28. FIG. 5C reveals the vibration feature 22 of DCP with A5 device 2 allowing the device to act in an alarm or awakening mode. FIG. 5D reveals a transceiver 23 provided with DCP with AR device 2 capable of producing alarm or awakening sound 24 with transceiver 23 being also capable of receiving voice commands or input.
[0062] FIG. 6 is a further detailed figure of DCP with AR device 2 showing details of the device construction. Particularly, FIG. 6A discloses that DCP with AR device 2 has a body 21 and activation button 20 and camera 26 while FIG. 6B is a side view of FIG. 6A showing DCP with AR device 2 in profile with activation button 20, camera 26 and sensor 28. FIG. 6C is a cross-sectional view of FIG. 6A showing internal operational components of DCP with AR device 2, particularly microcomputer sensor transmitter processor 60 and power source 70 in the form of a disc battery. FIG. 6D shows the microcomputer sensor transmitter processor 60 and power source 70 individually revealing their relative size while FIG. 6E is a side profile view of microcomputer sensor transmitter processor 60 and power source 70. As previously discussed, microcomputer sensor transmitter processor 60 is capable of being coded, in app 65, with information regarding the identity of a user and is capable of creating and receiving signals and/or data which can be transmitted to a satellite navigational technology system. Microcomputer sensor transmitter processor 60 also can transmit and receive signals and/or data to the mobile electronic appliance, electronic tablet 3 or smart phone 5. And as previously discussed in reference to FIG. 1, app 65 incorporates artificial intelligence techniques to assist in data received by sensor 28 or inputted thru electronic tablet 3 and smart phone 4.
[0063] FIG. 7 discloses the DP with AR device of FIG. 2 in use. During its use, a user, or their caretaker, or the DCP with AR may detect or sense a change in the User's condition, particularly of a mental condition or disorientation episode; in that event, the DCP with AR device activates the Who Where When Update, i.e., a vital physical statistics analyses mode, to assess potential user condition changes or disorientation status. In this mode, the DCP with AR device prompts the User to respond to a series of questions to assess any mental and/or physical condition changes which may then require the User to alter his/her level of care. The Who Where When Update is further discussed in reference to FIG. 12 in an attempt to re-orient the User having a disorientation episode.
[0064] FIG. 8 shows details of the DCP with AR device 31 with wristband 32, particularly wireless communication signal feature producing and receiving electronic signals 38. DCP with AR device 31, is internally provided with electronic components similar to microcomputer sensor transmitter processor 60 and power source 70 of DCP with AR device 2. As such, DCP with AR device 31 is capable of electronically communicating with GPS and GLONASS satellite systems for allowing location of wearer of the device, i.e., the User. FIG. 8 shows further details of the DC with AR device 31 of FIG. 4 particularly, FIG. 8A shows DCP with AR device 31 with wristband 32, visual screen 33, visual screen display readout 37, sensor 42 and sensor connector 41. FIG. 8A further reveals the wireless communication signal feature producing and receiving electronic signals 38 from microcomputer sensor transceiver processor 80 of FIG. 8C. FIG. 8B, a rear view of DCP with AR device 31 of FIG. 8A, shows sensor probe 40 and sensor probe 44. Sensor probe 42 is located and positioned on band 32 such that can be held against the chest area of a user to measure the user's respiratory rate while sensor probe 44 is constructed and located on the DCP with AI device 31 to sense the heart rate on the wrist of a user. Sensor probe 44 is further capable of sensing a user's body temperature and is connected to DCP with AR device 31 by connector 43. FIG. 8C, a cross-section view of DCP with AR device 31 of FIG. 8A, reveals that DCP with AR device is internally provided with electronic components 80 and 90 similar to microcomputer sensor transceiver processor 60 and power source 70, respectively, of DCP with AR device 2. As such, DCP with AR device 31 is capable of electronically communicating with GPS and GLONASS satellite systems for allowing location of wearer of the apparatus, i.e., via signal 38, the user and is further capable of connecting with the DCP with AR centralized computer system discussed in reference to FIG. 1, supra. As with DCP with AR device 2, and as discussed supra, in reference to FIG. 4, microcomputer sensor transceiver processor 80 of DCP with AR device 31 incorporates a MEMS inertial sensor capable of measuring 1 to 3 axis acceleration, i.e., a pedometer 87, which is processed by imbedded software app 85 capable of calculating calorie levels or other calculations as previously discussed, supra.
[0065] FIG. 9 discloses an additional feature of DCP with AR device 31, particularly, a vibration capability allowing the device to awaken, remind and or alert the User to perform a particular task, here, as an example, to Who Where When Update as discussed in reference to FIGS. 7 and 12.
[0066] FIG. 10 discloses an additional feature of the DCP with AR device 31, particularly, a sound transceiver 5 capable of emitting sound 36 for alerting the User to awaken, remind him or her to perform a particular task or respond to re-orientation prompts, here, as an example, to Who Where When Update. Additionally, transceiver 35 is capable of receiving audible input which can be used to program DCP with AR device 31.
[0067] FIG. 11 shows the DCP with AR device 31 of FIG. 7 being utilized by a user, i.e., being worn on the wrist of hand 30 of a user. The DCP with AR device has a visual screen 33 and screen display 37 capable of communicating a readout to the User. The DCP with AR device 31 of FIG. 9 has all of the features and capabilities of the DCP with AR device 31 of FIGS. 5-8. Also seen in FIG. 10, the DCP with AR device 31 is provided with the camera scanner feature 46 along with sensor probe 52 which operates in a similar fashion to sensor probe 42 and sensor probe connector 41 of FIGS. 7-9 and are provided with electronic components 80 and 90 as well as app 85 as discussed in reference to FIG. 4, capable of producing all of the electronic information and processing all of the data of the DCP with AR device 31 of FIGS. 4, 8-10.
[0068] FIG. 12 is a step by step flow chart of an example of various task routines to performed by the DCP with AR device and system in responding to a user dis-orientation episode. Particularly in FIG. 12 is shown that a user undergoes a physical and mental assessment with the results being loaded in the DCP with AR device. Subsequently, the DCP with AR device is activated by the User and during the course of use, the DCP with AR device and system will perform various physical monitoring tasks on the User. In the event, either the User, their caretaker, or the DCP with AR device, detects or senses a disorientation episode of the User, the DCP with AR device activates the Who Where When Update, i.e., a vital physical statistics analyses mode, to assess potential User orientation status. The Who Where When Update then prompts the User to respond to a series of questions to assess his/her orientation status which responses are entered into the DCP with AR device. Using the User's profile and AI analytical processing techniques, various mathematical methods such statistical analysis, search and mathematical optimization, fuzzy logic, etc. an assessment is made as to the User's orientation. In the event, the DCP with AR device and system concludes the User is disoriented, a series of prompts including photos of the User's profile are displayed, in chronological order in an attempt to orient the User. FIGS. 14 to 17 are samples of photos and/or images which can be used in the orientation serial prompts for the User and depending on the responses by the User, suggestions can be made for further User operation of the DCP with AR device, particularly by the app 65 of DCP with AR device 2 and app 85 of DCP with AR device 31. Particularly, as discussed in reference to FIGS. 18 and 19, when DCP with AR system and devices are in the disorientation mode, a series of prompts in the form of the Patient's photos are displayed with the objective of orienting him or her. Usually, the sequence of the displayed photos proceeds from the long-term memory in progressive chronological order. For example, as shown in FIG. 18 a childhood photo is displayed with a message or query bar, labeled KNOW THIS PHOTO, YES, NO below the photo asking whether the User recognizes it; the User then taps on the appropriate response. If the response in YES, the series of questions prompted in FIG. 13 are again prompted with the objective of receiving positive responses indicating the User has regained orientation. In the response to the query of FIG. 18 is in the negative, a subsequent profile photo is displayed with a similar query bar being displayed; in the event several of the responses to the query bars are in the negative, a signal and/notice is sent to the User's Carer and/or Administrator to activate an intervention alert of the User.
[0069] FIG. 13 is a series of questions which can be used to assess a user's mental state when the DCP with AR is in the Who Where When Update mode. For example, the User is queried What is your name?, What is today's date and day of the week?, When and where were you born?, Where do you live?, Where are you right now?, Who is the President of the USA?, What is your mother's maiden name?, What is our father's name?, What is your spouse's name?, Where were you married?. These responses are processed using the apps 65 and 85, of DCP devices 2 and 31, respectively incorporating artificial intelligence techniques which then make suggest changes to the User level and type of care and orientation.
[0070] FIG. 14 discloses a sample photo or image for use in stimulating a user or patient's long-term memory. Particularly, FIG. 14 displays a childhood photo which includes the User or Patient and his childhood home.
[0071] FIG. 15 discloses a second sample photo or image for use in stimulating a user or patient's long-term memory. Particularly, FIG. 15 displays a school photo which includes the User or Patient and his high school.
[0072] FIG. 16 discloses a third sample photo or image for use in stimulating a user patient's long-term memory. Particularly, FIG. 16 displays a school photo which includes the User or patient on his wedding day in front of his church.
[0073] FIG. 17 discloses a third sample photo or image for use in stimulating a user or patient's long-term memory in relation to a profession or hobby involving sports. Particularly, FIG. 17 displays a photo which includes various sports equipment such as a basketball, tennis ball, hockey stick, tennis racket or American football which may be relevant to the User or Patient's long-term memory.
[0074] FIG. 18 discloses the devices of FIG. 2 of the DCP with AR in use with the photo of FIG. 14 and its operation. Particularly, FIG. 18 discloses that in the disorientation mode, a series of prompts in the form of the User's photos are displayed with the objective of orienting him or her. Usually, the sequence of the displayed photos proceeds from the long-term memory in progressive chronological order; in FIG. 18 a childhood photo is displayed with a message or query bar, labeled KNOW THIS PHOTO, YES, NO below the photo asking whether the User recognizes it; the User then taps on the appropriate response. If the response in YES, the series of questions prompted in FIG. 13 are again prompted with the objective of receiving positive responses thereto indicating the User has gained orientation. In the response to the query of FIG. 18 is in the negative, a subsequent profile photo is displayed with a similar query bar being displayed; in the event several of the responses to the query bars are in the negative, a signal and/notice is sent to the User's Carer and/or Administrator to activate an intervention alert of the User.
[0075] FIG. 19 discloses the device of FIG. 4 of the DCP with AR in use with the photo of FIG. 14 and operates in the same mode and manner of the devices of FIG. 2 as discussed in reference to FIG. 18.
[0076] Additionally, the principles of the invention disclosed herein could be practiced by those of skilled in the art with equivalent alternative constructions. Although the present invention has been described in considerable detail with reference to a certain preferred embodiments thereof, other embodiments are possible. Therefore, the spirit and scope of the appended claims should not be limited to the description of the preferred embodiment(s) contained herein. The invention may be embodied and practiced in other specific forms without departing from the spirit and essential characteristics thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description; and all variations, substitutions and changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
[0077] When introducing elements of the present invention or the preferred embodiment(s) thereof, the articles a, an, the, and said are intended to mean there are one or more of the elements. The terms comprising, including, and having are intended to be inclusive and mean that there may be additional elements other than the listed elements. Additionally, any description of the exemplary or preferred embodiments is intended to be read in connection with the accompanying drawings, which are to be considered part of the entire written description.