MOTION AND GESTURE-BASED MOBILE ADVERTISING ACTIVATION
20180040026 ยท 2018-02-08
Inventors
Cpc classification
G06F3/04842
PHYSICS
G06F3/017
PHYSICS
G06F3/011
PHYSICS
G06Q30/0252
PHYSICS
International classification
G06F3/0488
PHYSICS
Abstract
The presentation of advertisements to a user on a mobile communications device is disclosed. A first external input corresponding to a triggering of an advertisement delivery is received on a first input modality. An advertisement overlay is displayed in a graphical user interface in response to receiving the external input. Advertisement invocation instructions are displayed within the advertisement overlay. A second external input is received on a second input modality different from the first input modality. The second external input is translated to a set of quantified values. An advertisement is then displayed within the advertisement overlay in response to a substantial match between the set of quantified values translated from the received second external input to the set of predefined values corresponding to the advertisement invocation instructions.
Claims
1-20. (canceled)
21. A method for presenting content to a user on a mobile communications device, the method comprising: displaying an overlay in a graphical user interface, the graphical user interface being receptive to a first external input on a first input modality of the mobile communications device; displaying, within the overlay, content invocation instructions; receiving a sequence of second external inputs on a second input modality of the mobile communications device different from the first input modality; displaying an input progress indicator during the receiving of the second external input, the input progress indicator being incrementally updated in response to translated values of a partial sequence of the sequence of second external inputs matching a predefined condition; and displaying, within the overlay, the content separate from the content invocation instructions upon an evaluation of a substantial match between quantified values translated from the received sequence of second external inputs to a set of predefined values corresponding to the content invocation instructions.
22. The method of claim 21, wherein: the first input modality is a touch screen; and the first external input is a haptic contact upon the touch screen from the user corresponding in position to an activatable element displayed therein.
23. The method of claim 21, wherein the second input modality is a sensor selected from a group consisting of: an accelerometer, a compass, and a gyroscope.
24. The method of claim 23, wherein the sensor is embedded in the mobile communications device, and the second external input is a sequence of motions applied to the mobile communications device by the user that are translated to the set of quantified values by the sensor.
25. The method of claim 23, wherein: the sensor is embedded in an external device wearable by the user and in communication with the mobile communications device; and the second external input is a sequence of motions applied to the external device by the user that are translated to the set of quantified values by the sensor.
26. The method of claim 23, wherein the second external input is steps walked by the user as measured by the accelerometer.
27. The method of claim 23, wherein the second external input is steps ran by the user as measured by the accelerometer.
28. The method of claim 23, wherein the second external input is a physical gesture as measured by the gyroscope.
29. The method of claim 23, wherein the second external input is a direction in which the mobile communications device is pointed as measured by the compass.
30. The method of claim 23, wherein the second external input is steps walked in a defined direction as measured by a combination of the accelerometer and the compass.
31. The method of claim 21, wherein: the second input modality is an on-board microphone; and the second external input is a sequence of audio commands generated by the user and captured by the on-board microphone.
32. The method of claim 21, wherein: the second input modality is an on-board camera; and the second external input is one or more user gestures graphically captured by the on-board camera.
33. The method of claim 32, wherein the one or more gestures are graphically captured from a face of the user.
34. The method of claim 32, wherein the gestures are graphically captured from a hand of the user.
35. The method of claim 21, wherein the second external input is translated to multiple sets of quantified values each corresponding to a different time instant.
36. A mobile communications device, comprising: a processor; an input/output interface connected to a display, a first input modality, and a second input modality, the input/output interface being controlled by the processor; a memory encoded with an content delivery application that when executed on the processor performs a method for presenting content to a user on the mobile communications device, the method comprising: displaying an overlay in a graphical user interface, the graphical user interface being receptive to a first external input on a first input modality of the mobile communications device; displaying, within the overlay, content invocation instructions; receiving a sequence of second external inputs on a second input modality of the mobile communications device different from the first input modality; displaying an input progress indicator during the receiving of the second external input, the input progress indicator being incrementally updated in response to translated values of a partial sequence of the sequence of second external inputs matching a predefined condition; and displaying, within the overlay, the content separate from the content invocation instructions upon an evaluation of a substantial match between quantified values translated from the received sequence of second external inputs to a set of predefined values corresponding to the content invocation instructions.
37. An article of manufacture comprising a non-transitory program storage medium readable by a computing device, the medium tangibly embodying one or more programs of instructions executable by the device to perform a method for presenting content to a user on the mobile communications device, the method comprising: displaying an overlay in a graphical user interface, the graphical user interface being receptive to a first external input on a first input modality of the mobile communications device; displaying, within the overlay, content invocation instructions; receiving a sequence of second external inputs on a second input modality of the mobile communications device different from the first input modality; displaying an input progress indicator during the receiving of the second external input, the input progress indicator being incrementally updated in response to translated values of a partial sequence of the sequence of second external inputs matching a predefined condition; and displaying, within the overlay, the content separate from the content invocation instructions upon an evaluation of a substantial match between quantified values translated from the received sequence of second external inputs to a set of predefined values corresponding to the content invocation instructions.
38. The article of manufacture of claim 37, wherein the second input modality is a sensor selected from a group consisting of: an accelerometer, a compass, and a gyroscope.
39. The article of manufacture of claim 37, wherein the first input modality is a touch screen, and the first external input is a haptic contact upon the touch screen from the user corresponding in position to an activatable element displayed therein.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] These and other features and advantages of the various embodiments disclosed herein will be better understood with respect to the following description and drawings, in which like numbers refer to like parts throughout, and in which:
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
DETAILED DESCRIPTION
[0028] The present disclosure encompasses various embodiments of methods for motion and gesture-based mobile advertising activation. The detailed description set forth below in connection with the appended drawings is intended as a description of the several presently contemplated embodiments of these methods, and is not intended to represent the only form in which the disclosed invention may be developed or utilized. The description sets forth the functions and features in connection with the illustrated embodiments. It is to be understood, however, that the same or equivalent functions may be accomplished by different embodiments that are also intended to be encompassed within the scope of the present disclosure. It is further understood that the use of relational terms such as first and second and the like are used solely to distinguish one from another entity without necessarily requiring or implying any actual such relationship or order between such entities.
[0029]
[0030] The mobile communications device 10 is understood to implement a wide range of functionality through different software applications, which are colloquially known as apps in the mobile device context. The software applications are comprised of pre-programmed instructions that are executed by a central processor 14 and that may be stored on a memory 16. The results of these executed instructions may be output for viewing by a user, and the sequence/parameters of those instructions may be modified via inputs from the user. To this end, the central processor 14 interfaces with an input/output subsystem 18 that manages the output functionality of a display 20 and the input functionality of a touch screen 22 and one or more buttons 24.
[0031] In a conventional smartphone device, the user primarily interacts with a graphical user interface that is generated on the display 20 and includes various user interface elements that can be activated based on haptic inputs received on the touch screen 22 at positions corresponding to the underlying displayed interface element. One of the buttons 24 may serve a general purpose escape function, while another may serve to power up or power down the mobile communications device 10. Additionally, there may be other buttons and switches for controlling volume, limiting haptic entry, and so forth. Those having ordinary skill in the art will recognize other possible input/output devices that could be integrated into the mobile communications device 10, and the purposes such devices would serve. Other smartphone devices may include keyboards (not shown) and other mechanical input devices, and the presently disclosed interaction methods with the graphical user interface detailed more fully below are understood to be applicable to such alternative input modalities.
[0032] The mobile communications device 10 includes several other peripheral devices. One of the more basic is an audio subsystem 26 with an audio input 28 and an audio output 30 that allows the user to conduct voice telephone calls. The audio input 28 is connected to a microphone 32 that converts sound to electrical signals, and may include amplifier and ADC (analog to digital converter) circuitry that transforms the continuous analog electrical signals to digital data. Furthermore, the audio output 30 is connected to a loudspeaker 34 that converts electrical signals to air pressure waves that result in sound, and may likewise include amplifier and DAC (digital to analog converter) circuitry that transforms the digital sound data to a continuous analog electrical signal that drives the loudspeaker 34. Furthermore, it is possible to capture still images and video via a camera 36 that is managed by an imaging module 38.
[0033] Due to its inherent mobility, users can access information and interact with the mobile communications device 10 practically anywhere. Additional context in this regard is discernible from inputs pertaining to location, movement, and physical and geographical orientation, which further enhance the user experience. Accordingly, the mobile communications device 10 includes a location module 40, which may be a Global Positioning System (GPS) receiver that is connected to a separate antenna 42 and generates coordinates data of the current location as extrapolated from signals received from the network of GPS satellites. Motions imparted upon the mobile communications device 10, as well as the physical and geographical orientation of the same, may be captured as data with a motion subsystem 44, in particular, with an accelerometer 46, a gyroscope 48, and a compass 50, respectively. Although in some embodiments the accelerometer 46, the gyroscope 48, and the compass 50 directly communicate with the central processor 14, more recent variations of the mobile communications device 10 utilize the motion subsystem 44 that is embodied as a separate co-processor to which the acceleration and orientation processing is offloaded for greater efficiency and reduced electrical power consumption. One exemplary embodiment of the mobile communications device 10 is the Apple iPhone with the M7 motion co-processor.
[0034] The components of the motion subsystem 44, including the accelerometer 46, the gyroscope 48, and the compass 50, while shown as integrated into the mobile communications device 10, may be incorporated into a separate, external device. This external device may be wearable by the user and communicatively linked to the mobile communications device 10 over the aforementioned data link modalities. The same physical interactions contemplated with the mobile communications device 10 to invoke various functions as discussed in further detail below may be possible with such external wearable device.
[0035] There are other sensors 52 that can be utilized in the mobile communications device 10 for different purposes. For example, one of the other sensors 52 may be a proximity sensor to detect the presence or absence of the user to invoke certain functions, while another may be a light sensor that adjusts the brightness of the display 20 according to ambient light conditions. Those having ordinary skill in the art will recognize that other sensors 52 beyond those considered herein are also possible.
[0036] With reference to the flowchart of
[0037] In further detail, on the upper right hand corner of the content panel 58, there is an activatable graphic element 62 that is a part of an advertisement delivery sub-application embedded within the content 60. As referenced broadly herein, the term embedded with respect to the advertisement delivery sub-application may mean an executable or scripted module that is incorporated into the underlying app, a single instruction or reference that invokes the functionality of the advertisement delivery sub-application, or any other modality of calling a separate set of instructions that perform the function of advertisement delivery as contemplated. In order for the user to continue to experience the underlying app and/or content in the same manner as before, any other suitable unobtrusive location within the graphical user interface 54 that indicates an advertisement is available to be viewed can be substituted.
[0038] Continuing on, the method includes a step 202 of receiving a first external input corresponding to a triggering of the activatable graphic element 62 of the advertisement delivery sub-application. The first external input is received on a first input modality. In the context of the mobile communications device 10 with a touch user interface, this refers to receiving a haptic contact on the touch screen 22 at a location corresponding in position to the displayed activatable graphic element 62. Other ways of providing the same input are also possible.
[0039] As best shown in
[0040] Although the step 204 of displaying the advertisement overlay 64 has been described in the context of one embodiment in which the step is responsive to receiving the first external input that triggers the activatable graphic element 62, alternative embodiments are not limited thereto. For example, the advertisement overlay 64 may be displayed in response to a first external input that is independent of the graphical user interface 54 and any user interactions therewith. The first input modality of the mobile communications device 10 could be an indoor positioning system (beacon) receiver. Upon receiving a signal from an indoor positioning system transmitter by virtue of the mobile communications device 10 being brought in proximity thereto where such reception becomes possible, it is evaluated as such. In this case, the first external input could be the receipt of the beacon signal. Similarly, establishing a network link over particular wireless local area networks, being in a particular location as detected by the location module 40, being in a location with a particular type of weather reported, and so forth, can invoke the display of the advertisement overlay 64. Additional context can be discerned from nearby connected devices such as thermostats, televisions, lights, door locks, vehicles, and the like. Furthermore, the interaction with the graphical user interface as in the previously described embodiment could be combined with location triggering to further refine the type of advertisements that are presented to the user.
[0041] The present disclosure contemplates the invocation of advertisements in response to various motion/gesture inputs applied to the mobile communications device 10 by the user. Such inputs are detected, measured, and quantified by the motion subsystem 44. The conventions of certain mobile communications devices 10 dictate obtaining consent from the user prior to the use of this data due to privacy concerns. As shown in
[0042] Referring again to the flowchart of
[0043] The captured second external input is thereafter translated to at least a set of quantified values in accordance with step 210. The second external input could be one set of data captured in one time instant as would be the case for direction and orientation, or it could be multiple sets of data captured over multiple time instances that represent a movement action. Where multiple sets of data are required to detect an action from the user, a progress indicator towards the completion thereof may be displayed on the advertisement overlay 64.
[0044] With additional reference to
[0045] The display of the advertisement 76 need not be static, and may be modified according to other inputs being concurrently provided to the mobile communications device 10. For instance, a different graphic may be displayed in instances where the mobile communications device 10 is truly stationary, versus instances where the mobile communications device 10 is stationary but within a moving object such as a train, automobile, and the like. Where animated graphics are used as the advertisement 76, the playback speed can also be adjusted depending on the circumstances.
[0046] Other types of gestures and motions that can be imparted on the mobile communications device 10 by the user are also contemplated. For example,
[0047]
[0048] Yet another motion/gesture can be requested as shown in the advertisement invocation instructions 70 of
[0049] The foregoing examples illustrate that multiple inputs to the motion subsystem 44 can be used in sequence to correlate various actions by the user. However, it is also possible to utilize and prompt for single actions.
[0050] Although the previous advertisement invocation instructions 70 each involved a motion or gesture that is measured by the motion subsystem 44, it is possible to use the other inputs of the mobile communications device 10 to similarly activate the advertisement 76. For example,
[0051] As indicated above, the outputs generated throughout the steps of the method are all within the advertisement overlay 64, and designed to complement the native environment. Thus, it is possible to show the instructions, receive the input, and display the advertisement within the same main screen area 56 of the underlying app, without the need for leaving or exiting out of the same. Along these lines, whenever the advertisement overlay 64 is displayed, the user has the option to close the same and return to the underlying app by tapping a close button 88. The familiar entertainment and discovery that the user has come to expect from the app are still readily accessible, as the underlying app continues to run in the background. It is contemplated that advertisers will be able to attain superior brand engagements and drive messaging that imaginatively involves the user on a deeper level.
[0052] The particulars shown herein are by way of example and for purposes of illustrative discussion of the embodiments of the present disclosure only and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects. In this regard, no attempt is made to show details of the present invention with more particularity than is necessary, the description taken with the drawings making apparent to those skilled in the art how the several forms of the present invention may be embodied in practice.