PORTABLE DEVICE
20170371420 · 2017-12-28
Assignee
Inventors
- Mårten Skogö (Danderyd, SE)
- John Elvesjö (Danderyd, SE)
- Jan-Erik Lundkvist (Danderyd, SE)
- Per Lundberg (Danderyd, SE)
Cpc classification
G06F3/017
PHYSICS
G06F3/011
PHYSICS
G06F2200/1637
PHYSICS
G06F1/1684
PHYSICS
International classification
Abstract
A portable computing device is disclosed which may include a base element, a lid element, a first motion sensor, a second motion sensor, a processor, and an eye tracking system. The first motion sensor may be disposed in the base element. The second motion sensor may be disposed in the lid element. The processor may be configured to control the first motion sensor to detect first motion information, control the second motion sensor to detect second motion information; and determine final motion information based at least in part on the first motion information and the second motion information. The eye tracking system may be configured to determine a gaze position of a user based at least in part on the final motion information, wherein the processor is further configured to execute one or more control processes based at least in part on the determined gaze position meeting a predetermined condition.
Claims
1. A portable computing device, comprising: a base element; a lid element; a first motion sensor disposed in the base element; a second motion sensor disposed in the lid element; a processor configured to: control the first motion sensor to detect first motion information; control the second motion sensor to detect second motion information; and determine final motion information based at least in part on the first motion information and the second motion information; and an eye tracking system configured to determine a gaze position of a user based at least in part on the final motion information, wherein the processor is further configured to execute one or more control processes based at least in part on the determined gaze position meeting a predetermined condition.
2. The portable computing device according to claim 1, wherein: the first motion information comprises a first angle value between the gravitational force exerted by the Earth and a plane of the lid element; the second motion information comprises a second angle value between the gravitational force exerted by the Earth and a plane of the base element; and the final motion information comprises a third angle value derived from at least the first angle value and the second angle value.
3. The portable computing device according to claim 1, wherein the first motion sensor and the second motion sensor each comprise an accelerometer.
4. The portable computing device according to claim 3, wherein the second motion sensor is located below a display unit of the lid element and proximate to a hinge means rotatably coupling the base element with the lid element.
5. The portable computing device according to claim 1, wherein the one or more control processes include a selection from a group consisting of: turning off the eye tracking system; pausing image collection of the eye tracking system; adjusting one or more photography parameters of the eye tracking system; and controlling one or more system events.
6. The portable computing device according to claim 1, wherein the predetermined condition is selected from a group consisting of: the gaze position cannot be determined; and the gaze position cannot be determined in a predetermined area with respect to a display unit of the lid element.
7. The portable computing device according to claim 6, wherein the predetermined area is smaller than an entirety of a display area of the display unit.
8. The portable computing device according to claim 1, wherein the processor is further configured to determine 3D coordinate values of three or more positions on a display unit of the lid element based at least in part on at least one of the first motion information, the second motion information, or the final motion information.
9. The portable computing device according to claim 8, wherein the three or more positions include: at least one position proximate to a top right corner of the display unit; at least one position proximate to a top left corner of the display unit; and at least one position proximate to a lower left corner or a lower right corner of the display unit.
10. The portable computing device according to claim 1, wherein the eye tracking system is further configured to determine head pose information of the user with respect to the display unit.
11. The portable computing device according to claim 10, wherein the determination of the gaze position is further based on the head pose information.
12. A method for controlling an eye tracking system of a portable device, wherein the method comprises: detecting first motion information with at least a first motion sensor disposed in a first part of a portable device; detecting second motion information with at least a second motion sensor disposed in a second part of the portable device; determining final motion information based on the first motion information and the second motion information; determining a gaze position of a user based on at least in part of the final motion information; and executing one or more control processes based at least in part on the determined gaze position meeting a predetermined condition.
13. The method of claim 12, wherein: the first part comprises a base element of the portable device; and the second part comprises a lid element of the portable device.
14. The method of claim 13, wherein: the first motion information comprises a first angle value between the gravitational force exerted by the Earth and a plane of the base element; the second motion information comprises a second angle value between the gravitational force exerted by the Earth and a plane of the lid element; and the final motion information comprises a third angle value derived from at least the first angle value and the second angle value.
15. The method of claim 12, wherein the first motion sensor and the second motion sensor each comprise an accelerometer.
16. The method of claim 13, wherein the second motion sensor is located below a display unit of the lid element and proximate to a hinge means rotatably coupling the base element with the lid element.
17. The method of claim 12, wherein the one or more control processes include a selection from a group consisting of: turning off an eye tracking system; pausing image collection of the eye tracking system; adjusting one or more photography parameters of the eye tracking system; and controlling one or more system events.
18. The method of claim 13, wherein the predetermined condition is selected from a group consisting of: the gaze position cannot be determined; and the gaze position cannot be determined in a predetermined area with respect to a display unit of the lid element.
19. The method of claim 18, wherein the predetermined area is smaller than display area of the display unit.
20. The method of claim 13, wherein the method further comprises: determining 3D coordinate values of three or more positions on a display unit of the lid element based at least in part on at least one of the first motion information, the second motion information, or the final motion information.
21. The method of claim 20, wherein the three or more positions includes at least one is substantially close to the top right corner of the display unit, at least one is substantially close to the top left corner of the display unit and at least one substantially close to the lower right or left corner of the display unit.
22. The method of claim 13, wherein the method further comprises: determining head pose information of the user with respect to a display unit of the lid element.
23. The method of claim 22, wherein the determination of the gaze position is further based on the head pose information.
24. A non-transitory computer readable medium having stored thereon a program for controlling an eye tracking system of a portable device comprising the steps of: detecting first motion information with at least a first motion sensor disposed in a first part of a portable device; detecting second motion information with at least a second motion sensor disposed in a second part of the portable device; determining final motion information based on the first motion information and the second motion information; determining a gaze position of a user based on at least in part of the final motion information; and executing one or more control processes based at least in part on the determined gaze position meeting a predetermined condition.
25. A non-transitory computer readable medium having stored thereon a program for controlling an eye tracking system of a portable device of claim 24, wherein: the first part comprises a base element of the portable device; and the second part comprises a lid element of the portable device.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] Embodiments of the present invention are described in conjunction with the appended figures:
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
DETAILED DESCRIPTION OF THE INVENTION
[0035] The ensuing description provides exemplary embodiments only, and is not intended to limit the scope, applicability or configuration of the disclosure. Rather, the ensuing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing one or more exemplary embodiments. It being understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the invention as set forth in the appended claims.
[0036] For example, any detail discussed with regard to one embodiment may or may not be present in all contemplated versions of that embodiment. Likewise, any detail discussed with regard to one embodiment may or may not be present in all contemplated versions of other embodiments discussed herein. Finally, the absence of discussion of any detail with regard to embodiment herein shall be an implicit recognition that such detail may or may not be present in any version of any embodiment discussed herein.
[0037] Specific details are given in the following description to provide a thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, circuits, systems, networks, processes, and other elements in the invention may be shown as components in block diagram form in order not to obscure the embodiments in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the embodiments.
[0038] Also, it is noted that individual embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process may be terminated when its operations are completed, but could have additional steps not discussed or included in a figure. Furthermore, not all operations in any particularly described process may occur in all embodiments. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination corresponds to a return of the function to the calling function or the main function.
[0039] The term “computer readable medium” or “machine readable medium” includes, but is not limited to transitory and non-transitory, portable or fixed storage devices, optical storage devices, wireless channels and various other mediums capable of storing, containing or carrying instruction(s) and/or data. A code segment or machine-executable instructions may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
[0040] Furthermore, embodiments of the invention may be implemented, at least in part, either manually or automatically. Manual or automatic implementations may be executed, or at least assisted, through the use of machines, hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks may be stored in a computer or machine readable medium. A processor(s) may perform the necessary tasks. The terms “comprises,” “comprising,” “includes,” “including,” and other terms herein specify the presence of stated features, integers, steps, or components. However, these terms do not preclude the presence or addition of one or more additional features, integers, steps, and/or components or groups thereof.
[0041] We refer initially to
[0042] The proposed portable device includes a first part 110 (here represented by a laptop base element) and a second part 120 (here represented by a laptop lid element). The second part 120, in turn, includes the optical remote sensing system 300. As is common in laptops, the second part 120 is pivotably attached to the first part 110 via a hinge means 115, such that the portable device may be arranged in an open and a closed position respectively.
[0043] The first and second parts 110 and 120 have a respective essentially flat inner surface 111 and 121. When the portable device is arranged in the closed position, the essentially flat inner surfaces 111 and 121 are parallel and face one another, as can be seen in
[0044] The first essentially flat surface 111 of the first part 110 preferably also includes a keyboard configured to receive input commands from the user. Moreover, the second essentially flat surface 121 of the second part 120 preferably includes a display unit 122 (see
[0045]
[0046]
[0047] Again, the portable device has first and second parts 110 and 120 that are pivotably attached to one another, such that the portable device may be arranged in an open and a closed position respectively. In this case, however, the optical remote sensing system 300 is not co-located with the hinge means 115. Instead, the optical remote sensing system 300 is disposed in a projection 125b extending along a distal side of the second part 120, whereas the hinge means 115 are arranged along a proximal side of the second part 120, which proximal and distal sides are opposite to one another.
[0048]
[0049]
[0050] Irrespective of whether the optical remote sensing system 300 is co-located with the hinge means 115 (as in
[0051] Additionally, regardless of the location of the optical remote sensing system 300, according to some embodiments of the invention, the optical remote sensing system 300 includes an image registering unit and at least one illuminator configured to illuminate the user. The image registering unit, in turn, may contain a still and/or a video camera configured to capture image data representing the user of the portable device, such as images of his/her eyes.
[0052] It is further advantageous if at least one of the at least one illuminator is configured to produce structured light, which when reflected against the user and registered by the image registering unit causes resulting data to be created, which resulting data are adapted for generating a depth map of the user. Depth maps are advantageous both when interpreting gestures and during eye-tracking, for instance when selecting a relevant image segment to process.
[0053] Moreover, one or more of the at least one illuminator may be configured to produce (NIR) light. NIR light is advantageous because it is relatively uncomplicated to detect by a camera and because it is invisible to the human eye. Thus, NIR light does not disturb the user.
[0054] It is further advantageous if one or more of the at least one illuminator is configured to produce a light beam whose direction is controllable to track a varying position of the user. If at least one of the at least one illuminator is configured to produce coherent light, diffractive optical elements (DOE) may be used to transform the light beam into a desired spatial pattern. Thus, the illumination can be controlled very efficiently, for instance to follow a position of the user.
[0055] Alternatively, or as a complement, at least one of the at least one illuminator may be based on LED technology. LEDs are desirable light sources since they are energy-efficient, compact and reliable.
[0056] In a third embodiment, a portable device having two portions connected via a hinge, such as laptop, is capable of opening to a large angle (e.g., over 120 degrees) with respect to the surface of base element (shown as the first part 110). Some convertible laptops are even equipped with a rotatable display, such that when the laptop is in the open position, the display portion can be rotated, twisted and tilted. Where the portable device is equipped with an eye tracker, the eye tracker is normally mounted towards the base of the display portion, and a large open angle, or extremely tilted, twisted, or rotated display presents a problem in that the eye tracker (shown as optical remote sensing system 300) may not be able to achieve optimal eye tracking performance.
[0057] For the purpose of this document, the term “open angle” is intended to refer to the angle representing the orientation of the first part 110 to the second part 120. In other words, the degree of openness of a laptop or similar portable device.
[0058] The performance of the eye tracker, such as precision or accuracy of eye tracking, may be severely affected as the gaze positions may be too close to the edge of the second part 120 or even out of the gaze tracking area on the display unit 122 when, for example, the open angle is too large. Therefore, there is a need to determine the open angle of the portable device and ultimately determine the orientation of the lid element (shown as the second part 120) and dynamically control and/or recalibrate the eye tracker based on this and/or any other system events associated with the performance of the eye tracker. The exact position and/or orientation of the second part 120 in relation to the optical remote sensing system 300 may be stored in any form of computer readable storage of the portable device. This may allow for improved power efficiency and more accurate and precise eye tracking performance.
[0059]
[0060] For this non-limiting example, the motion sensor may be any kind of Inertial Measurement Unit (IMU) or Microelectromechanical (MEMS) system, such as an accelerometer, gyroscope, and/or magnetometer. The motion sensor can also be an inertial measurement module coupled with a plurality of the aforementioned motion sensors or integrated as a System-in-Package (SiP).
[0061] The accelerometer may be an electromechanical device that measures acceleration forces, as would be readily understood by a person of skill in the art. These forces may be static, like the constant force of gravity in a case where the accelerometer is not moved or vibrated for a period of time, or they could be dynamic—caused by moving or vibrating the accelerometer. The accelerometer may be of different types, such as a digital accelerometer or analog accelerometer. The specifications of the accelerometers, such as number of measurement axes, output range, sensitivity and dynamic range, can be manually set by a user or automatically set according to the usage of the portable device.
[0062] The gyroscope (or gyro sensor) senses angular velocity from the Coriolis force applied to a vibrating object. And the vibrating object may be the second part 120, the first part 110, or may be the portable device in general. The types of gyroscopes are also not limited, and may include Tuning Fork Gyroscopes, Vibrating-Wheel Gyroscopes, Wine Glass Resonator Gyroscopes or Foucault Pendulum Gyroscopes. The gyroscopes may be a stand-alone chip module communicatively coupled to the system bus of the circuitry of the portable device or may be printed onto the circuit board (e.g., motherboard) of the portable device using photolithography. Again, the specification of the gyroscopes, such as measurement range, number of sensing axes, linearity or nonlinearity, shock survivability, angular random walk (ARW), bias, bias drift and bias instability, can be manually set by a user or automatically set according to the usage of the portable device.
[0063] Alternatively, the motion sensor of 401 and/or 402 may be a composite module coupled with one or more accelerometers and/or one or more gyroscopes. Then the measurement of acceleration force(s) and angular velocity is possible.
[0064] The one or more motion sensors in the second part 120 may be preferably placed at the bottom or substantially close to the bottom of the second part 120, where the bottom is the edge of the second part 120 most closely located to the first part 110, which means the one or more motion sensors may be substantially close to the hinge means 115. “Substantially close” in this context means proximate to the hinge and distal to the opposite edge of the second part 120. In some embodiments, this may mean the one or more motion sensors may be located below the screen of the second part 120. It is advantageous to have such placement to minimize risk of low determination accuracy caused by accidental or unwanted vibration of the second part 120, which may affect the precision of angle determination (details will be described in the following paragraphs). The second part 120 of a portable device (e.g., laptop) may be of such limited thickness that minor vibrations may cause shaking of second part 120. However, the placement of the one or more motion sensors is not limited to the aforementioned position, it can be any place in second part 120.
[0065] The motion sensor may be integrated into the motherboard of the portable device or circuitry of the first part 110 or second part 120. Alternatively, the motion sensor may also be placed externally with respect to the enclosure (e.g., either second part 120 or first part 110) of the portable device. In such circumstance, the motion sensor may be equipped as a module of a Raspberry Pi® that communicatively coupled to the laptop via any I/O interface (e.g., Universal Serial Bus (USB)) or preferably has wireless connectivity (e.g., Bluetooth™, WiFi) for data transmission. Other similar forms of embedded devices are possible.
[0066]
[0067] In a non-limiting example, as shown in
[0068] Then a final motion information is determined in the processor via arithmetic calculation by using the V.sub.close up and the 3D coordinates' values of the at least three points. Optionally, the final motion information may be determined by only using either of the V.sub.close up or the 3D coordinates' values of the at least three points. The final motion information is sent to the eye tracker. Alternatively, the final motion information may be determined before ahead if it is within a predetermined threshold value. The predetermined threshold value may indicate a range of open angle value. In an extreme circumstance that the open angle of the laptop or the twisting angle of the second part 120 may be too large to be used for the use of gaze determination (will be described in the following). After the acquisition of the final motion information, eye tracker is controlled to determine the user's gaze positions of at least one eye relative to the display unit 122. Besides, the eye tracker may also alternatively determine the head pose of the user with respect to the display unit 122 and take the head pose information into account of determination of gaze positions. Here, head pose may be determined based on one or more images captured by the optical remote sensing. The head pose may be defined by the position and the orientation of the head in three-dimensional space at some particular time. The head pose may be determined by examining one or more facial features (for instance mouth, nose, etc.) and their positions and orientations relative to one another. In non-limiting determination condition, under the control of the processor, the eye tracker is controlled to (a) determine if the gaze positions cannot be determined at all with respect to the display unit 122; or (b) determine if the gaze positions cannot be determined in a predetermined area with respect to the display unit 122 (e.g., corner of the display unit, or area that is close to the edge of the display unit 122). Note that “gaze position” is used herein to not only describe actual determined gaze positions, but also may include a description of a scenario where the gaze position cannot be determined with respect to the display unit 122 or otherwise.
[0069] If the result of the aforementioned determination is positive, then the eye tracker is controlled for the following one or both of the executions: (i the eye tracker may be power ON or OFF or pause the eye tracker for image collection or adjust one or more photography parameters (e.g., frame rate, exposure, ISO and etc.) for image collection or start over the calibration process of the eye tracker; (ii) one or more system events associated with the eye tracker may be controlled, such as the function of the application on the portable device. Other executions are possible, not limited to the aforementioned executions.
[0070] Next, referring to the flowchart of
[0071] At step 1040, the 3D coordinate values of three or more points on the display unit 122 are calculated in the processor. At step 1050, a final motion information is determined by using calculated the 3D coordinate values and the open angle value. At step 1060, after the determination of the final information, the final information is further determined whether it is within the threshold value. If the final motion information is within threshold value, then proceeds to step 1070; if the final motion information is not within the threshold value.
[0072] At step 1070, the eye tracker is controlled to determine gaze positions of the user with respect to the display unit 122. At step 1080, the gaze positions are used to (a) determine if the gaze positions cannot be determined at all with respect to the display unit 122; or (b) determine if the gaze positions cannot be determined in a predetermined area with respect to the display unit 122 (e.g., corner of the display unit, or area that closed to the edge of the display unit 122. And if the determination result is position. If the determination result is negative. At step 1090, the eye tracker is controlled to execute corresponding control process.
[0073] Embodiments of the invention has now been described in detail for the purposes of clarity and understanding. However, it will be appreciated that certain changes and modifications may be practiced within the scope of the appended claims.