SANDBOXING FOR SEPARATING ACCESS TO TRUSTED AND UNTRUSTED WEARABLE PERIPHERALS
20250232028 ยท 2025-07-17
Inventors
Cpc classification
G06F9/5027
PHYSICS
G06F3/011
PHYSICS
G09G5/397
PHYSICS
H04L67/52
ELECTRICITY
H04L67/12
ELECTRICITY
G09G2360/06
PHYSICS
G06F9/5038
PHYSICS
G09G2340/10
PHYSICS
G06F21/53
PHYSICS
G09G2320/0261
PHYSICS
G06F3/0481
PHYSICS
International classification
Abstract
Techniques include adding a trusted wearable services module to a sandbox/isolated module on the companion device. e.g., to Private Compute Core. This trusted wearable services module has a secure connection to the camera on the wearable device and prevents other modules on the companion device from viewing the private data. The trusted wearable services has the ability to encrypt and decrypt data from the camera and also performs the processing used to determine user context in an ambient sensing situation.
Claims
1. A method, comprising: receiving, by an isolated module of a companion device connected to a wearable device, a request from a manager module of the companion device to determine a user context of a user wearing the wearable device, the isolated module not sharing sensor data with the manager module; receiving, from the wearable device, encrypted sensor data acquired with a sensor of the wearable device, the encrypted sensor data being sensor data acquired from a sensor and encrypted; and determining the user context based on the sensor data.
2. The method as in claim 1, further comprising: establishing a secure connection with the wearable device.
3. The method as in claim 2, wherein the secure connection includes a transport layer security (TLS) protocol.
4. The method as in claim 1, wherein the sensor is a world-facing camera and the encrypted data includes a set of encrypted images, the images having been acquired with the world-facing camera.
5. The method as in claim 1, further comprising: sending data representing the user context to the manager module.
6. The method as in claim 1, wherein the user context includes an environment in which the user is driving.
7. The method as in claim 1, wherein determining the user context includes: inputting the sensor data into a machine learning engine, the machine learning engine being configured to determine the user context based on sensor data.
8. The method as in claim 7, wherein determining the user context further includes: decrypting the encrypted sensor data before inputting the sensor data into the machine learning engine.
9. A computer program product comprising a nontransitory storage medium, the computer program product including code that, when executed by at least one processor, causes the at least one processor to perform a method, in particular as claimed in any of the preceding claims, the method comprising: receiving, by an isolated module of a companion device connected to a wearable device, a request from a manager module of the companion device to determine a user context of a user wearing the wearable device, the isolated module not sharing sensor data with the manager module; receiving, from the wearable device, encrypted sensor data acquired with a sensor of the wearable device, the encrypted sensor data being sensor data acquired from a sensor and encrypted; and determining the user context based on the sensor data.
10. The computer program product as in claim 9, wherein the method further comprises: establishing a secure connection with the wearable device.
11. The computer program product as in claim 10, wherein the secure connection includes a transport layer security (TLS) protocol.
12. The computer program product as in claim 9, wherein the sensor is a world-facing camera and the encrypted data includes a set of encrypted images, the images having been acquired with the world-facing camera.
13. The computer program product as in claim 9, wherein the method further comprises: sending data representing the user context to the manager module.
14. The computer program product as in claim 9, wherein the user context includes an environment in which the user is driving.
15. The computer program product as in claim 9, wherein determining the user context includes: inputting the sensor data into a machine learning engine, the machine learning engine being configured to determine user context based on sensor data.
16. The computer program product as in claim 15, wherein determining the user context further includes: decrypting the encrypted sensor data before inputting the sensor data into the machine learning engine.
17. An apparatus, comprising: memory; and processing circuitry coupled to the memory, the processing circuitry being configured to: receive, by an isolated module of a companion device connected to a wearable device, a request from a manager module of the companion device to determine a user context of a user wearing the wearable device, the isolated module not sharing sensor data with the manager module; receive, from the wearable device, encrypted sensor data acquired with a sensor of the wearable device, the encrypted sensor data being sensor data acquired from a sensor and encrypted; and determine the user context based on the sensor data.
18. The apparatus as in claim 17, wherein the processing circuitry is further configured to: establish a secure connection with the wearable device.
19. The apparatus as in claim 18, wherein the secure connection includes a transport layer security (TLS) protocol.
20. The apparatus as in claim 17, wherein the sensor is a world-facing camera and the encrypted data includes a set of encrypted images, the images having been acquired with the world-facing camera.
21. The apparatus as in claim 17, wherein the processing circuitry is further configured to: send data representing the user context to the manager module.
22. The apparatus as in claim 17, wherein the user context includes an environment in which the user is driving.
23. The apparatus as in claim 17, wherein the processing circuitry configured to determine the user context is further configured to: input the sensor data into a machine learning engine, the machine learning engine being configured to determine user context based on sensor data.
24. The apparatus as in claim 23, wherein the processing circuitry configured to determine the user context is further configured to: decrypt the encrypted sensor data before inputting the sensor data into the machine learning engine.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
DETAILED DESCRIPTION
[0015] This disclosure relates to private data usage in a wearable device (e.g., head mounted device (HMD), augmented reality (AR) smartglasses). There are several privacy modes of data usage for a wearable device, including recording, intentional sensing, and ambient sensing.
[0016] Recording refers to the saving of photos or video generated by a world-facing camera of the wearable device. In this case, a bystander indicator such as an LED communicates to a bystander that an image is being taken and recordedthe bystander can then take action to protect their privacy.
[0017] Intentional sensing refers to usage such as object detection in which images are used for machine learning processing but are not being saved. A bystander indicator may be used in this case as well.
[0018] Ambient sensing refers to usage that determines a context in which the user is operating. For example, the user may be driving, and the wearable device may detect that the user is driving. Such detection may occur without the user doing anything or without the user's knowledge. Images taken with a world-facing camera are not saved. In this case, a bystander indicator is not used and the data should be kept private.
[0019] The need for privacy may complicate the ability to share data with a companion device in a split-compute architecture.
[0020] Wearable devices such as smartglasses can be configured to operate based on various constraints so that the smartglasses can be useful in a variety of situations. Example smart glasses constraints can include, for example, (1) smartglasses should amplify key services through wearable computing (this can include supporting technologies such as augmented reality (AR) and visual perception); (2) smartglasses should have sufficient battery life (e.g., last at least a full day of use on a single charge); and/or (3) smart glasses should look and feel like real glasses. Smartglasses can include AR and virtual reality (VR) devices. Fully stand-alone smartglasses solutions with mobile systems on chip (SoCs) that have the capability to support the desired features may not meet the power and industrial design constraints of smartglasses as described above. On-device compute solutions that meet constraints (1), (2) and/or (3) may be difficult to achieve with existing technologies.
[0021] A split compute architecture within smartglasses can be an architecture where the app runtime environment is at a remote compute endpoint, such as a mobile device, a server, the cloud, a desktop computer, the like, hereinafter often referred to as a companion device for simplicity. In some implementations, data sources such as IMU, camera sensors, and microphones (for audio data) can be streamed from the wearable device to the companion device. In some implementations, display content can be streamed from the compute endpoint back to the wearable device. In some implementations, because the majority of the compute and rendering does not happen on the wearable device itself, the split compute architecture can allow leveraging low-power MCU based systems. In some implementations, this can allow keeping power and ID in check, meeting at least constraints (1), (2) and/or (3). With new innovation in codecs and networking, it is possible to sustain the required networking bandwidth in a low power manner. In some implementations, a wearable device could connect to more than one compute endpoint at a given time. In some implementations, different compute endpoints could provide different services. In some implementations, with low-latency, high-bandwidth 5G connections becoming mainstream, compute endpoints could operate in the cloud.
[0022] In some implementations, a split compute architecture can move the application runtime environment from the wearable device to a remote endpoint such as a companion device (phone, watch) or cloud. Wearable device hardware only does the bare minimum, such as streaming of data sources (Camera, IMU, audio), pre-processing of data (e.g., feature extraction, speech detection) and finally the decoding and presentation of visuals.
[0023] Doing less on the wearable device can enable reducing the hardware and power requirements. In some implementations, a split-compute architecture may reduce the size of the temples. In some implementations, a split-compute architecture may enable leveraging large ecosystems. In some implementations, a split-compute architecture may enable building experiences that are no longer limited by the hardware capabilities of the wearable device.
[0024] Some companion devices have a sandbox/isolated module in which private data is kept apart from other modules on the companion device. For example, an operating system may have an open source, secure environment that is isolated from the rest of the operating system and apps. For example, sensitive data (e.g., sensor data) processed in such a secure environment is not shared to any apps without the user taking an action. Along these lines, until the user sends an indication, the OS keeps the user's reply hidden from both the key board and the app into which the user is typing.
[0025] A technical problem with the above-described private data usage is that the sandboxes that run on a companion device do not work with data from wearable devices in a split-compute architecture. For example, an isolated, secure environment does not have a facility that recognizes data from wearable devices in a split-compute architecture. Accordingly, the data generated by the wearable device during ambient or intentional sensing may not be kept private.
[0026] A technical solution to the above technical problem includes adding a trusted wearable services module to a sandbox/isolated module on the companion device. This trusted wearable services module has a secure connection to the camera on the wearable device (or another sensor of the wearable device) and prevents other modules on the companion device from viewing the private data. The trusted wearable service module has the ability to encrypt and decrypt data from the camera and also performs the processing used to determine user context (e.g., in an ambient sensing situation).
[0027] A technical advantage of the above-described technical solution is that the technical solution allows ambient and other sensing to be performed on a wearable device in a split-compute architecture while keeping data private from other modules on the companion device.
[0028] User context as used herein is a classification of what a user is doing in their environment. In one example, a user context indicates whether a user is driving or not, walking or not, or running or not. In some implementations, a user context indicates whether a user is moving quickly or slowly. In another example, a user context indicates whether the user is performing an action such as viewing an object or speaking with another person. The user context thus may be data indicating an activity of a user and/or characterizing an activity of a user.
[0029]
[0030] As shown in
[0031] In some implementations, the at least one processor 114 is configured to capture and encrypt sensor data prior to transmission to a companion device. Moreover, in some implementations the at least one processor 114 is configured to transmit encrypted sensor data over a secure connection to the companion device.
[0032]
[0033] On the wearable device, there is at least one sensor 205. The at least one sensor 205 can include a world-facing camera, an inertial measurement unit (IMU), The at least one sensor 205 is configured to acquire sensor data, which in some cases has a privacy concern. For example, a world-facing camera may take an image of a bystander without the bystander's knowledge. Accordingly, such sensor data should be hidden from modules on the companion device that could use the sensor data in a way that would violate the privacy of the bystander.
[0034] As shown in
[0035] Moreover, the secure sensor datasource 210 is connected to the trusted wearable services 220 via a secure connection. In some implementations, the secure connection is a transport layer security (TLS) connection. In some implementations, the secure connection includes a QUIC protocol.
[0036] On the companion device, the trusted wearable service 220 is configured to determine user context based on the sensor data received from a remote endpoint such as the secure sensor datasource 210. In some implementations, the trusted wearable service 220 includes a machine learning engine. In some implementations, the machine learning engine is configured to take as input encrypted sensor data and output a user context (e.g., user is driving or user is not driving). In some implementations, the machine learning engine is configured to take as input decrypted sensor data; in such an implementation, the trusted wearable service is further configured to decrypt the encrypted sensor data prior to input into the machine learning engine. In some implementations, the decryption is performed using a private key corresponding to the public key used by the secure sensor datasource 210 to encrypt the sensor data.
[0037] In some implementations, the trusted wearable service 220 is part of a private computing sandbox used to isolate private data from other modules on the companion device. For example, when the companion device uses an open source, isolated secure environment, the trusted wearable service 220 is an extension of such an environment used to isolate private data of the companion device from other modules of the companion device. Accordingly, the trusted wearable services 220 is an extension of a sandbox in that the isolation is extended to data received from the wearable device.
[0038] In some implementations, the trusted wearable service 220 is configured to send a request to the secure sensor datasource 210 for sensor data. Such a request may be sent in response to a request from the wearable manager 225 for user context.
[0039] The wearable manager 225 is configured to request user context from the trusted wearable services 220 and to receive the user context once determined by the trusted wearable services 220. The wearable manager 225 is also configured to control other wearable computation tasks on the companion device. For example, once the wearable manager 225 receives a user context indicating that the user is driving, the wearable manager 225 can send that user context to other wearable-core modules configured to use the user context to perform other functions.
[0040]
[0041] In some implementations, one or more of the components of the companion device 320 can be, or can include processors (e.g., processing units 324) configured to process instructions stored in the memory 326. Examples of such instructions as depicted in
[0042] The trusted wearable service 330 is configured to perform operations on private data in isolation from other wearable application modules (e.g., wearable manager 350) of the companion device 320 in a split-compute architecture. The trusted wearable service corresponds to the trusted wearable service 220 in
[0043] The decryption manager 332 is configured to perform decryption operations on sensor data (e.g., sensor data 342 of trusted wearable data 340). In some implementations, the encryption is public key encryption and the decryption operation is performed using the private key that generated the public key used for encryption. It is noted that the decryption operations cannot be performed outside of the trusted wearable service 330.
[0044] The machine learning engine 334 is configured to take as input sensor data (e.g., sensor data 342) and based on the input sensor data, produce user context data 344 representing a user context (e.g., is the user driving?). In some implementations, the machine learning engine 334 takes as input encrypted sensor data and the decryption manager 332 does not perform a decryption of the encrypted sensor data. In some implementations, the machine learning engine 334 includes a convolutional neural network.
[0045] The wearable manager 350 is configured to perform computations with regard to the wearable device connected to the companion device 320 in a split-compute environment. The wearable manager is isolated from the trusted wearable service 330 in that the wearable manager does not have access to private data used by or generated by the trusted wearable service. For example, the wearable manager 350 is configured to generate or receive wearable data 360 such as request data 362 representing a request for user context that is sent to the trusted wearable service 330. Also, the wearable manager 350 is configured to receive user context data 344 for use by wearable application modules on the companion device in the split-compute architecture.
[0046] The components (e.g., modules, processing units 324) of companion device 320 can be configured to operate based on one or more platforms (e.g., one or more similar or different platforms) that can include one or more types of hardware, software, firmware. operating systems, runtime libraries, and/or so forth. In some implementations, the components of the companion device 320 can be configured to operate within a cluster of devices (e.g., a server farm). In such an implementation, the functionality and processing of the components of the companion device 320 can be distributed to several devices of the cluster of devices.
[0047] The components of the companion device 320 can be, or can include, any type of hardware and/or software configured to process private data from a wearable device in a split-compute architecture. In some implementations, one or more portions of the components shown in the components of the companion device 320 in
[0048] The communication interface 322 includes, for example, wireless adaptors, and the like, for converting electronic and/or optical signals received from the network to electronic form for use by the companion device 320. The set of processing units 324 include one or more processing chips and/or assemblies. The memory 326 includes both volatile memory (e.g., RAM) and non-volatile memory, such as one or more ROMs, disk drives, solid state drives, and the like. The set of processing units 324 and the memory 326 together form processing circuitry, which is configured and arranged to carry out various methods and functions as described herein.
[0049] Although not shown, in some implementations, the components of the companion device 320 (or portions thereof) can be configured to operate within, for example, a data center (e.g., a cloud computing environment), a computer system, one or more server/host devices, and/or so forth. In some implementations, the components of the companion device 320 (or portions thereof) can be configured to operate within a network. Thus, the components of the companion device 320 (or portions thereof) can be configured to function within various types of network environments that can include one or more devices and/or one or more server devices. For example, the network can be, or can include, a local area network (LAN), a wide area network (WAN), and/or so forth. The network can be, or can include, a wireless network and/or wireless network implemented using, for example, gateway devices, bridges, switches, and/or so forth. The network can include one or more segments and/or can have portions based on various protocols such as Internet Protocol (IP) and/or a proprietary protocol. The network can include at least a portion of the Internet.
[0050] In some implementations, one or more of the components of the companion device 320 can be, or can include, processors configured to process instructions stored in a memory. For example, trusted wearable services 330 (and/or a portion thereof) and wearable manager 350 (and/or a portion thereof) are examples of such instructions.
[0051] In some implementations, the memory 326 can be any type of memory such as a random-access memory, a disk drive memory, flash memory, and/or so forth. In some implementations, the memory 326 can be implemented as more than one memory component (e.g., more than one RAM component or disk drive memory) associated with the components of the companion device 320. In some implementations, the memory 326 can be a database memory. In some implementations, the memory 326 can be, or can include, a non-local memory. For example, the memory 326 can be, or can include, a memory shared by multiple devices (not shown). In some implementations, the memory 326 can be associated with a server device (not shown) within a network and configured to serve the components of the companion device 320. As illustrated in
[0052]
[0053] In some implementations, one or more of the components of the wearable device 420 can be, or can include processors (e.g., processing units 424) configured to process instructions stored in the memory 426. Examples of such instructions as depicted in
[0054] The sensor manager 430 is configured to generate sensor data 432 for use by the companion device. In one example, the sensor manager 430 acquires world-facing images from a world-facing camera on the wearable device 420; in this case, the sensor data is a world-facing image that may include a bystander. In another example, the sensor manager 430 acquires IMU data from an IMU of the wearable device 420.
[0055] The encryption manager 440 (corresponding to secure sensor datasource 210) is configured to perform an encryption operation on the sensor data 432 to produce encrypted sensor data 442. In some implementations, the encryption manager 440 uses a public key sent by an isolated trusted wearable services module (e.g., trusted wearable services 330) running on the companion device to effect the encryption. In some implementations, the encryption manager 440 is configured to send the encrypted sensor data 442 to the isolated trusted wearable services module over a secure connection, e.g., a transport layer security (TLS) connection.
[0056] The components (e.g., modules, processing units 424) of wearable device 420 can be configured to operate based on one or more platforms (e.g., one or more similar or different platforms) that can include one or more types of hardware, software, firmware, operating systems, runtime libraries, and/or so forth. In some implementations, the components of the wearable device 420 can be configured to operate within a cluster of devices (e.g., a server farm). In such an implementation, the functionality and processing of the components of the companion device 305 can be distributed to several devices of the cluster of devices.
[0057] The communication interface 422 includes, for example. Ethernet adaptors, Token Ring adaptors, and the like, for converting electronic and/or optical signals received from the network to electronic form for use by the wearable device 420. The set of processing units 424 include one or more processing chips and/or assemblies. The memory 426 includes both volatile memory (e.g., RAM) and non-volatile memory, such as one or more ROMs, disk drives, solid state drives, and the like. The set of processing units 424 and the memory 426 together form processing circuitry, which is configured and arranged to carry out various methods and functions as described herein.
[0058] The components of the wearable device 420 can be, or can include, any type of hardware and/or software configured to acquire and encrypt sensor data for split compute environments. In some implementations, one or more portions of the components shown in the components of the wearable device 420 in
[0059] Although not shown, in some implementations, the components of the wearable device 420 (or portions thereof) can be configured to operate within, for example, a data center (e.g., a cloud computing environment), a computer system, one or more server/host devices, and/or so forth. In some implementations, the components of the wearable device 420 (or portions thereof) can be configured to operate within a network. Thus, the components of the wearable device 420 (or portions thereof) can be configured to function within various types of network environments that can include one or more devices and/or one or more server devices. For example, the network can be, or can include, a local area network (LAN), a wide area network (WAN), and/or so forth. The network can be, or can include, a wireless network and/or wireless network implemented using, for example, gateway devices, bridges, switches, and/or so forth. The network can include one or more segments and/or can have portions based on various protocols such as Internet Protocol (IP) and/or a proprietary protocol. The network can include at least a portion of the Internet.
[0060] In some implementations, one or more of the components of the companion device 305 can be, or can include, processors configured to process instructions stored in a memory. For example, sensor manager 430 (and/or a portion thereof) and encryption manager 440 (and/or a portion thereof) are examples of such instructions.
[0061] In some implementations, the memory 426 can be any type of memory such as a random-access memory, a disk drive memory, flash memory, and/or so forth. In some implementations, the memory 426 can be implemented as more than one memory component (e.g., more than one RAM component or disk drive memory) associated with the components of the wearable device 420. In some implementations, the memory 426 can be a database memory. In some implementations, the memory 426 can be, or can include, a non-local memory. For example, the memory 426 can be, or can include, a memory shared by multiple devices (not shown). In some implementations, the memory 426 can be associated with a server device (not shown) within a network and configured to serve the components of the wearable device 420. As illustrated in
[0062]
[0063] At 502, the isolated module receives a request from a manager module (e.g., wearable manager 350) of a companion device (e/g/, companion device 320) to determine a user context (e.g., user context data 344) of a user wearing a wearable device (e.g., wearable device 420), the isolated module not sharing sensor data with the manager module, the user context including an environment in which the user is using the wearable device.
[0064] At 504, the isolated module receives, from the wearable device, encrypted sensor data (e.g., sensor data 342) acquired with a sensor of the wearable device, the encrypted sensor data being sensor data acquired from a sensor and encrypted.
[0065] At 506, the isolated module determines the user context based on the sensor data.
[0066] Specific structural and functional details disclosed herein are merely representative for purposes of describing example embodiments. Example embodiments, however, may be embodied in many alternate forms and should not be construed as limited to only the embodiments set forth herein.
[0067] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the embodiments. As used herein, the singular forms a, an, and the are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms comprises, comprising, includes, and/or including, when used in this specification, specify the presence of the stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof.
[0068] It will be understood that when an element is referred to as being coupled, connected, or responsive to, or on, another element, it can be directly coupled, connected, or responsive to, or on, the other element, or intervening elements may also be present. In contrast, when an element is referred to as being directly coupled, directly connected, or directly responsive to, or directly on, another element, there are no intervening elements present. As used herein the term and/or includes any and all combinations of one or more of the associated listed items.
[0069] Spatially relative terms, such as beneath, below, lower, above, upper, and the like, may be used herein for ease of description to describe one element or feature in relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as below or beneath other elements or features would then be oriented above the other elements or features. Thus, the term below can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 70 degrees or at other orientations) and the spatially relative descriptors used herein may be interpreted accordingly.
[0070] Example embodiments of the concepts are described herein with reference to cross-sectional illustrations that are schematic illustrations of idealized embodiments (and intermediate structures) of example embodiments. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, example embodiments of the described concepts should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. Accordingly, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of example embodiments.
[0071] It will be understood that although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. Thus, a first element could be termed a second element without departing from the teachings of the present embodiments.
[0072] Unless otherwise defined, the terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which these concepts belong. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and/or the present specification and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
[0073] While certain features of the described implementations have been illustrated as described herein, many modifications, substitutions, changes, and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover such modifications and changes as fall within the scope of the implementations. It should be understood that they have been presented by way of example only, not limitation, and various changes in form and details may be made. Any portion of the apparatus and/or methods described herein may be combined in any combination, except mutually exclusive combinations. The implementations described herein can include various combinations and/or sub-combinations of the functions, components, and/or features of the different implementations described.