Patent classifications
G06F1/163
Intelligent electronic footwear and logic for navigation assistance by automated tactile, audio, and visual feedback
Presented are intelligent electronic footwear and apparel with controller-automated features, methods for making/operating such footwear and apparel, and control systems for executing automated features of such footwear and apparel. A method for operating an intelligent electronic shoe (IES) includes receiving, e.g., via a controller through a wireless communications device from a GPS satellite service, location data of a user. The controller also receives, e.g., from a backend server-class computer or other remote computing node, location data for a target object or site, such as a virtual shoe hidden at a virtual spot. The controller retrieves or predicts path plan data including a derived route for traversing from the user's location to the target's location within a geographic area. The controller then transmits command signals to a navigation alert system mounted to the IES's shoe structure to output visual, audio, and/or tactile cues that guide the user along the derived route.
Method and device for calibrating a light smart watch, and light smart watch
A method for calibrating a light smart watch includes: providing an FPC soft board under a dial, a size of the FPC soft board matching a size of the dial, the FPC soft board is divided into a plurality of partitions, each partition is insulated from other partitions, and each of watch hands and each partition form a capacitor in turn when the watch hands run; detecting a capacitance change amount of each partition, determining positions of partitions where the watch hands are currently located, and determining a current time indicated by the watch hands, according to the positions of partitions where the watch hands are currently located; comparing the current time indicated by the watch hands with a current time of a mobile terminal to determine a time error; and adjusting the watch hands, according to the time error to run in sync with the time of the mobile terminal.
Prestaging, gesture-based, access control system
A prestaging, gesture-based, access control system includes a local access assembly, a mobile device, a storage medium, and a processor. The assembly includes a controller to effect actuation between access and no-access states. The mobile device is carried by a user, and includes a detection system configured to detect a prestaging event inherently performed by the user toward an intent to gain access and followed by the detection of a primary intentional gesture specifically performed by the user toward the intent to gain access. The storage medium and the processor are configured to receive prestaging event information and primary intentional gesture information from the detection system, and execute an application to determine the performance of the prestaging event from the prestaging event information, then determine the performance of the primary intentional gesture from the primary intentional gesture information if the prestaging event is determined to have occurred.
Controlled-environment facility resident wearables and systems and methods for use
Controlled-environment facility resident behavioral and/or health monitoring may employ controlled-environment facility resident wearables each having a band configured to be affixed around a portion of a controlled-environment facility resident, irremovable by the resident and may include sensor(s) configured to measure biometric(s) of the controlled-environment facility resident and one or more physical parameter(s) experienced by the wearable, with a transmitter transmitting the biometric(s) and/or the physical parameter(s) to a controlled-environment facility management system. The controlled-environment facility management system may predetermine one or more normal input levels of the biometric(s) and/or physical parameter(s), receive the transmitted biometric(s) and/or physical parameter(s), determine whether received biometric(s) and/or physical parameter(s) rises above or falls below the predetermined normal input level(s), and alert controlled-environment facility personnel and/or law enforcement when received physical parameter(s) and/or received biometric(s) rise above or fall below the predetermined normal input level(s).
Correlation of bio-impedance measurements and a physiological parameter for a wearable device
An apparatus device may include a bio-impedance sensor configured to take a bio-impedance measurement from a body of an individual, an optical sensor configured to take an optical measurement from the body of the individual, and a processing device configured to receive a first bio-impedance measurement from the bio-impedance sensor taken during a first period of time and a first optical measurement from the optical sensor taken during the first period of time, receive first location information of the individual during the first period of time, determine a first correlation between a physiological parameter and at least one of the first location, the first bio-impedance measurement, or the first optical measurement, and determine a first level of the physiological parameter based on the first correlation.
Systems and methods for data visualization in virtual reality environments
A computer-implemented method is provided for visualizing multiple objects in a computerized visual environment. The method includes displaying to a user a virtual three-dimensional space via a viewing device worn by the user, and determining a data limit of the viewing device for object rendering. The method includes presenting an initial rendering of the objects within the virtual space, where the visualization data used for the initial rendering does not exceed the data limit of the viewing device. The method also includes tracking user attention relative to the objects as the user navigates through the virtual space and determining, based on the tracking of user attention, one or more select objects from the multiple objects to which the user is paying attention. The one or more select objects are located within a viewing range of the user.
Reflective eyepiece optical system and head-mounted near-to-eye display device
The present invention relates to a reflective eyepiece optical system and a head-mounted near-to-eye display device. The system includes: a first lens group, and a first optical element and a second lens group for transmitting and reflecting a light from a miniature image displayer; the second lens group includes one optical reflection surface which is an optical surface farthest from a human eye viewing side in the second lens group; the optical reflection surface is concave to a human eye viewing direction; the first optical element reflects the light refracted by the first lens group to the second lens group, and then transmits the light refracted, reflected and then refracted by the second lens group to human eyes; and the focal length combination among respective lenses is negative, positive and positive.
Activity-dependent audio feedback themes for touch gesture inputs
Systems and methods that provide audio feedback in response to gesture validity can provide a more intuitive interface that can train users to correctly complete gestures. Moreover, systems and methods that provide line-specific audio feedback can provide more specific feedback that can allow a user to better understand what sensing line is being contacted. The systems and methods can further include basing the audio feedback based at least in part on obtained activity data, such that invalid and valid feedbacks can provide different sounds dependent on the determined activity state.
Methods and devices for aligning miniaturized spectrometers and impedance sensors in wearable devices
A method, system, apparatus, and/or device to determine a condition of a user using multiple sensors. The method, system, apparatus, and/or device may include: a band configured to extend at least partially around a body part of a user having a subdermal feature within body part; a light configured in the band to emit light into the body part; a miniaturized spectrometer positioned in the band to press against the body part to receive the light, where the miniaturized spectrometer comprises: an optical filter configured to isolate a relevant constituent wavelength of the light; a collimator configured to collimate the light; and an optical sensor configured to detect an intensity of the relevant constituent wavelength; and an impedance sensor integrated into the band and configured to be positioned against a same side of the body part as the miniaturized spectrometer.
Virtually modeling clothing based on 3D models of customers
Three-dimensional models (or avatars) may be defined based on imaging data captured from a customer. The avatars may be based on a virtual mannequin having one or more dimensions in common with the customer, a body template corresponding to the customer, or imaging data captured from the customer. The avatars are displayed on displays or in user interfaces and used for any purpose, such as to depict how clothing will appear or behave while being worn by a customer alone or with other clothing. Customers may drag-and-drop images of clothing onto the avatars. One or more of the avatars may be displayed on any display, such as a monitor or a virtual reality headset, which may depict the avatars in a static or dynamic mode. Images of avatars and clothing may be used to generate print catalogs depicting the appearance or behavior of the clothing while worn by the customer.