A61B5/398

ACTIVE TITRATION OF ONE OR MORE NERVE STIMULATORS TO TREAT OBSTRUCTIVE SLEEP APNEA
20200069947 · 2020-03-05 ·

The present disclose generally relates to systems and methods for active titration of one or more cranial or peripheral nerve stimulators to treat obstructive sleep apnea. The active titration can be accomplished in an automated fashion by a closed-loop process. The closed-loop process can be executed by a computing device that includes a non-transitory memory storing instructions and a processor to execute the instructions to perform operations. The operations can include defining initial parameters for the one or more cranial or peripheral nerve stimulators for a patient; receiving sensor data from sensors associated with the patient based on a stimulation with the one or more cranial or peripheral stimulators programmed according to the initial parameters; and adjusting the initial parameters based on the sensor data.

Methods and Systems for Large Spot Retinal Laser Treatment
20200069463 · 2020-03-05 ·

In some embodiments, a system for providing a therapeutic treatment to a patient's eye includes a treatment beam source configured to transmit a treatment beam along a treatment beam path. The system further includes a processor coupled to the treatment beam source, the processor being configured to direct the treatment beam onto retinal tissue of the patient's eye and deliver a series of short duration pulses from the treatment beam onto the retinal tissue at a first treatment spot to treat the retinal tissue. In some embodiments, a pre-treatment evaluation method using electroretinography (ERG) data may be used to predict effects of treatment beams at different power values and to determine optimal power values.

Neuro-physiology and neuro-behavioral based stimulus targeting system

An example system includes a processor to determine a first distance between a first peak in a first frequency band of neuro-response data gathered from a subject while exposed to media and a second peak in the first frequency band; determine a second distance between a third peak in the first frequency band and either the second peak in the first frequency band or a fourth peak in the first frequency band; determine a first difference between the first distance and the second distance; generate a first response profile for the subject based on the first difference; and integrate the first response profile with a second response profile associated with a second subject to form an integrated response profile. A selector is to select an advertisement or entertainment for presentation based on the integrated response profile. The processor is to modify the media to present the advertisement or entertainment.

In-ear electrical potential sensor

Aspects of the present disclosure provide an audio product for obtaining biologically-relevant information associated with a user comprising. The audio product includes at least two electrodes, a processor, and an electroacoustic transducer coupled to the processor. The processor is configured to receive at least one signal affected by an action of the user obtained via the first electrode or the second electrode and take one or more actions based on the at least one signal. The at least one action may control another device, in an effort to provide hands-free control of the other device.

ELECTRONIC DEVICE AND CONTROL METHOD THEREOF
20200064921 · 2020-02-27 ·

An electronic device is disclosed. The electronic device identifies a user on the basis of: a biological signal input unit for receiving the input of a user's biological signal detected through an electrode; a voice input unit for receiving the input of a voice signal; a biological signal inputted through the biological signal input unit; and a voice signal inputted through a microphone.

Wearable device for eye movement detection
10568505 · 2020-02-25 · ·

A method, apparatus and computer program product are provided for detecting eye movement with a wearable device attachable to an eyelid. The wearable device may include an arrangement of sensors, including any combination of piezoelectric sensors, accelerometers, and/or any other type of sensor. An external device may be provided for processing sensor data. The wearable device and/or external device may analyze the sensor data to generate eyeball movement data, and to differentiate first directional data and second directional data. The first directional data may be considered as movement occurring substantially horizontally and the second directional data may be considered as movement occurring substantially vertically. The wearable device and/or external device may therefore provide data regarding sleep cycles of the user.

User and object interaction with an augmented reality scenario

A method for generating virtual content for presentation in an AR system includes, under control of a hardware processor included in the AR system, analyzing pose data to identify a pose of a user of the AR system. The method also includes identifying a physical object in a 3D physical environment of the user based at least partly on the pose. The method further includes responsive to detecting a first gesture, presenting a first type of virtual content in a display of the AR system. Moreover, the method includes responsive to detecting a second gesture, presenting a pod user interface virtual construct comprising a navigable menu. In addition, the method includes responsive to detecting a selection of an application through the navigable menu, rendering, in the display of the AR system, within the pod user interface virtual construct, the particular application in a 3D view to the user.

User and object interaction with an augmented reality scenario

A method for generating virtual content for presentation in an AR system includes, under control of a hardware processor included in the AR system, analyzing pose data to identify a pose of a user of the AR system. The method also includes identifying a physical object in a 3D physical environment of the user based at least partly on the pose. The method further includes responsive to detecting a first gesture, presenting a first type of virtual content in a display of the AR system. Moreover, the method includes responsive to detecting a second gesture, presenting a pod user interface virtual construct comprising a navigable menu. In addition, the method includes responsive to detecting a selection of an application through the navigable menu, rendering, in the display of the AR system, within the pod user interface virtual construct, the particular application in a 3D view to the user.

Hearing device comprising electrodes for picking up a physiological response

The application relates to a hearing device, e.g. a hearing aid, comprising a first part for being inserted in an ear canal or fully or partially implanted in the head of a user, the first part comprising at least one electrode unit, termed a PR-electrode unit, for making contact to skin or tissue of a user when mounted or implanted in an operational condition, the at least one PR-electrode unit being configured to pick up a physiological response from the user, and wherein the at least one PR-electrode unit comprises an electrically conductive material, e.g. a shape memory alloy. The first part may comprise an implanted part, e.g. in combination with an external part adapted for being located in an ear canal, wherein both parts comprise one or more PR-electrode units. The invention may e.g. be used hearing aids, headsets, ear phones, active ear protection systems, or combinations thereof, e.g. to control processing of the hearing device or to monitor a condition of the user wearing the hearing device.

Augmented and virtual reality display systems and methods for delivery of medication to eyes

Configurations are disclosed for a health system to be used in various healthcare applications, e.g., for patient diagnostics, monitoring, and/or therapy. The health system may comprise a light generation module to transmit light or an image to a user, one or more sensors to detect a physiological parameter of the user's body, including their eyes, and processing circuitry to analyze an input received in response to the presented images to determine one or more health conditions or defects.