G02B2027/0141

Virtual and augmented reality instruction system
11694565 · 2023-07-04 ·

A virtual and augmented reality instruction system may include a complete format and a portable format. The complete format may include a board system to capture all movement (including writing and erasing) on the board's surface, and a tracking system to capture all physical movements. The portable format may include a touch-enabled device or digital pen and a microphone, and is designed to capture a subset of the data captured by the complete format. In one embodiment of the complete format, the board system and the tracking system can communicate with each other through a network, and control devices (such as a laptop, desktop, mobile phone and tablet) can be used to control the board system and tracking system through the network. In further embodiments of the complete format, augmented reality can be achieved within the tracking system through the combination of 3D sensors and see through augmented reality glasses.

Vehicle user interface device and operating method of vehicle user interface device

The present invention relates to a vehicle user interface device including a display configured to display a first Augmented Reality (AR) graphic object at a point in a display area corresponding to a first point, and at least one processor configured to obtain distance data between a vehicle and the first point and change the first AR graphic object based on the distance data.

Display control device, display control method, and storage medium storage program

A display control device provided at a vehicle and including a processor, a first display section, a second display section adjacent to the first display section, and a third display section adjacent to the second display section in a different direction from the first display section. The processor being configured to: control the first display section, the second display section, and the third display section; acquire notification information to be notified to a driver of the vehicle; and execute an animation such that the notification information currently being displayed on one of the first display section or the third display section is displayed on the second display section and then displayed on another of the first display section or the third display section.

AUGMENTED REALITY EYEWEAR WITH X-RAY EFFECT

Eyewear providing an interactive augmented reality experience to users in a first physical environment viewing objects in a second physical environment (e.g., X-ray effect). The second environment may be a room positioned behind a barrier, such as a wall. The user views the second environment via a sensor system moveable on the wall using a track system. As the user in the first environment moves the eyewear to face the outside surface of the wall along a line-of-sight (LOS) at a location (x, y, z), the sensor system on the track system repositions to the same location (x, y, z) on the inside surface of wall. The image captured by the sensor system in the second environment is wirelessly transmitted to the eyewear for displayed on the eyewear displays, providing the user with an X-ray effect of looking through the wall to see the objects within the other environment.

Display control device, display control method, and storage medium storing program

A display control device provided at a vehicle and including a processor, a first display section, a second display section adjacent to the first display section, and a third display section adjacent to the second display section in a different direction from the first display section. The processor being configured to: control the first display section, the second display section, and the third display section; acquire notification information to be notified to a driver of the vehicle; and execute an animation such that the notification information currently being displayed on one of the first display section or the third display section is displayed on the second display section and then displayed on another of the first display section or the third display section.

VIRTUAL AND AUGMENTED REALITY INSTRUCTION SYSTEM
20220415197 · 2022-12-29 ·

A virtual and augmented reality instruction system may include a complete format and a portable format. The complete format may include a board system to capture all movement (including writing and erasing) on the board's surface, and a tracking system to capture all physical movements. The portable format may include a touch-enabled device or digital pen and a microphone, and is designed to capture a subset of the data captured by the complete format. In one embodiment of the complete format, the board system and the tracking system can communicate with each other through a network, and control devices (such as a laptop, desktop, mobile phone and tablet) can be used to control the board system and tracking system through the network. In further embodiments of the complete format, augmented reality can be achieved within the tracking system through the combination of 3D sensors and see through augmented reality glasses.

EYE TRACKING SYSTEM FOR SMART GLASSES AND METHOD THEREFOR
20220413285 · 2022-12-29 · ·

The present invention provides an eye tracking system including: a pupil sensing unit sensing a pupil of a user by at least one sensor embedded in smart glasses; a display unit including smart lenses to which a plurality of target objects is floated; an eye tracking unit tracking an eye through at least one sensor and acquiring eye state information of the user for the plurality of target objects; and an input unit performing an input after selecting a specific object in which the eye of the user stays for a predetermined time or more among the plurality of target objects as an input object based on the eye state information, in which at least one of a size, a color, and a movement direction of the input object is changed based on the eye state information of the user.

SYSTEM AND METHOD FOR ESTABLISHING DENTAL TREATMENT ENVIRONMENT
20220415489 · 2022-12-29 · ·

A system for establishing a dental treatment environment, includes: a head-mounted device provided at a dental clinic to be mounted on a patient's head, the head-mounted device having an image display unit and an ear-mounted speaker, a microphone for converting a sound including the voice of the medical staff in charge of the patient into an electric signal; a voice recognition module for recognizing the voice of the medical staff in charge from the electric sound input from the microphone; a content module storing multiple image contents for relaxing the patient mentally physically; a user interface having a content selection unit configured such that the patient can select a play content provided to the image display unit from the multiple image contents; and an output signal generating module for generating an output signal that is output to the head-mounted device.

NAVIGATION METHOD, ELECTRONIC DEVICE AND READABLE STORAGE MEDIUM

A navigation method, an electronic device and a readable storage medium, which relate to a field of vehicle networking technology, and in particular to a field of navigation technology. The navigation method includes: acquiring a real world image and a navigation information; converting the real world image to obtain a projection image, wherein the projection image is matched with an eyebox of at least one pair of vehicle-mounted glasses; superimposing the navigation information on the projection image to obtain a navigation image; and transmitting to the vehicle-mounted glasses the navigation image so that the navigation image is displayed by the vehicle-mounted glasses.

APPARATUS AND METHOD FOR MITIGATING THE EFFECTS OF TRAVEL LAG VIA COLOR PALETTE TRANSITIONS IN A VIRTUAL SKY PROJECTED IN A DIGITAL SPACE
20220409851 · 2022-12-29 ·

A method includes initiating an accelerated daily light and darkness cycle experience at a first color palette position within the accelerated daily light and darkness cycle that corresponds to a current time at a destination geographic location that a user will travel to. The accelerated daily light and darkness cycle experience is executed at a second color palette position within the accelerated daily light and darkness cycle that corresponds to the current time at the destination geographic location of the user. The accelerated daily light and darkness cycle is less than six minutes and includes projecting a dark virtual sky in a digital space, transitioning the dark virtual sky to a dawn virtual sky in the digital space, altering the dawn virtual sky to introduce a daytime virtual sky in the digital space and modifying the daytime virtual sky to produce a dusk virtual sky in the digital space.