Patent classifications
G02B2027/014
DISPLAY DEVICE FOR A VEHICLE
There is provided a display device for a vehicle, the display device including: a display unit that is visible to an occupant of a vehicle; a memory; and a processor that is connected to the memory, the processor changing a content image displayed on the display unit so as to be displayed in the same direction as a direction in which a travel path of the vehicle curves.
VIRTUAL REALITY SYSTEM
The present disclosure provides a virtual reality system. The virtual reality system of the disclosure, comprising: a head-mounted display, an input device and a positioning module, wherein the head-mounted display comprises a central processing unit, a camera module connected with the central processing unit and a wireless connection module; the camera module comprises a binocular fisheye camera, an IR camera and a TOF camera, the positioning module comprises a first inertial measurement unit and an electromagnetic receiver provided on the head-mounted display, a second inertial measurement unit and an electromagnetic transmitter provided on the input device; the central processing unit, configured to implement data interaction and command control with the binocular fisheye camera, the IR camera, the TOF camera, the wireless connection module, the first inertial measurement unit and the electromagnetic receiver.
HEAD-MOUNTED DISPLAY SENSOR STATUS
An example device comprises: a head-mounted display; a housing for the head-mounted display, the housing including an external surface; sensors to monitor a wearer of the head-mounted display; a visual indicator at the external surface; and a controller. The controller is generally to: control subsets of the sensors to be on or off based on respective permissions for usage of subsets of the sensors; and control the visual indicator to indicate respective status of the subsets of the sensors.
HEAD-UP DISPLAY DEVICE
According to an aspect, a head-up display device, includes: a display panel configured to display a third image that is a composite of first and second images; a parallax generator configured to generate parallax of the first image and parallax of the second image; and a projection destination part onto which projection light emitted from a display surface side of the display panel and modulated by the parallax generator is projected.
SMART WEARABLE DEVICE FOR VISION ENHANCEMENT AND METHOD FOR REALIZING STEREOSCOPIC VISION TRANSPOSITION
The invent discloses a smart wearable device for vision enhancement and a method for realizing stereoscopic vision transposition, comprising a wearable device body, wherein the wearable device body is provided with camera lenses, image sensors, an image information receiving and transmitting unit, image enhancement units, and near-to-eye optical systems; the optical axis and field angle of the near-to-eye optical system are matched with the optical axis and field angle of the camera lens; the image sensor is arranged behind the camera lens; the real scene enters the image sensor through an image imaging device for image acquisition, and through the image enhancement unit, the low-light environment image collected by the smart wearable device in the low-light environment is enhanced and displayed clearly. The invention can ensure the enhancement of the real stereoscopic vision in the dark environment and the interchange of the remote and barrier-free stereoscopic real vision.
HEAD-UP DISPLAY, HEAD-UP DISPLAY SYSTEM, AND MOVABLE BODY
A first input unit in a head-up display obtains a distance to an object. A second input unit obtains a user's eye position. An optical system projects, into the user's field of view, a virtual image of an image displayed on a display panel. A processor causes the display panel to display a parallax image. An optical element causes a first image displayed on the display panel to reach the user's first eye and a second image on the display panel to reach the user's second eye. The processor causes the display panel to display an image element in the parallax image as at least partially superimposed on the object. The processor performs first control to fix, in response to the distance to the object greater than or equal to a predetermined first distance, parallax of the image element to a value other than 0 corresponding to the first distance.
USING 6DOF POSE INFORMATION TO ALIGN IMAGES FROM SEPARATED CAMERAS
Techniques for aligning images generated by an integrated camera physically mounted to an HMD with images generated by a detached camera physically unmounted from the HMD are disclosed. A 3D feature map is generated and shared with the detached camera. Both the integrated camera and the detached camera use the 3D feature map to relocalize themselves and to determine their respective 6 DOF poses. The HMD receives the detached camera's image of the environment and the 6 DOF pose of the detached camera. A depth map of the environment is accessed. An overlaid image is generated by reprojecting a perspective of the detached camera's image to align with a perspective of the integrated camera and by overlaying the reprojected detached camera's image onto the integrated camera's image.
HEAD-MOUNTED DISPLAY APPARATUS, IMAGE DISPLAY SYSTEM, AND IMAGE DISPLAY METHOD
An object is to provide a head-mounted display apparatus that makes it possible to accurately determine a mounting deviation. The technology provides a head-mounted display apparatus including a sensor that detects a change in position of the head-mounted display apparatus relative to a head. The technology further provides an image display system including the head-mounted display apparatus and an information processor that sends image data to the head-mounted display apparatus. The technology further provides an image display method including a detection step of detecting the change in position of the head-mounted display apparatus relative to the head, and an adjustment step of adjusting a position to which the head-mounted display apparatus projects image display light on the basis of the change in position detected in the detection step.
AUGMENTED REALITY DEVICE AND METHODS OF USE
Computer-implemented methods of operating an augmented reality device can involve capturing camera images, processing the camera images, and displaying virtual display images. The camera images can be captured automatically using a camera disposed within an augmented reality device worn by a user. The camera images can be processed automatically using a processor located within the augmented reality device. The virtual display images can be displayed automatically to the user within the augmented reality device while the user is looking through the augmented reality device and simultaneously viewing real objects through the augmented reality device. The virtual display images can be based on the processed camera images. Additional steps can include accepting a first user input, storing camera image(s) on a memory located within the augmented reality device based on the first input, accepting a second user input, and displaying stored image(s) to the user based on the second input.
MOVING CONTENT BETWEEN A VIRTUAL DISPLAY AND AN EXTENDED REALITY ENVIRONMENT
Systems, methods, and non-transitory computer readable media including instructions for extracting content from a virtual display are disclosed. Extracting content from a virtual display includes generating a virtual display via a wearable extended reality appliance, wherein the virtual display presents a group of virtual objects and is located at a first virtual distance from the wearable extended reality appliance; generating an extended reality environment via the wearable extended reality appliance including at least one additional virtual object at a second virtual distance from the wearable extended reality appliance; receiving input for causing a specific virtual object to move from the virtual display to the extended reality environment; and in response, generating a presentation of a version of the specific virtual object in the extended reality environment at a third virtual distance different from the first virtual distance and the second virtual distance.