Patent classifications
G09G5/377
SHARED VIEWING EXPERIENCE ENHANCEMENT
Methods and systems are provided for receiving media content for display in a shared activity session; receiving additional content corresponding to respective users of one or more other user equipment devices participating in the shared activity session; generating for display, using control circuitry, a display screen including the media content and at least some of the additional content; and, during the shared activity session, automatically adapting the display of the additional content using the control circuitry based on the media content and/or the additional content. For example, images, avatars or video of the users displayed alongside the media content may be adapted using backgrounds or filters reflecting the media content; additional content, such as audio or chat messages, provided by those users; and/or information in their user profiles. The shared activity may be, for example, a group watch session, a videoconference, videocall, audio call, chat session or multi-player game session.
Ship information display device and method of displaying ship information
A ship information display device is provided, which may include a first processor, a second processor, a graphic processor, and a display. The first processor may generate a first image based on first ship information received from a first ship sensor and generate a screen to be synthesized including the first image and a blank image. The second processor may generate a second image based on second ship information received from a second ship sensor. The graphic processor may generate a synthesized screen including the first image and the second image by replacing the blank image of the screen to be synthesized by the second image generated by the second processor. The display may display the synthesized screen.
Ship information display device and method of displaying ship information
A ship information display device is provided, which may include a first processor, a second processor, a graphic processor, and a display. The first processor may generate a first image based on first ship information received from a first ship sensor and generate a screen to be synthesized including the first image and a blank image. The second processor may generate a second image based on second ship information received from a second ship sensor. The graphic processor may generate a synthesized screen including the first image and the second image by replacing the blank image of the screen to be synthesized by the second image generated by the second processor. The display may display the synthesized screen.
DISPLAY TERMINAL DEVICE
In a display terminal device, a CPU determines an arrangement position of a virtual object in real space by software processing and outputs a first image, which is an image of the virtual object, and information indicating the arrangement position. An imaging unit captures a second image, which is an image of the real space. A synthesizer generates a synthetic image by combining the first image and the second image by hardware processing based on the arrangement position. A display is directly connected to the synthesizer and displays the synthetic image.
IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, AND PROGRAM
A device and a method for performing AR image display control in which flicker in a boundary region between a virtual object and a real object is not noticeable are provided. A real object detection unit that executes processing of detecting a real object in a real world, and an augmented reality (AR) image display control unit that generates an AR image visually recognized in such a manner that a virtual object exists in a same space as a real object and outputs the AR image to a display unit are included, in which the AR image display control unit, in a case where a position of the real object detected by the real object detection unit is on a near side in a depth direction with respect to a position where the virtual object to be displayed on the display unit is arranged, an additional virtual object is output to the display unit on the near side in the depth direction with respect to the position of the real object in such a manner as to hide at least a partial region of a boundary region between the virtual object and the real object.
TOUCH DISPLAY DRIVING MODULE, TOUCH DISPLAY DRIVING METHOD, AND DISPLAY DEVICE
The present disclosure provides a touch display driving module, a touch display driving method, and a display device. The touch display driving module includes a touch detection chip and a time sequence touch control chip. The time sequence touch control chip is configured to perform data calculation in accordance with a touch data signal to generate touch point coordinate information, generate touch point image information in accordance with the touch point coordinate information, and display the touch point image on a display screen in accordance with the touch point image information. The touch point coordinate information is information carrying coordinates of a touch point, the touch point image is an image showing that the touch point is touched, and the touch point image information is display information corresponding to the touch point image.
INFORMATION PROCESSING APPARATUS AND PROJECTION SYSTEM
An information processing apparatus includes a recognizer, based on captured image information generated by capturing a region on which a first image including an individual image is projected from a projector, to recognize an obstacle between the region and the projector, an identifier, when the obstacle is recognized, to identify, based on first image information indicating the first image and a recognition result by the recognizer, the individual image of which projection is blocked by the obstacle as a specific image, a decider to decide changing a first position in the region on which the specific image is projected to a second position in the region toward which projection of the specific image is not blocked by the obstacle, and a generator to generate second image information indicating a second image including an auxiliary image teaching association between the specific image located at the second position and the object.
Augmented reality display systems with super-Lambertian LED source
Emissive display devices having LED sources with super-lambertian radiation patterns. An exemplary emission source may have a half-emission-cone-angle of less than 40°. A system, such as an augmented reality display system, employing such an emissive display device may display a reduction in power of up to three times relative to LED sources with a lambertian radiation pattern. In some systems, such as augmented reality display systems, the optical path down stream of such an emissive display device may be simplified and/or dimensionally scaled, and/or manufactured to lower tolerances. For example, a discrete collimating lens may be eliminated from the optical path of such an emissive display device.
Augmented reality display systems with super-Lambertian LED source
Emissive display devices having LED sources with super-lambertian radiation patterns. An exemplary emission source may have a half-emission-cone-angle of less than 40°. A system, such as an augmented reality display system, employing such an emissive display device may display a reduction in power of up to three times relative to LED sources with a lambertian radiation pattern. In some systems, such as augmented reality display systems, the optical path down stream of such an emissive display device may be simplified and/or dimensionally scaled, and/or manufactured to lower tolerances. For example, a discrete collimating lens may be eliminated from the optical path of such an emissive display device.
Wearable display system for portable computing devices
A method and system of providing a wearable expanded display system for portable computing devices is disclosed. The method includes connecting a head-mounted display system to the portable computing device and anchoring a virtual projection of content from the portable computing device to a screen of the portable computing device to provide the user with an augmented reality viewing experience. The virtual projection can present the content in an enlarged view with richer content options than the smaller screen of the portable computing device. The position of the virtual projection will adapt to changes in pose of the screen to maintain the anchored relationship.