G06F3/1454

Multi-user extended reality viewing technique
11574435 · 2023-02-07 ·

In this patent, an improved multi-user extended reality viewing technique is disclosed. A first user and a second user can be geographically separated and view the same volume. The first user can manipulate a virtual object and the second user can see the manipulated virtual object. A set of techniques are disclosed herein to cause the virtual object as it is presented to each user to be eye appealing and prevent dizziness and nausea.

Method and apparatus for automatically creating mirrored views of the video feed of meeting participants in breakout rooms or conversation groups during a videoconferencing session
11595448 · 2023-02-28 ·

A mirrored gallery view is provided of a breakout room in an online meeting user interface associated with a videoconferencing session with a session view established in a videoconferencing system. The mirrored gallery view displays video feeds of meeting participants on their respective participant computers. The video feeds are camera-captured views of each of the meeting participants, The videoconferencing system creates a breakout room within the videoconferencing session for a subset of the meeting participants, thereby allowing the subset of the meeting participants to engage with one another within the breakout room during the videoconferencing session. A video processor automatically creates mirrored views of the video feed of each of the subset of meeting participants in the breakout room whose video feed in the videoconferencing session is not currently mirrored. The videoconferencing system generates instructions for a gallery view of the breakout room in the online meeting user interface using only mirrored views of the video feeds of the subset of meeting participants in the breakout room, including the mirrored views created by the video processor, and transmits instructions to display the gallery view of the breakout room in the online meeting user interface to all meeting participants in the breakout room on their respective participant computers. In this manner, all of the meeting participants in the breakout room are displayed as mirrored views of their respective video feeds. A similar process occurs with conversation groups in a virtual space view in an online meeting user interface associated with a videoconferencing session established in a videoconferencing system.

Selective screen sharing
11593055 · 2023-02-28 · ·

Disclosed are various examples for selective screen sharing. In one example, a computing device can generate a video stream based on a screen capture and transmit the video stream to a destination device. The computing device can also obtain a user-specified modification to an area of the screen capture within the video stream. The computing device can also update the video stream by application of a transformation to the screen capture based at least in part on the user-specified modification, after the video stream started transmission to the destination device. In some cases, a user-specified modification to the area is also obtained. The video stream can be updated by applying an updated transformation to the screen capture that obscures the updated area within the video stream.

TRANSMISSION TERMINAL, TRANSMISSION METHOD, AND COMPUTER-READABLE RECORDING MEDIUM STORING TRANSMISSION PROGRAM
20180007317 · 2018-01-04 ·

A transmission terminal transmits video data and display data of a screen shared with another transmission terminal to the other transmission terminal via a predetermined relay apparatus. The transmission terminal includes a storage unit that stores relay apparatus information of the relay apparatus to which the transmission terminal transmits the video data; a receive unit that receives the display data from an external input apparatus connected to the transmission terminal; and a transmitting unit that transmits the display data received by the receive unit to the relay apparatus indicated by the relay apparatus information stored in the storage unit.

MANAGEMENT OF DISPLAY INPUTS

Examples relate to managing a display input An example system to manage a display input is provided herein. Management of display input includes a determination of a display mode selected. Management of display input also includes control of connections and transfer of data between an external device and an internal device. Management of display input further includes adjustment of a display setting based on display mode.

INTEGRATED DISPLAY DEVICE FOR MOBILE TERMINAL
20180004477 · 2018-01-04 ·

Disclosed herein is an integrated display device for a mobile terminal. The integrated display device includes: a terminal connection module including an accommodation part configured to accommodate a mobile terminal, and a docking part connected to the mobile terminal via a multimedia transmission and reception terminal and configured to receive image data output from the mobile terminal; and a display module including a display part configured to include a display device larger than a display device of the mobile terminal, and a controller configured to include a resizing module adapted to resize the image data received from the docking part and provide the resized image data to the display part; wherein the display module is integrated with one side surface of the terminal connection module; and wherein at least one operation button is located on the front surface of the accommodation part.

MEDIA PRODUCTION TO OPERATING SYSTEM SUPPORTED DISPLAY

The rendering of media generated by media production systems on a display of a different computer system that operates an operating system. A display of a computer system that operates an operating system is sometimes referred to as a smart display. When the computer system receives the media from the media production system(s), the computer system formulates an operating system control that, when triggered, performs one or more operating system operations. The computer system then displays a visualization of the operating system control along with at least part of the received media on the display of the computer system. The operating system control is structured so as to be triggered when a user interacts in at least a particular way with the visualization of the operating system control. Thus, rather than simply render the media as provided, additional operating system level control is provided by the smart display.

Display Screen Front Panel of HMD for Viewing by Users Viewing the HMD Player
20180004478 · 2018-01-04 ·

Method for providing image of HMD user to a non-HMD user includes, receiving a first image of a user including the user's facial features captured by an external camera when the user is not wearing a head mounted display (HMD). A second image capturing a portion of the facial features of the user when the user is wearing the HMD is received. An image overlay data is generated by mapping contours of facial features captured in the second image with contours of corresponding facial features captured in the first image. The image overlay data is forwarded to the HMD for rendering on a second display screen that is mounted on a front face of the HMD.

Audio-Visual Navigation and Communication
20180011621 · 2018-01-11 ·

Communicating information through a user platform by representing, on a user platform visual display, spatial publishing objects as entities at locations within a three-dimensional spatial publishing object space. Each spatial publishing object associated with information, and each presenting a subset of the associated information. Establishing a user presence at a location within the spatial publishing object space. The user presence, in conjunction with a user point-of-view, being navigable by the user in at least a two-dimensional sub-space of the spatial publishing object space.

CONTROL SYSTEM FOR NAVIGATING A PRINCIPAL DIMENSION OF A DATA SPACE
20180011541 · 2018-01-11 ·

Systems and methods are described for navigating through a data space. The navigating comprises detecting a gesture of a body from gesture data received via a detector. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting comprises identifying the gesture using the gesture data. The navigating comprises translating the gesture to a gesture signal, and navigating through the data space in response to the gesture signal. The data space is a data-representational space comprising a dataset represented in the physical space.