G06F3/14

Screen sharing system, screen sharing method, and display apparatus
11579832 · 2023-02-14 · ·

A screen sharing system includes a first display apparatus including first circuitry; and a second display apparatus including second circuitry. Both of the first display apparatus and the second display apparatus display an input screen. The first circuitry of the first display apparatus is configured to receive first hand drafted input data that is input to the first display apparatus, and set an edit authority, of a user of the second display apparatus, for the first hand drafted input data. The second circuitry of the second display apparatus is configured to restrict editing of the firsthand drafted input data based on the edit authority of the user set by the first display apparatus.

Screen sharing system, screen sharing method, and display apparatus
11579832 · 2023-02-14 · ·

A screen sharing system includes a first display apparatus including first circuitry; and a second display apparatus including second circuitry. Both of the first display apparatus and the second display apparatus display an input screen. The first circuitry of the first display apparatus is configured to receive first hand drafted input data that is input to the first display apparatus, and set an edit authority, of a user of the second display apparatus, for the first hand drafted input data. The second circuitry of the second display apparatus is configured to restrict editing of the firsthand drafted input data based on the edit authority of the user set by the first display apparatus.

Displaying a window of a remote desktop computer on a mobile device with a native layout

Embodiments generally enable a mobile device to display a window of a remote desktop on a mobile device with a native layout. In some embodiments, a method includes receiving a remote desktop display request from a mobile client device, wherein the remote desktop display request includes display information of the mobile client device. The method further includes generating a copy of a window process of a remote desktop computer. The method further includes generating a virtual display based at least in part on the copy of the window process of the remote desktop computer and on the display information of the mobile client device. The method further includes sending virtual display information to the mobile client device based at least in part on the virtual display.

Intelligent interactive all-in-one machine

The present invention, belonging to the technical field of data transmission, relates to an integrated intelligent interaction machine. The integrated intelligent interaction machine comprises a signal transmission device, a processing device, a upper computer and a network transmission device, wherein the signal transmission device is connected to the processing device, the processing device is connected to the upper computer, and the upper computer is connected to the processing device through the network transmission device. The signal transmission device is used to receive a screen transmission signal and transmit it to the processing device, and the processing device is used to process the screen transmission signal to obtain screen transmission data and transmit the screen transmission data to the upper computer. The upper computer is used to connect to an external network and output a network signal to the network transmission device, and the network transmission device transmits the network signal to the processing device. The upper computer is used to connect the external network and the internal device of the equipment to provide the network signal for the processing device, thereby realizing the communication between the processing device and the external network. The screen transmission data and the network signal are transmitted through different paths, so as to carry out screen transmission communication in the equipment and network sharing, respectively, thereby improving the data transmission convenience of the integrated intelligent interaction machine.

Color-sensitive virtual markings of objects
11582312 · 2023-02-14 · ·

Disclosed are systems, methods, and non-transitory computer readable media for making virtual colored markings on objects. Instructions may include receiving an indication of an object; receiving from an image sensor an image of a hand of an individual holding a physical marking implement; detecting in the image a color associated with the marking implement; receiving from the image sensor image data indicative of movement of a tip of the marking implement and locations of the tip; determining from the image data when the locations of the tip correspond to locations on the object; and generating, in the detected color, virtual markings on the object at the corresponding locations.

Artificial reality collaborative working environments

Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.

Interactive virtual reality system
11580708 · 2023-02-14 · ·

Provided herein are method, apparatus, and computer program products for generating a first and second three dimensional interactive environment. The first three dimensional interactive environment may contain one or more engageable virtual interfaces that correspond to one or more items. Upon engagement with a virtual interface the second three dimensional interactive environment is produced to virtual simulation related to the one or more items.

Systems and methods for controlling virtual scene perspective via physical touch input

Systems, methods, and non-transitory computer readable media for controlling perspective in an extended reality environment are disclosed. In one embodiment, a non-transitory computer readable medium contains instructions to cause a processor to perform the steps of: outputting for presentation via a wearable extended reality appliance (WER-appliance), first display signals reflective of a first perspective of a scene; receiving first input signals caused by a first multi-finger interaction with the touch sensor; in response, outputting for presentation via the WER-appliance second display signals to modify the first perspective of the scene, causing a second perspective of the scene to be presented via the WER-appliance; receiving second input signals caused by a second multi-finger interaction with the touch sensor; and in response, outputting for presentation via the WER-appliance third display signals to modify the second perspective of the scene, causing a third perspective of the scene to be presented via the WER-appliance.

Systems and methods for controlling virtual scene perspective via physical touch input

Systems, methods, and non-transitory computer readable media for controlling perspective in an extended reality environment are disclosed. In one embodiment, a non-transitory computer readable medium contains instructions to cause a processor to perform the steps of: outputting for presentation via a wearable extended reality appliance (WER-appliance), first display signals reflective of a first perspective of a scene; receiving first input signals caused by a first multi-finger interaction with the touch sensor; in response, outputting for presentation via the WER-appliance second display signals to modify the first perspective of the scene, causing a second perspective of the scene to be presented via the WER-appliance; receiving second input signals caused by a second multi-finger interaction with the touch sensor; and in response, outputting for presentation via the WER-appliance third display signals to modify the second perspective of the scene, causing a third perspective of the scene to be presented via the WER-appliance.

Realistic virtual/augmented/mixed reality viewing and interactions

The present invention discloses systems and methods for both viewing and interacting with a virtual reality (VR), an augmented reality (AR) or a mixed reality (MR). More specifically, the systems and methods allow the user to interact with aspects of such realities including virtual items presented in such realities or within such environments by manipulating a control device that has an inside-out camera mounted on-board. The apparatus or system uses two distinct representations including a reduced representation in determining the pose of the control device and uses these representations to compute an interactive pose portion of the control device to be used for interacting with the virtual item. The reduced representation is consonant with a constrained motion of the control device.