H04N21/8545

Interactive Media Events
20230014831 · 2023-01-19 · ·

An Interactive Media Event (IME) system includes a sync server, a first user device, and a second user device, each device is coupled to the server. The server executes computer instructions instantiating a content segment engine which outputs a Party matter to the second user device and instantiates an IME engine which receives, from the second user device, a later reaction to the Party matter. The IME engine synchronizes the later reaction with the Party matter. The Party matter may include a media event and a prior reaction to the media event received from the first user device. The media event includes a primary content segment and synchronization information associated therewith. The prior reaction and/or the later reaction may be synchronized to the primary content segment and/or to each other using the synchronization information. A reaction may include chat data captured during the Party.

Interactive Media Events
20230014831 · 2023-01-19 · ·

An Interactive Media Event (IME) system includes a sync server, a first user device, and a second user device, each device is coupled to the server. The server executes computer instructions instantiating a content segment engine which outputs a Party matter to the second user device and instantiates an IME engine which receives, from the second user device, a later reaction to the Party matter. The IME engine synchronizes the later reaction with the Party matter. The Party matter may include a media event and a prior reaction to the media event received from the first user device. The media event includes a primary content segment and synchronization information associated therewith. The prior reaction and/or the later reaction may be synchronized to the primary content segment and/or to each other using the synchronization information. A reaction may include chat data captured during the Party.

Reflective video display apparatus for interactive training and demonstration and methods of using same
11697056 · 2023-07-11 · ·

A smart mirror can show live or recorded streaming video of an instructor performing a workout in a package that is attractive and unobtrusive enough to hang in a living room. The smart mirror includes a mirror surface with a fully reflecting section and a partially reflecting section. A display behind the partially reflecting section shows the video when the smart mirror is on and is almost invisible when the smart mirror is off. The smart mirror also has a speaker, a microphone, and a camera to enable a user to view the video content and interact with the instructor. The smart mirror may connect to the user's smart phone, a peripheral device (e.g., a Bluetooth speaker) to augment user experience, a biometric sensor to provide biometric data to assess user performance, and/or a network router to connect the smart mirror to a content provider, an instructor, and/or other users.

Reflective video display apparatus for interactive training and demonstration and methods of using same
11697056 · 2023-07-11 · ·

A smart mirror can show live or recorded streaming video of an instructor performing a workout in a package that is attractive and unobtrusive enough to hang in a living room. The smart mirror includes a mirror surface with a fully reflecting section and a partially reflecting section. A display behind the partially reflecting section shows the video when the smart mirror is on and is almost invisible when the smart mirror is off. The smart mirror also has a speaker, a microphone, and a camera to enable a user to view the video content and interact with the instructor. The smart mirror may connect to the user's smart phone, a peripheral device (e.g., a Bluetooth speaker) to augment user experience, a biometric sensor to provide biometric data to assess user performance, and/or a network router to connect the smart mirror to a content provider, an instructor, and/or other users.

Dynamic topology generation for branching narratives

A playback application is configured to dynamically generate topology for an interactive media title. The playback application obtains an initial topology and also collects various data associated with a user interacting with the feature. The playback application then modifies the initial topology, based on the collected data, to generate a dynamic topology tailored to the user. The dynamic topology describes the set of choices available to the user during playback as well as which options can be selected by the user when making a given choice. In addition, the playback application also selectively buffers different portions of the interactive media title, based on the collected data, in anticipation of the user selecting particular options for available choices.

Dynamic topology generation for branching narratives

A playback application is configured to dynamically generate topology for an interactive media title. The playback application obtains an initial topology and also collects various data associated with a user interacting with the feature. The playback application then modifies the initial topology, based on the collected data, to generate a dynamic topology tailored to the user. The dynamic topology describes the set of choices available to the user during playback as well as which options can be selected by the user when making a given choice. In addition, the playback application also selectively buffers different portions of the interactive media title, based on the collected data, in anticipation of the user selecting particular options for available choices.

Creating and distributing interactive addressable virtual content
11538213 · 2022-12-27 · ·

Systems and methods create and distribute addressable virtual content with interactivity. The virtual content may depict a live event and may be customized for each individual user based on dynamic characteristics (e.g., habits, preferences, etc.) of the user that are captured during user interaction with the virtual content. The virtual content is generated with low latency between the actual event and the live content that allows the user to interactively participate in actions related to the live event. The virtual content may represent a studio with multiple display screens that each show different live content (of the same or different live events), and may also include graphic displays that include related data such as statistics corresponding to the live event, athletes at the event, and so on. The content of the display screens and graphics may be automatically selected based on the dynamic characteristics of the user.

System, apparatus and method for interactive reading

System, apparatus and method for facilitating interactive reading can include an electronic device having a program or application thereon. In one embodiment, the application can recognize one or more cues, combined with an external data source, that result from reading a story aloud and/or performing one or more acts.

System, apparatus and method for interactive reading

System, apparatus and method for facilitating interactive reading can include an electronic device having a program or application thereon. In one embodiment, the application can recognize one or more cues, combined with an external data source, that result from reading a story aloud and/or performing one or more acts.

Video file playing method and apparatus, and storage medium

This application discloses a video file playing method and apparatus, and a storage medium. The video file playing method includes playing an animation file frame by frame according to a playback time of a video file, the video file comprising at least one displayed object, and the animation file comprising an animation element generated according to the displayed object; determining click/tap position information of a screen clicking/tapping event in response to the screen clicking/tapping event being detected; determining an animation element display area corresponding to the click/tap position information of the screen clicking/tapping event in the animation file according to the click/tap position information; determining, according to the corresponding animation element display area, an animation element triggered by the screen clicking/tapping event; and determining an interactive operation corresponding to the triggered animation element and performing the interactive operation.