H04N5/222

Flash unit using adjustable diffusers
11538206 · 2022-12-27 · ·

Embodiments of systems and methods for using electronic diffusers to implement message indicators are described. A segment of a diffuser attached to an electronic device is configured to indicate an informational message in response to signals that result in a change to an optical property. A set of information to be displayed using the segment is determined, and a signal is transmitted to the segment to display the information.

Creating and distributing interactive addressable virtual content
11538213 · 2022-12-27 · ·

Systems and methods create and distribute addressable virtual content with interactivity. The virtual content may depict a live event and may be customized for each individual user based on dynamic characteristics (e.g., habits, preferences, etc.) of the user that are captured during user interaction with the virtual content. The virtual content is generated with low latency between the actual event and the live content that allows the user to interactively participate in actions related to the live event. The virtual content may represent a studio with multiple display screens that each show different live content (of the same or different live events), and may also include graphic displays that include related data such as statistics corresponding to the live event, athletes at the event, and so on. The content of the display screens and graphics may be automatically selected based on the dynamic characteristics of the user.

INTEGRATED LIGHTING AND SOUND DEVICE
20220407985 · 2022-12-22 ·

An integrated lighting and sound device includes an LED lamp sheet, a beautifying live broadcast lamp front cover, a microphone output port, a beautifying live broadcast lamp rear cover, and a sound card circuit board. The sound card circuit board is embedded in the beautifying live broadcast lamp rear cover and belongs to a part of the sound card module, and the sound card module includes a plurality of sound production modes. A sound card, a microphone, and the LED lamp sheet achieve a switch between beautifying, live talking, and live voice changing at the same time. The present disclosure saves time of switching live broadcast devices back and forth and space of working environment, increases working efficiency of a live broadcast, and reduces a purchase cost.

INTEGRATED LIGHTING AND SOUND DEVICE
20220407985 · 2022-12-22 ·

An integrated lighting and sound device includes an LED lamp sheet, a beautifying live broadcast lamp front cover, a microphone output port, a beautifying live broadcast lamp rear cover, and a sound card circuit board. The sound card circuit board is embedded in the beautifying live broadcast lamp rear cover and belongs to a part of the sound card module, and the sound card module includes a plurality of sound production modes. A sound card, a microphone, and the LED lamp sheet achieve a switch between beautifying, live talking, and live voice changing at the same time. The present disclosure saves time of switching live broadcast devices back and forth and space of working environment, increases working efficiency of a live broadcast, and reduces a purchase cost.

Display control apparatus, method for controlling display control apparatus, and storage medium
11528462 · 2022-12-13 · ·

State information indicating states of a plurality of imaging apparatuses 100-x used for generating a virtual viewpoint image is acquired. At least one image type is determined from a plurality of image types indicating display formats of displaying the states of the plurality of imaging apparatuses 100-x based on the state information. Based on the determined image type, the states of the plurality of imaging apparatuses 100-x are displayed.

CONTROLLING CHARACTERISTICS OF LIGHT OUTPUT FROM LED WALLS
20220382502 · 2022-12-01 ·

A computer-generated scene is generated as background for a live action set, for display on a panel of light emitting diodes (LEDs). Characteristics of light output by the LED panel are controlled such that the computer-generated scene rendered on the LED panel, when captured by a motion picture camera, has high fidelity to the original computer-generated scene. Consequently, the scene displayed on the screen more closely simulates the rendered scene from the viewpoint of the camera. Thus, a viewpoint captured by the camera appears more realistic and/or truer to the creative intent.

Operating system integrated image capture guidance
11516384 · 2022-11-29 · ·

Systems and techniques for operating system integrated image capture guidance are described herein. An indication may be received of an object to be captured for completing a transaction. Configuration data may be obtained for an image of the object. The configuration data may indicate an orientation of the object in the image. An image of the object may be obtained from an imaging sensor of a device. A discrepancy may be determined between the orientation of the object in the image using the configuration data. Orientation guidance may be generated that indicates repositioning of the object in the image. It may be determined that the discrepancy between the orientation of the object in the image has been eliminated. Capture guidance may be generated for output via the device based on a set of commands determined based on detection of an operating system executing on the device.

VARIED DEPTH DETERMINATION USING STEREO VISION AND PHASE DETECTION AUTO FOCUS (PDAF)
20220377209 · 2022-11-24 ·

Disclosed are systems, methods, and non-transitory computer-readable media for varied depth determination using stereo vision and phase detection auto focus (PDAF). Computer stereo vision (stereo vision) is used to extract three-dimensional information from digital images. To utilize stereo vision, two optical sensors are displaced horizontally from one another and used to capture images depicting two differing views of a real-world environment from two different vantage points. The relative depth of the objects captured in the images is determined using triangulation by comparing the relative positions of the objects in the two images. For example, the relative positions of matching objects (e.g., features) identified in the captured images are used along with the known orientation of the optical sensors (e.g., distance between the optical sensors, vantage points the optical sensors) to estimate the depth of the objects.

AUTOMATIC MEDIA CAPTURE BASED ON MOTION SENSOR DATA
20220375103 · 2022-11-24 ·

Systems and methods herein describe a media capture system that receives sensor data from motion sensors coupled to a head-wearable apparatus, detects a trigger event corresponding to a head-wearable apparatus based on the sensor data, captures images using a camera coupled to the head-wearable apparatus, and transmits the captured images to a client device.

Tracking system for visual effect triggering using induced magnetic field and predictive analysis

A system comprises active magnetic emitters positioned within an area, passive magnetic emitters configured to be moved within the area, a magnetic field detector configured to measure a strength and direction of a magnetic field within the area, and a processor in communication with the magnetic field detector. The passive magnetic emitters are configured to be integrated in, coupled to, or secured to at least one tracked object or tracked subject within the area. The processor is configured to evaluate at least one change in the measured strength and direction of the magnetic field end send a signal to a visual effect actuator or visual effects display to initiate a visual effect based on the at least one change. A method and computer program product relating to the system is also provided.