H04N5/222

Method and Terminal Device for Matching Photographed Objects and Preset Text Information
20230066116 · 2023-03-02 ·

A photographing method and a terminal device are disclosed. The method includes: receiving, by a terminal device, a first operation; starting, by the terminal device, a camera in response to the first operation; displaying, by the terminal device, a first preview screen including a first preview image, where the first preview image includes at least one photographed object, and the at least one photographed object in the first preview image matches preset first text information; and outputting, by the terminal device, first prompt information based on the first preview image and the first text information, where the first prompt information is used to indicate a missing or redundant photographed object in the first preview image. Indicated by the prompt information, a user can move a position or an angle of the terminal device.

INTERACTIVE IMAGE GENERATION

A content generation platform is generally described herein. More specifically, interactive image generation and techniques and features thereof are disclosed herein. One or more sets of images of a scene are captured in an imaging studio. The captured one or more sets of images of the scene are processed using one or more machine learning based networks to generate an interactive image of the scene comprising a plurality of interactive features. One or more of the plurality of interactive features of the generated interactive image may be modified or edited according to user preferences.

CAMERA TRACKING VIA DYNAMIC PERSPECTIVES
20230117368 · 2023-04-20 · ·

A computer system may identify a first position of a physical camera corresponding to a first time period. The computer system may render a first virtual scene for the first time period. The system may project the first scene onto a display surface to determine a first rendered image for the first time period. The computer system may receive a first camera image of the display surface from the camera during the first time period. The system may determine a first corrected position of the camera by comparing the first rendered image to the first camera image. The system may predict a second position of the camera corresponding to a second time period. The computer system may render a second virtual scene for the second time period. The system may project the second virtual scene onto the display surface to determine a second rendered image for the second time period.

System and method for rendering free viewpoint video for studio applications

Systems and methods for foreground/background separation and for studio production of a FVV. A method includes projecting, onto objects in a filming area within a studio, a predefined pattern including a large set of features; generating, based on signals reflected off of the objects and captured by a plurality of depth cameras deployed in proximity to the filming area, a local point cloud for each depth camera; separating, based on the local point clouds, between a background and a foreground of the filming area; creating, based on the local point clouds, a unified point cloud; meshing points in the unified point cloud to generate a 3D model of the objects; texturing the 3D model based on the separation and images captured by the depth cameras; and rendering the textured 3D model as a FVV including a series of video frames with respect to at least one viewpoint.

MODULAR UTILITY SYSTEM
20230065062 · 2023-03-02 ·

A kit of parts for use in demountably configuring a variety of structural assemblies. A kit may comprise a plurality of cylindrical structural components configured for engagement with various types of male connector components along the sides or at the ends of the cylindrical structural components, there provided for demountable engagement of a female-end socket component at the end of another cylindrical structural component. The kit may additionally comprise components for one or more of roller carriage assemblies and/or one or more of sliding carriage assemblies configured for rolling or sliding movement along rail assemblies comprising a plurality of cylindrical, square, rectangular, or octagonal track components. The rail assemblies may be supported by bases comprising the cylindrical structural components and other components such as weight components, as well as threaded foot components or threaded caster components on which the rail assemblies can be adjustably levelled.

MODULAR UTILITY SYSTEM
20230065062 · 2023-03-02 ·

A kit of parts for use in demountably configuring a variety of structural assemblies. A kit may comprise a plurality of cylindrical structural components configured for engagement with various types of male connector components along the sides or at the ends of the cylindrical structural components, there provided for demountable engagement of a female-end socket component at the end of another cylindrical structural component. The kit may additionally comprise components for one or more of roller carriage assemblies and/or one or more of sliding carriage assemblies configured for rolling or sliding movement along rail assemblies comprising a plurality of cylindrical, square, rectangular, or octagonal track components. The rail assemblies may be supported by bases comprising the cylindrical structural components and other components such as weight components, as well as threaded foot components or threaded caster components on which the rail assemblies can be adjustably levelled.

Multi Color LED Video Tile
20230067712 · 2023-03-02 ·

An LED wall system, with a camera for imaging an area that is illuminated by the LED wall system. The LEDs have a plurality of emitting pixels, each emitting pixel formed of at least one LED, the emitting pixels having primary colors, and also having at least one secondary color other than primary color pixels.

System and method for rendering free viewpoint video for sport applications

Methods and systems for generating free viewpoint videos (FVVs) based on images captured in a sports arena. A method includes projecting, onto objects within a filming area within the sports arena, a predefined pattern including a large set of features; generating, based on signals captured by each of a plurality of depth cameras, a point cloud for each depth camera, wherein the plurality of depth cameras is deployed in proximity to the filming area, wherein the captured signals are reflected off of the objects within the filming area; creating, based on the plurality of point clouds, a unified point cloud; meshing points in the unified point cloud to generate a three-dimensional (3D) model of the objects; texturing the 3D model based on images captured by the plurality of depth cameras; and rendering the textured 3D model as a FVV including a series of video frames with respect to a viewpoint.

Image processing device, imaging device, image processing method, and recording medium
11663710 · 2023-05-30 · ·

The image processing device includes an imaging unit that images a subject and a distance map acquisition unit that acquires information regarding a distance distribution of the subject as map data. The distance map acquisition unit acquires map data with an image deviation amount or a defocused amount related to a captured image or distance map data in conformity with a TOF scheme or an imaging plane phase difference detection scheme of using a pupil division type image sensor. An image processing unit generates data of a texture image in which a low-frequency component of a captured image is inhibited and combines the data of the texture image and the map data acquired by the distance map acquisition unit to generate image data in which a distance distribution of a subject is expressed.

Background display system

A background display system for a virtual image recording studio comprises a background display device which is configured to display, behind or above a real subject, a representation of a virtual background for a recording by means of an associated camera, and a control device which is configured to control the background display device. The control device comprises a data input for receiving lens data from the associated camera and is configured to adjust the representation of the virtual background in dependence of the received lens data.