Patent classifications
H04N5/2228
DISTRIBUTED COMMAND EXECUTION IN MULTI-LOCATION STUDIO ENVIRONMENTS
A content production management system within a distributed studio environment includes a command interface module and a command queue management module. The command interface module is configured to render a user interface for a set of content production entities associated with a set of content production volumes within the distributed studio environment. The command queue management module, upon execution of software instructions, is configured to perform the operations of receiving, from the command interface module, a command targeting a target content production entity, assigning a synchronized execution time to the command, enqueueing the command into a command queue associated with the target content production entity according to the synchronized execution time, and enabling the target content production entity to execute the command from the command queue according to the synchronized execution time.
Broadcast lighting system and the method of use thereof
Embodiments of a live broadcast lighting system are disclosed. In one example embodiment, the live broadcast lighting system includes a light emitting apparatus, a control box being connected to the light emitting apparatus, and a device holder coupled to the control box. The device holder can be configured to releasably retain a video recording device. The control box can include an electronic control circuit configured to control rotation of the light emitting apparatus. The device holder can be configured to be rotatable independent of the rotation of the light emitting apparatus.
Crowdsourced Cinematic Universe Model
A storytelling system that simplifies the filmmaking process to enable end-users, including, but not limited to, audiences of movies, TV shows, and commercials, and product-narrative owners, including, but not limited to studios, TV networks, filmmakers, and brands, to co-produce audio-visual stories as extensions of existing product-narratives. The system includes, but is not limited to, story prompts that direct end-users' filmmaking activities, protocol to frame creation of audio-visual stories within the product-narrative, a function for end-users to choose relationship status between one another within the product narrative, and a method to catalogue activity within the system, which can be delivered in varying formats. The system includes, but is not limited to, a method for product-narrative owner to organize end-user's audio-visual story, a plurality of end-user audio-visual stories, the product-narrative, and/or any combination thereof, to articulate expanded product-narrative as a result of the system.
Systems and methods for tracking objects in a field of view
Systems and methods for tracking objects in a field of view are disclosed. In one embodiment a method may include capturing, via a camera, a real-world object in the field of view; generating a first object data associating the real-world object with a first position of the real-world object in a real-world environment at a first time; generating a virtual object representative of the real-world object depicting the real-world object in the first position at the first time; generating a second object data associating the real-world object with a second position of the real-world object in the real-world environment at a second time, determining a displacement value of the real-world object between the first position and the second position, modifying the virtual object to include an indication that the real-world object has been displaced when the displacement value is greater than a threshold value.
Imaging apparatus and imaging method
An imaging apparatus and an imaging method that further facilitate recording of a video within an extraction range while moving the extraction range within an angle of view are provided. An imaging apparatus according to an aspect of the present invention includes an image sensor that captures a reference video which is a motion picture, a housing that accommodates the image sensor, a detection portion for detecting a motion of the housing, and a processor. The processor is configured to execute setting processing of setting an extraction range smaller than an angle of view within the angle of view in a case of capturing the reference video, extraction processing of extracting an extraction video within the extraction range from the reference video, movement processing of moving the extraction range within the angle of view over time in accordance with the motion detected by the detection portion, and recording processing of recording the extraction video during movement of the extraction range in the movement processing on a recording medium.
CONTROLLING CHARACTERISTICS OF LIGHT OUTPUT FROM LED WALLS
A computer-generated scene is generated as background for a live action set, for display on a panel of light emitting diodes (LEDs). Characteristics of light output by the LED panel are controlled such that the computer-generated scene rendered on the LED panel, when captured by a motion picture camera, has high fidelity to the original computer-generated scene. Consequently, the scene displayed on the screen more closely simulates the rendered scene from the viewpoint of the camera. Thus, a viewpoint captured by the camera appears more realistic and/or truer to the creative intent.
AUTOMATED COORDINATION IN MULTIMEDIA CONTENT PRODUCTION
Methods, apparatus and systems related to automated production of multimedia contents are described. In one example aspect, an automated production system includes a directing server configured to store production-stage information in a machine-readable script and manage production of a multimedia content according to the script. The system also includes a device management server configured to coordinate one or more shooting locations for the production of the multimedia content. The device management server is configured to receive a portion of the production-stage information extracted from the script based on its location. The system further includes end devices connected to the device management server. The device management server is configured to track activities of the end devices and to provide status of the end devices at a production time to the directing server to enable the directing server to dynamically update the script for subsequent shooting activities at the production time.
ANIMATION PRODUCTION METHOD
To enable to shoot animations in a virtual space, an animation production method, a computer executes a step of placing a virtual camera to shoot a character in a virtual space; and a step of placing in the virtual space a grid representing shooting range of the camera and a division line that divides the shooting.
Camera signal monitoring apparatus and method
The present disclosure provides a camera signal monitoring apparatus and method, the camera signal monitoring apparatus including a processor, which includes a vehicle information input unit for receiving a vehicle speed and a yaw rate signal of a vehicle; a camera information input unit for receiving a camera signal including a vehicle speed and a yaw rate signal from a vehicle front camera; and a monitoring unit for calculating a vehicle driving trajectory using the vehicle speed and the yaw rate signal of the vehicle input from the vehicle information input unit, calculating a reference curvature value based on the calculated vehicle driving trajectory, and determining a reliability by comparing the calculated reference curvature value with a curvature value calculated by the camera signal input from the camera information input unit.
INFORMATION PROCESSING DEVICE
An information processing device includes an obtaining section configured to obtain position information of a plurality of parts of a photographing target, a deriving section configured to derive a length between two parts, and a production control section configured to perform production on the basis of the derived length.