Patent classifications
H04N2213/006
Truncated square pyramid geometry and frame packing structure for representing virtual reality video content
Techniques and systems are described for mapping 360-degree video data to a truncated square pyramid shape. A 360-degree video frame can include 360-degrees' worth of pixel data, and thus be spherical in shape. By mapping the spherical video data to the planes provided by a truncated square pyramid, the total size of the 360-degree video frame can be reduced. The planes of the truncated square pyramid can be oriented such that the base of the truncated square pyramid represents a front view and the top of the truncated square pyramid represents a back view. In this way, the front view can be captured at full resolution, the back view can be captured at reduced resolution, and the left, right, up, and bottom views can be captured at decreasing resolutions. Frame packing structures can also be defined for 360-degree video data that has been mapped to a truncated square pyramid shape.
Methods and system for simulated 3D videoconferencing
A system and method for manipulating images in a videoconferencing session provides users with a 3-D-like view of one or more presented sites, without the need for 3-D equipment. A plurality of cameras may record a room at a transmitting endpoint, and the receiving endpoint may select one of the received video streams based upon a point of view of a conferee at the receiving endpoint. The conferee at the receiving endpoint will thus experience a 3-D-like view of the presented site.
Adaptive streaming of an immersive video scene
Client configured for retrieving a video data representation of an immersive video scene streamed by a server using a streaming protocol, wherein the server is configured for providing a plurality of streams to the client, wherein each of the streams comprises a portion of the immersive video scene, the client comprising: a sending interface; a reception interface; a viewing direction receiving unit; and a stream selecting unit; wherein the sending interface is configured for transmitting a streaming request for streaming the one or more selected streams as the video data representation of the immersive video scene, wherein the sending interface receives from the stream selecting unit a selected stream information identifying the one or more selected streams, wherein the selected stream information is created by the stream selecting unit based on the viewing direction of the user of the client and based on the manifest.
Method of estimating parameter of three-dimensional (3D) display device and 3D display device using the method
A method of estimating a parameter of a three-dimensional (3D) display device includes estimating a transformation parameter between a camera and a display based on an image displayed on the display and an image reflected by a reflector.
Automated 3D photo booth
This fully-automated 3D photo booth uses a single-or dual-lens camera or other depth sensors. Depth algorithms process the pictures taken, which can be inserted into themed templates. A novel method of coarse segmentation separates the foreground and background. The photo booth dispenses an interlaced print and snap-in photo frame, or gives the option of viewing on a mobile device or sending via email or social media. On a mobile device, the 3D viewing is with a snap-on or adhesive optical overlay if available, using novel lenticules that solve Moir artifacts. In the absence of an optical overlay, tilt sensors are used to move the viewpoint of the 3D scene model, to look around objects in the foreground, the same as if real objects were being tilted back and forth.
Image processing system with three-dimensional viewing and method of operation thereof
A system and method of operation of an image processing system includes: a get original image module for receiving an original image; a viewer detection module, coupled to the get original image module, for receiving a first position and a second position of a viewer; a crop image module, coupled to the position detector, for calculating a cropping offset for the original image based on the first position and the second position, and for calculating a cropped image by cropping the original image by the cropping offset; and a display image module, coupled to the crop image module, for displaying the cropped image on a display unit.
PERSPECTIVE ALTERING DISPLAY SYSTEM
The perception of a displayed image is altered for viewers moving relative to the position of the display system screen, thereby imparting a sense of three-dimensional immersion in the scene being displayed. A display generator generates a scene having foreground and background elements, and a display screen displaying the scene. A sensor detects the position of a viewer relative to the display screen, and a processor is operative to shift the relative position of the foreground and background elements in the displayed scene as a function of viewer position, such that the viewer's perspective of the scene changes as the viewer moves relative to the display screen. The foreground and background elements may be presented in the form of multiple superimposed graphics planes, and/or a camera may be used to record the scene through panning at sequential angles. The system may be used to implement virtual windows, virtual mirrors and other effects without requiring viewers or users to modify behavior or wear glasses, beacons, etc.
Simulated Fluoroscopy Images with 3D Context
A computer system that provides a simulated 2D fluoroscopy image is described. During operation, the computer system may generate the simulated 2D fluoroscopy image based on data in a predetermined 3D image associated with an individual's body. For example, generating the simulated 2D fluoroscopy image may involve a forward projection. Moreover, the forward projection may involve calculating accumulated absorption corresponding to density along X-ray lines through pixels in the predetermined 3D image. Then, the computer system may provide the simulated 2D fluoroscopy image with a 3D context associated with a predefined cut plane in the individual's body. Note that the providing may involve displaying, on a display, the simulated 2D fluoroscopy image with the 3D context.
Three-dimensional image display device
A three-dimensional image display device includes: an image display screen 10; a motion parallax amount measurement unit 12 which measures a to-display-screen parallax angle 1 and a to-display-screen separation distance Le with respect to the screen 10; and a display unit 8 which selects a two-dimensional image of the object to be observed that has rotated by an angle corresponding to the parallax angle 1, and transmits the two-dimensional image to the screen 10. Each of the plurality of two-dimensional images is associated with three-dimensional information; the three-dimensional information includes at least the rotational angle and a virtual separation distance Lo between the object to be observed and the image display screen; and the rotational angle of the object to be observed is determined on the basis of the following formula:
=1Le/(Lo+Le).
Perspective altering display system
The perception of a displayed image is altered for viewers moving relative to the position of the display system screen, thereby imparting a sense of three-dimensional immersion in the scene being displayed. A display generator generates a scene having foreground and background elements, and a display screen displaying the scene. A sensor detects the position of a viewer relative to the display screen, and a processor is operative to shift the relative position of the foreground and background elements in the displayed scene as a function of viewer position, such that the viewer's perspective of the scene changes as the viewer moves relative to the display screen. The foreground and background elements may be presented in the form of multiple superimposed graphics planes, and/or a camera may be used to record the scene through panning at sequential angles. The system may be used to implement virtual windows, virtual mirrors and other effects without requiring viewers or users to modify behavior or wear glasses, beacons, etc.