Camera control unit with stereoscopic video recording and archive
10104331 ยท 2018-10-16
Assignee
Inventors
- Marc R. Amling (Santa Barbara, CA)
- Timothy King (Goleta, CA, US)
- Grant deGoede (Solvang, CA, US)
- Lisa Blake (Santa Barbara, CA, US)
Cpc classification
H04N13/239
ELECTRICITY
A61B1/04
HUMAN NECESSITIES
A61B1/0005
HUMAN NECESSITIES
International classification
A61B1/00
HUMAN NECESSITIES
H04N9/804
ELECTRICITY
A61B1/04
HUMAN NECESSITIES
Abstract
A video imaging system, and more particularly, a modular video imaging system having a control module connectable to multiple input modules. The input modules each capable of receiving differing types of image data from different types of cameras including stereoscopic camera configurations and processing the image data into a format recognizable by the control module. The control unit providing general functions such as user interface and general image processing that is not camera specific including the generation of a 3D image stream formed by combining two image streams.
Claims
1. A video imaging system comprising: an endoscope having a shaft; a camera control unit including an input module and a control module disposed external to the input module, the control module being coupled to the input module and the input module being coupled to said endoscope; a first imager positioned on or within the endoscope, the first imager generating a first image stream transmitted to said input module, said first image stream comprising first lines of data from a first sensor; a second imager positioned on or within the endoscope, the second imager generating a second image stream transmitted to said input module, said second image stream comprising second lines of data from a second sensor; the input module supporting functions of the first imager and the second imager and being configured to process the first image stream and the second image stream into a first processed image stream and a second processed image stream; and a processor located in said control module for processing the first processed image stream and the second processed image stream into a 3D image stream, the 3D image stream being transmitted to a display to be presented on said display; wherein the 3D image stream comprises alternating lines of data taken from the first processed image stream and the second processed image stream; wherein, upon connection of the input module to the control module, said control module is configured to communicate to the input module a plurality of types of standard processed image data the control module is compatible with, and said input module is configured to communicate to said control module a plurality of types of standard processed image data the input module is configured to transmit; and wherein said control module and said input module are configured to, by said communicating, settle on a type of standard processed image data to transmit.
2. The video imaging system of claim 1 wherein the 3D image stream is saved in the storage of the camera control unit.
3. The video imaging system of claim 1 wherein the 3D image stream is saved on a removable storage.
4. The video imaging system of claim 1 wherein the 3D image stream is encoded.
5. The video imaging system of claim 4 wherein the 3D image stream is compressed prior to being saved in the storage.
6. The video imaging system of claim 4 wherein the encoding is Top and Bottom (TaB) frame compatible format.
7. The video imaging system of claim 4 wherein the encoding is Side-by-Side (SbS) frame compatible format or Alternating Line-by-Line (LbL) frame compatible format.
8. The video imaging system of claim 4 wherein the encoding is MPEG-2 video coding or AVC/H.264 video coding.
9. The video imaging system of claim 1 wherein the first and second imagers are positioned at a distal end of said shaft.
10. The video imaging system of claim 1 further comprising a light source generating illuminating light.
11. The video imaging system of claim 1 wherein said camera control unit further comprises a network connection.
12. The video imaging system of claim 11 further comprising a remote storage coupled to said network connection and the 3D image stream is stored on said remote storage.
13. The video imaging system of claim 11 further comprising a remote computer and the first image stream and the second image stream are accessible by said remote computer via said network connection.
14. The video imaging system of claim 1 wherein the 3D image stream is processed from the first image stream and the second image stream that are saved in the storage.
15. A method for generating a 3D image comprising the steps of: generating a first image stream with a first imager positioned on or within an endoscope; generating a second image stream with a second imager positioned on or within the endoscope; transmitting the first and the second image streams to an input module coupled to the endoscope; processing, via the input module, the first image stream and the second image stream into a first processed image stream and a second processed image stream; transmitting the first and second processed image streams to a control module, the control module being disposed external to the input module and coupled to the input module; processing the first processed image stream and the second processed image stream into a 3D image stream, the 3D image stream comprising alternating lines of data taken from the first processed image stream and the second processed image stream; and transmitting the 3D image stream to a display coupled to the control module for displaying the 3D image stream; wherein, upon connection of the input module to the control module, said control module is configured to communicate to the input module a plurality of types of standard processed image data the control module is compatible with, and said input module is configured to communicate to said control module a plurality of types of standard processed image data the input module is configured to transmit; and wherein said control module and said input module are configured to, by said communicating, settle on a type of standard processed image data to transmit.
16. The method of claim 15 further comprising the step of storing the 3D image stream in the storage.
17. The method of claim 16 further comprising the steps of transmitting the 3D image stream to a remote computer via a network connection and storing the 3D image stream on a remote storage.
18. The method of claim 15 further comprising the step of storing the 3D image stream in a removable storage.
19. The method of claim 15 further comprising the step of encoding the 3D image stream.
20. The method of claim 15 further comprising the step of compressing the 3D image stream.
21. A video imaging system comprising: an endoscope having a shaft; a camera control unit including an input module and a control module disposed external to the input module, the control module being coupled to the input module and the input module being coupled to said endoscope; a first imager positioned on or within the endoscope, the first imager generating a first image stream transmitted to said input module, said first image stream comprising first lines of data from a first sensor; a second imager positioned on or within the endoscope, the second imager generating a second image stream transmitted to said input module, said second image stream comprising second lines of data from a second sensor; the input module transmitting an input module identifier to the control module, the input module being configured to process the first image stream and the second image stream based on a command to generate a first processed image stream and a second processed image stream; and the control module being configured to determine the command based on the input module identifier and user input and transmit the command to the input module, the control module having a processor for processing the first processed image stream and the second processed image stream into a 3D image stream, the 3D image stream being transmitted to a display to be presented on said display; wherein the 3D image stream is encoded, wherein the encoding is selected from a group consisting of Top and Bottom (TaB) frame compatible format, Side-by-Side (SbS) frame compatible format and Alternating Line-by-Line (LbL) frame compatible format.
22. A modular video imaging system comprising: a first image stream; a second image stream; a camera control unit, the camera control unit comprising: an input module configured to process the first image stream and the second image stream into a first processed image stream and a second processed image stream; a control module disposed external to the input module and coupled to the input module, the control module having a processor; and a storage located within said camera control unit; wherein the first processed image stream is transmitted to said control module and stored in the storage; wherein the second processed image stream is transmitted to said control module and stored in the storage; wherein the processor interleaves the first processed image stream and the second processed image stream into a 3D image stream, the 3D image stream comprising alternating lines of data taken from the first processed image stream and the second processed image stream, and wherein the 3D image stream is transmitted to a display; wherein, upon connection of the input module to the control module, said control module is configured to communicate to the input module a plurality of types of standard processed image data the control module is compatible with, and said input module is configured to communicate to said control module a plurality of types of standard processed image data the input module is configured to transmit; and wherein said control module and said input module are configured to, by said communicating, settle on a type of standard processed image data to transmit.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
DETAILED DESCRIPTION OF THE INVENTION
(6) Accordingly, the invention involves a modular medical imaging system including several modules, such as an input module and a control module, which can be developed, sold and installed at different times. For example, a system may be initially installed with a control module and several input modules, and later additional modules can be added to the system.
(7) According to one embodiment of the inventive video imaging system allows for later developed modules incorporating various technologies and industry standard interfaces as they evolve to be incorporated into an endoscopic system. By having modularity between the control module and input module, manufacturers can prevent having to re-design an entire new system for newer technologies and end users can avoid purchasing entirely new systems. The inventive video imaging system provides the ability to accommodate future imaging system improvements and adaptations as current technology limitations are overcome by adding new input modules, which are forward and backward compatible with the control module, without obsolescing initial customer investments in control modules. The system also provides the ability for a user to add a new control module to accommodate future improvements, which is forward and backward compatible with older input modules. This allows system users to take advantage of new features and functions of one module without requiring redesign and/or replacement of the entire system.
(8) For example, industry standards in display and recording infrastructure technologies evolve at a different rate than, say, video endoscope technology, imaging technology, or proximal camera head technology. Newer technologies often use differing imaging data and parameters, such as aspect ratio, timing, pixel rate, pixel resolution, and pixel encoding. By having an input module connected to a control module, where the input module is forward and backward compatible with the control module, new camera technologies may be provided to replace outdated camera technologies, while still being compatible with older control modules.
(9) Thus, a user can replace existing control modules with newer control modules that allow for a display having higher resolution, more color bit depth or being 3D compatible. Similarly, a user can replace an existing input module, which only supports a limited number of camera heads, without replacing the control module or the display.
(10) Such a system provides a competitive advantage by being able to provide newer technologies faster and affords users the benefit of the backwards and forwards compatibility between the control modules and input modules.
(11) The modular imaging system allows upgradeability and compatibility with a multitude of camera heads that are supported by a plurality of input modules, where the camera heads and input modules may be existing or are yet to be developed. Formerly, when a new imaging technology becomes available, a CCU would be incompatible with the new technology due to a variety of constraints, for example, outdated hardware. By using a modular architecture, the new technology is supported by a new input module that is backward compatible with the existing control module. The modular architecture increases the likelihood that existing visualization technology and yet to be developed visualization will be able to operate with some if not all of the same image processing hardware. This results in decreased capital costs for physicians' offices, surgical offices and/or hospitals.
(12) In various embodiments of the invention, the control module may be designed to accommodate general image processing and display functions. These general functions include, for example, supporting a separate user interface, overlaying a user interface onto an image, image capture and streaming functionality as well as input/output functionality for the display/monitor interfaces, system interface and control, and network connectivity. The control module may be designed to accommodate a single input module or multiple input modules. The control module may be connected to a display or the control module may include a display as a one piece unit. The control module may include a processor as well.
(13) For example, a user may only wish to purchase a control module and only one input module at a time. Thus, the overall modular system can be purchased at a lower initial cost. If the consumer wishes to purchase a new camera type, the modular system may be upgraded with a new input module to support the new imaging technology. The new input module may replace the old input module or be used together with the older input module.
(14) The input modules may support functions required for a group or family of image sources, such as cameras or auxiliary inputs. The input module may provide compatibility between the family of image sources and the control module. Over the life of the system, additional input modules may be purchased to support emerging imaging technology such as 3D imaging, advanced fluorescence imaging, solid-state variable direction of view endoscopes, wireless camera heads and so on.
(15) The group of input modules connected to the control module may include an auxiliary input module. The auxiliary input module may support a variety of video sources such as third party camera control units, C-Arm, X-Ray, Ultrasound, Personal Computers and the like. Supported input formats may include, DVI, VGA, S-Video, Composite, 3G-SDI and the like. Inputs may be both automatically and manually selected. The auxiliary module may provide increased backward compatibility, forward compatibility and third party image source compatibility.
(16) It should be noted that as used herein, the categorization of Standard Definition (SD) or High Definition (HD) is not intended to limit the categories to a single signal format, but rather, many differing signal formats may be used. Furthermore, many different signal formats are categorized as SD and many different signal formats may be categorized as HD. For instance, SD generally refers to a line count of up to approximately 720480 NTSC and PAL; while HD refers to systems that utilize a higher line count and may include, but is not limited to, 1280720 progressive, 19201080 interlaced, or 19201080 progressive which are only three of the commonly used HD resolutions. HD resolution also includes 1080p or Full HD resolution.
(17) In various embodiments, the modules are capable of sending digital video in the form of HD and SD video over the cable from module to module at fully run-time programmable image sizes, color spaces, bit-depths and frame-rates. The receiving and transmitting ends of the video signals can auto-negotiate these various parameters.
(18) There are commonly used types of signal formats, however, and it is contemplated that additional formats may be provided for; especially new signal formats that may become available. Two commonly used SD format types are NTSC and PAL. It should be noted that these are just two video signal formats and that there are many differing types and modifications to the above-listed types including, for example, a modified version Phase-Alternating Line (PAL-M).
(19) In addition to the standard NTSC and PAL SD (NTSC and PAL) composite, RGB, and s-video (Y/C) outputs, numerous other outputs may be used. The following examples are presented to further illustrate and explain the present invention and should not be taken as limiting in any regard.
(20) Serial Digital Interface (SDI), standardized in ITU-R BT.656 and SMPTE 259M, is a digital video interface used for broadcast-grade video. A related standard, known as High Definition Serial Digital Interface (HD-SDI), is standardized in SMPTE 292M and provides a nominal data rate of 1.485 Gbit/s. Another standard is 3G-SDI. 3G-SDi provides a nominal data rate of 2.970 Gbit/s, and 2.970/1.001 Gbit/s and works with 1080p resolution.
(21) Digital Visual Interface (DVI) is a video interface standard designed to maximize the visual quality of digital display devices such as flat panel LCD computer displays and digital projectors and is partially compatible with the HDMI standard in digital mode (DVI-D). The DVI interface uses a digital protocol in which the desired illumination of pixels is transmitted as binary data. When the display is driven at its native resolution, it will read each number and apply that brightness to the appropriate pixel. In this way, each pixel in the output buffer of the source device corresponds directly to one pixel in the display device.
(22) High-Definition Multimedia Interface (HDMI) is an all-digital audio/visual interface capable of transmitting uncompressed streams. HDMI is compatible with High-bandwidth Digital Content Protection (HDCP) Digital Rights Management technology. HDMI provides an interface between any compatible digital audio/video source and a compatible digital audio and/or video monitor, such as a digital television (DTV).
(23) The modular architecture of the present system allows buyers to progressively and economically upgrade their imaging technology, rather than being required to purchase a CCU that is compatible with the entire range of imagers that the buyer would wish to purchase in the future. The system allows for hardware upgrades through the modules as well as software feature upgrades. Further, the cost of ownership and upgrade, such as acquisition, back-up, and maintenance, is reduced.
(24) Referring now to the drawings, wherein like reference numerals designate corresponding structure throughout the views.
(25)
(26) Internal portions of input modules 2200, 2300 and 2400 are also shown in
(27) Input modules 2200, 2300, and 2400 may be configured to receive and process numerous types of image data 2204. Image data 2204 may include analog data such as CCD based video endoscopes (, 1/10 CCDs) (Pre-CDS analog); CMOS; and/or 720p60 single chip Digital Proximal Heads (for smaller camera heads requiring less than 1080p resolution but better than Standard Definition (SD)). Image data 2204 may also be analog High Definition (HD) image data such as from 3-Chip HD CCD camera heads or digital HD image data such as from 1080p60 3chip camera heads (CMOS) or 1080p60 1chip camera heads (CMOS). Finally image data 2204 may also be advanced fluorescence imaging, solid-state variable direction of view endoscopes, wireless camera heads and so on.
(28) The camera head 4000 is connected to input module 2300 by a cable 4500. Cable 4500 has a connector 4550 that connects into a slot such as shown in input module 2200 as slot 2250. Camera head 4000 may send image data 2204 to the input module through the cable 4500.
(29) Control module 2100 is shown having an on/off switch 2102, which, in certain embodiments, can control the power of all of the input modules 2200, 2300 and 2400. Control module 2100 is also shown having input slots or ports 2104 and 2105 as well as a white balance control switch 2103.
(30)
(31)
(32) Input modules 2200, 2300 and 2400 each have a slot 2201, 2301, 2401 respectively for receiving the cable 1000 which transfers information between the input modules and the control module 2100, such as processed image data 2500. Input module 2400 has various input and output elements 2430, 2440, 2450, 2460 and 2470 to connect to various other input and output devices. Such input/output devices may include existing or third-party CCUs, C-Arm, X-Ray, Ultrasound, and personal computers. Such inputs may also include DVI, VGA, S-Video, Composite, 3G-SDI. Other additional input and output elements may be envisioned for the various input modules 2200, 2300 and 2400.
(33)
(34) More specifically, in this system camera(1) 230 and camera(1+N) 330 output different types of image data, image data(1) 240 and image data(1+N) 340 respectively. Therefore, input module(1) 200 receives image data(1) 240 and processes it into processed image data 140 to be sent to the control module 100. Camera(1+N) 330 is not compatible with input module(1) 200 so it is connected to input module(1+N) 300, which supports image data(1+N) 340. Input module(1+N) 300 receives image data(1+N) 340 and processes it into processed image data 140 to be sent to the control module 100.
(35) It should be understood that input module 200, 300, 400 can be configured to receive multiple types of image data. Furthermore, image data may be for a single type of camera or a family of cameras. It should also be understood that the input modules may process the image data through hardware or software or some combination of hardware and software. For instance, input module(1) can implement a processor 210 running software 220 to process image data(1) 240 into processed image data 140. Similarly, input module(1+N) can implement a processor 310 running software 320 to process image data(1+N) 340 into processed image data 140.
(36) The system may also implement an auxiliary input module 400, which can support multiple auxiliary devices. In this case, Aux(1) 430 outputs Aux Data(1) 440 that is received by the auxiliary input module 400 and processed into processed image data 140. Aux(1+N) 450 outputs aux data(1+N) 460 that is received by the auxiliary input module 400 and processed into processed image data 140. It should also be understood that the auxiliary input module 400 may process the image data through hardware or software or some combination of hardware and software. In one embodiment, auxiliary input module 400 can implement a processor 410 running software 420 to process image data 440, 460 into processed image data 140.
(37) It should be understood that terms input module and auxiliary input module can be used interchangeably as the purpose of the input/auxiliary modules is to process differing types of image data into a standard format for the control module 100. It should also be understood that while
(38) Control module 100 receives processed image data 140 from either all or some of the input modules 200, 300, 400 and can carry out general image processing, user interface and connect with various outputs. For instance, the control module 100 can connect to a touch screen display which provides a user interface through which to control the module. The control module can further process the processed image data 140 and transmit the process/manipulated image data 150 to various places, such as displays 500, 510, outputs 520, 530, PCs, LANs, Storage devices, and printers, etc. The process/manipulated data 150 can be any combination of processed and/or manipulated data. Manipulation to the data can include overlaying a graphical user interface (GUI) on an image, zooming in on an image, and picture-in-picture of multiple sources including from other input modules. Manipulation to the data may also include image rotation, perspective correction, cropping, pan and scan, tilt and mirror in the horizontal and the vertical direction, and correcting for endoscope artifacts.
(39) The control module 100 may also be configured to provide artificial horizon, wide angle lens support, adaptive camera perspective to surgeon perspective, intelligent image pan/scan controlled via surgeon movement.
(40) It should be understood that the control module 100 may further process the image data 140 through hardware or software or some combination of hardware and software. For instance, control module 100 can implement a processor 110 running software 120 to further process the processed image data 140 into manipulated image data 150.
(41) In order to be backwards and forwards compatible the control module 100 and input modules 200, 300, 400 may have to communicate what types of standard processed image data 140 they are compatible with. For instance, control module 100 may be compatible with several types of standard processed image data (e.g. HD or SD) and may have to communicate this compatibility with each input modules 200, 300, 400 in turn the input modules may have to communicate what types of standard processed image data 140 they are capable of transmitting. By communicating this information between the control module 100 and each input module 200, 300, 400 can settle on a type of standard processed image data 140 to communicate. Such functionality allows for the use of newer control modules with older input modules and newer input module with older control modules. For instance, if an input module was made for a newer imaging technology (e.g. HD) the input module may be capable of transmitting processed image data in HD or SD formats so that the new HD input module could function with an older SD control module. Likewise, if a user had a newer HD control module, the control module would be able to receive both HD and SD image data such that the HD control module would be backwards compatible with SD input modules.
(42) In certain embodiments, the control module 100 is connected to, for example, an Intranet, the Internet and/or the like. In certain embodiments, the input modules 200, 300, 400 and/or the control module 100 includes WI-FI and/or a way to receive information directly from the Internet, either wired or wirelessly. In certain embodiments, any of the input modules may wirelessly connect to a related camera.
(43) In certain embodiments, upon connection of control module 100 to, for example, input module 200 an input module identifier/program stored on input module 200 may be transmitted to the control module. It is contemplated that the input module identifier may comprise discrete data or may comprise a program that provides information relating to the input module 200 to the control module 100. In addition, it is contemplated that the control module 100 may also transmit a control module identifier/program stored on the control module 100 to the input module 200. It is contemplated that the control module identifier may comprise discrete data or may comprise a program that provides information relating to the control module 100 to the input module 200.
(44) In certain embodiments, the control module 100 may send commands to the input module 200, which may include, for example, adjusting color balance, light, focal distance, resolution, zoom, focus, shading, and other optical characteristics if the input is a camera video or video endoscope. Input module 200 may then generate and transmit processed image data 140 to control module 100.
(45) Referring now to
(46) The shaft 5004 of endoscope 5000 may comprise either a rigid or flexible shaft or may comprise a combination of the two (e.g., the proximal end of the shaft 5004 coupled to the housing 5020 may comprise a rigid portion, while the distal end may comprise a flexible portion). Also shown in
(47) The first and second image streams 5006, 5008 are further transmitted to the control unit 2000 for processing as indicated in
(48) Also shown at the distal end of the shaft 5004 is a light source 5014, which may comprise, for example, an LED that receives power via a line 5016. Illuminating light from the light source 5014 impinges on the area to be viewed and reflected light is then picked up by first and second imagers 5010, 5012 which generate corresponding first and second image streams 5006, 5008. Alternatively, it is contemplated that the light source 5014 may be positioned in the housing 5020 and the illuminating light is transmitted down the shaft via fiber optic cables to illuminate the area to be viewed. Still further, the light source could further be positioned in the control unit 2000 and the illuminating light could be transmitted from the control unit 2000 via fiber optic cables through the shaft 5004 to illuminate the area to be viewed. In the latter embodiment, the illuminating light could travel through the housing 5020 or a light cable extending from the control unit 2000 could couple directly to the shaft 5004 (e.g., a lateral connection). In other embodiments, an external light source (not shown) may be provided. In other embodiments, the external light source is directly coupled to shaft 5004 or to housing 5020.
(49) Line 5018 is provided to illustrate that various information and/or energy is transmitted between the endoscope 5000 and the control unit 2000. For example, various command and control information may be transmitted down line 5018 including identification information from the endoscope 5000 to the control unit 2000 allowing the control unit 2000 to configure itself to function with the particular type of endoscope attached. Likewise, various command data may be transmitted to the endoscope to facilitate the proper functioning of the endoscope 2000. Line 5018 may also be used to provide power to endoscope 5000, and electronic circuitry 5022 may include a battery (not shown) that charges up to provide uninterrupted power to endoscope 5000. Additionally, while various lines (5006, 5008, 5018) are shown in
(50) Turning now to the control unit 2000, it should be understood that any configuration as previously discussed herein is applicable to the control unit of
(51) Referring to
(52) The 3D image stream is a composite of the first and second image streams 5006, 5008. The Image stream 2204 is comprised of lines of data where every other line of data from the first and second image streams 5006, 5008 are combined (interleaved) to generate the 3D image stream.
(53) In certain embodiments, the 3D image stream is formatted in a line-by-line interleave format using AVC/H.264 Video coding.
(54) There are various forms of multiplexing that can be used to generate the 3D image stream such as Top and Bottom (TaB) frame compatible format multiplexing. TaB formatting must be used with progressive (720p and 1080p) HD video formats and may be used with MPEG-2 or with AVC/H.264 Video coding. TaB formatting must also be oriented with the Left-eye image on the top half of the frame and Right-eye image on the bottom half of the frame, without any inversion or mirroring. For 720p formats, the Left-eye image occupies lines 26 to 385, and the Right-eye image occupies lines 386 to 745, for example. For 1080p formats, the Left-eye image occupies lines 42 to 581, and the Right-eye image occupies lines 582 to 1121. TaB formatting is coded using any anti-aliased resizing algorithm that reduces resolution and alias components only in the vertical direction without specific line structure orientation between left and right views. This means that a simple 2-dimensional image processed in this way will produce exactly the same reduced image for the left and right views.
(55) In certain embodiments, such a TaB frame multiplexing involves de-interleaving the 3D image frame and repacking it into a Top half-of-frame (Right Image) and Bottom half-of-frame (Left Image) before passing the image to a display. In certain embodiments, the de-interleaving the 3D image is done before passing the image to standard H.264 encoders. In certain embodiments, this is done without using Multiview Video Coding (MVC) features of H.264 encoders. In certain embodiments the viewed playback of an H.264 recording is performed on a commercial 3D display that supports 3D Top/Bottom display modes.
(56) In certain embodiments, various forms of multiplexing involves data that is saved to storage and also data that is sent to a display. Moreover, if the system has a high bandwidth, then the data could be sent directly to a display in certain embodiments.
(57) Another form of multiplexing that can be used to generate the 3D image stream is Side-by-Side (SbS) frame compatible format. SbS formatting may be used with interlaced HD video formats, such as 1080i or 1080p. SbS formatting may be used with MPEG-2 or with AVC/H.264 Video coding. SbS formatting is oriented with the Left-eye image on the left half of the frame and Right-eye image on the right half of the frame, without any inversion or mirroring. SbS formatting is also coded using any anti-aliased resizing algorithm that reduces resolution and alias components only in the horizontal direction without specific column structure orientation between left and right views. This means that a simple 2-dimensional image processed in this way will produce exactly the same reduced image for the left and right views.
(58) Another form of multiplexing that can be used to generate the 3D image stream is Alternating Line-by-Line interleave (LbL). Alternating Line-by-Line interleave involves alternating lines from the first and second image streams into a 3D image stream.
(59) While the invention has been specifically described in connection with certain specific embodiments thereof, it is to be understood that this is by way of illustration and not of limitation and that various changes and modifications in form and details may be made thereto, and the scope of the appended claims should be construed as broadly as the prior art will permit.
(60) The description of the invention is merely exemplary in nature, and thus, variations that do not depart from the gist of the invention are intended to be within the scope of the invention. Such variations are not to be regarded as a departure from the spirit and scope of the invention.