Apparatus and method for adding synchronization information to an auxiliary data space in a video signal and synchronizing a video
09723259 · 2017-08-01
Assignee
Inventors
- Suk Hee Cho (Daejeon, KR)
- Se Yoon Jeong (Daejeon, KR)
- Jong Ho Kim (Daejeon, KR)
- Sung Hoon Kim (Daejeon, KR)
- Jin Soo Choi (Daejeon, KR)
- Jin Woong Kim (Daejeon, KR)
Cpc classification
H04N21/242
ELECTRICITY
H04N21/235
ELECTRICITY
H04N5/0675
ELECTRICITY
International classification
H04N21/242
ELECTRICITY
H04N21/235
ELECTRICITY
Abstract
A video synchronization apparatus and method by adding sync data to an ancillary data space of a video signal are provided. A sync data adding apparatus may include a sync data generator to generate sync data; and a sync data adder to add the sync data to an ancillary data space of a video signal.
Claims
1. An encoding system, comprising: a sync data adding apparatus to add sync data to an ancillary data space of a video signal; an encoding apparatus to generate a bitstream in which the sync data is added by encoding the video signal in which the synch data is added; and a multiplexing apparatus to synchronize a plurality of bitstreams based on the sync data included in the bitstream, and multiplexes the synchronized bitstreams, wherein the encoding apparatus identifies the sync data in the ancillary data space of the video signal, and adds the identified sync data to a user data syntax among syntaxes of the bitstream, and wherein the multiplexing apparatus determines an interval for verifying the sync data based on the user data syntax added with the sync data, and synchronizes the plurality of bitstreams by verifying the sync data at the determined intervals.
2. An encoding method, comprising: adding sync data related to time information to an ancillary data space of a video signal; and generating an MPEG-2 bitstream by encoding the video data using MPEG-2 video encoding scheme, wherein the MPEG-2 bitstream includes video sequence( ) syntax, wherein the video sequence( ) syntax includes the ancillary data corresponding to extension_and_user_data( ) syntax inserted in the MPEG-2 bitstream, based on a group of pictures (GOP) of the video data, or a picture unit.
3. An encoding method, comprising: adding sync data to an ancillary data space of a video signal; generating a bitstream in which the sync data is added by encoding the video signal in which the synch data is added; synchronizing a plurality of bitstreams based on the sync data included in the bitstream; and multiplexing the synchronized bitstreams, wherein the generating comprises identifying the sync data in the ancillary data space of the video signal, and adding the identified sync data to a user data syntax among syntaxes of the bitstream, and wherein the synchronizing comprises determining an interval for verifying the sync data based on the user data syntax added with the sync data, and synchronizing the plurality of bitstreams by verifying the sync data at the determined intervals.
Description
BRIEF DESCRIPTION OF DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
BEST MODE FOR CARRYING OUT THE INVENTION
(11) Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The embodiments are described below in order to explain the present invention by referring to the figures. A synchronization data (sync data) adding method according to an embodiment of the present invention may be performed by a sync data adding apparatus.
(12)
(13) Referring to
(14) The left camera 110 may generate a first video signal that is a video signal of a left image by photographing a left image of a photographing target, and may transmit the first video signal to the sync data adding apparatus 100.
(15) The right camera 120 may generate a second video signal that is a video signal of a right image by photographing a right image of the photographing target, and may transmit the second video signal to the sync data adding apparatus 100.
(16) Here, the left camera 110 and the right camera 120 may be gen-locked by the signal generating apparatus 130, and may simultaneously transmit the first video signal and the second video signal based on a synchronization signal of the signal generating apparatus 130.
(17) The sync data adding apparatus 100 may generate sync data, and may add the sync data to an ancillary data space of the first video signal and an ancillary data space of the second video signal. Here, the sync data may be information required to synchronize the left image and the right image such as a time code. Also, the ancillary data space may be one of a vertical ancillary data space (VANC) positioned between video frames of the video signal and a data space for transmission at a vertical blanking interval (VBI).
(18) Here, the sync data adding apparatus 100 may transmit the first video signal and the second video signal in which the sync data is added, to the left image recording medium 115 and the right image recording medium 125, respectively, and thereby store the first video signal and the second video signal therein.
(19) The sync data adding apparatus 100 may generate sync data using at least a predetermined number of bits such as a time code that includes hour data, minute data, and second data, and thereby prevent the same synchronization signal from being overlappingly added in single video content or a connected video sequence.
(20) Also, as shown in
(21) That is, the sync data adding apparatus 100 may add sync data to an ancillary data space included in a video signal. Through this, a decoding apparatus may synchronize images corresponding to the video signal without a need to transmit a separate synchronization signal, or to performing a separate synchronization process.
(22)
(23) Referring to
(24) The sync data generator 210 may generate sync data by including a time code based on a time when a video signal is received, or a playback time of a video frame. Here, the sync data generator 210 may generate sync data using at least a predetermined number of bits such as a time code that includes hour data, minute data, and second data, and thereby prevent the same synchronization signal from being overlappingly added in single video content or a connected video sequence.
(25) The sync data adder 220 may add, to the ancillary data space of the video signal, the sync data that is generated by the sync data generator 210. Here, the sync data adder 220 may determine, as the ancillary data space, one of a VANC, a data space for transmission at a VBI, and ab ancillary data space positioned between video frames of a video signal in national television system committee (NTSC), SD-serial digital interface (SDI), and HD-SDI standards, and may add sync data to the data space that is determined as the ancillary data space.
(26) The ancillary data space will be further described with reference to
(27)
(28) An image display apparatus may use a time for resetting or setting an image display function in order to display a subsequent screen after displaying a single screen. Here, the image display apparatus may not use the image display function and thus, a data received at the above point in time may not be displayed on a screen.
(29) Accordingly, as shown in
(30) By adding sync data to the ancillary data space 330 included in the video signal 300, the sync data adding apparatus 100 may transmit information used to synchronize a video signal without a need to transmit a separate synchronization signal or additional information.
(31)
(32) An encoding apparatus 410 may receive, from the left image recording medium 115, a first video signal in which sync data is added, and may receive, from the right image recording medium 125, a second video signal in which sync data is added, and thereby encode the first video signal and the second video signal based on a frame unit.
(33) Specifically, the encoding apparatus 410 may identify the sync data in the ancillary data space of the video signal, and may divide the video signal based on a frame unit.
(34) Next, the encoding apparatus 410 may generate a bitstream by encoding the video signal divided based on the frame unit together with sync data corresponding to a corresponding frame. Here, the encoding apparatus 410 may store sync data in an elementary stream (ES) of a bitstream. Also, the encoding apparatus 410 may additionally define flag information indicating whether sync data is present in a transport stream (TS) syntax, so that a multiplexing apparatus 420 may readily verify whether the sync data is present.
(35) A next generation digital broadcasting, such as a 3DTV, a free view broadcasting, a multi-view broadcasting, a super resolution TV, and the like, may need to maintain the compatibility with an existing desktop video (DTV) broadcasting.
(36) That is, when at least one of a plurality of video signals is generated as a bitstream according to an advanced television systems committee (ATSC) digital broadcasting standard used in the existing DTV broadcasting, the encoding apparatus 410 may maintain the compatibility with the existing DTV broadcasting. A method of adding, by the encoding apparatus 410, sync data to a bitstream according to an ATSC digital broadcasting standard will be described with reference to
(37) The encoding apparatus 410 may transmit the generated bitstream to the multiplexing apparatus 420.
(38) The encoding apparatus 410 may generate a bitstream by encoding a video signal and sync data included in the video signal. Through this, without a need to synchronize and multiplex bitstreams using the multiplexing apparatus 420, a terminal receiving a bitstream may synchronize and thereby play back the plurality of bitstreams. For example, as in a non-real time (NRT) 3D broadcasting, even though a left image signal and a right image signal are separately transmitted without being multiplexed, a terminal receiving a signal may synchronize and thereby play back the left image signal and the right image signal using sync data included in the signals.
(39) The multiplexing apparatus 420 may synchronize and thereby multiplex a plurality of bitstreams based on sync data that is included in a bitstream received from the encoding apparatus 410.
(40) Specifically, the multiplexing apparatus 420 may identify sync data of each frame by interpreting an ES syntax of the bitstream that is received from the encoding apparatus 410.
(41) Next, the multiplexing apparatus 420 may search for a frame of another bitstream having the same sync data as the identified sync data.
(42) Next, the multiplexing apparatus 420 may synchronize a frame of the bitstream retrieved based on the sync data with a frame of the received bitstream.
(43) The multiplexing apparatus 420 may multiplex and thereby output the synchronized bitstreams.
(44) A conventional multiplexing apparatus needs to receive bitstreams that are to be synchronized and be multiplexed at the same time and thus, the number of encoding apparatuses corresponding to the number of bitstreams to be multiplexed may be required. For example, as shown in
(45) On the other hand, the multiplexing apparatus 420 according to the present invention may need to search for and synchronize bitstreams to be synchronized based on sync data included in a bitstream. Accordingly, even though bitstreams to be synchronized are not received at the same time, the encoding apparatus 420 may search for and thereby multiplex a bitstream having the same bit data from among initially received bitstreams. Accordingly, regardless of the number of encoding apparatuses, the multiplexing apparatus 420 according to the present invention may synchronize and multiplex a plurality of bitstreams.
(46) For example, as shown in
(47) Also, a decoding apparatus according to the present invention may decode a received bitstream, and may synchronize the decoded bitstream using the same method as the multiplexing apparatus 420.
(48) Specifically, the decoding apparatus may identify sync data of each frame by interpreting an ES syntax of the decoded bitstream.
(49) Next, the decoding apparatus may search for a frame of another bitstream having the same sync data as the identified sync data.
(50) Next, the decoding apparatus may synchronize a frame of the bitstream retrieved based on the sync data with a frame of the received bitstream.
(51)
(52) Referring to
(53) The sync data adding apparatus 100 may generate sync data, may add the sync data to an ancillary data space of a first video signal that is received from the left camera 110, and may transmit the first video signal to the first encoding apparatus 510. Also, the sync data adding apparatus 100 may add the sync data to an ancillary data space of a second video signal that is received from the right camera 120, and may transmit the second video signal to the second encoding apparatus 520.
(54) The first encoding apparatus 510 may generate a first bitstream by encoding, based on a frame unit, the first video signal that is received from the sync data adding apparatus 100. Here, the first encoding apparatus 510 may identify the sync data from the ancillary data space of the first video signal, and may add the identified sync data to a user data syntax among syntaxes of the first bitstream. The first encoding apparatus 510 may transmit, to the multiplexing apparatus 530, the first bitstream in which the sync data is added.
(55) The second encoding apparatus 520 may generate a second bitstream by encoding, based on a frame unit, the second video signal that is received from the sync data adding apparatus 100. Here, the second encoding apparatus 520 may identify the sync data from the ancillary data space of the second video signal, and may add the identified sync data to, a user data syntax among syntaxes of the second bitstream. The second encoding apparatus 520 may transmit, to the multiplexing apparatus 530, the second bitstream in which the sync data is added.
(56) The multiplexing apparatus 530 may synchronize and multiplex a frame in which the same sync data is added to the first bitstream and the second bitstream, based on the sync data that is included in each of the first bitstream and the second bitstream.
(57)
(58) An ATSC digital broadcasting standard encodes a video signal using an MPEG-2 video encoding standard. Therefore, video sequence( ) that is a top syntax of MPEG-2 bitstream generated by the encoding apparatus 410 according to the ATSC digital broadcasting standard may include extension_and_user_data(2) 610.
(59) Also, the encoding apparatus 410 may include extension_and_user_data(1) in video sequence( ), instead of including extension_and_user_data(2) 610.
(60) Here, in extension_and_user_data(1), ancillary data may be inserted based on a group of pictures (GOP). In extension_and_user_data(2), ancillary data may be inserted into an encoded bitstream based on a picture unit. Also, the GOP unit may include an encoded picture of a plurality of frames and thus, be a randomly accessible independent encoding unit. The picture may be an image in which each video frame is encoded. A plurality of frames corresponding to the GOP unit may include frames from a single key frame to a subsequent key frame.
(61) As shown in
(62) The encoding apparatus 410 may store sync data in user_data( ) 710 that is included in extension_and_user_data(1) or extension_and_user_data(2).
(63) For example, when the sync data adding apparatus 100 generates sync data using a time code that is indicated as hour, minute, second, and the number of frames, the encoding apparatus 410 may store sync data in user_data( ) 710 based on a time code syntax according to society of motion picture and television engineers (SMPTE) 334-2: caption distribution packet definition standard, as shown in the following Table 1:
(64) TABLE-US-00001 TABLE 1 Syntax Bits Meaning time_code( ){ tc_10hrs 2 Value of ten hour unit tc_1hrs 4 value of one hour unit Reserved 1 ‘1’ tc_10min 3 Value of ten minute unit tc_1min 4 Value of one minute unit tc_feld_flag 1 In an interlace picture, set as zero if it is a first field, and set as ‘1’ if it is a second filed. In a progressive picture, used for counting the number of frames. That is, interpreted as (2*frame + flag). tc_10sec 3 Value of ten minute second tc_1sec 4 Value of one minute second drop_frame_flag 1 When time code count compensates for drop-frame, it is indicated as 1 and otherwise, it is indicated as zero. zero 1 ‘0’ tc_10fr 3 Counted value of ten frame unit tc_1fr 4 Counted value of one frame unit. } }
(65) When the multiplexing apparatus 420 multiplexes, to a single MPEG-2 TS, bitstreams that are received from the plurality of encoding apparatuses 410, or bitstreams that are received from a single encoding apparatus 410, the multiplexing apparatus 420 may multiplex the bitstreams using sync data that is stored in user_data( ) 710.
(66) Here, the multiplexing apparatus 420 may determine an interval for verifying sync data based on the user data syntax in which the sync data is added, and may synchronize the plurality of bitstreams by verifying the sync data at the determined intervals. For example, when the sync data is stored in user_data( ) 710 of extension_and_user_data(1), the multiplexing apparatus 420 may perform multiplexing by verifying the sync data based on a GOP unit. Also, when the sync data is stored in user_data( ) 710 of extension_and_user_data(2), the multiplexing apparatus 420 may perform multiplexing by verifying the sync data based on a picture unit.
(67) Also, extension_and_user_data(1) or extension_and_user_data(2) may be a syntax for storing ancillary data instead of image information. Accordingly, even though a decoding apparatus that receives and decodes a signal output from the multiplexing apparatus 420 is incapable of interpreting the sync data that is stored in user_data( ) 710 of extension_and_user_data(2), the decoding apparatus may restore an image by decoding the syntax in which the image information is encoded. That is, even a conventional decoding apparatus incapable of identifying sync data may restore an image of an encoded signal according to the present invention and thus, the encoding system according to the present invention may be compatible with the conventional decoding apparatus.
(68)
(69) An ATSC digital broadcasting standard defines a picture user data syntax that indicates closed caption data or bar data in user_data( ) of an MPEG-2 video encoded bitstream.
(70) The encoding apparatus 410 according to the present invention may transmit sync data according to the ATSC digital broadcasting standard by adding time code( ) 810 of table 1 to picture user data syntax of user data( ) 710. Here, a syntax 820 associated with user_data_type_code may be one of ATSC_reserved_user_data, and may be a conditional syntax for identifying time code( ) 810.
(71)
(72)
(73) A caption data adding apparatus 900 may generate sync data and caption data associated with an image photographed from the left camera 110 and the right camera 120, and may add the caption data including the sync data to an ancillary data space of a first video signal and an ancillary data space of a second video signal. Here, the caption data adding apparatus 900 may generate caption data, for example, closed caption using SMPTE 333M or SMPTE 334M that are standards with respect to EIA-608 (analog and SD-SDI closed captioning) and EIA-708 (HD-SDI closed captioning).
(74) An MPEG-2 encoding apparatus 910 may encode caption data together with a video signal, and may output the same to an MPEG-2 TS.
(75) Here, the MPEG-2 encoding apparatus 910 may not generate a bitstream by inserting sync data in user_data( ) within extension_and_user_data(1) or extension_and_user_data(2). Accordingly, the MPEG-2 encoding apparatus 910 may store the sync data by adding time code( ) 810 to cc_data( ) that indicates a position of caption broadcasting data pair.
(76) When using an MPEG-2 encoding method, the video encoding apparatus 920 may encode the video signal and the caption and thereby output the same to an MPEG-2 TS. Here, the video encoding apparatus 920 may store the sync data by adding time code( ) 810 to cc_data( ).
(77) Also, when using an advanced video coding (AVC) or high efficiency video coding (HEVC) based encoding method, the video encoding apparatus 920 may generate a bitstream by encoding caption data based on a frame unit. Here, the video encoding apparatus 920 may identify the sync data from the ancillary data space of the second video signal, and may add the identified sync data to a user data syntax among syntaxes of the second bitstream. The video encoding apparatus 920 may transmit, to the multiplexing apparatus 930, the bitstream in which the sync data is added.
(78) The multiplexing apparatus 930 may identify sync data of each frame by verifying cc_data( ) of MPEG-2 TS that is received from the MPEG-2 encoding apparatus 910 and user_data( ) of a bitstream that is received from the video encoding apparatus 920, and may synchronize and thereby multiplex the MPEG-2 TS received from the MPEG-2 encoding apparatus 910 and the bitstream received from the video encoding apparatus 920. Through this, the multiplexing apparatus 930 may generate and output an MPEG-2 3D TV TS signal about 3D TV broadcasting content.
(79)
(80) In operation S1010, the sync data adding apparatus 100 may generate sync data including a time code based on a time when a video signal is received, or a playback time of a video frame.
(81) In operation S1020, the sync data adding apparatus 100 may add the generated sync data to an ancillary data space of the video signal.
(82) In operation S1030, the sync data adding apparatus 100 may generate a bitstream by encoding the video signal in which the sync data is added, based on a frame unit.
(83) Specifically, the encoding apparatus 410 may generate a bitstream by identifying the sync data from the ancillary data space of the video signal, and by encoding the video signal divided based on a frame unit together with sync data corresponding to a corresponding frame.
(84) In operation S1040, the multiplexing apparatus 420 may synchronize and thereby multiplex a plurality of bitstreams based on the sync data that is included in the bitstream generated in operation S1030.
(85) Specifically, in operation S1030, the multiplexing apparatus 420 may identify sync data of a bitstream and may search for another bitstream having the same sync data as the identified sync data. Next, the multiplexing apparatus 420 may synchronize and thereby multiplex a frame of the bitstream retrieved based on the sync data with a frame of the bitstream generated in operation S1020.
(86) According to embodiments of the present invention, by adding sync data in an ancillary data space of a video signal, a decoding apparatus may synchronize images corresponding to the video signal without a need to transmit a separate synchronization signal or performing a separate synchronization process.
(87) According to embodiments of the present invention, it is possible to generate a bitstream by encoding a video signal and sync data included in the video signal. Therefore, even though a plurality of bitstreams is transmitted without a synchronization process, a terminal receiving the bitstream may synchronize and play back the plurality of bitstreams.
(88) According to embodiments of the present invention, bitstreams to be synchronized may be retrieved and be synchronized based on sync data included in a bitstream. Accordingly, even though bitstreams to be synchronized are not simultaneously received, it is possible to retrieve and synchronize a bitstream to be synchronized among initially received bitstreams.
(89) Although a few embodiments of the present invention have been shown and described, the present invention is not limited to the described embodiments. Instead, it would be appreciated by those skilled in the art that changes may be made to these embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.