INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM
20230164305 · 2023-05-25
Inventors
Cpc classification
International classification
Abstract
An information processing device according to the present technology includes a display processing unit that performs processing of displaying a screen indicating, by filtering, camerawork information corresponding to input information of a user among a plurality of pieces of camerawork information, as a camerawork designation screen that receives designation operation of camerawork information that is information indicating at least a movement trajectory of a viewpoint in a free viewpoint image.
Claims
1. An information processing device comprising: a display processing unit that performs processing of displaying a screen indicating, by filtering, camerawork information corresponding to input information of a user among a plurality of pieces of camerawork information, as a camerawork designation screen that receives designation operation of camerawork information that is information indicating at least a movement trajectory of a viewpoint in a free viewpoint image.
2. The information processing device according to claim 1, wherein the display processing unit performs processing of filtering and displaying camerawork information according to a keyword as the input information on the camerawork designation screen.
3. The information processing device according to claim 1, wherein filtering condition information indicating a filtering condition of camerawork information is displayed on the camerawork designation screen, and the display processing unit performs processing of filtering and displaying the camerawork information according to the filtering condition indicated by the selected filtering condition information as the input information.
4. The information processing device according to claim 1, wherein the display processing unit performs processing of displaying information obtained by visualizing a movement trajectory of the viewpoint on the camerawork designation screen.
5. The information processing device according to claim 1, wherein the display processing unit performs processing of displaying, on the camerawork designation screen, camera arrangement position information indicating arrangement positions of a plurality of cameras that performs imaging for generating a free viewpoint image.
6. The information processing device according to claim 5, wherein the display processing unit performs processing of displaying, on the camerawork designation screen, start point arrangement position information and end point arrangement position information indicating respective positions of a camera serving as a movement start point and a camera serving as a movement end point of the viewpoint among the plurality of cameras.
7. The information processing device according to claim 6, wherein the display processing unit performs processing of displaying the start point arrangement position information and the end point arrangement position information, and arrangement position information of cameras other than the camera serving as the movement start point and the camera serving as the movement end point among the plurality of cameras in different modes.
8. The information processing device according to claim 4, wherein the display processing unit performs processing of displaying information obtained by visualizing the moving speed of the viewpoint on the camerawork designation screen.
9. The information processing device according to claim 8, wherein the display processing unit performs processing of displaying information indicating a period in which the moving speed decreases as information obtained by visualizing the moving speed of the viewpoint.
10. The information processing device according to claim 4, wherein the display processing unit performs processing of displaying information obtained by visualizing a field of view from the viewpoint on the camerawork designation screen.
11. The information processing device according to claim 4, wherein the display processing unit performs processing of displaying a target that defines a line-of-sight direction from the viewpoint on the camerawork designation screen.
12. The information processing device according to claim 11, further comprising: a camerawork editing processing unit that updates information on the position of the target in camerawork information according to a change in the position of the target on the camerawork designation screen.
13. The information processing device according to claim 1, wherein the display processing unit performs processing of displaying an image obtained by observing a three-dimensional space from the viewpoint on the camerawork designation screen.
14. The information processing device according to claim 13, wherein the display processing unit performs processing of displaying an image obtained by rendering a virtual three-dimensional model of a real space as an image obtained by observing a three-dimensional space from the viewpoint.
15. The information processing device according to claim 5, wherein the display processing unit performs processing of displaying information notifying a camera in which a change in the field of view has been detected among the plurality of cameras.
16. An information processing method in which an information processing device performs processing of displaying a screen indicating, by filtering, camerawork information corresponding to input information of a user among a plurality of pieces of camerawork information, as a camerawork designation screen that receives designation operation of camerawork information that is information indicating at least a movement trajectory of a viewpoint in a free viewpoint image.
17. A program readable by a computer device, the program causing the computer device to implement a function of performing processing of displaying a screen indicating, by filtering, camerawork information corresponding to input information of a user among a plurality of pieces of camerawork information, as a camerawork designation screen that receives designation operation of camerawork information that is information indicating at least a movement trajectory of a viewpoint in a free viewpoint image.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0042]
[0043]
[0044]
[0045]
[0046]
[0047]
[0048]
[0049]
[0050]
[0051]
[0052]
[0053]
[0054]
[0055]
[0056]
[0057]
[0058]
[0059]
[0060]
[0061]
[0062]
[0063]
[0064]
[0065]
[0066]
[0067]
[0068]
[0069]
[0070]
[0071]
[0072]
[0073]
[0074]
[0075]
[0076]
[0077]
[0078]
[0079]
[0080]
[0081]
[0082]
[0083]
[0084]
[0085]
[0086]
[0087]
[0088]
[0089]
[0090]
[0091]
[0092]
[0093]
[0094]
[0095]
[0096]
[0097]
[0098]
[0099]
[0100]
[0101]
[0102]
[0103]
[0104]
[0105]
[0106]
[0107]
[0108]
[0109]
[0110]
[0111]
[0112]
[0113]
[0114]
[0115]
[0116]
MODE FOR CARRYING OUT THE INVENTION
1. System Configuration
[0117]
[0118] The image processing system includes an image creation controller 1, a free viewpoint image server 2, a video server 3, a plurality of (for example, four) video servers 4A, 4B, 4C, and 4D, a network attached storage (NAS) 5, a switcher 6, an image conversion unit 7, a utility server 8, and a plurality of (for example, sixteen) imaging devices 10.
[0119] Hereinafter, the term “camera” refers to the imaging device 10. For example, “camera arrangement” means arrangement of a plurality of imaging devices 10.
[0120] In addition, when the video servers 4A, 4B, 4C, and 4D are collectively referred to without being particularly distinguished, they are referred to as “video servers 4”.
[0121] In this image processing system, a free viewpoint image corresponding to an observation image from an arbitrary viewpoint in a three-dimensional space can be generated on the basis of captured images (for example, image data V1 to V16) acquired from a plurality of imaging devices 10, and an output clip including the free viewpoint image can be created.
[0122] In
[0123] A solid line indicates connection of a serial digital interface (SDI) which is an interface standard for connecting broadcast devices such as a camera and a switcher, and supports 4K, for example. The image data is mainly transmitted and received between the devices by the SDI wiring.
[0124] The double line indicates connection of a communication standard for constructing a computer network, for example, 10 Gigabit Ethernet. The image creation controller 1, the free viewpoint image server 2, the video servers 3, 4A, 4B, 4C, and 4D, the NAS 5, and the utility server 8 are connected by a computer network, so that image data and various control signals can be transmitted and received to and from each other.
[0125] A broken line between the video servers 3 and 4 indicates a state in which the video servers 3 and 4 equipped with the inter-server file sharing function are connected via, for example, a 10G network. As a result, between the video server 3 and the video servers 4A, 4B, 4C, and 4D, each video server can preview and send materials in other video servers. That is, a system using a plurality of video servers is constructed, and efficient highlight editing and sending can be realized.
[0126] Each imaging device 10 is configured as a digital camera device including an imaging element such as a charge coupled device (CCD) sensor or a complementary metal-oxide-semiconductor (CMOS) sensor, for example, and obtains captured images (image data V1 to V16) as digital data. In this example, each imaging device 10 obtains a captured image as a moving image.
[0127] In this example, each imaging device 10 captures an image of a situation in which a competition such as basketball or soccer is being held, and each imaging device is arranged in a predetermined orientation at a predetermined position in a competition site where the competition is held. In this example, the number of imaging devices 10 is sixteen, but the number of imaging devices 10 may be at least two or more to enable generation of a free viewpoint image. By increasing the number of imaging devices 10 and imaging a target subject from a larger number of angles, the accuracy of three-dimensional restoration of the subject can be improved, and the image quality of the virtual viewpoint image can be improved.
[0128]
[0129] The image creation controller 1 includes an information processing device. The image creation controller 1 can be realized using, for example, a dedicated workstation, a general-purpose personal computer, a mobile terminal device, or the like.
[0130] The image creation controller 1 performs control/operation management of the video servers 3 and 4 and processing for creating clips.
[0131] As an example, the image creation controller 1 is a device operable by the operator OP1. For example, the operator OP1 selects a clip content and gives an instruction to create the clip.
[0132] The free viewpoint image server 2 is configured as an information processing device that actually performs a process of creating a free viewpoint image (free view (FV) clip to be described later) according to an instruction from the image creation controller 1 or the like. The free viewpoint image server 2 can also be realized using, for example, a dedicated workstation, a general-purpose personal computer, a mobile terminal device, or the like.
[0133] As an example, the free viewpoint image server 2 is a device that can be operated by the operator OP2. The operator OP2 performs, for example, work related to creation of an FV clip as a free viewpoint image. Specifically, the operator OP2 performs a designation operation (selection operation) of the camerawork for generating the free viewpoint image. In this example, the operator OP2 also performs work of creating a camerawork.
[0134] Configurations and processes of the image creation controller 1 and the free viewpoint image server 2 will be described later in detail. In addition, the operators OP1 and OP2 perform operations, but for example, the image creation controller 1 and the free viewpoint image server 2 may be arranged side by side and operated by one operator.
[0135] Each of the video servers 3 and 4 is an image recording device, and includes, for example, a data recording unit such as a solid state drive (SSD) or a hard disk drive (HDD), and a control unit that controls data recording and reproduction of the data recording unit.
[0136] Each of the video servers 4A, 4B, 4C, and 4D can receive inputs from four systems, for example, and each video server simultaneously records images captured by the four imaging devices 10.
[0137] For example, the video server 4A records the image data V1, V2, V3, and V4. The video server 4B records the image data V5, V6, V7, and V8. The video server 4C records the image data V9, V10, V11, and V12. The video server 4D records the image data V13, V14, V15, and V16.
[0138] As a result, all the images captured by the sixteen imaging devices 10 are simultaneously recorded.
[0139] The video servers 4A, 4B, 4C, and 4D perform constant recording, for example, during a sports game to be broadcast.
[0140] The video server 3 is directly connected to the image creation controller 1, for example, and can receive inputs from two systems and output data to two systems, for example. Pieces of image data Vp and Vq are illustrated as inputs of two systems. As the image data Vp and Vq, images captured by any two imaging devices 10 (any two of the pieces of image data V1 to V16) can be selected. Naturally, the captured image may be an image captured by another imaging device.
[0141] The image creation controller 1 can display the image data Vp and Vq on the display as monitor images. The operator OP1 can check the situation of the scene captured and recorded for broadcasting, for example, by the image data Vp and Vq input to the video server 3.
[0142] In addition, since the video servers 3 and 4 are connected to the file sharing state, the image creation controller 1 can monitor and display the images captured by the imaging devices 10 recorded in the video servers 4A, 4B, 4C, and 4D, and the operator OP1 can sequentially check the captured images.
[0143] Note that, in this example, a time code is attached to an image captured by each imaging device 10, and frame synchronization can be achieved in processing in the video servers 3, 4A, 4B, 4C, and 4D.
[0144] The NAS 5 is a storage device arranged on a network, and includes, for example, an SSD, an HDD, or the like. In the case of this example, the NAS 5 is a device that stores for processing in the free viewpoint image server 2 or stores the created free viewpoint image when some frames of the image data V1, V2, ..., V16 recorded in the video servers 4A, 4B, 4C, and 4D are transferred for generating the free viewpoint image.
[0145] The switcher 6 is a device that inputs an image output via the video server 3 and selects the main line image PGMout to be finally selected and broadcast. For example, a broadcast director or the like performs necessary operations.
[0146] The image conversion unit 7 performs, for example, resolution conversion and synthesis of image data by the imaging device 10, generates a monitoring image of the camera arrangement, and supplies the monitoring image to the utility server 8. For example, sixteen systems of image data (V1 to V16) to be 8K images are converted into four systems of images arranged in a tile shape after resolution conversion into 4K images, and the four systems of images are supplied to the utility server 8.
[0147] The utility server 8 is a computer device capable of performing various related processes. In this example, the utility server 8 is a device that performs a process of detecting a camera movement for calibration. For example, the utility server 8 monitors image data from the image conversion unit 7 to detect camera movement. The camera movement is, for example, movement of any one of arrangement positions of the imaging devices 10 arranged as illustrated in
2. Configuration of Image Creation Controller and Free Viewpoint Image Server
[0148] The image creation controller 1, the free viewpoint image server 2, the video servers 3 and 4, and the utility server 8 having the above configuration can be realized as the information processing device 70 having the configuration illustrated in
[0149] In
[0150] The CPU 71, the ROM 72, and the RAM 73 are connected to one another via a bus 74. An input/output interface 75 is also connected to the bus 74.
[0151] An input unit 76 including an operator and an operation device is connected to the input/output interface 75.
[0152] For example, as the input unit 76, various operators and operation devices such as a keyboard, a mouse, a key, a dial, a touch panel, a touch pad, and a remote controller are assumed.
[0153] An operation of the user is detected by the input unit 76, and a signal corresponding to the input operation is interpreted by the CPU 71.
[0154] Furthermore, a display unit 77 configured of a liquid crystal display (LCD), an organic electro-luminescence (EL) panel, or the like, and a sound output unit 78 configured of a speaker or the like are integrally or separately connected to the input/output interface 75.
[0155] The display unit 77 is a display unit that performs various displays, and includes, for example, a display device provided in a housing of the information processing device 70, a separate display device connected to the information processing device 70, or the like.
[0156] The display unit 77 executes display of an image for various types of image processing, a moving image to be processed, and the like on a display screen on the basis of an instruction from the CPU 71. In addition, the display unit 77 displays various operation menus, icons, messages, and the like, that is, displays as a graphical user interface (GUI) on the basis of an instruction from the CPU 71.
[0157] In some cases, a storage unit 79 including a hard disk, a solid-state memory, or the like, and a communication unit 80 configured of a modem or the like are connected to the input/output interface 75.
[0158] The communication unit 80 performs communication processing via a transmission path such as the Internet, wired/wireless communication with various devices, bus communication, and the like.
[0159] A drive 82 is also connected to the input/output interface 75 as necessary, and a removable recording medium 81 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory is appropriately mounted.
[0160] A data file such as an image file MF, various computer programs, and the like can be read from the removable recording medium 81 by the drive 82. The read data file is stored in the storage unit 79, and images and sounds included in the data file are output by the display unit 77 and the sound output unit 78. Furthermore, the computer program and the like read from the removable recording medium 81 are installed in the storage unit 79 as necessary.
[0161] In the information processing device 70, software can be installed via network communication by the communication unit 80 or the removable recording medium 81. Alternatively, the software may be stored in advance in the ROM 72, the storage unit 79, or the like.
[0162] In a case where the image creation controller 1 and the free viewpoint image server 2 are realized using such an information processing device 70, the processing functions as illustrated in
[0163]
[0164] The section identification processing unit 21 performs processing of identifying a generation target image section as a generation target of a free viewpoint image for a plurality of captured images (image data V1 to V16) simultaneously captured by the plurality of imaging devices 10. For example, in response to the operator OP1 performing an operation of selecting a scene to be replayed in the image, processing of specifying a time code for the scene, in particular, a section (generation target image section) of the scene to be a free viewpoint image, and notifying the free viewpoint image server 2 of the time code is performed.
[0165] Here, the generation target image section refers to a frame section actually used as a free viewpoint image. In a case where a free viewpoint image is generated for one frame in a moving image, the one frame is the generation target image section. In this case, the in-point and the out-point for the free viewpoint image have the same time code.
[0166] Furthermore, in a case where a free viewpoint image is generated for a section of a plurality of frames in a moving image, the plurality of frames is the generation target image section. In this case, the in-point and the out-point for the free viewpoint image are different time codes.
[0167] Note that, although the structure of the clip will be described later, it is assumed that the in-point/out-point of the generation target image section is different from the in-point/out-point as the output clip to be finally generated. This is because a front clip and a rear clip, which will be described later, are coupled.
[0168] The target image transmission control unit 22 performs control to transfer image data of the generation target image section in each of the plurality of captured images 10, that is, one or a plurality of frames of the image data V1 to V16, as image data to be used for generating a free viewpoint image in the free viewpoint image server 2. Specifically, control is performed to transfer the image data as the generation target image section from the video servers 4A, 4B, 4C, and 4D to the NAS 5.
[0169] The output image generation unit 23 performs a process of generating an output image (output clip) including the free viewpoint image (FV clip) generated and received by the free viewpoint image server 2.
[0170] For example, by the processing of the output image generation unit 23, the image creation controller 1 combines, on the time axis, a front clip that is an actual moving image at a previous time point and a rear clip that is an actual moving image at a subsequent time point with an FV clip that is a virtual image generated by the free viewpoint image server 2 to obtain an output clip. That is, the front clip + the FV clip + the rear clip are set as one output clip.
[0171] Naturally, the front clip + FV clip may be one output clip.
[0172] Alternatively, the FV clip + the rear clip may be one output clip.
[0173] Further, an output clip of only the FV clip may be generated without coupling the front clip and the rear clip.
[0174] In any case, the image creation controller 1 generates an output clip including the FV clip, outputs the output clip to the switcher 6, and can use the output clip for broadcasting.
[0175]
[0176] The target image acquisition unit 31 performs processing of acquiring image data of a generation target image section as a generation target of a free viewpoint image in each of a plurality of captured images (image data V1 to V16) simultaneously captured by the plurality of imaging devices 10. That is, image data of one or a plurality of frames designated by the in-point/out-point of the generation target image section specified by the function of the section identification processing unit 21 by the image creation controller 1 can be acquired from the video servers 4A, 4B, 4C, and 4D via the NAS 5 and used for generating a free viewpoint image.
[0177] For example, the target image acquisition unit 31 acquires image data of one or a plurality of frames of the generation target image section for all of the pieces of image data V1 to V16. The image data of the generation target image section is acquired for all of the pieces of image data V1 to V16 in order to generate a high-quality free viewpoint image. As described above, it is possible to generate a free viewpoint image using images captured by at least two or more imaging devices 10. However, by increasing the number of imaging devices 10 (that is, the number of viewpoints), it is possible to generate a finer 3D model and generate a high-quality free viewpoint image. Therefore, for example, in a case where sixteen imaging devices 10 are arranged, the image data of the generation target image section is acquired for all the pieces of image data (V1 to V16) of the sixteen imaging devices 10.
[0178] The image generation processing unit 32 is a function of generating a free viewpoint image, that is, an FV clip in this example, using the image data acquired by the target image acquisition unit 31.
[0179] For example, the image generation processing unit 32 performs modeling processing including 3D model generation and subject analysis, and processing such as rendering for generating a free viewpoint image that is a two-dimensional image from the 3D model.
[0180] The 3D model generation is processing of generating 3D model data representing the subject in a three-dimensional space (that is, the three-dimensional structure of the subject is restored from the two-dimensional image) on the basis of the image captured by each imaging device 10 and the camera parameter for each imaging device 10 input from the utility server 8 or the like, for example. Specifically, the 3D model data includes data representing the subject in a three-dimensional coordinate system of (X, Y, Z).
[0181] In the subject analysis, a position, an orientation, and a posture of a subject as a person (player) are analyzed on the basis of the 3D model data. Specifically, estimation of the position of the subject, generation of a simple model of the subject, estimation of the orientation of the subject, and the like are performed.
[0182] Then, a free viewpoint image is generated on the basis of the 3D model data and the subject analysis information. For example, a free viewpoint image is generated such that the viewpoint is moved with respect to a 3D model in which a player as a subject is stationary.
[0183] The viewpoint of the free viewpoint image will be described with reference to
[0184]
[0185]
[0186] For example, the viewpoint is gradually moved in the direction of the arrow C from the state of
[0187] Here, the free viewpoint image server 2 (CPU 71) of this example has a function as the display processing unit 32a as a part of the function of the image generation processing unit 32.
[0188] The display processing unit 32a performs processing of displaying the camerawork designation screen Gs that receives the designation operation of the camerawork information used for generating the free viewpoint image. Note that details of the camerawork related to the free viewpoint image and the camerawork designation screen Gs will be described later again.
[0189] Furthermore, the free viewpoint image server 2 in this example also has a function as the camerawork editing processing unit 32b as a part of the function of the image generation processing unit 32, but the function as the camerawork editing processing unit 32b will also be described later.
[0190] The transmission control unit 33 performs control to transmit the free viewpoint image (FV clip) generated by the image generation processing unit 32 to the image creation controller 1 via the NAS 5. In this case, the transmission control unit 33 also controls to transmit accompanying information for generating an output image to the image creation controller 1. The accompanying information is assumed to be information designating images of the front clip and the rear clip. That is, it is information designating which image of the image data V1 to V16 is used to create (cut out) the front clip and the rear clip. In addition, information designating the time length of the front clip or the rear clip is also assumed as the accompanying information.
[0191] The camerawork generation processing unit 34 performs processing related to generation of camerawork information used for generation of a free viewpoint image. In creating the free viewpoint image, a plurality of candidate cameraworks is created in advance to cope with various scenes. In order to enable such pre-creation of the camerawork, a software program for creating the camerawork is installed in the free viewpoint image server 2 of this example. The camerawork generation processing unit 34 is a function realized by this software program, and performs a camerawork generation process on the basis of an operation input of the user.
[0192] The camerawork generation processing unit 34 has a function as a display processing unit 34a. The display processing unit 34a performs processing of displaying the creation operation screen Gg so as to enable reception of various operation inputs for creating the camerawork by the user (the operator OP2 in this example).
3. Outline of GUI
[0193] With reference to
[0194] On the camerawork designation screen Gs illustrated in
[0195] In the scene window 41, for example, monitor display of the image of the generation target image section is performed, and the operator OP2 can check the content of the scene in which the free viewpoint image is generated.
[0196] For example, a list of scenes designated as the generation target image section is displayed on the scene list display unit 42. The operator OP2 can select a scene to be displayed in the scene window 41 on the scene list display unit 42.
[0197] In the camerawork window 43, the position of the arranged imaging device 10, the selected camerawork, a plurality of selectable cameraworks, or the like is displayed.
[0198] Here, the camerawork information is information indicating at least a movement trajectory of the viewpoint in the free viewpoint image. For example, in a case of creating an FV clip in which the position of the viewpoint, the line-of-sight direction, and the angle of view (focal length) are changed with respect to the subject for which the 3D model has been generated, the parameters necessary for defining the movement trajectory of the viewpoint, the change mode of the line-of-sight direction, and the change mode of the angle of view are the camerawork information.
[0199] In the camerawork window 43, at least information visualizing and indicating the movement trajectory of the viewpoint is displayed as the display of the camerawork.
[0200] The camerawork list display unit 44 displays a list of pieces of information of various types of cameraworks created and stored in advance. The operator OP2 can select and designate the camerawork to be used for FV clip generation from among the cameraworks displayed on the camerawork list display unit 44.
[0201] Various parameters related to the selected camerawork are displayed on the parameter display unit 45.
[0202] In the transmission window 46, information related to transmission of the created FV clip to the image creation controller 1 is displayed.
[0203] Next, the creation operation screen Gg of
[0204] On the creation operation screen Gg, a preset list display unit 51, a camerawork list display unit 52, a camerawork window 53, an operation panel unit 54, and a preview window 55 are arranged.
[0205] The preset list display unit 51 can selectively display a preset list of cameras, a preset list of targets, and a preset list of 3D models.
[0206] The preset list of cameras is list information of position information (position information in a three-dimensional space) of each camera preset by the user for the camera arrangement position at the site. As will be described later, when the preset list of cameras is selected, information indicating the position of each piece of identification information (for example, cameral, camera2, ..., camera16) of the camera is displayed in a list form on the preset list display unit 51.
[0207] Furthermore, in the preset list of targets, the target means a target position that determines a line-of-sight direction from a viewpoint in the free viewpoint image. In the generation of the free viewpoint image, the line-of-sight direction from the viewpoint is determined to face the target.
[0208] When the preset list of targets is selected, the preset list display unit 51 displays a list of pieces of identification information about targets preset by the user and information indicating the positions of the targets.
[0209] Hereinafter, the target that determines the line-of-sight direction from the viewpoint in the free viewpoint image as described above is referred to as a “target Tg”.
[0210] The preset list of 3D models is a preset list of 3D models to be displayed as a background of the camerawork window 43, and when the preset list of 3D models is selected, the preset list display unit 51 displays a list of pieces of identification information of the preset 3D models.
[0211] The camerawork list display unit 52 can display a list of pieces of information of the camerawork created through the creation operation screen Gg and information (entry to be described later) of the camerawork to be newly created through the creation operation screen Gg.
[0212] In the camerawork window 53, at least information visualizing and indicating the movement trajectory of the viewpoint is displayed as the display of the camerawork.
[0213] The operation panel unit 54 is a region that receives various operation inputs in camerawork creation.
[0214] In the preview window 55, an observation image from a viewpoint is displayed. In a case where an operation of moving the viewpoint on the movement trajectory is performed, observation images from respective viewpoint positions on the movement trajectory are sequentially displayed in the preview window 55. In addition, as will be described later, in a case where an operation of designating a camera from the preset list of cameras is performed in a state where the preset list of cameras is displayed on the preset list display unit 51, an observation image from the arrangement position of the camera is displayed in the preview window 55 of this example.
[0215] Note that details of the camerawork designation screen Gs and a specific procedure of the camerawork designation illustrated in
4. Clip Including Free Viewpoint Image
[0216] Next, an output clip including an FV clip as a free viewpoint image will be described.
[0217]
[0218] For example, the front clip is an actual moving image in a section of time codes TC1 to TC2 in certain image data Vx among the pieces of image data V1 to V16.
[0219] Further, the rear clip is an actual moving image in a section of time codes TC5 to TC6 in certain image data Vy among the pieces of image data V1 to V16.
[0220] It is normally assumed that the image data Vx is the image data of the imaging device 10 before the start of the viewpoint movement by the FV clip, and the image data Vy is the image data of the imaging device 10 at the end of the viewpoint movement by the FV clip.
[0221] In this example, the front clip is a moving image having a time length t1, the FV clip is a free viewpoint image having a time length t2, and the rear clip is a moving image having a time length t3. The reproduction time length of the entire output clip is t1 + t2 + t3. For example, as the output clip for 5 seconds, a 1.5-second moving image, a 2-second free viewpoint image, a 1.5-second moving image, and the like can be considered.
[0222] Here, the FV clip is illustrated as a section of time codes TC3 to TC4, but this may or may not correspond to the number of frames of the actual moving image.
[0223] That is, as the FV clip, there are a case where the viewpoint is moved in a state where the time of the moving image is stopped (TC3 = TC4) and a case where the viewpoint is moved without stopping the time of the moving image (TC3 ≠ TC4).
[0224] For description, an FV clip in a case where the viewpoint is moved in a state where the time of the moving image is stopped is referred to as a “still image FV clip”, and an FV clip in a case where the viewpoint is moved without stopping the time of the moving image is referred to as a “moving image FV clip”.
[0225]
[0226] That is, this is the case of generating a free viewpoint image in which the viewpoint moves with respect to the still image of one frame of the frame F82.
[0227] Meanwhile, the moving image FV clip is as illustrated in
[0228] That is, this is a case where a free viewpoint image in which the viewpoint moves is generated for a moving image in a section of a plurality of frames from frame F102 to frame 302.
[0229] Therefore, the generation target image section determined by the image creation controller 1 is a section of one frame of the frame F82 in the case of creating the still image FV clip of
[0230]
[0231] In
[0232] For example, the output clip including the FV clip is generated in this manner and used as an image to be broadcast.
5. Clip Creation Process
[0233] Hereinafter, a processing example of output clip creation performed in the image processing system of
[0234] First, a flow of processing including operations of the operators OP1 and OP2 will be described with reference to
Step S1: Scene Selection
[0235] When an output clip is created, first, the operator OP1 selects a scene to be an FV clip. For example, the operator OP1 searches for a scene desired to be an FV clip while monitoring the captured image displayed on the display unit 77 on the image creation controller 1 side. Then, a generation target image section of one or a plurality of frames is selected.
[0236] The information of the generation target image section is transmitted to the free viewpoint image server 2, and the operator OP2 can recognize the generation target image section by the GUI on the display unit 77 on the free viewpoint image server 2 side.
[0237] Specifically, the information on the generation target image section is information on the time codes TC3 and TC4 in
Step S2: Scene Image Transfer Instruction
[0238] In response to the designation of the generation target image section, the operator OP2 performs an operation of giving an instruction to transfer the image of the corresponding scene. In response to this operation, the free viewpoint image server 2 transmits a transfer request for image data in the section of the time codes TC3 and TC4 to the image creation controller 1.
Step S3: Synchronous Cutout
[0239] In response to the image data transfer request, the image creation controller 1 controls the video servers 4A, 4B, 4C, and 4D, and causes the video servers to cut out the section of the time codes TC3 and TC4 for each of the sixteen systems of image data from the image data V1 to the image data V16.
Step S4: NAS Transfer
[0240] Then, the image creation controller 1 transfers the data in the section of all the time codes TC3 and TC4 of the image data V1 to V16 to the NAS 5.
Step S5: Thumbnail Display
[0241] The free viewpoint image server 2 displays the thumbnails of the pieces of image data V1 to V16 in the section of the time codes TC3 and TC4 transferred to the NAS 5.
Step S6: Scene Check
[0242] The operator OP2 checks the scene content of the section indicated by the time codes TC3 and TC4 on the camerawork designation screen Gs by the free viewpoint image server 2.
Step S7: Select Camerawork
[0243] The operator OP2 selects (designates) the camerawork considered to be appropriate on the camerawork designation screen Gs according to the scene content.
Step S8: Generation Execution
[0244] After selecting the camerawork, the operator OP2 performs an operation to execute generation of the FV clip.
Step S9: Modeling
[0245] The free viewpoint image server 2 generates a 3D model of the subject, analyzes the subject, and the like using data of frames in the section of the time codes TC3 and TC4 in the image data V1 to V16, respectively, and parameters such as the arrangement position of each imaging device 10 input in advance.
Step S10: Rendering
[0246] The free viewpoint image server 2 generates a free viewpoint image on the basis of the 3D model data and the subject analysis information. At this time, a free viewpoint image is generated so that the viewpoint movement based on the camerawork selected in step S7 is performed.
Step S11: Transfer
[0247] The free viewpoint image server 2 transfers the generated FV clip to the image creation controller 1. At this time, not only the FV clip but also the designation information of the front clip and the rear clip and the designation information of the time lengths of the front clip and the rear clip can be transmitted as accompanying information.
Step S12: Quality Check
[0248] Note that, on the free viewpoint image server 2 side, the quality check by the operator OP2 can be performed before or after the transfer in step S11. That is, the free viewpoint image server 2 reproduces and displays the generated FV clip on the camerawork designation screen Gs so that the operator OP2 can check the FV clip. In some cases, it is also possible that the operator OP2 performs the generation of the FV clip again without executing the transfer.
Step S13: Playlist Generation
[0249] The image creation controller 1 generates an output clip using the transmitted FV clip. In this case, one or both of the front clip and the rear clip are coupled to the FV clip on the time axis to generate the output clip.
[0250] The output clip may be generated as stream data in which each frame as the front clip, each frame virtually generated as the FV clip, and each frame as the rear clip are actually connected in time series. However, in this processing example, the frames are virtually connected as the playlist.
[0251] That is, the playlist is generated such that the FV clip is reproduced following the reproduction of the frame section as the front clip, and the frame section as the subsequent clip is reproduced thereafter, so that the output clip can be reproduced without generating the stream data actually connected as the output clip.
Step S14: Quality Check
[0252] The GUI on the image creation controller 1 side performs reproduction based on the playlist, and the operator OP1 checks the content of the output clip.
Step S15: Reproduction Instruction
[0253] The operator OP1 gives a reproduction instruction by a predetermined operation according to the quality confirmation. The image creation controller 1 recognizes the input of the reproduction instruction.
Step S16: Reproduction
[0254] In response to the reproduction instruction, the image creation controller 1 supplies the output clip to the switcher 6. As a result, the broadcast of the output clip can be executed.
6. Camera Variation Detection
[0255] In order to generate a free viewpoint image, since a 3D model is generated using the image data V1, V2, ..., V16, parameters including position information of each imaging device 10 are important.
[0256] For example, in a case where the position of a certain imaging device 10 is moved or the imaging direction is changed in the panning direction, the tilt direction, or the like in the middle of broadcasting, it is necessary to calibrate the parameter corresponding thereto. Therefore, in the image processing system of
[0257] A processing procedure of the image creation controller 1 and the utility server 8 at the time of detecting the variation of the camera will be described with reference to
Step S30: HD Output
[0258] The image creation controller 1 controls the image conversion unit 7 to output image data from the video servers 4A, 4B, 4C, and 4D for camera movement detection. The images from the video servers 4A, 4B, 4C, and 4D, that is, the images of the sixteen imaging devices 10 are subjected to resolution conversion by the image conversion unit 7 and supplied to the utility server 8.
Step S31: Background Generation
[0259] The utility server 8 generates a background image on the basis of the supplied image. Since the background image is an image that does not change unless there is a change in the camera, for example, a background image excluding a subject such as a player is generated for image data of sixteen systems (V1 to V16).
Step S32: Difference Check
[0260] The background image is displayed as a GUI so that the operator OP2 can check a change in the image.
Step S33: Automatic Variation Detection
[0261] It is also possible to automatically detect a variation of the camera by performing comparison processing on the background image at each time point.
Step S34: Camera Variation Detection
[0262] As a result of step S33 or step S32, a variation of a certain imaging device 10 is detected.
Step S35: Image Acquisition
[0263] Calibration is required in response to detection of a variation in the imaging device 10. Therefore, the utility server 8 requests the image creation controller 1 for the image data in the changed state.
Step S36: Clip Cutout
[0264] The image creation controller 1 controls the video servers 4A, 4B, 4C, and 4D in response to a request for image acquisition from the utility server 8, and causes the video servers to execute clip cutout for the image data V1 to V16.
Step S37: NAS Transfer
[0265] The image creation controller 1 controls the video servers 4A, 4B, 4C, and 4D to transfer the image data cut out as a clip to the NAS 5.
Step S38: Feature Point Correction
[0266] By the transfer to the NAS 5, the utility server 8 can refer to and display the image in the state after the camera is changed. The operator OP2 performs an operation necessary for calibration such as feature point correction.
Step S39: Recalibration
[0267] The utility server 8 re-executes the calibration for creating the 3D model using the image data (V1 to V16) in the state after the camera variation.
Step S40: Background Reacquisition
[0268] After the calibration, in response to the operation of the operator OP2, the utility server 8 requests reacquisition of image data for the background image.
Step S41: Clip Cutout
[0269] The image creation controller 1 controls the video servers 4A, 4B, 4C, and 4D in response to a request for image acquisition from the utility server 8, and causes the video servers to execute clip cutout for the image data V1 to V16.
Step S42: NAS Transfer
[0270] The image creation controller 1 controls the video servers 4A, 4B, 4C, and 4D to transfer the image data cut out as a clip to the NAS 5.
Step S43: Background Generation
[0271] The utility server 8 generates a background image using the image data transferred to the NAS 5. This is, for example, a background image serving as a reference for subsequent camera variation detection.
[0272] For example, by performing camera variation detection and calibration as in the above procedure, for example, even in a case where the position or the imaging direction of the imaging device 10 is changed during broadcasting, the parameter is corrected accordingly, so that an accurate FV clip can be continuously generated.
7. GUI for Creating Camerawork
[0273] Hereinafter, details of the creation operation screen Gg illustrated in
[0274]
[0275] As described above, the preset list display unit 51, the camerawork list display unit 52, the camerawork window 53, the operation panel unit 54, and the preview window 55 are arranged on the creation operation screen Gg.
[0276] As illustrated, the preset list display unit 51 is provided with a camera button B1, a target button B2, and a 3D model button B3. The camera button B1 is a button for giving an instruction to display the preset list of cameras described above on the preset list display unit 51, and the target button B2 and the 3D model button B3 are buttons for giving an instruction to display the preset list of targets described above and the preset list of 3D models as the background on the preset list display unit 51.
[0277] In the drawing, an underline mark is illustrated in the camera button B1, which means that the preset list display of the camera is selected.
[0278] The preset list display unit 51 is provided with a folder reference button B4. By operating the folder reference button B4, the user can refer to a folder storing data desired to be displayed as a list on the preset list display unit 51.
[0279] A new creation button B5 is provided in the camerawork list display unit 52. The user can issue an instruction to add a new entry of camerawork by operating the new creation button B5. The added entry of the camerawork is displayed on the camerawork list display unit 52.
[0280] The camerawork window 53 is provided with an X viewpoint button B6, a Y viewpoint button B7, a Z viewpoint button B8, a Ca viewpoint button B9, and a Pe viewpoint button B10. Each of these viewpoint buttons is a button for instructing an observation viewpoint for an object to be displayed in the camerawork window 53. Specifically, the X viewpoint button B6, the Y viewpoint button B7, and the Z viewpoint button B8 are buttons for respectively instructing the viewpoint on the X axis, the viewpoint on the Y axis, and the viewpoint on the Z axis as viewpoints for observing the visualization information of the camerawork information displayed in the camerawork window 53, and the Pe viewpoint button B10 is a button for instructing the shift to a mode for changing the observation viewpoint of the visualization information of the camerawork information to an arbitrary position. The Ca viewpoint button B9 is a button for giving an instruction to display an image obtained by observing the target three-dimensional space from the viewpoint movement trajectory defined as the camerawork information. Note that, for images of the X-axis viewpoint, the Y-axis viewpoint, the Z-axis viewpoint, the Pe viewpoint, and the Ca viewpoint, refer to
[0281] Here, in the creation operation screen Gg, the display image in the camerawork window 53 or the preview window 55 can be enlarged or reduced according to a predetermined operation such as a wheel operation of a mouse, for example. Furthermore, in the camerawork window 53 and the preview window 55, the display image can be scrolled according to a predetermined operation such as a drag operation, for example. Note that the enlargement, reduction, and scrolling of the display image can be performed according to an operation of a button provided on the screen.
[0282] The operation panel unit 54 is provided with a reproduction button B11, a pause button B12, a stop button B13, a timeline operation unit 54a, a speed adjustment operation unit 56, and a trajectory shape adjustment operation unit 57.
[0283] The reproduction button B11, the pause button B12, and the stop button B13 are buttons for respectively instructing reproduction, pause, and stop of the visualization image of the camerawork information displayed in the camerawork window 53 and the observation image from the viewpoint displayed in the preview window 55. The reproduction button B11, the pause button B12, and the stop button B13 are enabled at least when the information on the movement trajectory of the viewpoint is determined as the camerawork information.
[0284] The timeline operation unit 54a is a region that receives an operation related to the creation of the camerawork on the timeline representing the movement period of the viewpoint of the free viewpoint image. Examples of the operation on the timeline operation unit 54a include an operation of dragging and dropping one of the cameras listed on the preset list display unit 51 to an arbitrary position on the timeline (that is, an arbitrary time point within the viewpoint movement period) (see
[0285] Various operation buttons for adjusting the moving speed of the viewpoint are arranged in the speed adjustment operation unit 56. In the trajectory shape adjustment operation unit 57, various operation buttons for adjusting the shape of the movement trajectory of the viewpoint are arranged.
[0286] The speed adjustment operation unit 56 and the trajectory shape adjustment operation unit 57 will be described later.
[0287] A camerawork creation procedure and various functions related to the camerawork creation will be described.
[0288] First, the user (operator OP2 in this example) performs an operation for acquiring a preset list of cameras as illustrated in
[0289] The preset list of cameras is acquired by operating the folder reference button B4 to designate the corresponding folder.
[0290] When a folder is designated, the display processing unit 34a performs processing of displaying a preset list of cameras according to the data content of the designated folder on the preset list display unit 51.
[0291] At the same time, the display processing unit 34a performs processing of displaying information visually indicating the arrangement of each camera on the camerawork window 53 on the basis of the acquired position information of the camera. Specifically, processing of displaying a camera position mark Mc indicating the position of each camera is performed.
[0292] Note that, regarding the display of the camera position mark Mc, display for identifying each camera by color coding may be performed. For example, each camera can be color-coded in the preset list of cameras, and each camera position mark Mc can be displayed by the same color coding in the camerawork window 43.
[0293] Furthermore, in the camerawork window 53, it is also conceivable to display the identification information (for example, cameral, camera2, ..., cameraX, and the like) of the camera for the mouse-over camera position mark Mc.
[0294] In this example, the 3D model displayed as the background can be changed in the camerawork window 53.
[0295] Describing the change of the background 3D model with reference to
[0296]
[0297] In starting the creation of the camerawork, the user operates a new creation button B5 as illustrated in
[0298] In response to the operation of the new creation button B5, the display processing unit 34a displays a new entry of camerawork on the camerawork list display unit 52. In this entry, an operation unit for designating the In-camera as a start point of the viewpoint movement and the Out-camera as an end point of the viewpoint movement is displayed.
[0299] In this example, in the generation of the free viewpoint image, since the texture based on the image captured by the camera is pasted to the 3D model of the subject, it is desirable to create a movement trajectory that passes through the camera position as much as possible as the movement trajectory of the viewpoint. In particular, since the start point and the end point of the viewpoint movement in the free viewpoint image are switching points of the image with the front and back clip, the start point and the end point of the viewpoint movement should coincide with the camera position. Therefore, the start point and the end point of the viewpoint movement are designated as the camera positions of the In-camera and the Out-camera, respectively.
[0300] Note that the movement start point and the movement end point of the viewpoint are not necessarily limited to the camera position, and may be any position other than the camera position.
[0301] As illustrated in the drawing, the operation unit for designating the In-camera and the Out-camera is, for example, an operation unit for designating the camera in a pull-down format. When pull-down is instructed by a user operation, information (in this example, the camera number information) indicating each camera listed in a preset list of specifiable cameras, that is, cameras designated by the user is displayed (see
[0302] Furthermore, according to the operation of the new creation button B5, the new entry of the camerawork is displayed in the camerawork list display unit 52 as described above, and a mark (hereinafter referred to as a “target mark Mt”) indicating the position of the target Tg set by the user is displayed in the camerawork window 53.
[0303] Here, in the camerawork creation work, the position of the target Tg is set to an appropriate position assumed for the target scene, for example, in a case where it is desired to generate an image of a shooting scene as a free viewpoint image, the position is set to a position near a goal in a target three-dimensional space (for example, a soccer ground in the case of soccer). Here, it is assumed that the position of the target Tg can be set in the free viewpoint image server 2 in advance by the user.
[0304] As described above, the free viewpoint image can be generated such that the line-of-sight direction from the viewpoint faces the target Tg. Specifically, the free viewpoint image of this example can be generated such that the target Tg continues to be located at a predetermined position (for example, the center position) in the image frame within at least a partial period during the movement period of the viewpoint.
[0305] Note that, in the generation of the free viewpoint image, continuing to position the target Tg at a predetermined position in the image frame as described above is expressed as “following the target Tg”. This “following the target Tg” is synonymous with the fact that the line-of-sight direction from the viewpoint continues to face the target Tg while the viewpoint is moving.
[0306] Here, in a state in which a preset list of cameras is displayed in the camerawork window 53, an observation image from a field of view or a viewpoint (an image obtained by observing a three-dimensional space from the viewpoint) in a case where the viewpoint is set at the position of the designated camera is displayed in the camerawork window 53 and the preview window 55, respectively, according to the designation operation of the camera from the preset list of cameras.
[0307] Specifically,
[0308] In this case, the camerawork window 53 displays field-of-view information Fv that visualizes and indicates the field of view from the camera for the camera designated from the preset list of cameras. As illustrated in the drawing, in this example, information representing the field of view as a figure is displayed as the field-of-view information Fv.
[0309] Furthermore, in the camerawork window 53 in this case, the camera position mark Mc for the designated camera is highlighted more than the camera position marks Mc for other cameras (in the figure, an example of increasing the size is illustrated), so that the user can easily grasp at which position the designated camera is located.
[0310] In the preview window 55, an image obtained by observing a three-dimensional space from a designated camera is displayed.
[0311] Here, in this example, it is assumed that the camerawork creation work is performed prior to the generation of the free viewpoint image. That is, it is assumed that the captured image used to generate the free viewpoint image is performed in an unacquired state. Therefore, the image obtained by observing the three-dimensional space from the viewpoint mentioned here is not an image obtained by rendering, as a two-dimensional image, a 3D model (hereinafter, it is described as a “real three-dimensional model” for the sake of description) generated by performing object detection or the like from the image captured by each camera that captures the target real space, but is an image obtained by rendering, as a two-dimensional image, a virtual 3D model (referred to as “virtual three-dimensional model”) that simulates the target real space. In the process of generating the observation image from the viewpoint in this case, since the captured image has not been acquired, the process of pasting the texture generated from the captured image to the 3D model is not performed.
[0312]
[0313] As illustrated in
[0314]
[0315] Furthermore, in the camerawork window 53, the camera position mark Mc for the cameral is highlighted, and the field-of-view information Fv for the cameral is displayed.
[0316] Note that, in comparison with
[0317] In the preview window 55, an image obtained by observing a three-dimensional space from the cameral is displayed.
[0318] In a case where the Out-camera is designated, as illustrated in
[0319]
[0320] Here, the movement trajectory of the viewpoint is determined by designating the In-camera and the Out-camera. Therefore, in the camerawork window 53 in this case, information indicating the movement trajectory of the viewpoint connecting the positions of the In-camera and the Out-camera, which is indicated as the movement trajectory information Mm in the figure, is displayed. Here, the movement trajectory information Mm is information obtained by visualizing the movement trajectory of the viewpoint.
[0321] Specifically, in the camerawork window 53 in this case, the camera position mark Mc for the camera9 designated as the Out-camera is highlighted, and linear movement trajectory information Mm connecting the positions of the In-camera and the Out-camera is additionally displayed from the case of
[0322] Although illustration is omitted, in the creation operation screen Gg, the preview of the camerawork can be displayed after at least the In-camera and the Out-camera are designated as described above and the movement trajectory of the viewpoint is formed.
[0323] Specifically, this preview display can be instructed to start by operating the reproduction button B11 on the operation panel unit 54. In the camerawork window 53, as the preview display of the camerawork, an image in which the field-of-view information Fv changes from moment to moment with movement of the viewpoint is displayed. In addition, in conjunction with such preview display of the camerawork, in the preview window 55, an observation image in a three-dimensional space (observation image from the viewpoint) that changes from moment to moment with movement of the viewpoint is displayed.
[0324] Furthermore, in this example, such preview display of the camerawork and preview display of the observation image in the three-dimensional space can be performed not only by the operation of the reproduction button B11 but also by the drag operation of the seek bar B17 in the timeline operation unit 54a.
[0325]
[0326] While the drag operation of the seek bar B17 is being performed, the position of the seek bar B17 on the timeline, that is, on the time axis from the start timing to the end timing of the free viewpoint image changes from moment to moment. Therefore, while the drag operation is being performed, the field-of-view information Fv corresponding to the viewpoint position at the timing indicated by the seek bar B17 is sequentially displayed in the camerawork window 53, and is visually recognized by the user as an image in which the field-of-view information Fv changes from moment to moment with movement of the viewpoint. Similarly, in the preview window 55, an observation image of the three-dimensional space from a viewpoint changing from moment to moment is displayed according to the movement of the seek bar B17.
[0327] Next, the via-point of the viewpoint will be described.
[0328] On the creation operation screen Gg, it is possible to designate a via-point of the viewpoint and a timing at which the viewpoint passes through the via-point.
[0329]
[0330] In this example, the via-point and the timing at which the viewpoint passes through the via-point can be designated by an operation of dragging and dropping a camera desired to be designated as the via-point on the timeline in the timeline operation unit 54a.
[0331]
[0332] First, as illustrated in
[0333] The camera selected in this manner is dragged on the screen as illustrated in
[0334] In response to the completion of the designation, as illustrated in
[0335] In this example, the via-point mark Mv is displayed as a square mark in the initial state as illustrated in the drawing.
[0336] In addition, in a state in which the designation of the via-point and the via-timing is completed, the camerawork window 53 highlights the camera position mark Mc for the camera (camera6 in this case) designated as the via-point, and displays the field-of-view information Fv indicating the field of view from the camera. Furthermore, the preview window 55 displays an image obtained by observing the three-dimensional space from the viewpoint of the camera position designated as the via-point.
[0337] By selecting a camera from the preset list and dragging and dropping the selected camera, as described above, it is possible to designate a via-point of the viewpoint and a timing at which the viewpoint passes through the via-point.
[0338]
[0339] Note that, although the example in which the camera position is designated as the via-point of the viewpoint has been described here, an arbitrary position other than the camera position can be designated as the via-point.
[0340] Furthermore, on the creation operation screen Gg, it is possible to designate the type of the shape of the movement trajectory of the viewpoint.
[0341] With reference to
[0342] First, as illustrated in
[0343] An operation button provided in the trajectory shape adjustment operation unit 57 is operated to designate the type of the shape of the movement trajectory. For example, as illustrated in
[0344] In response to the operation of the curve interpolation button B18, the camerawork generation processing unit 34 performs curve interpolation of the movement trajectory for the partial designated range of the viewpoint movement trajectory designated in
[0345] By forming the movement trajectory of the viewpoint into the curved shape in this manner, it is possible to prevent the distance from the target subject to the viewpoint from greatly changing even if the viewpoint moves. In other words, the size of the target subject in the free viewpoint image can be prevented from greatly changing.
[0346] In this example, in a case where the curve interpolation as described above is performed, processing of changing the shape of the via-point mark Mv displayed on the timeline in the timeline operation unit 54a to a shape corresponding to the curve interpolation is performed. Specifically, in this example, as illustrated in the figure, the shape of the via-point mark Mv is changed from a square mark to a round mark. As a result, it is possible to notify the user that curve interpolation is being performed for the movement trajectory of the viewpoint connecting the via-points also on the timeline.
[0347] Note that, in this example, an operation button for instructing to make the movement trajectory shape linear is arranged in the trajectory shape adjustment operation unit 57, and when the operation button is operated, the shape of the corresponding via-point mark Mv is changed to a square mark.
[0348] Here, as the movement trajectory shape, shapes other than a curve and a straight line can be set. For example, as illustrated in the movement trajectory information Mm of
[0349] In addition, in a case where there is a variation in the movement trajectory shape as described above, the via-point mark Mv may be displayed not only in the two types of display forms of the straight line and the curved line as illustrated above but also in different display forms corresponding to each variation.
[0350] Furthermore, in the creation operation screen Gg of this example, it is possible to designate the moving speed of the viewpoint.
[0351] In the designation of the moving speed, first, as illustrated in
[0352] In response to the operation of the speed adjustment button B19, the camerawork generation processing unit 34 adjusts the speed of the viewpoint corresponding to the operated button for a partial range of the designated viewpoint movement trajectory.
[0353] In this example, in response to such speed adjustment, the display processing unit 34a performs processing of changing the shape of the corresponding via-point mark Mv on the timeline to a shape according to the mode of the executed speed adjustment, as illustrated in
[0354] Next, adjustment of the target position will be described.
[0355] First, the significance of the target Tg will be described with reference to
[0356] As described above, the target Tg is used to determine the line-of-sight direction from the viewpoint in the free viewpoint image.
[0357] As described above, in the generation of the free viewpoint image in this example, it is possible to designate the period of facing the target Tg in the movement period of the viewpoint. In other words, it is possible to designate a period during which the target Tg continues to be positioned at a predetermined position in the image frame as a period for following the target Tg. In this example, the following of the target Tg is performed such that the target Tg continues to be positioned at the center position in the image frame as illustrated in
[0358]
[0359] In this example, the position of the target Tg can be adjusted on the creation operation screen Gg. As the adjustment operation of the target Tg, for example, it is conceivable to perform an operation of adjusting the position of the target mark Mt displayed in the camerawork window 53.
[0360] When the position of the target Tg is changed by adjustment, the camerawork generation processing unit 34 sets the line-of-sight direction Dg and the field of view Rf at each viewpoint position so as to keep the changed target Tg at a predetermined position in the image frame.
[0361] Here, in the creation operation screen Gg of this example, it is possible to perform designation of moving the position of the target Tg with the lapse of time during the viewpoint movement period.
[0362] A specific operation procedure of the movement designation of the target Tg will be described with reference to
[0363] First, as illustrated in
[0364] By such an operation, the new point Ptn of the target Tg is designated.
[0365] Next, the user operates the target button B2 provided on the preset list display unit 51 to bring the preset list display unit 51 into a display state of the list of target Tg. In the display state, as illustrated in
[0366] In response to the operation of the addition button B20, identification information (“Target0” in the drawing) and position information (position information indicating the new point Ptn) about the added target Tg are displayed on the preset list display unit 51 as illustrated in the drawing. Furthermore, in the camerawork window 53, an additional target mark Mtt as a mark representing the added target Tg is displayed at the position of the target position designation mark Mtn.
[0367] Next, the user performs an operation of adding a new target to the timeline as illustrated as transitions from
[0368] The arrival target timing mark Mem as illustrated in
[0369] After performing the operation of adding a new target to the timeline as described above, the user performs an operation of designating a period of time to face the target Tg as illustrated in
[0370] In designating the period of facing the target Tg, first, as illustrated in
[0371]
[0372] In this case, the movement of the target Tg is performed so as to reach the new point Ptn at the timing indicated by the arrival target timing mark Mem on the timeline within the period from the start to the end of the movement of the viewpoint. Therefore, the target mark Mt gradually approaches the target position designation mark Mtn (the additional target mark Mtt in the camerawork window 53) as illustrated in
[0373] In this example, the period of facing the target Tg is designated as the entire period from the start to the end of movement of the viewpoint. That is, a period exceeding the period until the timing indicated by the arrival target timing mark Mem is designated as the period of facing the target Tg.
[0374] In this way, in a case where the period of facing the target Tg is designated as a period exceeding the period up to the arrival target timing mark Mem, in this example, the position of the target Tg is gradually returned from the new point Ptn to the movement start position as the movement of the target Tg in the period exceeding the period up to the arrival target timing mark Mem.
[0375] Therefore, in a period exceeding the period up to the arrival target timing mark Mem in the period of facing the target Tg, the target mark Mt gradually approaches the target initial position mark Mst indicating the movement start position with the lapse of time as illustrated in
[0376] By enabling the movement of the target Tg as described above, the degree of freedom in creating a free viewpoint image can be improved as compared with a case where the position of the target Tg is fixed.
[0377] In the above description, in a case where designation to move the position of the target Tg is performed, a case where the period exceeding the period up to the arrival target timing mark Mem is designated as the period of facing the target Tg has been illustrated. However, as illustrated in
[0378] Furthermore, in the above description, an example has been described in which one target Tg is added as the addition of the new target Tg using the target position designation mark Mtn, but a plurality of targets Tg can also be added.
[0379] Then, in that case, as illustrated in
[0380] In this case, in the period indicated by the period designation bar B22-1 within the movement period of the viewpoint, a free viewpoint image in which the position of the target Tg gradually moves from the initial position (position of the target Tg at the viewpoint movement start time point) to the position of the target Tg-1 is generated, and in the period indicated by the period designation bar B22-2, for example, a free viewpoint image in which the position of the target Tg gradually moves from the initial position to the position of the target Tg-2 is generated.
[0381] Here, as understood from the description of
[0382] As a result, it is possible to generate a free viewpoint image that follows the target A in a certain period and follows the target B in another period in the viewpoint movement period, and freely improve the setting of the target to be followed.
[0383] Therefore, the degree of freedom in creating a free viewpoint image can be improved.
[0384] Note that, in the above description, as the designation of the position of the target Tg, an example of designating the position of the target Tg as the movement destination point in a case where the position of the target Tg is moved during the viewpoint movement period has been described. However, the designation of the position of the target Tg can also be performed as designation of the position of the target Tg that is not moved during the viewpoint movement period.
[0385] Processing related to generation and display of a movement trajectory according to an operation input on the creation operation screen Gg will be described with reference to the flowcharts of
[0386] Note that the processing illustrated in
[0387]
[0388] First, in step S101, the CPU 71 waits for the designation operation of the In-camera. In this example, this designation operation is performed as an operation of designating the camera number described in the pull-down list of In-cameras in the entry of the camerawork displayed on the camerawork list display unit 52 as illustrated in
[0389] In a case where the designation operation of the In-camera is performed, the CPU 71 performs various types of display processing related to the In-camera as described with reference to
[0390] In step S103 subsequent to step S102, the CPU 71 waits for the designation operation of the Out-camera (see the description of
[0391] In step S104, the CPU 71 performs processing of generating a viewpoint movement trajectory connecting the In-camera and the Out-camera.
[0392] Then, in subsequent step S105, the CPU 71 executes processing of displaying the Out-camera and the viewpoint movement trajectory. That is, various display processes related to the Out-camera and the display process of the movement trajectory information Mm of the viewpoint as described with reference to
[0393] The CPU 71 terminates the series of processing illustrated in
[0394]
[0395] In step S110, the CPU 71 waits for the designation operation of the via-point. In this example, the designation operation is a series of operations including the operation on the timeline as described with reference to
[0396] When the via-point is designated, the CPU 71 generates a viewpoint movement trajectory through the designated point in step S111. That is, a movement trajectory of the viewpoint connecting the In-camera, the designated point, and the Out-camera is generated.
[0397] Then, in subsequent step S112, the CPU 71 performs processing of displaying the via-point and the viewpoint movement trajectory. That is, for the designated via-point, processing for highlighting the camera position mark in the camerawork window 53, displaying the field-of-view information Fv, and displaying the via-point mark Mv in the timeline as described in
[0398] The CPU 71 terminates the series of processing illustrated in
8. GUI for Creating Free Viewpoint Image
[0399] Next, details of the camerawork designation screen Gs illustrated in
[0400]
[0401] As described above, the camerawork designation screen Gs is provided with the scene window 41, the scene list display unit 42, the camerawork window 43, the camerawork list display unit 44, the parameter display unit 45, and the transmission window 46.
[0402] Furthermore, on the camerawork designation screen Gs, a camera designation operation unit 47, a still image import button B31, and a moving image import button B32 are provided with respect to the scene window 41, and a reproduction button B33, a pause button B34, and a stop button B35 are provided at the lower part of the screen.
[0403] Furthermore, on the camerawork designation screen Gs, an X-axis viewpoint button B36, a Y-axis viewpoint button B37, a Z-axis viewpoint button B38, a Ca viewpoint button B39, a Pe viewpoint button B40, a display path restriction button B41, and a restriction release button B42 are provided for the camerawork window 43, and a filtering operation unit 48 is provided for the camerawork list display unit 44. The filtering operation unit 48 is provided with a pull-down button B43 and a reset button B44.
[0404] In generating the free viewpoint image on the camerawork designation screen Gs, first, the user performs an operation for importing an image of a generation target section of the free viewpoint image, in other words, an image of a scene as a generation target of the free viewpoint image, as the image data V1 to V16 described above. In performing this import, the user operates either the still image import button B31 or the moving image import button B32 in the drawing. The still image import button B31 is a button for giving an instruction to import the image data V1 to V16 by a still image for generating the still image FV clip described above as a free viewpoint image, and the moving image import button B32 is a button for giving an instruction to import the image data V1 to V16 by a moving image for generating the moving image FV clip described above as a free viewpoint image.
[0405] In response to the operation of any one of the still image import button B31 and the moving image import button B32, a pop-up window W1 as illustrated in
[0406] When the OK button is operated, information on the imported scene is added in the scene list display unit 42 on the camerawork designation screen Gs as illustrated in
[0407] The camera designation operation unit 47 is provided with a camera selection button for selecting which camera displays an image for each camera for the imported scene, that is, image data V1 to V16.
[0408] On the camerawork designation screen Gs, the camerawork can be previewed in the camerawork window 43.
[0409] In the camerawork window 43, the observation viewpoint of the camerawork in the three-dimensional space can be switched by the X-axis viewpoint button B36, the Y-axis viewpoint button B37, the Z-axis viewpoint button B38, the Ca viewpoint button B39, and the Pe viewpoint button B40.
[0410] The X-axis viewpoint button B36, the Y-axis viewpoint button B37, and the Z-axis viewpoint button B38 are buttons for switching the observation viewpoint in the three-dimensional space to the viewpoint on the X-axis, the viewpoint on the Y-axis, and the viewpoint on the Z-axis, respectively. Here, the X axis, the Y axis, and the Z axis are three axes that define a three-dimensional space. In this example, the X axis is an axis that defines a horizontal direction, the Y axis is an axis that defines a vertical direction, and the Z axis is an axis that defines a direction orthogonal to both the X axis and the Y axis.
[0411] The Pe viewpoint button B40 is a button for switching the observation viewpoint of the three-dimensional space to an arbitrary viewpoint designated by the user.
[0412] The Ca viewpoint button B39 is a button for switching the observation viewpoint in the three-dimensional space to the viewpoint (point on the viewpoint movement trajectory) in the camerawork.
[0413]
[0414] Here, the camerawork window 43 displays information indicating the camerawork displayed on the camerawork list display unit 44. The camerawork displayed on the camerawork list display unit 44 is the camerawork created through the creation operation screen Gg described above, and is a candidate for the camerawork used for generating the free viewpoint image. In other words, it is a camerawork that can be designated as a camerawork used for generating a free viewpoint image.
[0415] In the camerawork window 43, as the information indicating the camerawork, the movement trajectory information Mm, the camera position mark Mc, and the field-of-view information Fv described above (also in this case, the field of view is a piece of information indicated by a figure) are displayed. Furthermore, as particularly illustrated in
[0416] Furthermore, in the camerawork window 43, the target mark Mt indicating the position of the target Tg described above is displayed as the information indicating the camerawork.
[0417] Note that, in the case of
[0418] In the drawing, two target marks Mt are displayed as the target marks Mt (particularly, refer to
[0419] In the camerawork window 43, dynamic preview reproduction of the camerawork can be performed. Such dynamic preview reproduction can be instructed by the operation of the reproduction button B33.
[0420] When the reproduction button B33 is operated, in each case of the viewpoint on the X-axis, the viewpoint on the Y-axis, the viewpoint on the Z-axis, and any viewpoint illustrated in
[0421] Note that the pause button B12 and the stop button B13 are buttons for instructing pause and stop of the dynamic preview reproduction as described above, respectively.
[0422] Here, in a case of preview reproduction of an observation image from a viewpoint as in the case of
[0423] However, since the generation of the free viewpoint image as the FV clip requires a corresponding processing load and processing time, the time for making the user wait until the start of the preview reproduction becomes long, and there is a possibility that the rapid generation of the free viewpoint image is hindered.
[0424] Therefore, in this example, in the case of preview reproduction of the observation image from the viewpoint as in the case of
[0425] As a result, the processing time required to display the preview of the observation image from the viewpoint can be shortened, and the work of creating the free viewpoint image can be quickly executed.
[0426] Here, as described above, in the camerawork window 43, it is possible to display the selected camerawork or a plurality of selectable cameraworks.
[0427] Although the case where the camerawork information for only the selected camerawork is displayed is illustrated in
[0428] In this manner, switching of the number of cameraworks to be displayed in the camerawork window 43 can be instructed by the display path restriction button B41 and the restriction release button B42. The display path restriction button B41 is a button for instructing display of only the selected camerawork, and the restriction release button B42 is a button for releasing the state of being limited to the display of only the selected camerawork, and functions as an instruction button for displaying the camerawork information of all the plurality of cameraworks displayed on the camerawork list display unit 44.
[0429] Next, the camerawork list display unit 44 will be described.
[0430] The camerawork as a candidate that can be used to generate the free viewpoint image is displayed on the camerawork list display unit 44 (see, for example,
[0431] Furthermore, in this example, the thumbnail image of the movement trajectory information Mm is displayed for each camerawork in the camerawork list display unit 44. By displaying such thumbnail images, it is possible to cause the user to confirm what viewpoint movement trajectory each camerawork has even on the camerawork list.
[0432] Here, the tag information is information that can be added to each created camerawork when the camerawork is created through the creation operation screen Gg described above, and is text information in this example. The tag information for the camerawork can be set by, for example, inputting information to a field of “Tag” (See, for example,
[0433] Hereinafter, this tag information is referred to as “tag information 11”.
[0434] The camerawork list display unit 44 is provided with the filtering operation unit 48 for filtering the camerawork to be displayed in the list display, that is, the camerawork to be displayed on the camerawork list display unit 44.
[0435] A function related to filtering of the camerawork using the filtering operation unit 48 will be described with reference to
[0436] First, when filtering is performed, the user operates a pull-down button B43 on the filtering operation unit 48 as illustrated in
[0437] The user can give an instruction to display only the camerawork in which the tag information I1 is set on the camerawork list display unit 44 by performing an operation (for example, a click operation or the like) to designate the tag information I1 displayed in the pull-down list 48a.
[0438] Note that, as can be understood from this point, the display portion of each piece of tag information I1 in the pull-down list 48a corresponds to filtering condition information indicating a filtering condition for filtering and displaying the camerawork information.
[0439]
[0440] In this case, since there is only one camerawork to which the “CW, Right” is set, information of the camerawork to which the “CW, Right” is set is displayed in the camerawork window 43. Note that the camerawork in which “CW, Right” is set is the camerawork in which the target Tg whose position is indicated by the right target mark Mt among the targets Tg whose positions are indicated by the two target marks Mt illustrated in the camerawork window 43 of
[0441] Note that, in a case where there is a plurality of cameraworks to which the designated tag information I1 is set, for example, it is conceivable to display information of the camerawork displayed at a predetermined position such as a head position on the list in the camerawork window 43.
[0442] By filtering the camerawork based on the tag information I1 as described above, filtering based on an arbitrary standard can be realized depending on information content set as the tag information I1. For example, if team information (for example, team A, team B, or the like) is set as the tag information I1, filtering based on a criterion such as whether the camerawork is for a shooting scene of team A or a shooting scene of team B can be realized. Alternatively, by setting information (for example, clockwise rotation, counterclockwise rotation, and the like) indicating the movement direction of the viewpoint as the tag information I1, filtering based on the movement direction of the viewpoint can be realized.
[0443] Furthermore, by setting the camera closest to the visual field of interest, such as the visual field closest to the subject to be the target Tg, as the tag information I1, it is possible to realize filtering of the camerawork with the visual field of interest as a reference.
[0444] In the filtering operation unit 48, the reset button B44 is a button for instructing resetting of filtering. In a case where the reset button B44 is operated, as illustrated as the screen transition from
[0445] Here, although the filtering based on the tag information I1 has been illustrated above with respect to the filtering of the camerawork, the filtering of the camerawork can be performed on the basis of the information of the In-camera or the Out-camera included in the information of the camerawork.
[0446] Although the example in which the information indicating the filtering condition such as the tag information I1 is displayed in the pull-down list 48a has been described above, the information indicating the filtering condition (filtering condition information) can also be displayed as a button as illustrated in
[0447] At this time, the information displayed as the button can be determined on the basis of history information of the camerawork used for generation of the free viewpoint image in the past. For example, the tag information I1 of predetermined high-rank cameraworks that have been used frequently in the past can be displayed as a button.
[0448] In
[0449] The button displaying the filtering condition information may be customized by the user.
[0450]
[0451] Furthermore, the information indicating the filtering condition can be received as input keyword information by the user.
[0452] In this case, for example, the keyword input unit 48b illustrated in
[0453] Note that, in the above description, the designation of the camerawork used for generating the free viewpoint image is performed as the designation of the camerawork displayed on the camerawork list display unit 44. However, the designation of the camerawork may be performed as the designation of the camerawork displayed on the camerawork window 43.
[0454] Here, in the present embodiment, information obtained by visualizing the moving speed of the viewpoint is displayed for the camerawork information displayed in the camerawork window 43.
[0455]
[0456] In this example, information indicating a period during which the moving speed of the viewpoint decreases is displayed as the information visualizing the moving speed of the viewpoint.
[0457]
[0458] Although not illustrated, the moving speed of the viewpoint can also be expressed by the density of points in a case where the moving trajectory is indicated by a dotted line. For example, it is conceivable to perform display in which the density of points increases as the moving speed increases.
[0459] Note that, although the observation image from the viewpoint on the Y-axis is illustrated in
[0460] Furthermore, in the present embodiment, with respect to the information of the camerawork displayed in the camerawork window 43, processing of updating the information of the position of the target Tg in the camerawork information according to the operation of changing the position of the target mark Mt is performed.
[0461] This processing is processing of the camerawork editing processing unit 32b illustrated in
[0462] With reference to
[0463]
[0464] It is assumed that an operation of changing the position of the target mark Mt is performed in the camerawork window 43 as illustrated in the drawing. The operation of changing the position of the target mark Mt may be, for example, a drag & drop operation of the target mark Mt.
[0465] Here, the position of the target Tg after the change by such a change operation is referred to as “position Pta”, and the position of the target Tg before the change is referred to as “Ptb”.
[0466] In response to the operation of changing the position of the target mark Mt as described above, the camerawork editing processing unit 32b performs processing of updating the information on the position of the target Tg from the position Ptb to the position Pta with respect to the camerawork information displayed in the camerawork window 43.
[0467] By updating the position information of the target Tg in the camerawork information in this manner, in the free viewpoint image generation processing using the camerawork information, the free viewpoint image is generated such that the line-of-sight direction Dg from each position on the viewpoint movement trajectory faces the position of the updated target Tg.
[0468] Since the camerawork information can be edited according to the operation on the camerawork designation screen Gs as described above, it is not necessary to start software for generating the camerawork information when it is desired to edit the camerawork information at the stage of designating the camerawork to be used for generating the free viewpoint image.
[0469] Therefore, even when it is necessary to edit the camerawork information, it is possible to quickly execute the work of creating the free viewpoint image.
[0470] In addition, in the present embodiment, on the camerawork designation screen Gs, the above-described processing of displaying information notifying the camera that requires calibration is performed.
[0471] Specifically, the display processing unit 32a determines whether there is a camera whose variation has been detected among the cameras whose camera position marks Mc are displayed in the camerawork window 43, on the basis of the result of the variation detection (for example, the automatic variation detection in step S33) of each camera by the utility server 8 described above with reference to
[0472] Note that the camera in which the variation is detected can be rephrased as a camera in which the change in the field of view is detected.
[0473]
[0474] As for the display of the notification information, as illustrated in the drawing, the camera position mark Mc of the corresponding camera is displayed in a display mode different from that of the other camera position marks Mc (even in this case, it is conceivable to vary the color, size, shape, and the like). Furthermore, it is also conceivable to display information for calling attention in the vicinity of the corresponding camera position mark Mc, like the exclamation mark illustrated in the figure.
[0475] In generating a free viewpoint image, in order to accurately generate three-dimensional information from images captured by a plurality of cameras, it is necessary for each camera to maintain an assumed position and orientation in advance, and in a case where a change in position and orientation occurs in any camera, it is necessary to calibrate parameters used for generating three-dimensional information. By notifying the camera in which the change in the field of view is detected as described above, it is possible to notify the user of the camera that requires calibration.
[0476] Therefore, it is possible to generate a free viewpoint image based on accurate three-dimensional information, and to improve the image quality of the free viewpoint image.
[0477] Processing related to the filtering of the camerawork described above will be described with reference to flowcharts of
[0478]
[0479] First, in step S201, the CPU 71 performs processing of acquiring tag information I1 in each piece of camerawork information as a candidate. That is, each piece of camerawork information as a candidate that can be used for generation of the free viewpoint image is acquired. Here, the camerawork information as a candidate is stored in a readable storage device inside or outside the free viewpoint image server 2. In the processing of step S201, the camerawork information as the candidate stored in this manner is acquired.
[0480] In step S202 subsequent to step S201, the CPU 71 performs a process of displaying the tag information I1. That is, in the case of displaying in the pull-down list 48a as illustrated in
[0481] In step S203 subsequent to step S202, the CPU 71 waits for the designation operation of the tag information I1,and in a case where the designation operation of the tag information I1 has been performed, proceeds to step S204 and performs processing of filtering and displaying the camerawork to which the designated tag information I1 is attached. That is, processing of displaying the camerawork information including the designated tag information I1 (in which the designated tag information I1 is set) among the camerawork information as candidates on the camerawork list display unit 44 is performed. As illustrated in
[0482] The CPU 71 ends the series of processes illustrated in
[0483]
[0484] In
[0485] Then, in subsequent step S212, the CPU 71 executes processing of displaying the selected camerawork. That is, processing of displaying the selected camerawork information on the camerawork list display unit 44 is performed.
[0486] In response to the execution of the process of step S212, the CPU 71 ends the series of processes illustrated in
[0487]
[0488] In step S301, the CPU 71 waits for the camera variation notification. That is, the CPU 71 waits for the variation notification transmitted when the utility server 8 detects variation of the camera by the automatic variation detection (step S33 in
[0489] When there is the camera variation notification, the CPU 71 determines in step S302 whether or not the camera is a camera being displayed. That is, it is determined whether or not the camera notified by the variation notification is the camera displaying the camera position mark Mc in the camerawork window 43. When the camera is not the camera being displayed, the CPU 71 terminates the series of processing illustrated in
[0490] On the other hand, if the camera is currently displayed, the CPU 71 proceeds to step S303 and executes the variation notification process. That is, for the corresponding camera position mark Mc being displayed in the camerawork window 43, for example, processing of displaying information notifying the variation in the display mode as illustrated in
[0491] In response to the execution of the process of step S303, the CPU 71 ends the series of processes illustrated in
9. Modification
[0492] Note that the embodiment is not limited to the specific example described above, and configurations as various modifications can be adopted.
[0493] For example, in the above description, an example has been described in which a device that performs the processing of displaying the creation operation screen Gg and the acceptance of the operation input for generating the camerawork and a device that performs the processing of displaying the camerawork designation screen Gs and the acceptance of the operation input for generating the free viewpoint image are common devices as the free viewpoint image server 2. However, it is also possible to adopt a mode in which these devices are separate devices.
[0494] Furthermore, in the above description, with respect to the filtering and displaying of the camerawork on the camerawork designation screen Gs, an example has been described in which the filtering is performed according to the operated portion such as the button indicating the filtering condition. However, for example, the filtering and displaying of the camerawork can also be performed according to the designation of the target Tg, such as designation of the target mark Mt displayed in the camerawork window 43. Specifically, only the camerawork in which the designated target Tg is set among the cameraworks as the candidates is filtered and displayed.
[0495] In addition, on the creation operation screen Gg and the camerawork designation screen Gs, in a case where there is a range (for example, a range in which the resolution is equal to or less than a predetermined value) in which image quality cannot be secured due to, for example, the real camera being too far from the subject on the movement trajectory of the viewpoint, information for notifying the range can also be displayed.
10. Summary of Embodiments
[0496] As described above, a first information processing device according to the embodiment includes the display processing unit (34a) that performs processing of displaying a screen, as the camerawork information creation operation screen (Gg) that is information indicating at least the movement trajectory of the viewpoint in a free viewpoint image, including a designation operation reception region (camerawork list display unit 44, operation panel unit 54, and the like) that receives operation input for designating at least partial information of the camerawork information and a camerawork display region (camerawork window 53) that visualizes and displays the movement trajectory of the viewpoint based on the camerawork information reflecting the designation content by the operation input.
[0497] As a result, the user can perform the creation operation of the camerawork while visually recognizing the visualized movement trajectory of the viewpoint on the camerawork creation operation screen.
[0498] Therefore, the efficiency of the camerawork creation work can be improved.
[0499] Furthermore, in the first information processing device according to the embodiment, the designation operation reception region can receive designation operations of the start point and the end point of the movement trajectory (see
[0500] As a result, it is possible to set an arbitrary start point and an arbitrary end point of the movement trajectory of the viewpoint rather than a fixed point.
[0501] Therefore, the degree of freedom in creating a free viewpoint image can be improved.
[0502] Furthermore, in the first information processing device according to the embodiment, the designation operation reception region can receive designation operation of a via-point of a viewpoint (see
[0503] Thus, it is possible to set, as the movement trajectory of the viewpoint, a trajectory passing through the designated point, instead of a linear trajectory connecting two points of the start point and the end point.
[0504] Therefore, the degree of freedom in creating a free viewpoint image can be improved.
[0505] Furthermore, in the first information processing device according to the embodiment, the designation operation reception region can receive designation operation of the timing at which the viewpoint passes through the via-point (see
[0506] Thus, it is possible to set not only the via-point of the viewpoint but also the timing at which the viewpoint passes through the via-point.
[0507] Therefore, it is possible to improve the degree of freedom in setting the position through which the viewpoint passes and the degree of freedom in setting the timing at which the viewpoint passes through the via-point, and improve the degree of freedom in creating the free viewpoint image.
[0508] Furthermore, in the first information processing device according to the embodiment, the designation operation reception region can receive designation operation of the type of the shape of the movement trajectory (see
[0509] As a result, the shape type of the movement trajectory of the viewpoint can be made variable instead of fixed.
[0510] Therefore, the degree of freedom in creating a free viewpoint image can be improved.
[0511] For example, if the shape type of the movement trajectory is a curved shape, it is possible to prevent the distance from the target subject to the viewpoint from greatly changing even if the viewpoint moves. In other words, the size of the target subject in the free viewpoint image can be prevented from greatly changing.
[0512] Furthermore, in the first information processing device according to the embodiment, the designation operation reception region can receive designation operation of the moving speed of the viewpoint (see
[0513] As a result, the moving speed of the viewpoint can be made variable instead of fixed.
[0514] Therefore, the degree of freedom in creating a free viewpoint image can be improved.
[0515] Furthermore, in the first information processing device according to the embodiment, the designation operation reception region can receive designation operation of a section for changing the moving speed in the movement trajectory (see
[0516] Thus, it is possible to dynamically change the moving speed of the viewpoint in the movement trajectory.
[0517] Therefore, the degree of freedom in creating a free viewpoint image can be improved.
[0518] Furthermore, in the first information processing device according to the embodiment, the designated operation-reception region can receive an operation input for a timeline indicating a period from the movement start time point to the movement end time point of the viewpoint (see the timeline operation unit 54a) .
[0519] By accepting the input operation on the timeline, for example, designation of a via-point and designation of a via-point timing thereof can be performed at the same time by dragging and dropping an icon of the camera on the timeline, or designation of a section in which a predetermined effect such as a section in which curve interpolation of a movement trajectory is to be performed by designation of a range by drag operation on the timeline can be performed. Thus, it is possible to facilitate designation operation of various types of information related to the camerawork.
[0520] Therefore, the efficiency of the camerawork creation work can be improved.
[0521] Furthermore, in the first information processing device according to the embodiment, the display processing unit performs processing of displaying information (field-of-view information Fv) obtained by visualizing the field of view from the viewpoint in the camerawork display region (see
[0522] Since the field of view is visually indicated, it is possible to facilitate grasping of the camerawork by the user.
[0523] Therefore, it is possible to allow the user to easily grasp how the camerawork changes by the operation input, and improve the efficiency of the camerawork creation work.
[0524] Furthermore, in the first information processing device according to the embodiment, the display processing unit performs processing of displaying information in which the field of view from the viewpoint is represented by a figure in the camerawork display region.
[0525] Since the field of view is graphically illustrated, the user can easily grasp the camerawork.
[0526] Therefore, it is possible to allow the user to easily grasp how the camerawork changes by the operation input, and improve the efficiency of the camerawork creation work.
[0527] Furthermore, in the first information processing device according to the embodiment, the display processing unit performs processing of displaying an image obtained by observing a three-dimensional space from a viewpoint on the creation operation screen (see the preview window 55).
[0528] As a result, an image similar to the free viewpoint image generated on the basis of the camerawork information can be displayed as a preview to the user, and grasping of the camerawork can be facilitated.
[0529] Therefore, the efficiency of the camerawork creation work can be improved.
[0530] Furthermore, in the first information processing device according to the embodiment, the designation operation reception region can receive designation operation of a position of a target that defines a line-of-sight direction from a viewpoint (see
[0531] As a result, an image following the target can be generated as the free viewpoint image. The image following the target means an image in which the target continues to be positioned at a predetermined position (for example, a center position) in the image frame.
[0532] Therefore, the degree of freedom in creating a free viewpoint image can be improved.
[0533] Furthermore, in the first information processing device according to the embodiment, the designation operation reception region can receive designation operation of a period of facing the target (see
[0534] The period of facing the target means a period in which the target is continuously positioned at a predetermined position in the image frame of the free viewpoint image. Since the designation operation of the period of facing the target is enabled as described above, it is possible to generate, as the free viewpoint image, an image that follows the target position in a certain period and does not follow the target position in other periods in the viewpoint movement period, and the like, and freely improve the setting for the period of following the target position.
[0535] Therefore, the degree of freedom in creating a free viewpoint image can be improved.
[0536] Furthermore, in the first information processing device according to the embodiment, the designation operation reception region can receive a designation operation of a plurality of target positions as a designation operation of target positions (see
[0537] As a result, it is possible to generate a free viewpoint image that follows the target A in a certain period and follows the target B in another period in the viewpoint movement period, and freely improve the setting of the target to be followed.
[0538] Therefore, the degree of freedom in creating a free viewpoint image can be improved.
[0539] Furthermore, a first information processing method according to an embodiment is an information processing method in which an information processing device performs processing of displaying a screen, as the camerawork information creation operation screen that is information indicating at least the movement trajectory of the viewpoint in a free viewpoint image, including a designation operation reception region that receives operation input for designating at least partial information of the camerawork information and a camerawork display region that visualizes and displays the movement trajectory of the viewpoint based on the camerawork information reflecting the designation content by the operation input.
[0540] According to such a first information processing method, it is possible to obtain the same operations and effects as those of the first information processing device described above.
[0541] In addition, a second information processing device according to the embodiment includes a display processing unit (32a) that performs processing of displaying a screen indicating, by filtering, camerawork information according to user input information among a plurality of pieces of camerawork information as a camerawork designation screen (Gs) that receives designation operation of camerawork information that is information indicating at least a movement trajectory of a viewpoint in a free viewpoint image.
[0542] By filtering and displaying the camerawork information according to the input information of the user, it is possible to easily find the camerawork information desired by the user, and shorten the time required for designating the camerawork information.
[0543] Therefore, the work of creating the free viewpoint image can be executed quickly.
[0544] Furthermore, in the second information processing device according to the embodiment, the display processing unit performs processing of filtering and displaying camerawork information according to a keyword as the input information on the camerawork designation screen (see
[0545] Thus, it is possible to perform appropriate filtering of the camerawork information reflecting the intention of the user.
[0546] Therefore, it is possible to make it easier for the user to find desired camerawork information, and further shorten the time required for designating the camerawork information.
[0547] Furthermore, in the second information processing device according to the embodiment, the operated portion indicating the filtering condition of the camerawork information is arranged on the camerawork designation screen, and the display processing unit performs processing of filtering and displaying the camerawork information according to the filtering condition indicated by the operated portion according to the operation of the operated portion (see
[0548] As a result, the operation required for the filtering and displaying of the camerawork information can be reduced to only the operation of selecting the filtering condition information.
[0549] Therefore, it is possible to reduce the user’s operation burden required for filtering and displaying the camerawork information.
[0550] Furthermore, in the second information processing device according to the embodiment, the display processing unit performs processing of displaying information obtained by visualizing the movement trajectory of the viewpoint on the camerawork designation screen (see
[0551] By displaying the information visualizing the movement trajectory of the viewpoint, it is easy for the user to image the camerawork.
[0552] Therefore, in designating the camerawork information to be used for creating the free viewpoint image, it is possible to make it easier for the user to find desired camerawork information, and shorten the time required for designating the camerawork information.
[0553] Furthermore, in the second information processing device according to the embodiment, the display processing unit performs processing of displaying camera arrangement position information indicating arrangement positions of a plurality of cameras that performs imaging for generating a free viewpoint image on the camerawork designation screen (see
[0554] By displaying the information indicating the arrangement position of each camera, the user can easily image what kind of image should be generated as a free viewpoint image.
[0555] Therefore, the work of creating the free viewpoint image can be executed quickly.
[0556] Furthermore, in the second information processing device according to the embodiment, the display processing unit performs processing of displaying, on the camerawork designation screen, start point arrangement position information and end point arrangement position information indicating the respective positions of the camera serving as the movement start point and the camera serving as the movement end point of the viewpoint among the plurality of cameras (see
[0557] As a result, it is possible to allow the user to grasp from which camera position the movement of the viewpoint starts and ends at which camera position in the camerawork.
[0558] Therefore, when designating the camerawork information used for creating the free viewpoint image, it is possible to more easily find the camerawork information desired by the user. In particular, in the case of generating the image in which the front clip and the rear clip are connected to the free visual point image as described above, it is desirable that the camera as the movement start point of the viewpoint is matched with the imaging camera of the front clip and the camera as the movement end point of the viewpoint is matched with the imaging camera of the rear clip so that the connection between the clips becomes natural. However, by displaying the respective positions of the camera as the movement start point and the camera as the movement end point as described above, it is possible to easily designate appropriate camerawork corresponding to the imaging camera of the front clip and the imaging camera of the rear clip.
[0559] Furthermore, in the second information processing device according to the embodiment, the display processing unit performs processing of displaying the start point arrangement position information and the end point arrangement position information, and arrangement position information of a camera serving as the movement start point and a camera other than the camera serving as the movement end point among the plurality of cameras in different modes.
[0560] As a result, it is possible to allow the user to intuitively grasp from which camera position the movement of the viewpoint starts and ends at which camera position in the camerawork.
[0561] Therefore, when designating the camerawork information used for creating the free viewpoint image, it is possible to more easily find the camerawork information desired by the user.
[0562] Furthermore, in the second information processing device according to the embodiment, the display processing unit performs processing of displaying information obtained by visualizing the moving speed of the viewpoint on the camerawork designation screen (see
[0563] The period in which the moving speed of the viewpoint is changed in the period in which the viewpoint is moved is an important factor in the drawing of the free viewpoint image.
[0564] Therefore, by displaying the visualization information of the moving speed of the viewpoint as described above, the camerawork information desired by the user can be more easily found, and the time required for designating the camerawork information can be shortened.
[0565] Furthermore, in the second information processing device according to the embodiment, the display processing unit performs processing of displaying information indicating a period during which the moving speed decreases as information obtained by visualizing the moving speed of the viewpoint.
[0566] The period in which the moving speed of the viewpoint is decreased in the period in which the viewpoint is moved is an important factor in the drawing of the free viewpoint image.
[0567] Therefore, by displaying the information indicating the period in which the moving speed of the viewpoint decreases as described above, it is possible to more easily find the camerawork information desired by the user, and to shorten the time required for designating the camerawork information.
[0568] Furthermore, in the second information processing device according to the embodiment, the display processing unit performs processing of displaying information obtained by visualizing the field of view from the viewpoint on the camerawork designation screen (see
[0569] Since the field of view is visually indicated, it is possible to facilitate grasping of the camerawork by the user.
[0570] Therefore, it is possible to make it easier for the user to find desired camerawork information, and shorten the time required for designating the camerawork information.
[0571] Furthermore, in the second information processing device according to the embodiment, the display processing unit performs processing of displaying a target that defines a line-of-sight direction from a viewpoint on the camerawork designation screen (see
[0572] As a result, it is possible to allow the user to easily grasp which position of the subject in the three-dimensional space the camerawork is targeted for.
[0573] Therefore, it is possible to make it easier for the user to find desired camerawork information, and shorten the time required for designating the camerawork information.
[0574] Furthermore, the second information processing device according to the embodiment includes a camerawork editing processing unit (32b) that updates information on the position of the target in the camerawork information according to a change in the position of the target on the camerawork designation screen (see
[0575] As a result, when it is desired to edit the camerawork information at the stage of designating the camerawork information to be used for generation of the free viewpoint image, it is not necessary to start software for generating the camerawork information.
[0576] Therefore, even when it is necessary to edit the camerawork information, it is possible to quickly execute the work of creating the free viewpoint image.
[0577] Furthermore, in the second information processing device according to the embodiment, the display processing unit performs processing of displaying an image obtained by observing a three-dimensional space from a viewpoint on the camerawork designation screen (see
[0578] As a result, an image similar to the free viewpoint image generated on the basis of the camerawork information can be displayed as a preview to the user, and grasping of the camerawork can be facilitated.
[0579] Therefore, it is possible to make it easier for the user to find desired camerawork information, and shorten the time required for designating the camerawork information.
[0580] Furthermore, in the second information processing device according to the embodiment, the display processing unit performs processing of displaying an image obtained by rendering a virtual three-dimensional model of a real space as an image obtained by observing a three-dimensional space from a viewpoint (see
[0581] As a result, when the preview display of the observation image from the viewpoint is realized, it is not necessary to perform the rendering processing using the three-dimensional model generated from the captured image of the target real space.
[0582] Therefore, the processing time required to display the preview of the observation image from the viewpoint can be shortened, and the work of creating the free viewpoint image can be quickly executed.
[0583] Furthermore, in the second information processing device according to the embodiment, the display processing unit performs processing of displaying information notifying a camera in which a change in the field of view is detected among the plurality of cameras (see
[0584] In generating a free viewpoint image, in order to accurately generate three-dimensional information from images captured by a plurality of cameras, it is necessary for each camera to maintain an assumed position and orientation in advance, and in a case where a change in position and orientation occurs in any camera, it is necessary to calibrate parameters used for generating three-dimensional information. By notifying the camera in which the change in the field of view is detected as described above, it is possible to notify the user of the camera that requires calibration.
[0585] Therefore, it is possible to generate a free viewpoint image based on accurate three-dimensional information, and to improve the image quality of the free viewpoint image.
[0586] Furthermore, a second information processing method according to the embodiment is an information processing method in which an information processing device performs processing of displaying a screen indicating, by filtering, camerawork information according to user input information among a plurality of pieces of camerawork information as a camerawork designation screen that receives designation operation of camerawork information that is information indicating at least a movement trajectory of a viewpoint in a free viewpoint image.
[0587] According to such a second information processing method, it is possible to obtain the same operations and effects as those of the second information processing device described above.
[0588] Here, as an embodiment, it is possible to consider a program for causing a CPU, a digital signal processor (DSP), or the like, or a device including the CPU, the DSP, or the like, to execute the processing by the display processing unit 34a described in
[0589] That is, a first program of the embodiment is a program that can be read by a computer device, and is a program that causes the computer device to realize a function of performing processing of displaying a screen, as the camerawork information creation operation screen that is information indicating at least the movement trajectory of the viewpoint in a free viewpoint image, including a designation operation reception region that receives operation input for designating at least partial information of the camerawork information and a camerawork display region that visualizes and displays the movement trajectory of the viewpoint based on the camerawork information reflecting the designation content by the operation input.
[0590] With such a program, the above-described display processing unit 34a can be realized in a device as the information processing device 70.
[0591] Furthermore, as an embodiment, a program for causing a CPU, a DSP, or a device including the CPU, the DSP, or the like to execute the processing by the display processing unit 32a described in
[0592] That is, a second program of the embodiment is a program that can be read by a computer device, and is a program that causes the computer device to execute a function of performing processing of displaying a screen indicating, by filtering, camerawork information according to user input information among a plurality of pieces of camerawork information as a camerawork designation screen that receives designation operation of camerawork information that is information indicating at least a movement trajectory of a viewpoint in a free viewpoint image.
[0593] With such a program, the above-described display processing unit 32a can be realized in a device as the information processing device 70.
[0594] These programs can be recorded in advance in an HDD as a recording medium built in a device such as a computer device, a ROM in a microcomputer having a CPU, or the like.
[0595] Alternatively, the program can be temporarily or permanently stored (recorded) in a removable recording medium such as a flexible disk, a compact disc read only memory (CD-ROM), a magneto optical (MO) disk, a digital versatile disc (DVD), a Blu-ray disc (registered trademark), a magnetic disk, a semiconductor memory, or a memory card. Such a removable recording medium can be provided as so-called packaged software.
[0596] Furthermore, such a program can be installed from a removable recording medium to a personal computer or the like, or can be downloaded from a download site via a network such as a local area network (LAN) or the Internet.
[0597] In addition, such a program is suitable for providing the display processing unit 34a and the display processing unit 32a of the embodiment in a wide range. For example, by downloading the program to a personal computer, a portable information processing device, a mobile phone, a game device, a video device, a personal digital assistant (PDA), or the like, the personal computer or the like can be caused to function as a device that realizes processing as the display processing unit 34a or the display processing unit 32a of the present disclosure.
[0598] Note that the effects described in the present specification are merely examples and are not limited, and other effects may be provided.
11. Present Technology
[0599] Note that the present technology can also have the following configurations. [0600] (1) An information processing device comprising: [0601] a display processing unit that performs processing of displaying a screen indicating, by filtering, camerawork information corresponding to input information of a user among a plurality of pieces of camerawork information, as a camerawork designation screen that receives designation operation of camerawork information that is information indicating at least a movement trajectory of a viewpoint in a free viewpoint image. [0602] (2) The information processing device according to (1), in which [0603] the display processing unit [0604] performs processing of filtering and displaying camerawork information according to a keyword as the input information on the camerawork designation screen. [0605] (3) The information processing device according to (1) or (2), in which [0606] filtering condition information indicating a filtering condition of camerawork information is displayed on the camerawork designation screen, and [0607] the display processing unit [0608] performs processing of filtering and displaying the camerawork information according to the filtering condition indicated by the selected filtering condition information as the input information. [0609] (4) The information processing device according to any one of (1) to (3), in which [0610] the display processing unit [0611] performs processing of displaying information obtained by visualizing a movement trajectory of the viewpoint on the camerawork designation screen. [0612] (5) The information processing device according to any one of (1) to (4), in which [0613] the display processing unit [0614] performs processing of displaying, on the camerawork designation screen, camera arrangement position information indicating arrangement positions of a plurality of cameras that performs imaging for generating a free viewpoint image. [0615] (6) The information processing device according to (5), in which [0616] the display processing unit [0617] performs processing of displaying, on the camerawork designation screen, start point arrangement position information and end point arrangement position information indicating respective positions of a camera serving as a movement start point and a camera serving as a movement end point of the viewpoint among the plurality of cameras. [0618] (7) The information processing device according to (6), in which [0619] the display processing unit [0620] performs processing of displaying the start point arrangement position information and the end point arrangement position information, and arrangement position information of cameras other than the camera serving as the movement start point and the camera serving as the movement end point among the plurality of cameras in different modes. [0621] (8) The information processing device according to any one of (4) to (7), in which [0622] the display processing unit [0623] performs processing of displaying information obtained by visualizing the moving speed of the viewpoint on the camerawork designation screen. [0624] (9) The information processing device according to (8), in which [0625] the display processing unit [0626] performs processing of displaying information indicating a period in which the moving speed decreases as information obtained by visualizing the moving speed of the viewpoint. [0627] (10) The information processing device according to any one of (4) to (9), in which [0628] the display processing unit [0629] performs processing of displaying information obtained by visualizing a field of view from the viewpoint on the camerawork designation screen. [0630] (11) The information processing device according to any one of (4) to (10), in which [0631] the display processing unit [0632] performs processing of displaying a target that defines a line-of-sight direction from the viewpoint on the camerawork designation screen. [0633] (12) The information processing device according to (11), further including [0634] a camerawork editing processing unit that updates information on the position of the target in camerawork information according to a change in the position of the target on the camerawork designation screen. [0635] (13) The information processing device according to any one of (1) to (12), in which [0636] the display processing unit [0637] performs processing of displaying an image obtained by observing a three-dimensional space from the viewpoint on the camerawork designation screen. [0638] (14) The information processing device according to (13), in which [0639] the display processing unit [0640] performs processing of displaying not a three-dimensional model generated from a captured image of a real space, but an image obtained by rendering a virtual three-dimensional model of a real space as an image obtained by observing a three-dimensional space from the viewpoint. [0641] (15) The information processing device according to (5), in which [0642] the display processing unit [0643] performs processing of displaying information notifying a camera in which a change in the field of view has been detected among the plurality of cameras. [0644] (16) An information processing method in which [0645] an information processing device [0646] performs processing of displaying a screen indicating, by filtering, camerawork information corresponding to input information of a user among a plurality of pieces of camerawork information specifiable, as a camerawork designation screen that receives designation operation of camerawork information that is information indicating at least a movement trajectory of a viewpoint in a free viewpoint image. [0647] (17) A program readable by a computer device, the program causing the computer device to implement a function of performing processing of displaying a screen indicating, by filtering, camerawork information corresponding to input information of a user among a plurality of pieces of camerawork information specifiable, as a camerawork designation screen that receives designation operation of camerawork information that is information indicating at least a movement trajectory of a viewpoint in a free viewpoint image.
REFERENCE SIGNS LIST
[0648] TABLE-US-00001 2 Free viewpoint image server 8 Utility server 10 Imaging device 21 Section identification processing unit 22 Target image transmission control unit 23 Output image generating unit 31 Target image acquisition unit 32 Image generation processing unit 32a Display processing unit 32b Camerawork editing processing unit 33 Transmission control unit Gs Camerawork designation screen 41 Scene window 42 Scene list display unit 43 Camerawork window 44 Camerawork list display unit 70 Information processing device 71 CPU 72 ROM 73 RAM 74 Bus 75 Input/output interface 76 Input unit 77 Display unit 78 Sound output unit 79 Storage unit 80 Communication unit 81 Removable recording medium 82 Drive Tg Target Mc Camera position mark Fv Field of view information Mt Target mark Mm Movement trajectory information Mv Via-point mark Mtn Target position designation mark Mtt Additional target mark Mem Arrival target timing mark Mst Target initial position mark Rf Field of view Dg Line-of-sight direction 48 Filtering operation unit 48a Pull-down list 48b keyword input unit B33 Reproduction button B34 Pause button B35 Stop button B36 X-axis viewpoint button B37 Y-axis viewpoint button B38 Z-axis viewpoint button B39 Ca viewpoint button B40 Pe viewpoint button B43 Pull-down button B44 Reset button