IMAGE PROJECTION METHOD AND PROJECTOR
20230124225 · 2023-04-20
Inventors
Cpc classification
International classification
Abstract
An image projection method includes projecting an image including a plurality of adjustment points on a screen, determining positions of the plurality of adjustment points on the screen, determining whether each of a plurality of sides connecting the adjustment points adjacent to each other is a linear line or a curved line, obtaining geometrically-corrected image by performing geometric correction on the image in a range corresponding to an area defined by the plurality of sides including the linear line and the curved line based on the plurality of sides defining the area, and projecting the geometrically-corrected image on the screen.
Claims
1. An image projection. method comprising: projecting an image including a plurality of adjustment points on a screen; determining positions of the plurality of adjustment points on the screen; determining whether each of a plurality of sides connecting the adjustment points adjacent to each other is a linear line or a curved line; obtaining geometrically-corrected image by performing geometric correction on the image in a range corresponding to an area defined by the plurality of sides including the linear line and the curved line based on the plurality of sides defining the area; and projecting the geometrically-corrected image on the screen.
2. The image projection method according to claim 1, further comprising detecting input for setting of types of lines of the respective plurality of sides, wherein the determining whether each of the plurality of sides connecting the adjustment points adjacent to each other is the linear line or the curved line includes determining whether each of the plurality of sides is the linear line or the curved line according to the types.
3. The image projection method according to claim 2, further comprising determining a shape of the side based on the adjustment points in a number corresponding to the type and adjacent to each other on a line having the side as a part.
4. The image projection method according to claim 1, wherein the plurality of adjustment points are projected on the screen as a plurality of grid points in a two-dimensional grid pattern.
5. The image projection method according to claim 1, further comprising: generating a non-linear coordinate system of the area by non-linear interpolation; and generating a linear coordinate system of the area by linear interpolation, wherein the performing the geometric correction includes performing geometric correction on the image in a range corresponding to the area by synthesizing the non-linear coordinate system and the linear coordinate system based on weighting according to the plurality of sides defining the area.
6. The image projection method according to claim 1, further comprising: defining virtual adjustment points linearly extrapolated with respect to all the linear lines of the plurality of sides defining the area; and defining a linear line passing through the adjustment points and the virtual adjustment points on both ends as a virtual curve in each of all the linear lines, wherein the performing the geometric correction includes performing geometric correction on the image in a range corresponding to the area by generating a non-linear coordinate system of the area by non-linear interpolation with reference to all the curved lines of the plurality of sides defining the area and all the virtual curves.
7. A projector comprising: a projection device projecting an image including a plurality of adjustment points on a screen; an input interface detecting an input by a user for adjusting positions of the plurality of adjustment points on the screen; a processing circuit determining whether each of a plurality of sides connecting the adjustment points adjacent to each other is a linear line or a curved line; and a correction circuit performing geometric correction on the image in a range corresponding to an area defined by the plurality of sides including the linear line and the curved line based on the plurality of sides defining the area.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0008]
[0009] FIG. is a flowchart for explanation of an example of an image projection method in the projection system.
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
DESCRIPTION OF EXEMPLARY EMBODIMENTS
[0018] As below, an embodiment of the present disclosure will be explained with reference to the drawings. The embodiment exemplifies a system for implementation of the technical idea of the present disclosure and an apparatus and a method used for the system. The technical idea of the present disclosure does not limit types and configurations of the respective apparatuses, network topology, a series of processing, etc. to those described as below. In the drawings, the same or similar elements may respectively have the same or similar signs and the overlapping explanation may be omitted.
[0019] As shown in
[0020] The projector 10 includes an input interface (I/F) 11, a control circuit 12, and a projection device 13. The input I/F 11 includes e.g. a communication I/F 111 and an input device 112. The input I/F 11 detects various kinds of input by a user and outputs signals according to the input by the user to the control circuit 12.
[0021] The communication I/F 111 establishes a communication link via the network NW with the control apparatus 20 according to the control by the control circuit 12, and thereby, communicably couples to the control apparatus 20. The communication I/F 111 includes a communication circuit processing a signal transmitted in the communication link. The communication link may be wired, wireless, or a combination of wired and wireless connection. That is, the communication I/F 111 may be coupled to the control apparatus 20 directly or indirectly via another relay device. The communication I/F 111 may include e.g. an antenna transmitting and receiving radio signals and a receptacle into which a plug of a communication cable is inserted.
[0022] For example, the communication I/F 111 sequentially acquires image data transmitted from the control apparatus 20 and outputs the data to the control circuit 12. The communication I/F 111 may acquire image data reproduced in a reproduction apparatus 30. For the purpose, the projection system 1 may include the reproduction apparatus 30. As the reproduction apparatus 30, an arbitrary apparatus having a function of supplying image data to the projector 10 e.g. a personal computer, a tablet terminal, a smartphone, a digital media player, a camera, a movie player, a wireless display adapter, a television tuner, a video game machine, or the like can be employed.
[0023] The input device 112 detects input by the user and outputs a signal according to the input by the user to the control circuit 12. For example, as the input device 112, various input devices including various switches such as a push button and a touch sensor, a pointing device such as a mouse and a touch panel, and a keyboard can be employed. To detect voice of the user as input using a voice recognition technology, a microphone may be employed as the input device 112. Or, as the input device 112, a gesture sensor detecting gesture or the user as input may be employed. The input device 112 may be a pointing device detecting a position of a pointer on the screen SC. The input device 112 may include a wired or wireless remote controller. The input I/F 11 may detect input by the user to an input unit 22 of the control apparatus 20 via the communication I/F 111.
[0024] The projection device 13 includes a light source 131, a display panel 132, and an optical system 133. The light source 131 includes e.g. a light emitting device such as a discharge lamp or a solid-state light source. The display panel 132 is a light modulation device having a plurality of pixels. The display panel 132 modulates light emitted from the light source 131 according to an image signal output from the control circuit 12. The display panel 132 is e.g. a transmissive or reflective liquid crystal light valve. The display panel 132 may be a digital micromirror device controlling reflection of light with respect to each pixel. Obviously, the display panel 132 in the single projection device 13 may include a plurality of display panels modulating lights having different wavelengths from one another. The optical system 133 projects an image D on the screen SC by radiating the light modulated by the display panel 132 onto the screen SC. The optical system 133 may include various kinds of lenses, mirrors, and drive mechanisms.
[0025] The control circuit 12 includes an OSD processing circuit 121, a geometric correction circuit 122, a processing circuit 123, and a storage medium 124. The control circuit 12 controls the projection device 13 to project the image D based on the image data input from the communication I/F 111 on the screen SC. The control circuit 12 performs geometric correction on the image D projected on the screen SC based on the input by the user detected by the input unit 22 or the input device 112.
[0026] The OSD processing circuit 121 generates plurality of adjustment points to be projected on the screen SC using e.g. an on-screen display (OSD) technique for geometric correction on the image D. The OSD processing circuit 121 projects the plurality of adjustment points via the projection device 13. The respective positions of the plurality of adjustment points on the screen SC are adjusted according to the input by the user to the input unit 22 or the input device 112. That is, the input I/F 11 detects the input for adjustment of the positions of the plurality of adjustment points.
[0027] The geometric correction circuit 122 executes the Geometric correction of the image by controlling the display panel 132 based on the positions of the plurality of adjustment points. Specifically, the geometric correction circuit 122 executes the geometric correction based on a transformation factor calculated by the processing circuit 123. As a series of processing for the Geometric correction by the geometric correction circuit 122 and the processing circuit 123, various kinds of two-dimensional coordinate transformation including affine transformation and homography transformation, and various kinds of interpolation including bilinear interpolation and bicubic interpolation may be appropriately executed.
[0028] The processing circuit 123 forms a processing device of a computer processing calculations necessary for the operation of the projector 10. For example, the processing circuit 123 executes a control program stored in the storage medium 124 and realizes various functions described in the embodiment. At least part of the OSD processing circuit 121 and the geometric correction circuit 122 may be realized by the processing circuit 123. As a processing device forming at least a part of the processing circuit 123, various arithmetic logic circuits including e.g. a central processing unit (CPU), a digital signal processor (DSP), a programmable logic device (PLD), and application specific integrated circuits (ASIC) can be employed. The processing circuit 123 may be formed using integrated hardware or individual pieces of hardware.
[0029] The storage medium 124 is a computer-readable storage device storing a control program representing a series of processing necessary for the operation of the processing circuit 123 and various kinds of data. As the storage medium 124, e.g. a semiconductor memory or various disk media can be employed. The storage medium 124 is not limited to a non-volatile auxiliary storage device, but may include a volatile main storage device such as a register or a cash memory. At least a part of the storage medium 124 may be formed using a part of the processing circuit 123. The storage medium 124 may be formed using integrated hardware or individual pieces of hardware.
[0030] The control apparatus 20 includes e.g. a communication unit 21, the input unit 22, a display unit 23, and a control section 24. The communication unit 21 includes a communication circuit that establishes a communication link with the projector 10 and processes a signal transmitted in the communication link. The communication unit 21 establishes the communication link via a network NW with the projector 10 according to the control by the control section 24, and thereby, communicably couples to the projector 10. The communication unit 21 may include e.g. an antenna transmitting and receiving radio signals and a receptacle into which a plug of a communication cable is inserted.
[0031] The input unit 22 is an input device detecting input by the user and outputting a signal according to the input by the user to the control section 24. At least one of various input devices employable as the input device 112 can be employed as the input unit 22.
[0032] The display unit 23 is a display displaying an image on a screen according to the control by the control section 24. The display unit 23 is e.g. a flat panel display. The control section 24 may transmit image data representing an image to the projector 10 via the communication unit 21, and thereby, the image D projected by the projector 10 may be used as the image of the display unit 23. The input unit 22 and the display unit 23 may form a touch-panel display.
[0033] The control section 24 includes a processing unit. 25 and a memory unit 26. The processing unit 25 forms a processing device of a computer processing calculations necessary for the operation of the control apparatus 20. For example, the processing unit 25 executes a program stored in the memory unit 26 and realizes various functions of the control apparatus 20 described in the embodiment. As a processing device forming at least a part of the processing unit 25, various arithmetic logic circuits including e.g. a CPU, a DSP, a PLD, and an ASIC can be employed. The processing unit 25 may be formed using integrated hardware or individual pieces of hardware.
[0034] The memory unit 26 is a computer-readable storage device storing a program representing a series of processing necessary for the operation of the control apparatus 20 and various kinds of data. As the memory unit 26, e.g. a semiconductor memory or various disk media can be employed. The memory unit 26 is not limited to non-volatile auxiliary storage device, but may include a volatile main storage device such as a register or a cash memory. At least a part of the memory unit 26 may be formed using a part of the processing unit 25. The memory unit 26 may be formed using integrated hardware or individual pieces of hardware.
[0035] As below, referring to a flowchart in
[0036] At step S101, the projection device 13 projects a plurality of adjustment points relatively defined to the image D as an adjustment point pattern on the screen SC. The plurality of adjustment points are projected on the screen SC as a plurality of grid points in a two-dimensional grid pattern. For example, the processing circuit 123 determines an adjustment point pattern in the initial state based on basic information on image projection stored in the storage medium 124. The OSD processing circuit 121 generates the adjustment point pattern determined by the processing circuit 123 and projects the pattern on the screen SC via the projection device 13.
[0037] For example, as shown in
[0038] In the example of
[0039] At step S102, the input I/F 11 determines the positions of the plurality of adjustment points P.sub.11 to P.sub.44 on the screen SC by detecting input for adjustment of the positions of the plurality of adjustment points P.sub.11 to P.sub.44. For example, the processing unit 25 prompts the user to adjust the positions of the adjustment points P.sub.11 to P.sub.44 by displaying a message, for request to arrange the positions of the adjustment points P.sub.11 to P.sub.44 to fit the shape of the projection surface, on the display unit 23. For example, the OSD processing circuit 121 adjusts the positions of the plurality of adjustment points P.sub.11 to P.sub.44 by the input I/F 11 detecting input by the user to the input unit 22. Accordingly, the display unit 23 may display an input window having a plurality of points corresponding to the plurality of adjustment points P.sub.11 to P.sub.44. The input I/F 11 detects input of completion of the adjustment of the positions of the plurality of adjustment points P.sub.11 to P.sub.44, and thereby, the processing circuit 123 judges that the positions of the plurality of adjustment points P.sub.11 to P.sub.44 are determined and proceeds with the processing to step S103.
[0040] As shown in
[0041] At step S103, the input I/F 11 determines types of the lines of the respective sides by detecting input for setting of the types of the respective lines of the plurality of sides connecting the adjustment points adjacent to each other. For example, as the line types, not only linear lines by linear interpolation such as bilinear interpolation but also various curved lines by non-linear interpolation (curve interpolation) such as polynomial interpolation, spline interpolation, or cubic interpolation can be set. For example, the processing unit 25 prompts the user to set the line types by displaying a message for request to select the line types of the respective sides on the display unit 23. The line types may be selected from e.g. a list displayed on the display unit 23 via the input unit 22 or cyclically selected from predetermined options according to the operation on the respective sides.
[0042] The processing circuit 123 determines whether the respective plurality of sides are linear lines or curved lines according to the line types set via the input I/F 11. In this regard, the OSD processing circuit 121 may calculate the shapes of the respective sides and project the respective sides on the screen. SC via the projection device 13 according to the line types determined in the processing circuit 123. For example, when the line type of the side E.sub.1 shown in
[0043] In the example shown in
[0044] At step S104, the processing circuit 123 calculates a transformation factor in the non-linear interpolation according to the curved line determined at step S103. That is, the processing circuit 123 calculates a transformation factor g(x,y) for obtaining coordinates Dst(x,y) after interpolation from initial coordinates Src(x,y) before the interpolation of the image D according to the type of the curved line indicating the type of the interpolation. The processing circuit 123 calculates the transformation factor g(x,y) with respect to at least an area defined by four sides including a curved line like e.g. an area A defined by the plurality of sides F.sub.1, F.sub.2, G.sub.3, F.sub.4 including the side G.sub.3 of the curved line in
[0045] At step S105, the processing circuit 123 executes non-linear interpolation processing using the transformation factor g(x,y) calculated at step S104. That is, the processing circuit 123 generates a non-linear coordinate system by non-linear interpolation according to the type of curved line determined at step S103 based on the positions of the plurality of adjustment points.
[0046] For example, as shown in
[0047] At step S106, the processing circuit 123 calculates a transformation factor in linear interpolation according to the linear line determined at step S103. That is, the processing circuit 123 calculates a transformation factor f(x,y) for obtaining coordinates Dst(x,y) after interpolation from initial coordinates Src(x,y) before the interpolation of the image D according to the type of the linear line indicating the type of the interpolation. The processing circuit 123 calculates the transformation factor f(x,y) with respect to at least an area defined. by four sides including a linear line like e.g. the area. A defined by the plurality of sides F.sub.1, F.sub.2, G.sub.3, F.sub.4 including'the sides F.sub.1, F.sub.2, F.sub.4 of the linear lines in
[0048] At step S107, the processing circuit 123 executes linear interpolation processing using the transformation factor f(x,y) calculated at step S105. That is, the processing circuit 123 generates a linear coordinate system by linear interpolation according to the types of the linear lines determined at step S103 based on the positions of the plurality of adjustment points.
[0049] For example, as shown in
[0050] At step S108, the processing circuit 123 synthesizes the interpolation results at step S105 and step S107 according to the line types determined at step S103. Hereinafter, an area defined. by a plurality of sides including a linear line and a curved line like the area A defined by the sides F.sub.1, F.sub.2, F.sub.4 of the linear lines and the side G.sub.3 of the curved line is referred to as a “specific area”. The processing circuit 123 synthesizes the non-linear coordinate system and the linear coordinate system using weighting according to the plurality of sides defining the specific area. Thereby, the processing circuit 123 generates a coordinate system of the specific area as a result of the nonlinear interpolation and the linear interpolation.
[0051] For example, as shown in
[0052] At step S109, the geometric correction circuit 122 performs geometric correction of the image D using the coordinate systems respectively generated at steps S105, S107, S108. The geometric correction circuit 122 performs the geometric correction to realize the plurality of sides defining the specific area on the image D in the range corresponding to the specific area using the coordinate system of the specific area generated at step S108 as a result of the interpolation processing.
[0053] Further, the geometric correction circuit 122 maintains the result of the non-linear interpolation of the area B defined only by the sides as the curved lines of the interpolation results obtained at steps S104 to S105 and employs the result for geometric correction. That is, the geometric correction circuit 122 performs geometric correction on the image D in the range corresponding to the area B to realize the plurality of curved lines defining the area B. Similarly, the geometric correction circuit 122 maintains the result of the linear interpolation of the area C defined only by the sides as the linear lines of the interpolation results obtained at steps S106 to S107 and employs the result for geometric correction. That is, the geometric correction circuit 122 performs geometric correction on the image D in the range corresponding to the area C to realize the plurality of linear lines defining the area C.
[0054] In the above described manner, as shown in
[0055] According to the projection system 1 of the embodiment, regarding the attributes of the sides connecting the adjustment points, whether the linear lines or the curved lines may be determined and the interpolation processing may be performed using the kinds interpolation methods of the linear interpolation and the curve interpolation. Therefore, according to the projection system 1, appropriate geometric correction can be performed even for a projection surface having a complex shape including a plurality of flat surfaces and a plurality of curved surfaces.
[0056] As above, as shown in the flowchart in
[0057] For example, as shown in the flowchart in
[0058] At step S204, the processing circuit 123 defines virtual adjustment points by linear extrapolation on all linear lines determined at step S203. Particularly, the processing circuit 123 defines a virtual curve by defining the linearly extrapolated virtual adjustment points for the sides as the linear lines of the plurality of sides.
[0059] For example, as shown in
[0060] Similarly, the processing circuit 123 defines virtual adjustment points Q.sub.13, Q.sub.44 linearly extrapolated on both sides of the side F.sub.2 as the linear line. The processing circuit 123 defines a linear line passing through the adjustment points P.sub.23, P.sub.33 and the virtual adjustment points Q.sub.13, Q.sub.44 on both ends in the side F.sub.2 as a virtual curve H.sub.2. The virtual adjustment points Q.sub.13, Q.sub.44 are temporarily used in place of the adjustment points P.sub.13, P.sub.44 as reference values only in the curve interpolation with respect to the side F.sub.2. The processing circuit 123 defines virtual adjustment points Q.sub.12, Q.sub.42 linearly extrapolated on both sides of the side F.sub.4 as the linear line. The processing circuit 123 defines linear line passing through the adjustment points P.sub.22, P.sub.32 and the virtual adjustment points Q.sub.12, Q.sub.42 on both ends in the side F.sub.4 as a virtual curve H.sub.4. The virtual adjustment points Q.sub.12, Q.sub.42 are temporarily used in place of the adjustment points P.sub.12, P.sub.42 as reference values only in the curve interpolation with respect to the side F.sub.4.
[0061] At step S205, the processing circuit 123 calculates a transformation factor in the non-linear interpolation based on the curved line determined at step S203 and the virtual curve defined at step S204. That is, the processing circuit 123 calculates a transformation factor for obtaining coordinates Dst(x,y) after interpolation from initial coordinates Src(x,y) before the interpolation of the image D according to the type of the curved line. The processing circuit 123 calculates the transformation factor g(x,y) by handling the linear line as the virtual curve with respect to the specific area defined by four sides including a curved line and a linear line like e.g. the area A in
[0062] At. step 3206, the processing circuit 123 executes non-linear interpolation processing using the transformation factor g(x,y) calculated at step S205. That is, the processing circuit 123 generates a non-linear coordinate system by non-linear interpolation with reference to the curved line determined at step S203 and the virtual curve defined at step S204. Particularly, the processing circuit 123 generates the non-linear coordinate system of the area A by non-linear interpolation with reference to the side G.sub.3 as the curved line of the plurality of sides defining the area A and all the virtual curves H.sub.1, H.sub.2, H.sub.4.
[0063] At step S207, the geometric correction circuit 122 performs geometric correction of the image D using the coordinate system generated at step S206. The geometric correction circuit 122 performs the geometric correction to realize the plurality of sides defining the specific area on the image D in the range corresponding to the specific area using the coordinate system of the specific area as a result of the interpolation processing. At step S208, the projection device 13 projects the geometrically-corrected image D on the screen SC.
[0064] Note that, for example, the virtual adjustment points may be determined in the following manner. In the example shown in
hdx=x2−x1 (1)
hdy=y2−y1 (2)
[0065] h(x0,y0) as the coordinates of the virtual adjustment point Q.sub.21 and h(x3,y3) as the coordinates of the virtual adjustment point Q.sub.24 are calculated from the following expression (3) to expression (6).
x0=x1−hdx (3)
y0=y1−hdy (4)
x3=x2+hdx (5)
y3=y2+hdy (6)
[0066] On the other hand, coordinates of the virtual adjustment point Q.sub.13 are v(x0,y0), coordinates of the adjustment point P.sub.23 are v(x1,y1), coordinates of the adjustment point P.sub.33 are v(x2,y2), and coordinates of the virtual adjustment point Q.sub.44 are v(x3,y3) with respect to the grid line in the vertical direction. A difference vdx in the horizontal direction and a difference vdy in the vertical direction of the adjustment points P.sub.23, P.sub.33 are expressed by the following expression (7) and expression (8), respectively.
vdx=x2−x1 (7)
vdy=y2−y1 (8)
[0067] v(x0,y0) as the coordinates of the virtual adjustment point Q.sub.13 and v(x3,y3) as the coordinates of the virtual adjustment point Q.sub.24 are calculated from the following expression (9) to expression (12).
x0=x1−vdx (9)
y0=y1−vdy (10)
x3=x2+vdx (11)
y3=y2+vdy (12)
[0068] As described above, the virtual adjustment points are calculated by simple additions and subtractions and may, be easily defined for all sides as linear lines. Therefore, according to the projection system 1 of a modified example of the embodiment, compared to the case where the two kinds of interpolation methods of the linear interpolation and the non-linear interpolation are executed as in the flowchart in
[0069] According to the projection system of the modified example of the embodiment, regarding the attributes of the sides connecting the adjustment points, whether the linear lines or the curved lines may be determined and the linearly extrapolated virtual adjustment points may be selectively defined for the linear lines. Therefore, in the projection system 1, the equal results to the results of the geometric correction using the plurality of kinds of interpolation methods may be obtained by execution of curve interpolation with the linear lines as the virtual curves. That is, according to the projection system 1, appropriate geometric correction can be performed even for a projection surface having a complex shape.
Other Embodiments
[0070] The embodiments are described as above, however, the present disclosure is not limited to the embodiments. The configurations of the respective parts may be replaced by any configurations having the same functions, and any configuration may be omitted or added in the respective embodiments within the technical scope of the present disclosure. From the present disclosure, various alternative embodiments would be clear to a person skilled in the art.
[0071] For example, in the flowchart shown in
[0072] In addition, obviously, the present disclosure includes various embodiments not described as above such as configurations in which arbitrary configurations described in the above described embodiments apply each other. The technical scope of the present disclosure is defined only by the matters used to specify the invention according to claims appropriate from the above described explanation.