INFORMATION PROCESSING APPARATUS, RECORDING MEDIUM, AND POSITIONING METHOD
20230078960 · 2023-03-16
Inventors
Cpc classification
G06V10/60
PHYSICS
International classification
Abstract
An information processing apparatus comprises a processing unit that acquires, from images captured by an imaging unit that is provided to a movable body and including markers that are disposed in a space, positions of the markers in an image coordinate system, acquire primary positioning information regarding the imaging unit in a world coordinate system, based on the positions of the markers in the image coordinate system and positions of the markers in the world coordinate system, and determines positioning information regarding the imaging unit based on the primary positioning information regarding the imaging unit in the world coordinate system and the positions of the markers in the world coordinate system.
Claims
1. An information processing apparatus comprising: one or more processors configured to: acquire from an image captured by an imaging unit and including a marker disposed in a space, a position of the marker in an image coordinate system, acquire primary positioning information regarding the imaging unit in a world coordinate system, based on the position of the marker in the image coordinate system and a position of the marker in the world coordinate system, determine positioning information regarding the imaging unit, based on the primary positioning information regarding the imaging unit in the world coordinate system and the position of the marker in the world coordinate system.
2. The information processing apparatus according to claim 1, wherein the image further includes an auxiliary marker in addition to the marker, and wherein the one or more processors: acquire the position of the marker in the image coordinate system and a position of the auxiliary marker in the image coordinate system, associate, based on the primary positioning information regarding the imaging unit and a position of the auxiliary marker in the world coordinate system, the position of the auxiliary marker in the image coordinate system with the position of the auxiliary marker in the world coordinate system, and determine the primary positioning information regarding the imaging unit, based on the position of the auxiliary marker in the image coordinate system and the position of the auxiliary marker in the world coordinate system associated with each other.
3. The information processing apparatus according to claim 1, wherein the one or more processors: identify, from the image captured, a light source that is identifiable based on a light emitting mode thereof as the marker, and acquires the position of the marker in the image coordinate system.
4. The information processing apparatus according to claim 1, wherein the one or more processors: identify the marker comprising two or more markers from the image captured, acquire a plurality of positions of markers in the image coordinate system, acquire the primary positioning information regarding the imaging unit, based on the plurality of positions of markers in the image coordinate system and positions in the world coordinate system that respectively correspond to the two or more markers identified.
5. The information processing apparatus according to claim 2, wherein the one or more processors: specify an imaging position and an imaging direction for the image captured, based on the primary positioning information, calculate a new position of the auxiliary marker by performing projection calculation processing on an assumption that the position of the auxiliary marker in the world coordinate system is projected onto the image captured, and associate the position of the auxiliary marker in the world coordinate system with the calculated new position of the auxiliary marker so as to associate the position of the auxiliary marker in the image coordinate system with the position of the auxiliary marker in the world coordinate system.
6. The information processing apparatus according to claim 2, wherein the auxiliary marker has a polygonal shape, and wherein the one or more processors: acquire the position of the auxiliary marker in the image coordinate system, based on positions of vertexes of the polygonal shape of the auxiliary marker in the image captured.
7. The information processing apparatus according to claim 6, wherein the one or more processors: acquire a center position of the auxiliary marker in a case where the auxiliary marker in the image captured is of a size smaller than a predetermined size, and acquire the position of the auxiliary marker in the image coordinate system, based on the center position.
8. A non-transitory computer-readable storage medium storing a program that causes a computer to perform operations comprising: acquiring, from an image captured by an imaging unit and including a marker disposed in a space, a position of the marker in an image coordinate system, acquiring primary positioning information regarding the imaging unit in a world coordinate system, based on the position of the marker in the image coordinate system and a position of the marker in the world coordinate system, and determine positioning information regarding the imaging unit, based on the primary information regarding the imaging unit in the world coordinate system and the position of the marker in the world coordinate system.
9. A positioning method executable by a computer, the positioning method comprising: acquiring, from an image captured by an imaging unit and including a marker disposed in a space, a position of the marker in an image coordinate system, acquiring primary positioning information regarding the imaging unit in a world coordinate system, based on the position of the marker in the image coordinate system and a position of the marker in the world coordinate system, and determining positioning information regarding the imaging unit, based on the primary information regarding the imaging unit in the world coordinate system and the position of the marker in the world coordinate system.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0005]
[0006]
[0007]
[0008]
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
DETAILED DESCRIPTION OF THE INVENTION
[0033] Embodiments of the present invention will be described with reference to the drawings.
(Positioning System 1)
[0034] First, a positioning system 1 will be outlined with reference to
[0035] As illustrated in
[0036] The imaging unit 2 captures images 7 in a state where the movable body 3 has the imaging unit 2 mounted thereto. The imaging unit 2 of the present embodiment consecutively captures images of the ceiling 4 above the imaging unit 2 at a predetermined frame rate, thereby obtaining a plurality of temporally consecutive images 7 of the ceiling 4.
[0037] The imaging unit 2 is connected to a communication device 28 for communicating with the information processing apparatus 10. The images 7 captured by the imaging unit 2 or various pieces of information based on the images 7 are transmitted to the information processing apparatus 10 via the communication device 28. In this example, the movable body 3 having the imaging unit 2 mounted thereto is a work vehicle such as a forklift.
[0038] The marker 5 is an object including either one of a device and an indicator that enable acquisition of three-dimensional position information in a world coordinate system 70 in which the marker 5 is disposed from a database 19, by means of data transmitted by the marker 5 per se, or by way of image processing on an image of the marker 5 captured by the imaging unit 2.
[0039] The marker 5 of the present embodiment is, for example, a light emitting device capable of controlling light emitting modes thereof, such as a color of visible light and timing for emitting visible light. The marker 5 optically transmits identification information by, for example, changing the color of visible light or blinking visible light according to predetermined patterns. The marker 5 may emit near infrared light instead of visible light. In other words, it is only necessary for the marker 5 to emit light within a light wave band that can be captured by a camera.
[0040] In the present embodiment, a plurality of markers 5 are provided in order to specify the position and orientation of the imaging unit 2. In the example illustrated in
[0041] Each auxiliary marker 6 per se does not transmit data. In a case where the imaging unit 2 captures the auxiliary marker 6 in an image, three-dimensional position information in the world coordinate system 70 in which the auxiliary marker 6 is disposed cannot be acquired from the database 19 only by way of image processing. The auxiliary markers 6 are, for example, a plurality of light sources or lighting devices installed on the ceiling 4. In the present specification, the auxiliary markers 6 indicate a region that can be specified from the captured image 7 by way of image processing. The region is specified based on, for example, luminance, chromaticity, or the like.
[0042] The building 9 has a window 8. In a case where the window 8 is captured in the image 7, the window 8 may satisfy a luminance condition depending on a time zone. In this respect, image processing is executed such that control is performed so as not to process the region of the window 8 as the auxiliary marker 6. The auxiliary markers 6 of the present embodiment are not limited to auto-luminous objects. A reflective white material that reflects light, a green colored region existing on a brown plane, or the like may be specified as the auxiliary marker 6 from the image 7 by way of image processing.
[0043] The information processing apparatus 10 measures a position of the imaging unit 2 based on known positions of the markers 5 and the auxiliary markers 6, positions 51 of the markers 5 in the image 7 (hereinafter referred to as the in-image marker positions 51), and the positions 61 of the auxiliary markers 6 in the image 7 (hereinafter referred to as the in-image auxiliary marker positions 61). The known positions of the markers 5 and the auxiliary markers 6 may be specified from, for example, a design drawing in advance, or may be specified by another positioning device. In the present specification, the position of each marker 5 and the position of each auxiliary marker 6 are represented in terms of the three-dimensional world coordinate system 70.
[0044] The information processing apparatus 10 of the present embodiment is connected to a communication device 18 for communicating with the imaging unit 2. Establishing communication between the communication device 18 of the information processing apparatus 10 and the communication device 28 of the imaging unit 2 allows the information processing apparatus 10 to acquire the images 7 from the imaging unit 2.
(Hardware Configuration)
[0045]
[0046] In the example illustrated in
[0047] The processor 11 performs various calculations and processing. The processor 11 is, for example, a central processing unit (CPU), a micro processing unit (MPU), a system on a chip (SoC), a digital signal processor (DSP), a graphics processing unit (GPU), an application specific integrated circuit (ASIC), a programmable logic device (PLD), or a field-programmable gate array (FPGA). Alternatively, the processor 11 is a combination of two or more of the foregoing components. Further, the processor 11 may be a combination of two or more of the foregoing components and a hardware accelerator or the like.
[0048] The processor 11, the ROM 12, and the RAM 13 are connected to one another via a bus. The processor 11 executes various types of processing in accordance with a program recorded in the ROM 12 or a program loaded to the RAM 13. A part or the entirety of the program may be incorporated in the circuitry of the processor 11.
[0049] The input/output unit 14 includes a keyboard, various buttons, a microphone, and the like, and inputs various kinds of information according to a user's instruction operation. The input/output unit 14 further includes a display, a speaker, and the like, and outputs images and sounds. The input/output unit 14 includes an information terminal with a touch panel. The communication unit 15 is a network interface for communicating with the imaging unit 2 and other devices via the communication device 18. The storage unit 16 is an area for storing various kinds of information such as the captured images 7 and known information.
[0050] Next, an example of the hardware configuration of the imaging unit 2 will be described. The imaging unit 2 includes an optical lens unit 21 and an image sensor 22.
[0051] The optical lens unit 21 includes a lens that condenses light in order to capture an image of a subject, such as a focus lens and a zoom lens. The focus lens is for forming a subject image on a light-receiving surface of the image sensor 22. The zoom lens is for freely changing the focal length within a certain range. The optical lens unit 21 is optionally provided with peripheral circuits for adjusting setting parameters such as focus, exposure, and white balance.
[0052] The image sensor 22 includes, for example, a photoelectric conversion element, an analog front end (AFE), etc. The photoelectric conversion element includes, for example, a complementary metal oxide semiconductor (CMOS) type photoelectric conversion element. An image of a subject is incident on the photoelectric conversion element from the optical lens unit 21. In response, the photoelectric conversion element photoelectrically converts (i.e., captures) the image of the subject, accumulates image signals for a certain period of time, and sequentially supplies the accumulated image signals to the AFE as analog signals. The AFE performs various signal processing such as analog/digital (A/D) conversion processing on the analog image signals. By way of the various signal processing, digital signals are generated, and the image 7 is output in the form of output signals from the imaging unit 2.
[0053] (Functional Configuration of Information Processing Apparatus)
[0054]
[0055] The information processing apparatus 10 performs various kinds of control by means of a processing unit 100 that is implemented by the processor 11 executing arithmetic processing based on predetermined programs.
[0056] The processing unit 100 of the present embodiment includes, on a function-to-function basis, an image processing unit 101, a marker identification unit 102, a known information acquisition unit 103, a primary positioning unit 104, a calculation unit 105, an auxiliary marker identification unit 106, a positioning finalization unit 107, and an output unit 108.
[0057] The image processing unit 101 acquires the image 7 and performs preprocessing such as distortion correction. The marker identification unit 102 is capable of acquiring the in-image marker positions 51 by executing processing for identifying the markers 5 from the image 7.
[0058] The known information acquisition unit 103 acquires, from the database 19, preset known information such as imaging unit internal parameters 23, imaging unit installation parameters 24, known marker positions 52 in the world coordinate system 70 (hereinafter referred to as the in-WC-system marker known positions 52), and known auxiliary marker positions 62 in the world coordinate system 70 (hereinafter referred to as the in-WC-system auxiliary marker known positions 62). The database 19 may be built in the storage unit 16 of the information processing apparatus 10, or may be built in a server outside the information processing apparatus 10.
[0059] The primary positioning unit 104 performs primarily positioning of the imaging unit 2 based on the in-image marker positions 51 acquired from the image 7 and the in-WC-system marker known positions 52 acquired from the database 19 to thereby acquire a primary position 23 of the imaging unit 2 in the world coordinate system 70 (hereinafter referred to as the in-WC-system imaging primary position). The in-WC-system imaging primary position includes the position where the imaging unit 2 is and the direction in which the imaging unit 2 is oriented.
[0060] The calculation unit 105 performs a projection calculation processing on an assumption that the in-WC-system auxiliary marker known positions 62 are projected onto an image 7 captured by the imaging unit 2, thereby calculating the projected positions. The auxiliary marker identification unit 106 performs identification processing by associating the positions 61 of the auxiliary markers 6 acquired from captured image 7 by way of image processing (hereinafter, referred to as the in-image auxiliary marker positions 61) with the in-WC-system auxiliary marker known positions 62. In the present embodiment, this associating processing is performed to calculate the position of the movable body 3 as a PnP problem including the in-image auxiliary marker positions 61 and the in-WC-system auxiliary marker known positions 62, in addition to the in-image marker positions 51 and the in-WC-system marker known positions 52. Thus, even in a case where a small number of markers 5 (e.g., two markers 5) are disposed, the position of the movable body 3 can be calculated with high accuracy by utilizing the auxiliary markers 6 such as existing lighting devices.
[0061] The positioning finalization unit 107 performs, based on, for example, the primary position of the imaging unit 2, the in-WC-system marker known positions 52, and the in-WC-system auxiliary marker known positions 62, processing for determining and acquiring the finalized position of the imaging unit 2 in the world coordinate system 70 (hereinafter referred to as the in-WC-system imaging unit determined position). The in-WC-system imaging unit determined position includes the position where the imaging unit 2 is and the direction in which the imaging unit 2 is oriented. The output unit 108 performs processing for outputting the results of the positioning.
[0062] In this example, the information processing apparatus 10 is installed at a location away from the imaging unit 2, but the present invention is not limited to this configuration. For example, the information processing apparatus 10 may be configured to be mounted to the movable body 3 or the imaging unit 2.
[0063] (Known Information)
[0064] Next, examples of known information registered in advance in the database 19 will be described with reference to
[0065]
[0066]
[0067] (Positioning Process)
[0068] Next, a positioning process using the images 7 will be described.
[0069] In response to a start of the positioning process, the processing unit 100 acquires the image 7 captured by the imaging unit 2, and performs distortion correction processing (Step S101). In the distortion correction processing, barrel aberration of the image captured by, for example, a wide-angle lens is corrected.
[0070] After Step S101, the processing unit 100 performs processing for specifying the positions of the auxiliary markers 6 (Step S102).
[0071] An example of the processing for specifying the positions of the auxiliary markers will be described with reference to
[0072]
[0073] The in-image auxiliary marker positions 61 in the captured image 7 are specified with reference to two-dimensional coordinates set on the captured image 7. In the present specification, the two-dimensional coordinate system on the captured image 7 is referred to as the image coordinate system 71.
[0074]
[0075] Next, the marker identification unit 102 performs marker identification processing (Step S103).
[0076]
[0077] An example of the marker identification processing will be described. The marker identification unit 102 determines a light emission pattern based on captured consecutive images 7, and compares the light emission pattern with a preset light emission pattern to specify the positions of the markers 5 in the captured images 7.
[0078] Next, the primary positioning unit 104 performs positioning possibility determination processing (Step S104). As the positioning possibility determination processing, for example, the primary positioning unit 104 determines whether two or more markers 5 are present in the captured image 7 and positioning can be performed based on the markers 5.
[0079] When it is determined that positioning can be performed based on the markers 5 (Step S104: YES), the primary positioning unit 104 performs primary positioning processing
[0080] (Step S105).
[0081] An example of the primary positioning processing will be now described. As described above, since the markers 5a and 5b can be individually identified from the captured image 7, the primary positioning unit 104 acquires the in-WC-system marker known positions 52 of the markers 5a and 5b based on the correspondence shown in
[0082] As the primary positioning processing, for example, the primary positioning unit 104 performs primary positioning of the imaging unit 2 by way of the PnP positioning processing, based on the in-image marker positions 51 and the in-WC-system marker known positions 52.
[0083] As is conventionally known, when positions of a group of points and positions in a captured image 7 are given, the position, the posture, and the like of the imaging unit 2 are determined by way of the PnP positioning processing. In a case where positioning is performed based on six points, the position and the imaging direction (an in-plane azimuth, an elevation angle) of the imaging unit 2 can be determined. Even in a case where the number of points is less than six, for example, when the imaging unit 2 is on a plane parallel to the ceiling 4 on which the markers 5 and the auxiliary markers 6 are disposed, the position and elevation angle of the imaging unit 2 can be determined by way of positioning based on only two points, even though the resulting accuracy is not high.
[0084] Since the movable body 3 moves on the floor surface parallel to the ceiling 4, the primary positioning unit 104 can determine, by way of a P2P method, the in-WC-system imaging primary position that includes the primary position and elevation angle of the imaging unit 2. The primary positioning unit 104 determines the in-WC-system imaging primary position, which is the primary position of the imaging unit 2, by way of the P2P method and using the in-image marker positions 51 denoted by identification numbers Ml and M2 and the in-WC-system marker known positions 52 denoted by identification numbers 5a and 5b shown in
[0085] In a case where the positioning cannot be performed based on the markers 5 (Step S104: NO), the primary positioning unit 104 performs position estimation processing (Step S106). As the position estimation processing, for example, the primary positioning unit 104 estimates the position of the imaging unit 2 based on a previously-acquired in-WC-system imaging unit determined position and a movement vector history, and defines the estimated position as a primary positioning result.
[0086] Next, the calculation unit 105 performs auxiliary marker recognition processing (Step S107). As the auxiliary marker recognition processing, for example, the calculation unit 105 determines whether or not an auxiliary marker 6 is present in the captured image 7.
[0087] When there is no auxiliary marker 6 in the captured image 7 (Step S107: NO), the primary positioning unit 104 performs primary positioning result application processing (Step S111). As the primary positioning result application processing, for example, the primary positioning unit 104 applies the primary positioning result as a determined positioning result.
[0088] When the auxiliary markers 6 are present in the captured image 7 (Step S107: YES), the calculation unit 105 performs position calculation processing for the auxiliary markers 6 (hereinafter, referred to as the auxiliary marker position calculation processing) (Step S108). The auxiliary marker position calculation processing is performed in the following manner, for example. On an assumption that the in-WC-system auxiliary marker known positions 62 are captured in an image 7 from the in-WC-system imaging primary position determined based on the primary positioning result, the calculating unit 105 calculates positions where the position of the in-WC-system auxiliary marker known positions 62 would appear on the captured image 7, in other words, positions 63 at which the in-WC-system auxiliary marker known positions 62 would be projected (hereinafter referred to as the calculative auxiliary marker positions 63).
[0089] This calculation is well-known processing for inversely calculating a position where an arbitrarily-designated three-dimensional point is drawn on an image, from the position and posture of the imaging unit 2. The calculation itself is a simple matrix calculation. Therefore, the projection will be implemented in an infinite view field, and the back side of the in-WC-system imaging unit position will also be plane-projected. Performing the PnP positioning processing in this state as it is will make the calculation illogical or cause a significant error. To address these inconveniences, the calculation according to the present embodiment is performed after sorting out only the auxiliary markers 6 at the in-WC-system auxiliary marker known positions 62 that would be approximately within the imaging angle of view of an image captured from the in-WC-system imaging primary position determined by the primary positioning.
[0090]
[0091] The positioning finalization unit 107 performs linking processing (Step S109). As the linking processing, for example, the positioning finalization unit 107 associates the in-image auxiliary marker positions 61 with the calculative auxiliary marker positions 63, based on the positional relationship in one captured image, and then, links the auxiliary markers 6 to the in-WC-system auxiliary marker known positions 62 based on which the calculated auxiliary marker positions 63 are determined.
[0092]
[0093] 13A and 13B are superimposed on each other. The calculative auxiliary marker positions 63 and the in-image auxiliary marker positions 61 are similar to each other but deviate from each other to some extent due to an error in the positioning. Among these points, ones that are closest to, or in proximity to, each other and that are in a positional relationship not disrupting the mutual positional relationship are linked to each other. In
[0094]
[0095] The primary positioning unit 104 calculates the in-WC-system imaging primary position of the imaging unit 2 based on the two markers 5. In this respect, the state in which the auxiliary markers 6 are linked to the in-WC-system auxiliary marker known positions 62 as shown in
[0096] The positioning finalization unit 107 further performs the PnP positioning processing (Step S110). For example, the positioning finalization unit 107 performs the PnP positioning processing, based on the in-WC-system marker known positions 52, the in-WC-system auxiliary marker known positions 62, the in-image marker positions 51, and the in-image auxiliary marker positions 61.
[0097] While the primary positioning has been performed according to the P2P positioning calculation, the processing in this step can be performed as a Pl8P problem in which positioning is performed based on the imaging results of the eighteen three-dimensional known points, whereby the accuracy is improved.
[0098] Then, the output unit 108 outputs the results of the PnP positioning processing as the determined positioning results.
[0099] The processing unit 100 performs positioning end determination processing (Step S112).
[0100] As the positioning end determination processing, the processing unit 100 ends the positioning process when positioning is to be ended (Step S112: YES). When the positioning is not to be ended (Step S112: NO), the processing unit 100 returns the positioning process to the stage denoted by “A” in the flowchart.
[0101] In the first embodiment, the P2P positioning processing is employed in the primary positioning, and two markers 5 are used as the minimum number of markers required for the primary positioning. The minimum number of the markers 5 may be set to six, i.e., P2P may be set as the minimum requirement for the positioning. This is because the position and imaging direction of the imaging unit 2 can be derived by way of the P2P positioning. Furthermore, the present embodiment is not limited to the above-described positioning based on two or six points, and is effective for all types of the PnP positioning processing.
Effects of the Embodiment
[0102] In the first embodiment, as the primary positioning method, the markers 5 are specified and identified by the imaging unit 2, thereby eliminating the need to input an initial state, such as the position and imaging direction of the imaging unit 2 to the information processing apparatus 10, and the positioning of the imaging unit 2 can be determined at any point where the markers 5 can be captured in an image. The function of identifying the markers 5 is effective even in a case where the markers 5 are received as very small points formed on a captured image. Therefore, the positioning system 1 can be suitably constructed by additional installation of a minimum number of markers 5 for practically implementing the primary positioning. As a result, the positioning system 1 can be constructed at low cost and in a short period of time.
[0103] The information processing apparatus 10 includes the processing unit 100 that acquires, from images 7 captured by the imaging unit 2 that is provided to the movable body 3 and including the markers 5 and the auxiliary markers 6 that are disposed in a space, in-image marker positions 51 indicating the positions of the markers 5 in the image coordinate system 71 and in-image auxiliary marker positions 61 indicating the positions of the auxiliary markers 6 in the image coordinate system 71 by way of image processing, acquires the in-WC-system imaging primary position indicating a three-dimensional position of the imaging unit 2 in the world coordinate system 70, based on the in-image marker positions 51 and the in-WC-system marker known positions 52 indicating the known three-dimensional positions of the markers 5 in the world coordinate system 70, associates the in-image auxiliary marker positions 61 with the in-WC-system auxiliary marker known positions 62, based on the in-WC-system imaging primary position and the in-WC-system auxiliary marker known positions 62 indicating three-dimensional positions of the auxiliary markers 6 in the world coordinate system 70, and determines the position of the imaging unit 2 based on the in-image auxiliary marker positions 61 and the in-WC-system auxiliary marker known positions 62.
[0104] The provided information processing apparatus 10 functions with a smaller number of markers 5 and derives positioning results with high positioning accuracy.
[0105] The information processing apparatus 10 does not experience difficulty in highly accurately tracking, which can be caused in the case of using a radio wave. Further, unlike Visual-SLAM and LiDAR-SLAM, the information processing apparatus 10 does not cause a disadvantage that determination of positioning is less easily assured, and is free from a problem of loop closing. As a result, a practical positioning system can be constructed.
[0106] The processing unit 100 included in the information processing apparatus 10 identifies, from the captured images 7, identifiable light sources that emit visible light and are identifiable based on light emission modes thereof as the markers 5, and acquires the in-image marker positions 51.
[0107] Each marker 5 can be reliably identified based on its unique light emission mode captured in images.
[0108] The processing unit 100 included in the information processing apparatus 10 identifies two or more markers 5 from the captured images 7, acquires a plurality of in-image marker positions 51, and acquires the in-WC-system imaging primary position based on the plurality of in-image marker positions 51 and the in-WC-system marker known positions 52 respectively corresponding to the markers 5.
[0109] Even a small number of the markers 5 are sufficient for the processing unit 100 to acquire the in-WC-system imaging primary position. It is only necessary to provide a small number of the markers 5, thereby making it possible to reduce the cost, the number of man-hours to start up, and the like.
[0110] The processing unit 100 included in the information processing apparatus 10 specifies the imaging position and imaging direction for the captured images 7 based on the in-WC-system imaging primary position, calculates the calculative auxiliary marker positions 63 by performing projection calculation processing on an assumption that the in-WC-system auxiliary marker known positions 62 are projected onto the captured image 7, and associates the in-image auxiliary marker positions 61 with the in-WC-system auxiliary marker known positions 62 by associating the in-image auxiliary marker positions 61 with the calculative auxiliary marker positions 63.
[0111] The position of the auxiliary markers 6, from which the in-image auxiliary marker positions 61 originate, are identified as the in-WC-system auxiliary marker known positions 62. The auxiliary markers 6 become usable in the PnP positioning processing, just like the markers 5.
[0112] As a result of the projection calculation processing, the processing unit 100 included in the information processing apparatus 10 performs associating processing by associating one in-WC-system auxiliary marker known position 62, based on which the calculative auxiliary marker position 63 located at or in proximity to one in-image auxiliary marker position 61 is calculated, with the one in-image auxiliary marker position 61.
[0113] The associating processing is reliably performed by a simple process of acquiring the in-image auxiliary marker position 61 located in proximity to the calculative auxiliary marker position 63 in the image coordinate system 71.
[0114] A program causes a computer to perform functions that include: a function of acquiring positions of the markers 5 and the auxiliary markers 6 by acquiring, from images 7 captured by the imaging unit 2 that is provided to the movable body 3 and including the markers 5 and the auxiliary markers 6 that are disposed in a space, the in-image marker positions 51 indicating the positions of the markers 5 in the image coordinate system 71 and the in-image auxiliary marker positions 61 indicating the positions of the auxiliary markers 6 in the image coordinate system 71 by way of image processing; a primary positioning function of acquiring the in-WC-system imaging primary position indicating a three-dimensional position of the imaging unit 2 in the world coordinate system 70, based on the in-image marker positions 51 and the in-WC-system marker known positions 52 indicating known three-dimensional positions of the markers 5 in the world coordinate system 70; and a positioning determination function of associating the in-image auxiliary marker positions 61 with the in-WC-system auxiliary marker known positions 62, based on the in-WC-system imaging primary position and the in-WC-system auxiliary marker known positions 62 indicating three-dimensional positions of the auxiliary markers 6 in the world coordinate system 70, and determining the position of the imaging unit 2 based on the in-image auxiliary marker positions 61 and the in-WC-system auxiliary marker known positions 62.
[0115] The provided program functions with a small number of markers 5 and derives positioning results with high positioning accuracy.
[0116] A positioning method performable by a computer includes: acquiring positions of the markers 5 and the auxiliary markers 6 by acquiring, from images 7 captured by the imaging unit 2 that is provided to the movable body 3 and including the markers 5 and the auxiliary markers 6 that are disposed in a space, in-image marker positions 51 indicating the positions of the markers 5 in the image coordinate system 71 and in-image auxiliary marker positions 61 indicating the positions of the auxiliary markers 6 in the image coordinate system 71 by way of image processing; performing primary positioning by acquiring the in-WC-system imaging primary position indicating the three-dimensional position of the imaging unit 2 in the world coordinate system 70, based on the in-image marker positions 51 and the in-WC-system marker known positions 52 indicating the known three-dimensional positions of the markers 5 in the world coordinate system 70; and determining positioning by associating the in-image auxiliary marker positions 61 with the in-WC-system auxiliary marker known positions 62, based on the in-WC-system imaging primary position and the in-WC-system auxiliary marker known positions 62 indicating the three-dimensional positions of the auxiliary markers 6 in the world coordinate system 70, and definitively determining the position of the imaging unit 2 based on the in-image auxiliary marker positions 61 and the in-WC-system auxiliary marker known positions 62.
[0117] The provided method functions with a small number of markers 5 and derives positioning results with high positioning accuracy.
[0118] Next, embodiments different from the first embodiment will be described. In the following description, components that are the same or similar to those described in the first embodiment are denoted by the same reference numerals, and description thereof may be omitted.
Second Embodiment
[0119] In the first embodiment, an example in which the position of the circular auxiliary marker 6 is set based on the center of the region of the auxiliary marker 6 has been described. In the second embodiment described below, the position of the auxiliary marker 6 is specified by a different method.
[0120]
[0121]
[0122] Processing for Extracting In-WC-System Auxiliary Marker Known Positions
[0123] Upon a start of creation of a table of the in-image auxiliary marker positions 61, the image processing unit 101 first performs auxiliary marker recognition processing (Step S201). As the auxiliary marker recognition processing, for example, the image processing unit 101 extracts regions with saturated luminance from a captured image 7, and assigns primary identification numbers to the extracted regions.
[0124] Further, the image processing unit 101 performs auxiliary marker sorting processing (Step S202). As the auxiliary marker sorting processing, for example, the image processing unit 101 performs sorting based on the shapes of the extracted regions. The image processing unit 101 sorts out, for example, an elliptic shape and a quadrangular shape, and excludes shapes overlapping with the edge of the image.
[0125] Subsequently, the image processing unit 101 performs table creation processing (Step S203). As the table creation processing, for example, the image processing unit 101 registers the extracted candidate regions in a primary table.
[0126] The auxiliary marker identification unit 106 performs selection processing (Step S204). As the selection processing, for example, the auxiliary marker identification unit 106 selects one of the regions.
[0127] The auxiliary marker identification unit 106 performs size determination processing. As the size determination processing, for example, the auxiliary marker identification unit 106 determines whether or not the size of the auxiliary marker 6 in the captured image 7 is equal to or larger than a predetermined size. The predetermined size is, for example, a size of 7×7 in terms of pixels of the captured image 7.
[0128] The size determination processing will be described with reference to
[0129] When the extracted region has a size equal to or larger than the predetermined size (Step S205: YES), the auxiliary marker identification unit 106 performs vertex registration processing (Step S206). As the vertex registration processing, for example, the auxiliary marker identification unit 106 registers the four vertexes of the auxiliary marker 6 having a quadrangular shape, as the in-image outline positions 64. As shown in
[0130] Subsequently, the auxiliary marker identification unit 106 performs center registration processing (Step S207). As the center registration process, for example, the auxiliary marker identification unit 106 registers the center of the extracted region as the in-image auxiliary marker position 61, regardless of the size of the extracted region.
[0131] The processing unit 100 performs process end determination processing (Step S208).
[0132] When the process is not to be ended (Step S208: NO), the processing unit 100 returns the positioning process to Step S204. When the process is to be ended (Step S208: YES), the processing unit 100 ends the creation of the list of the in-image auxiliary marker positions 61.
[0133]
[0134] In the present embodiment, lighting devices or light sources installed in the environment are used as the auxiliary markers 6, which can be used as a complement in a case where a small number of markers 5 are installed. When indoor lighting devices as the auxiliary markers 6 are normally captured in an image, luminance appears as a saturated region in the image. Therefore, to perform the first extraction of candidates for the auxiliary marker regions, it is only necessary to carry out simple binarizing processing using a saturation value, thereby obtaining highly stable image signals. In addition, lighting devices are generally disposed in a space with good visibility, and many of the lighting devices have a simple shape such as a round shape or a square shape, and can be suitably used as the auxiliary markers 6 of the present embodiment. The auxiliary marker 6 is not limited to such a lighting device, and may be any other device or object as long as it can be detected stably by a simple method in a scene of use. For example, it is conceivable to employ a high-chroma object or the like disposed in a low-chroma environment with good visibility because binarizing processing can be performed with a specific high-chroma threshold value.
[0135] As described above, in the second embodiment, the auxiliary marker 6 having a polygonal shape is used, and an increased number of the in-WC-system outline known positions 65 are obtained as a form of the in-WC-system auxiliary marker known position 62. This feature makes it possible to provide a plurality of in-WC-system auxiliary marker known positions 62 by means of one auxiliary marker or lighting device.
[0136] The processing unit 100 included in the information processing apparatus 10 acquires the in-image auxiliary marker positions 61 of the auxiliary marker 6 based on the positions of the vertexes of the polygonal shape of the auxiliary marker 6 in the captured image 7.
[0137] This feature makes it possible to increase the number of the in-WC-system auxiliary marker known positions 62 without increasing the number of auxiliary markers 6, thereby improving the accuracy of the PnP positioning processing.
[0138] In a case where the auxiliary marker 6 in the captured image 7 is of a size smaller than the predetermined size, the processing unit 100 included in the information processing apparatus 10 acquires the center position of the auxiliary marker 6 and acquires the in-image auxiliary marker position 61 based on the center position.
[0139] Consequently, the in-WC-system auxiliary marker known position 62 for use in the PnP positioning processing is optimized, thereby achieving higher positioning accuracy.
(Modifications of Marker)
[0140] In the positioning process described above, the markers 5 adapted to be identified are used in one primary positioning method. The primary positioning may be performed by other means. Examples of such means will be briefly described below.
[0141] In a case where the positioning determination unit 107 has been just completed the PnP positioning processing, the current position can be estimated from a difference based on a past position and a latest vector, and the current position can be defined as the in-WC-system imaging primary position. Nevertheless, this process is not available in an initial state.
[0142] An image of a two-dimensional code is captured by the imaging unit 2, and primary setting of the position of the imaging unit 2 can be performed based on the code and the imaging angle of the imaging unit 2. For the positioning performed by recognizing such a two-dimensional code or the like, it is preferable to display the two-dimensional code in a relatively large area on an imaging screen.
[0143] For example, an infrared sensor may be provided as a specific point, and primary setting of the position of the imaging unit 2 can be performed based on detection of passage through the specific point. In a case where the imaging unit 2 is mounted to the movable body 3, and it is guaranteed that the starting point of the movable body 3 is the specific point, primary setting of the position of the imaging unit 2 can be performed based on the position of the specific point. The information regarding the specific point may be stored as imaging unit installation parameters 24 in the database 19 illustrated in
[0144] As described above, the present invention is not limited to a marker that performs optical communication by changing the light emission modes as in the case of the markers 5 of the above embodiment, and other types of markers 5 can be used as described in the modifications.
[0145] Next, an example of processing performable after the in-WC-system auxiliary marker known positions 62 are identified in the captured image 7 will be described.
[0146]
[0147] For the sake of convenience,
[0148] In a similar manner as described in the first embodiment, the in-WC-system imaging primary position of the imaging unit 2 is calculated based on the in-image marker positions 51 and the in-WC-system marker known positions 52. Then, the calculative auxiliary marker positions 63 are calculated on the assumption that the auxiliary markers 6 located at the in-WC-system auxiliary marker known positions 62 are captured in an image from the in-WC-system imaging primary position. In other words, projection calculation processing is performed. Thereafter, for example, by associating the neighboring points, the auxiliary markers 6 from which the in-image auxiliary marker positions 61 originate are linked to the in-WC-system auxiliary marker known positions 62 on which the calculation is based. Thus, the auxiliary markers 6 shown in the captured image 7 are recognized as identified auxiliary markers 6 whose positions are known. Here, the auxiliary markers 6 serve as identifiable indicators, similarly to the markers 5. To describe this situation, the auxiliary markers 6 indicated by hollow circles in
[0149] When the movable body 3 continuously moves from this state, the images 7 captured by the imaging unit 2 change as illustrated in
[0150] The process described above is repeated.
[0151] Thus, the above-described configuration, in which the positions of the auxiliary markers 6 are continuously measured and identified subsequent to the identification of the markers 5 disposed only in the vicinity of the initial position of movement, makes it practical to install the markers 5 in the initial position even in a large area. That is, subsequent to the processing based on the markers 5, even if the markers 5 are not included in the captured images 7, the auxiliary markers 6 included in the captured images 7 allow for deriving the in-WC-system imaging unit determined position of the imaging unit 2.
[0152] In the foregoing, the positioning process based on the markers 5 has been described. It should be noted that the embodiments and modifications described above are not intended to limit the present invention, and the present invention encompasses improvements and the like within the range where the object of the present invention can be achieved.
[0153] Further, in the above embodiments, the information processing apparatus 10 to which the present invention is applied has been described by referring to the forklift as an example of the movable body 3, but the present invention is not limited thereto. For example, the present invention can be applied to general electronic apparatuses having an image processing function. Specifically, the present invention can be applied to, for example, a notebook personal computer, a portable navigation device, a mobile phone, a smartphone, and a portable game console.
[0154] The processing sequence described above can be executed by hardware, and can also be executed by software. In other words, the functional configuration of
[0155] In addition, a single functional block may be configured by a single piece of hardware, a single installation of software, or a combination thereof. The functional configurations of the present embodiment are realized by a processor executing arithmetic processing, and processors that can be used for the present embodiment include a unit configured by a single unit of a variety of single processing devices such as a single processor, multi-processor, multi-core processor, etc., and a unit in which the variety of processing devices are combined with a processing circuit such as ASIC (Application Specific Integrated Circuit) or FPGA (Field-Programmable Gate Array).
[0156] In the case of having the series of processing executed by software, the program constituting this software is installed from a network or recording medium to a computer or the like. The computer may be a computer equipped with dedicated hardware. In addition, the computer may be a computer capable of executing various functions, e.g., a general purpose personal computer, by installing various programs.
[0157] The storage medium containing such a program can not only be distributed separately from an apparatus main body to supply the program to a user, but also can be constituted by a storage medium or the like supplied to the user in a state incorporated in the apparatus main body in advance. The removable medium is composed of, for example, a magnetic disk (including a floppy disk), an optical disk, a magnetic optical disk, or the like. The optical disk is composed of, for example, a CD-ROM (Compact Disk-Read Only Memory), a DVD (Digital Versatile Disk), Blu-ray (Registered Trademark) disc or the like. The magnetic optical disk is composed of an MD (Mini-Disk) or the like. The storage medium supplied to the user in a state incorporated in the apparatus main body in advance is constituted by, for example, the ROM 12 in which the program is recorded or a hard disk (not shown).
[0158] It should be noted that, in the present specification, the steps defining the program recorded in the storage medium include not only the processing executed in a time series following this order, but also processing executed in parallel or individually, which is not necessarily executed in a time series. Further, in the present specification, the terminology of the system means an entire apparatus including a plurality of apparatuses and a plurality of units.
[0159] The embodiments of the present invention described above are only illustrative, and are not to limit the technical scope of the present invention. The present invention can assume various other embodiments. Additionally, it is possible to make various modifications thereto such as omissions or replacements within a scope not departing from the spirit of the present invention. These embodiments or modifications thereof are within the scope and the spirit of the invention described in the present specification, and within the scope of the invention recited in the claims and equivalents thereof.