Method and device to determine the camera position and angle
11715232 · 2023-08-01
Assignee
Inventors
Cpc classification
G06V20/588
PHYSICS
International classification
Abstract
The present disclosure provides a method and an apparatus for determining an attitude angle of a camera, capable of improving the accuracy of the attitude angle of the camera, and in turn the accuracy of the attitude of the camera that is obtained based on the attitude angle of the camera. The present disclosure can also improve the accuracy of object distance measurement and vehicle positioning based on the attitude angle of the camera. In the method for determining an attitude angle of a camera, the camera is fixed to one and the same rigid object in a vehicle along with an Inertial Measurement Unit (IMU). The method includes: obtaining IMU attitude angles outputted from the IMU and images captured by the camera; determining a target IMU attitude angle corresponding to each frame of image based on respective capturing time of the frames of images and respective outputting time of the IMU attitude angles; and determining an attitude angle of the camera corresponding to each frame of image based on a predetermined conversion relationship between a camera coordinate system for the camera and an IMU coordinate system for the IMU and the target IMU attitude angle corresponding to each frame of image.
Claims
1. A method of positioning a vehicle, comprising: determining, based on a plurality of inertial measurement unit (IMU) attitude angles, a target IMU attitude angle corresponding to each image of a plurality of images; and determining, based on the target IMU attitude angle, a position of the vehicle, wherein each of the plurality of IMU attitude angles has a corresponding output time and each of the plurality of images has a corresponding capture time, and wherein determining the target IMU attitude angle, for each of the plurality of images, comprises: selecting, from the plurality of IMU attitude angles, at least one IMU attitude angle whose output time matches the capture time of a current image, and determining, based on the at least one IMU attitude angle, the target IMU attitude angle corresponding to the current image.
2. The method of claim 1, further comprising: obtaining, from an inertial measurement unit (IMU), the plurality of IMU attitude angles; and obtaining, from a camera, the plurality of images.
3. The method of claim 2, wherein the camera and the IMU are fixed to a common rigid object.
4. The method of claim 2, further comprising: determining an attitude angle of the camera corresponding to each image of the plurality of images based on a predetermined conversion relationship between a camera coordinate system of the camera and an IMU coordinate system of the IMU and the target IMU attitude angle corresponding to each image.
5. The method of claim 4, wherein determining the position of the vehicle comprises: adjusting an external parameter between the camera coordinate system and a vehicle coordinate system; and determining an attitude angle of the vehicle in a world coordinate system based on a predetermined conversion relationship between the camera coordinate system and the vehicle coordinate system, a predetermined conversion relationship between the vehicle coordinate system and the world coordinate system, and the attitude angle of the camera.
6. The method of claim 5, wherein the external parameter is a matrix representation of the attitude angle of the camera or a translation value for an X-axis, a Y-axis and a Z-axis between the camera coordinate system and the world coordinate system.
7. The method of claim 1, wherein the selecting the at least one IMU attitude angle comprises: generating a sequence of IMU attitude angles by sorting the plurality of IMU attitude angles in an order based on based on the corresponding output time of each of the plurality of IMU attitude angles; moving a sliding window over the sequence of IMU attitude angles such the sliding window is centered at the capture time of the current image; and determining at least one of the sequence of IMU attitude angles within the sliding window as the at least one IMU attitude angle whose output time matches the capture time of the current image.
8. The method of claim 7, wherein a length of the sliding window is T seconds, wherein the capture time of the current image is t1, and wherein the sliding window spans a time range determined as [t1−T/2, t1+T/2].
9. A system for positioning a vehicle, comprising: a camera coupled to a rigid object on the vehicle; an inertial measurement unit (IMU) coupled to the rigid object; and a processor, coupled to the camera and the IMU, configured to: determine, based on a plurality of inertial measurement unit (IMU) attitude angles, a target IMU attitude angle corresponding to each image of a plurality of images, and determine, based on the target IMU attitude angle, a position of the vehicle, wherein each of the plurality of IMU attitude angles has a corresponding output time and each of the plurality of images has a corresponding capture time, and wherein the processor is further configured, as part of determining the target IMU attitude angle, for each of the plurality of images, to: select, from the plurality of IMU attitude angles, at least one IMU attitude angle whose output time matches the capture time of a current image, and determine, based on the at least one IMU attitude angle, the target IMU attitude angle corresponding to the current image.
10. The system of claim 9, wherein the processor is further configured, as part of determining the target IMU attitude angle corresponding to the current image, to: determine a movement curve of the camera or the rigid object; determine an IMU attitude angle curve by fitting the movement curve based on the at least one IMU attitude angle selected; and determine, from the IMU attitude angle curve, the target IMU attitude angle.
11. The system of claim 10, wherein the movement curve is a straight line.
12. The system of claim 10, wherein the movement curve is a parabola.
13. The system of claim 9, wherein the processor is further configured, as part of determining the target IMU attitude angle corresponding to the current image, to: calculate an average of the at least one IMU attitude angle selected to determine the target IMU attitude angle.
14. The system of claim 13, wherein the average is an arithmetic average.
15. The system of claim 13, wherein the average is a geometric average.
16. A non-transitory computer-readable storage medium having instructions stored thereupon for positioning a vehicle, comprising: instructions for obtaining a plurality of inertial measurement unit (IMU) attitude angles and a plurality of images, wherein each of the plurality of IMU attitude angles has a corresponding output time and each of the plurality of images has a corresponding capture time; instructions for determining, based on the plurality of IMU attitude angles, a target IMU attitude angle corresponding to each image of the plurality of images; and instructions for determining, based on the target IMU attitude angle, a position of the vehicle, wherein the instructions for determining the target IMU attitude angle, for each of the plurality of images, comprises: instructions for selecting, from the plurality of IMU attitude angles, at least one IMU attitude angle whose output time matches the capture time of a current image, and instructions for determining, based on the at least one IMU attitude angle, the target IMU attitude angle corresponding to the current image.
17. The non-transitory computer-readable storage medium of claim 16, further comprising: instructions for obtaining, from an inertial measurement unit (IMU), the plurality of IMU attitude angles; and instructions for obtaining, from a camera, the plurality of images.
18. The non-transitory computer-readable storage medium of claim 17, further comprising: instructions for determining an attitude angle of the camera corresponding to each image of the plurality of images based on a predetermined conversion relationship between a camera coordinate system of the camera and an IMU coordinate system of the IMU and the target IMU attitude angle corresponding to each image.
19. The non-transitory computer-readable storage medium of claim 18, wherein the instructions for determining the position of the vehicle comprise: instructions for adjusting an external parameter between the camera coordinate system and a vehicle coordinate system; and instructions for determining an attitude angle of the vehicle in a world coordinate system based on a predetermined conversion relationship between the camera coordinate system and the vehicle coordinate system, a predetermined conversion relationship between the vehicle coordinate system and the world coordinate system, and the attitude angle of the camera.
20. The non-transitory computer-readable storage medium of claim 19, wherein the position of the vehicle is based on the attitude angle of the vehicle in the world coordinate system and positioning information obtained from a positioning sensor.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The figures are provided for facilitating further understanding of the present disclosure. The figures constitute a portion of the description and can be used in combination with the embodiments of the present disclosure to interpret, rather than limiting, the present disclosure. In the figures:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
DETAILED DESCRIPTION OF THE EMBODIMENTS
(19) In the following, the solutions according to the embodiments of the present disclosure will be described clearly and completely with reference to the figures, such that the solutions can be better understood by those skilled in the art. Obviously, the embodiments described below are only some, rather than all, of the embodiments of the present disclosure. All other embodiments that can be obtained by those skilled in the art based on the embodiments described in the present disclosure without any inventive efforts are to be encompassed by the scope of the present disclosure.
(20) In the embodiments of the present disclosure, the conversion relationship between the camera coordinate system and the vehicle coordinate system, the conversion relationship between the camera coordinate system and the world coordinate system, the conversion relationship between the image coordinate system and the imaging plane coordinate system of the camera and the conversion relationship between the imaging plane coordinate system of the camera and the camera coordinate system are all prior arts. The conversions between the coordinate systems are not an inventive concept of the present disclosure.
(21) The method and apparatus for determining an attitude angle of a camera according to the embodiments of the present disclosure may be applied to a vehicle side or an aerial vehicle (e.g., unmanned aerial vehicle), an unmanned ship, a robot or the like. The present disclosure is not limited to any specific application scenarios.
Embodiment 1
(22) Referring to
(23) At step 101, IMU attitude angles outputted from the IMU and images captured by the camera are obtained.
(24) In the step 101, the obtaining of the IMU attitude angles outputted from the IMU and the images captured by the camera may be started at one and the same time point. For example, starting from a particular time point, n IMU attitude angles outputted from the IMU and m images captured by the camera may be obtained within a time period T. The images (or IMU attitude angles) may be obtained from the camera (or IMU) by transmitting a data obtaining request to the camera (or IMU). Alternatively, the camera (or IMU) may actively transmit the images (or IMU attitude angles) in accordance with a capturing period (or outputting period). The present disclosure is not limited to any specific scheme for obtaining the images and IMU attitude angles.
(25) At step 102, a target IMU attitude angle corresponding to each frame of image is determined based on respective capturing time of the frames of images and respective outputting time of the IMU attitude angles.
(26) At step 103, an attitude angle of the camera corresponding to each frame of image is determined based on a predetermined conversion relationship between a camera coordinate system for the camera and an IMU coordinate system for the IMU and the target IMU attitude angle corresponding to each frame of image.
(27) In an embodiment of the present disclosure, when the camera and the IMU are fixed to one and the same rigid object, there will be a fixed spatial position relationship between the camera and the IMU. As the camera and the IMU are fixed to one and the same rigid object, they may move in accordance with one and the same rigid movement model. Thus, when the vehicle is bumping, swaying, suddenly braking or sharply turning, the spatial position relationship between the camera and the IMU may be maintained unchanged.
(28) In an embodiment of the present disclosure, assuming that the camera coordinate system for the camera is represented as X.sub.cY.sub.cZ.sub.c and the IMU coordinate system for the IMU is represented as X.sub.cY.sub.cZ.sub.c, each of the IMU attitude angles outputted from the IMU and the target IMU attitude angle contains deflection angles on X-axis, Y-axis and Z-axis in X.sub.iY.sub.iZ.sub.i, denoted as W.sub.ix, W.sub.iy and W.sub.iz, respectively. In an implementation of the step 103, W.sub.ix, W.sub.iy and I.sub.iz may be converted into deflection angles on X-axis, Y-axis and Z-axis in X.sub.cY.sub.cZ.sub.c, denoted as W.sub.cx, W.sub.cy and W.sub.cz, respectively, based on the predetermined conversion relationship between the camera coordinate system for the camera and the IMU coordinate system for the IMU. Here, W.sub.cx, W.sub.cy and W.sub.cz are the attitude angle of the camera.
(29) Preferably, in an embodiment of the present disclosure, in a non-limiting implementation of the above step 102, the following steps A˜B may be performed for each frame of image.
(30) At step A, at least one IMU attitude angle whose outputting time matches the capturing time of the current frame of image is selected from the IMU attitude angles.
(31) At step B, the target IMU attitude angle corresponding to the current frame of image is determined based on the selected at least one IMU attitude angle.
(32) In particular, as shown in
(33) At step 201, at least one IMU attitude angle whose outputting time matches the capturing time of the current frame of image is selected from the IMU attitude angles.
(34) At step 202, the target IMU attitude angle corresponding to the current frame of image is determined based on the selected at least one IMU attitude angle.
(35) At step 203, it is determined whether the current frame of image is the last frame of image in the set. If so, the process ends; otherwise the process proceeds with step 204.
(36) At step 204, the next frame of image is used as the current frame of image and the step 201 is performed for the current frame of image.
(37) Preferably, in an embodiment of the present disclosure, the step A may be, but not limited to be, implemented in any of the following schemes (Schemes A1˜A3).
(38) Scheme A1: The IMU attitude angles are sorted in an order according to their respective outputting time, to obtain a sequence of IMU attitude angles. A predetermined sliding window is moved over the sequence of IMU attitude angles such that the sliding window is centered at the capturing time of the current frame of image, and at least one IMU attitude angle within the sliding window is determined as the at least one IMU attitude angle whose outputting time matches the capturing time of the current frame of image.
(39) It is assumed that the sliding window has a length of T seconds and the capturing time of the current frame of image is t.sub.1.
(40) Scheme A2: The IMU attitude angles are sorted in an order according to their respective outputting time, to obtain a sequence of IMU attitude angles. The frames of images are sorted in an order according to their respective capturing time, to obtain a sequence of images, as shown in
(41) Scheme A3: An absolute difference between the outputting time of each of the IMU attitude angles and the capturing time of the current frame of image is calculated. At least one IMU attitude angle each having a small absolute difference may be determined as the at least one IMU attitude angle whose outputting time matches the capturing time of the current frame of image.
(42) Preferably, in an embodiment of the present disclosure, the above step B may be, but not limited to be, implemented in any of the following schemes (Schemes B1˜B3).
(43) Scheme B1: A movement curve of the camera or the rigid object is obtained (in an embodiment of the present disclosure, as the camera and the IMU are fixed to the rigid object, it may be considered that the camera, the rigid object and the IMU have the same movement curve). An IMU attitude angle curve is obtained by fitting the movement curve based on the selected at least one IMU attitude angle. An IMU attitude angle corresponding to the capturing time of the current frame of image is obtained from the IMU attitude angle curve, as the target IMU attitude angle corresponding to the current frame of image.
(44) In Scheme B1, the movement curve of the camera, rigid object or IMU may be predetermined or may be calculated based on a movement trajectory of the camera, rigid object or IMU in accordance with an algorithm. The movement curve of the camera may be a straight line, as shown in
(45) Scheme B2: An average (e.g., arithmetical average or geometric average, as non-limiting examples) of the selected at least one IMU attitude angle is calculated, as the target IMU attitude angle corresponding to the current frame of image.
(46) Scheme B3: An absolute difference between the outputting time of each of the selected at least one IMU attitude angle and the capturing time of the current frame of image is calculated. The IMU attitude angle having the smallest absolute difference is determined as the target IMU attitude angle corresponding to the current frame of image.
Embodiment 2
(47) Based on the same concept as the method for determining an attitude angle of a camera according to the above Embodiment 1, according to Embodiment 2, an apparatus for determining an attitude angle of a camera is provided. The apparatus has a structure shown in
(48) An obtaining unit 51 is configured to obtain IMU attitude angles outputted from an IMU and images captured by a camera. The camera and the IMU are fixed to one and the same rigid object in a vehicle.
(49) A calculating unit 52 is configured to determine a target IMU attitude angle corresponding to each frame of image based on respective capturing time of the frames of images and respective outputting time of the IMU attitude angles.
(50) A determining unit 53 is configured to determine an attitude angle of the camera corresponding to each frame of image based on a predetermined conversion relationship between a camera coordinate system for the camera and an IMU coordinate system for the IMU and the target IMU attitude angle corresponding to each frame of image.
(51) Preferably, the calculating unit 52 is configured to perform the following steps for each frame of image.
(52) At step A, at least one IMU attitude angle whose outputting time matches the capturing time of the current frame of image is selected from the IMU attitude angles.
(53) At step B, the target IMU attitude angle corresponding to the current frame of image is determined based on the selected at least one IMU attitude angle.
(54) The step A may be, but not limited to be, implemented in any of Schemes A1˜A3 as described above in connection with Embodiment 1 and details thereof will be omitted here.
(55) The step B may be, but not limited to be, implemented in any of Schemes B1˜B3 as described above in connection with Embodiment 1 and details thereof will be omitted here.
(56) In Embodiment 2 of the present disclosure, assuming that the camera coordinate system for the camera is represented as X.sub.cY.sub.cZ.sub.c and the IMU coordinate system for the IMU is represented as X.sub.iY.sub.iZ.sub.i, each of the IMU attitude angles outputted from the IMU and the target IMU attitude angle contains deflection angles on X-axis, Y-axis and Z-axis in X.sub.iY.sub.iZ.sub.i, denoted as W.sub.ix, W.sub.iy and W.sub.iz, respectively. In a specific implementation, the determining unit 53 may be configured to convert W.sub.ix, W.sub.iy and W.sub.iz into deflection angles on X-axis, Y-axis and Z-axis in X.sub.cY.sub.cZ.sub.c, denoted as W.sub.cx, W.sub.cy and W.sub.cz, respectively, based on the predetermined conversion relationship between the camera coordinate system for the camera and the IMU coordinate system for the IMU. Here, W.sub.cx, W.sub.cy and W.sub.cz are the attitude angle of the camera.
(57) With the method for determining an attitude angle of a camera according to the above Embodiment 1 and the apparatus for determining an attitude angle of a camera according to Embodiment 2, the accuracy of the attitude angle of the camera may be improved. Accordingly, a pose of the camera determined based on the attitude angle of the camera may be improved. Moreover, results of other processes, e.g., object distance measurement and vehicle positioning, based on the attitude angle of the camera, may be more accurate. In the following, some embodiments will be described in detail, in which the attitude angle of the camera is used in application scenarios such as object distance measurement and vehicle positioning.
Embodiment 3
(58) According to Embodiment 3 of the present disclosure, a method for measuring a distance to an object is provided. The method includes the following steps for each frame of image captured by a camera, as shown in the flowchart of
(59) At step 601, an attitude angle of the camera corresponding to the current frame of image by using the method for determining the attitude angle of the camera as shown in
(60) At step 602, at least one external parameter between a camera coordinate system for the camera and a world coordinate system is adjusted based on the attitude angle of the camera.
(61) At step 603, respective image coordinates of position points on a lane line on a map in the current frame of image and respective distance information associated with the position points (the respective distance information indicating distances from the position points to the camera) are determined based on a predetermined conversion relationship between an image coordinate system and the world coordinate system and the at least one external parameter between the camera coordinate system and the world coordinate system.
(62) At step 604, the position point on the lane line on the map in the current frame of image that is closest to a target object is determined, and a distance to the target object is determined based on the distance information associated with the position point.
(63) For details of the above step 601, reference can be made to Embodiment 1 and description thereof will be omitted here.
(64) In an embodiment of the present disclosure, the at least one external parameter between the camera coordinate system for the camera and the world coordinate system may include R and t, where R is a matrix representation of the attitude angle of the camera, and t is a translation value for X-axis, Y-axis and Z-axis between the camera coordinate system and the world coordinate system and is constant. Before the vehicle moves, R has an initial value of 0. While the vehicle is moving, the value of R will vary as the attitude angle of the camera varies. Hence, R may be obtained by converting the attitude angle of the camera corresponding to each frame of image. In a specific implementation of the above step 602, R of the external parameter between the camera coordinate system for the camera and the world coordinate system is adjusted based on the attitude angle of the camera.
(65) In an embodiment of the present disclosure, a relationship between the image coordinate system of the camera, denoted as uv, and the imaging plane coordinate system of the camera, denoted as xy, is shown in
(66)
(67) where K is an internal parameter of the camera, which is determined at manufacture.
(68) In an embodiment of the present disclosure, the relationship among the imaging plane coordinate system of the camera (xy), the camera coordinate system (X.sub.cY.sub.cZ.sub.c) and the world coordinate system (X.sub.wY.sub.wZ.sub.w) is shown in
(69)
(70) In Equations (3) and (4), R and t are the external parameters between the camera coordinate system and the earth coordinate system, where R is a matrix representation of the attitude angle of the camera, and t is a translation value between the camera coordinate system and the world coordinate system.
Embodiment 4
(71) Based on the same concept as the method for measuring a distance to an object according to the above Embodiment 3, according to Embodiment 4, an apparatus for measuring a distance to an object is provided. The apparatus is connected to the apparatus for determining an attitude angle of a camera according to the above Embodiment 2. The apparatus has a structure shown in
(72) A first pose obtaining unit 81 is configured to obtain, from the apparatus for determining the attitude angle of the camera, the attitude angle of the camera corresponding to a current frame of image captured by the camera.
(73) A first adjusting unit 82 is configured to adjust at least one external parameter between a camera coordinate system for the camera and a world coordinate system based on the attitude angle of the camera.
(74) A mapping unit 83 is configured to determine respective image coordinates of position points on a lane line on a map in the current frame of image and respective distance information associated with the position points based on a predetermined conversion relationship between an image coordinate system and the world coordinate system and the at least one external parameter between the camera coordinate system and the world coordinate system.
(75) A distance determining unit 84 is configured to determine the position point on the lane line on the map in the current frame of image that is closest to a target object, and determine a distance to the target object based on the distance information associated with the position point.
(76) For details of the first adjusting unit 82, reference can be made to the step 602 and description thereof will be omitted here.
(77) For details of the mapping unit 83, reference can be made to the above Equation (4) and description thereof will be omitted here.
Embodiment 5
(78) According to Embodiment 5 of the present disclosure, a method for vehicle positioning is provided. The method includes the following steps, as shown in the flowchart of
(79) At step 901, an attitude angle of a camera corresponding to each frame of image captured by the camera is obtained by using the method for determining the attitude angle of the camera as shown in
(80) At step 902, at least one external parameter between a camera coordinate system and a vehicle coordinate system is adjusted based on the attitude angle of the camera.
(81) At step 903, an attitude angle of the vehicle in a world coordinate system is obtained based on a conversion relationship between the camera coordinate system and the vehicle coordinate system, a predetermined conversion relationship between the vehicle coordinate system and the world coordinate system, and the attitude angle of the camera.
(82) In an embodiment of the present disclosure, the attitude angle of the vehicle in the world coordinate system and position information obtained by a positioning sensor (e.g., Global Positioning System (GPS) or Global Navigation Satellite System (GNSS)) in the vehicle are used as positioning information of the vehicle.
(83) In an embodiment of the present disclosure, the at least one external parameter between the camera coordinate system and the vehicle coordinate system may include R and t, where R is a matrix representation of the attitude angle of the camera, and t is a translation value between the camera coordinate system and the vehicle coordinate system. Here, t is constant while the vehicle is moving and R varies as the attitude angle of the camera varies. That is, R may be obtained by converting the attitude angle of the camera corresponding to the current frame of image.
(84) In an embodiment of the present disclosure, the vehicle coordinate system is represented as X.sub.vY.sub.vZ.sub.v. The conversion relationship between the camera coordinate system and the vehicle coordinate system is represented as Equation (5). The conversion relationship between the camera coordinate system and the world coordinate system is represented as Equation (3).
(85)
Embodiment 6
(86) Based on the same concept as the method for vehicle positioning according to the above Embodiment 5, according to Embodiment 6, an apparatus for vehicle positioning is provided. The apparatus is connected to the apparatus for determining an attitude angle of a camera according to the above Embodiment 2. The apparatus has a structure shown in
(87) A second pose obtaining unit 10 is configured to obtain, from the apparatus for determining the attitude angle of the camera, the attitude angle of the camera corresponding to each frame of image captured by the camera.
(88) A second adjusting unit 11 is configured to adjust at least one external parameter between a camera coordinate system and a vehicle coordinate system based on the attitude angle of the camera.
(89) A positioning unit 12 is configured to obtain an attitude angle of the vehicle in a world coordinate system based on a conversion relationship between the camera coordinate system and the vehicle coordinate system and a predetermined conversion relationship between the vehicle coordinate system and the world coordinate system, and the attitude angle of the camera.
(90) For details of the positioning unit 12, reference can be made to the step 903 and description thereof will be omitted here.
(91) The basic principles of the present disclosure have been described above with reference to the embodiments. However, it can be appreciated by those skilled in the art that all or any of the steps or components of the method or apparatus according to the present disclosure can be implemented in hardware, firmware, software or any combination thereof in any computing device (including a processor, a storage medium, etc.) or a network of computing devices. This can be achieved by those skilled in the art using their basic programing skills based on the description of the present disclosure.
(92) It can be appreciated by those skilled in the art that all or part of the steps in the method according to the above embodiment can be implemented in hardware following instructions of a program. The program can be stored in a computer readable storage medium. The program, when executed, may include one or any combination of the steps in the method according to the above embodiment.
(93) Further, the functional units in the embodiments of the present disclosure can be integrated into one processing module or can be physically separate, or two or more units can be integrated into one module. Such integrated module can be implemented in hardware or software functional units. When implemented in software functional units and sold or used as a standalone product, the integrated module can be stored in a computer readable storage medium.
(94) It can be appreciated by those skilled in the art that the embodiments of the present disclosure can be implemented as a method, a system or a computer program product. The present disclosure may include pure hardware embodiments, pure software embodiments and any combination thereof. Also, the present disclosure may include a computer program product implemented on one or more computer readable storage mediums (including, but not limited to, magnetic disk storage and optical storage) containing computer readable program codes.
(95) The present disclosure has been described with reference to the flowcharts and/or block diagrams of the method, device (system) and computer program product according to the embodiments of the present disclosure. It can be appreciated that each process and/or block in the flowcharts and/or block diagrams, or any combination thereof, can be implemented by computer program instructions. Such computer program instructions can be provided to a general computer, a dedicated computer, an embedded processor or a processor of any other programmable data processing device to constitute a machine, such that the instructions executed by a processor of a computer or any other programmable data processing device can constitute means for implementing the functions specified by one or more processes in the flowcharts and/or one or more blocks in the block diagrams.
(96) These computer program instructions can also be stored in a computer readable memory that can direct a computer or any other programmable data processing device to operate in a particular way. Thus, the instructions stored in the computer readable memory constitute a manufacture including instruction means for implementing the functions specified by one or more processes in the flowcharts and/or one or more blocks in the block diagrams.
(97) These computer program instructions can also be loaded onto a computer or any other programmable data processing device, such that the computer or the programmable data processing device can perform a series of operations/steps to achieve a computer-implemented process. In this way, the instructions executed on the computer or the programmable data processing device can provide steps for implementing the functions specified by one or more processes in the flowcharts and/or one or more blocks in the block diagrams.
(98) While the embodiments of the present disclosure have described above, further alternatives and modifications can be made to these embodiments by those skilled in the art in light of the basic inventive concept of the present disclosure. The claims as attached are intended to cover the above embodiments and all these alternatives and modifications that fall within the scope of the present disclosure.
(99) Obviously, various modifications and variants can be made to the present disclosure by those skilled in the art without departing from the spirit and scope of the present disclosure. Therefore, these modifications and variants are to be encompassed by the present disclosure if they fall within the scope of the present disclosure as defined by the claims and their equivalents.