Abstract
[Problem] To perform interpretation assistance of a back portion image in real time using a general-purpose computer for any case of scoliosis.
[Solution] An interpretation assistance apparatus includes a center line creating unit that creates a center line C in a back portion image of a subject, a measurement interval designating unit that accepts designation of an arbitrary measurement interval I along the center line C, a measurement width designating unit that accepts designation of an arbitrary measurement width W toward intersecting directions of the center line C, and a measurement point coordinate obtaining unit that obtains depth coordinates of respective measurement points that are apart from the center line on both the right and left sides by the measurement width W at every measurement interval I.
Claims
1. An interpretation assistance apparatus of a back portion image, comprising: a center line creating unit that creates a center line in a back portion image of a subject; a measurement interval designating unit that accepts designation of an arbitrary measurement interval along the center line; a measurement width designating unit that accepts designation of an arbitrary measurement width toward intersecting directions of the center line; and a measurement point coordinate obtaining unit that obtains depth coordinates of respective measurement points that are apart from the center line on both right and left sides by the measurement width at every measurement interval.
2. The interpretation assistance apparatus according to claim 1, further comprising a plot designating unit that accepts designation of an arbitrary plurality of plots in the back portion image, wherein the center line creating unit creates the center line so as to pass through the plots in the back portion image.
3. The interpretation assistance apparatus according to claim 1, further comprising a right left difference calculating unit that calculates a right-left difference between depth coordinates of paired right and left measurement points arranged across the center line at every measurement interval.
4. The interpretation assistance apparatus according to claim 3, further comprising a curved portion identifying unit that identifies a position where the right-left difference is at its maximum as a curved portion where a curve degree of a back portion of the subject is high by comparing the right-left difference at each measurement interval.
5. The interpretation assistance apparatus according to claim 1, further comprising a height difference calculating unit that calculates height differences between depth coordinates of paired right and left measurement points arranged across the center line at every measurement interval and a depth coordinate on the center line between the right and left measurement points.
6. The interpretation assistance apparatus according to claim 1, further comprising: a three-dimensional data obtaining unit that obtains three-dimensional data of a back portion of the subject; a moire image converting unit that converts the three-dimensional data into a two-dimensional moire image; an integrated image creating unit that creates an integrated image associating the three-dimensional data with the moire image; and a display unit that displays the integrated image as the back portion image on a display screen.
7. The interpretation assistance apparatus according to claim 4, wherein the center line is sectioned into regions corresponding to a cervical spine, a thoracic spine, a lumbar spine, and a sacral spine of the subject, and the curved portion identifying unit identifies the curved portion, and subsequently further determines to which region of the center line the curved portion belongs.
8. The interpretation assistance apparatus according to claim 2, wherein in a case where a plurality of the back portion images of the subject exist, the plot designating unit preliminarily accepts designation of the plots common to the plurality of images, the measurement interval designating unit preliminarily accepts designation of the measurement interval common to the plurality of images, the measurement width designating unit preliminarily accepts designation of the measurement width common to the plurality of images, the center line creating unit creates the center line that passes through the plots in each of the plurality of images, and the measurement point coordinate obtaining unit collectively obtains the depth coordinates in each of the plurality of images.
9. A program for causing a computer to function as the interpretation assistance apparatus according to claim 1.
10. An interpretation assistance method of a back portion image, comprising: a step of creating a center line that passes through the plots in a back portion image of a subject; a step of accepting designation of an arbitrary measurement interval along the center line; a step of accepting designation of an arbitrary measurement width toward intersecting directions of the center line; and a step of obtaining depth coordinates of respective measurement points that are apart from the center line on both right and left sides by the measurement width at every measurement interval.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0021] FIG. 1 is a block diagram illustrating a function configuration of an interpretation assistance apparatus.
[0022] FIG. 2 schematically illustrates an exemplary display screen of the interpretation assistance apparatus.
[0023] FIG. 3 schematically illustrates an operation method of the interpretation assistance apparatus.
[0024] FIG. 4 schematically illustrates an operation method of the interpretation assistance apparatus.
[0025] FIG. 5 illustrates respective areas of the spine in a human body and their names.
[0026] FIG. 6(a) indicates a measuring method of a height difference (h) in the prior art (mainly Patent Document 1), and FIG. 6(b) indicates a measuring method of a height difference (h) in the present invention.
DETAILED DESCRIPTION OF THE INVENTION
[0027] The following describes embodiments for performing the present invention using the drawings. The present invention is not limited to the embodiments described below, and includes modifications appropriately made by those skilled in the art from the following embodiments within an obvious scope.
[0028] FIG. 1 illustrates a function configuration of one embodiment of an interpretation assistance apparatus 1 according to the present invention. The interpretation assistance apparatus 1 is an apparatus for assisting interpretation of a back portion image of a subject (image capturing the back of the subject in front view). As illustrated in FIG. 1, the interpretation assistance apparatus 1 includes a central processing device 2, and a three-dimensional sensor 3, a display 4, and an input device 5 connected to this central processing device 2. The central processing device 2 causes the display 4 to display the back portion image read by the three-dimensional sensor 3, and accepts input of a predetermined operation and a predetermined setting for reading the back portion image from an operator via the input device 5. Then, the central processing device 2 performs predetermined computation processing based on this input information, and reflects the computation result in the back portion image displayed on the display 4. Thus, the interpretation assistance apparatus 1 basically processes the back portion image so as to allow the operator to easily interpret the back portion image, and displays predetermined assistance information in this back portion image.
[0029] The central processing device 2, which is in charge of controlling the entire interpretation assistance apparatus 1, performs predetermined computation processing for interpretation assistance based on information obtained by the three-dimensional sensor 3 and the input device 5, and displays the result thereof on the display 4. As the central processing device 2, a general-purpose computer can be used. The central processing device 2 includes a processor, such as a CPU, and a storage unit 10. By executing a program for interpretation assistance stored in the storage unit 10 by the processor, the general-purpose computer is caused to function as the central processing device 2 specific to the present invention. The central processing device 2 mainly includes the storage unit 10 and functional elements 11 to 22 executed by the processor. The storage function of the storage unit 10 can be achieved by a non-volatile memory, such as an HDD and SDD. In addition, the storage unit 10 may have a function as a memory for writing in or reading out interim progress and the like of computation processing by the processor. The memory function of the storage unit 10 can be achieved by a volatile memory, such as a RAM and a DRAM. Besides that, details of the respective functional elements 11 to 22 included in the central processing device 2 will be described later.
[0030] The three-dimensional sensor 3 is a device for capturing a back portion of the subject and obtaining its three-dimensional data. As illustrated in FIG. 1, the three-dimensional sensor 3 is disposed face to face with the back portion of the subject, and by capturing this back portion, obtains an uneven state of the back portion as three-dimensional data. As the three-dimensional sensor 3, a general medical device “3D BACK SCANNER™” manufactured and sold by the present applicant can be employed. The “3D BACK SCANNER™” uses an LED light source, captures the back portion of the subject in three dimensions, and can convert it into a moire pattern visual image to visually depict the back portion symmetry. Besides that, as the three-dimensional sensor 3, for example, one having a publicly known Time of Flight (TOF) system, or one having a laser pattern projection system can be employed. The three-dimensional TOF sensor includes a light source which irradiates an object body (subject) with an invisible inspection light, such as infrared rays, and an image sensor that receives the inspection light reflected on the object. The three-dimensional sensor irradiates the range of the angle of view with the pulse modulated inspection light, measures a phase lag of this pulse using the image sensor, and thereby obtains a round-trip distance to the object. The three-dimensional sensor having the laser pattern projection system irradiates the object body with an infrared ray pattern and obtains a distance image by triangulation. As the three-dimensional sensor having the laser pattern projection system, for example, the Kinect sensor (registered trademark) manufactured by Microsoft Corporation can be employed. Information detected by the three-dimensional sensor 3 is input to the central processing device 2 via a bus, such as an USB.
[0031] The display 4 is a device for displaying mainly the back portion image of the subject and the computation result of the central processing device 2. As the display 4, a publicly known display device, such as a liquid crystal display and an organic EL display, need to be employed.
[0032] The input device 5 is a device for accepting input of information from the operator to the central processing device 2. The information input via the input device 5 is input to the central processing device 2 via a bus, such as an USB. As the input device 5, various devices used as computer peripheral devices can be employed. Examples of the input device 5 include a touch panel, a keyboard, a computer mouse, a stylus pen, a button, a cursor, and a microphone, but the device is not limited to these. In addition, a touch panel display combining the input device 5 as the touch panel and the display 4 may be employed.
[0033] Subsequently, a process of displaying an integrated image that associates the three-dimensional data with the moire image on the display 4 will be described. A three-dimensional data obtaining unit 11 of the central processing device 2 obtains the three-dimensional data of the back portion of the subject. The three-dimensional data is information that records the uneven state of the back portion of the subject as, for example, xyz three-dimensional coordinate values. Therefore, with the three-dimensional data, a certain point on the back portion of the subject can be identified on the x-y coordinate, and the depth (that is, the height) on the point with respect to the three-dimensional sensor 3 can be identified on the z-coordinate. In the present embodiment, the three-dimensional data obtaining unit 11 obtains the three-dimensional data of the back portion of the subject based on information measured by the three-dimensional sensor 3. However, the three-dimensional data obtaining unit 11 can also read out the preliminarily recorded three-dimensional data of the back portion from the local storage unit 10. In addition, in a case where the central processing device 2 is connected to another server device via a network, such as the Internet or an intranet, the three-dimensional data obtaining unit 11 may obtain the preliminarily recorded three-dimensional data of the back portion from this server device.
[0034] A moire visual image converting unit 12 converts the three-dimensional data obtained by the three-dimensional data obtaining unit 11 into a moire visual image 12. An example of the moire visual image is illustrated in FIG. 2. That is, the moire visual image converting unit 12 refers to the three-dimensional coordinates of the back portion of the subject and, by superimposingly displaying contour lines (moire fringes) connecting points having z-coordinate values that belong in the same levels (predetermined threshold value ranges) in the captured image of the back portion of the subject, generates a moire visual image as indicated in FIG. 2. The generated moire visual image here is stored in the storage unit 10.
[0035] An integrated image creating unit 13 creates an integrated image associating the moire visual image created by the moire visual image converting unit 12 with the three-dimensional data obtained by the three-dimensional data obtaining unit 11. That is, while the moire visual image itself does not include three-dimensional coordinate data, the integrated image creating unit 13 associates the three-dimensional data that has become the base of the moire visual image with this moire visual image, and thereby matches the xyz three-dimensional coordinates to the respective points in the moire visual image. Accordingly, for example, by designating a certain point on the moire visual image, an integrated image that allows obtaining an xyz-coordinate of the point can be obtained.
[0036] A display unit 14 causes the integrated image created by the integrated image creating unit 13 to be displayed on the display screen of the display 4. Accordingly, as illustrated in FIG. 2, the integrated image associating the moire visual image with the three-dimensional data is displayed as a back portion image. The operator confirms this integrated image (moire visual image) by visual observation while performing designation operations of plots, a measurement interval, and a measurement width, which will be described next.
[0037] Subsequently, the designation operations of the plots, the measurement interval, and the measurement width will be described. FIG. 3 schematically illustrates back portion images (integrated images) displayed on the display 4. As described above, the moire visual images are superimposingly displayed in the back portion images, but here the moire fringes are omitted to avoid complicating the drawing.
[0038] As illustrated in Step 1 of FIG. 3, the operator designates a plurality of plots P1, P2 in the back portion image via the input device 5. A plot designating unit 15 of the central processing device 2 accepts the designation of the plots P1, P2 from the input device 5. The plots need to be on at least two positions, but can be designated on three positions or more, or four positions or more. The plots on two positions are basically designated near the upper end (P1) and near the lower end (P2) of the spine of the subject. The plots P1, P2 on two positions can be arbitrarily designated by the operator. However, for example, a function of assisting the plot P1 and the plot P2 to be aligned on the x-axis of the back portion image that is an x-y plane may be provided in the plot designating unit 15. In addition, the plot designating unit 15 may further accept designation of one or a plurality of intermediary plots between these plots P1, P2 on the two positions near the upper end and near the lower end. Alternatively, the plot designating unit 15 may have a function of scanning the obtained three-dimensional data, extracting points having z-coordinates indicating the depth that are equal to or less than a predetermined value as the outline of the back portion of the subject, among these, obtaining the respective coordinates on the left and right ends of the outline on the upper side in the y-axis direction (such as the neck or shoulder line) and on the left and right ends of the outline on the lower side (such as the waist line), and automatically designating the plots P1, P2 on two positions as midpoints of these left and right ends.
[0039] Next, as illustrated in Step 2 of FIG. 3, a center line creating unit 16 creates a rectilinear center line C in the back portion image so as to pass through the plots P1, P2 whose designation has been accepted by the plot designating unit 15. This center line C is assumed to be a line along the spine of the subject in the back portion image. However, depending on the case of scoliosis, the center line C and the actual spine may possibly be displaced. Note that, as long as the center line C passes through the plots P1, P2, it may be a line inclined with respect to the x-axis of the back portion image as an x-y plane, or may be a perpendicular line in parallel to the x-axis of the back portion image. In addition, as described above, in a case where the plot designating unit 15 has a function of assisting the plot P1 and the plot P2 to be aligned on the x-axis of the back portion image, the center line C is always a perpendicular line in parallel to the x-axis of the back portion image.
[0040] Next, as illustrated in Step 3 of FIG. 3, the operator designates an arbitrary measurement interval I along the center line C via the input device 5. A measurement interval designating unit 17 of the central processing device 2 accepts the designation of the measurement interval I from the input device 5. In the present embodiment, the measurement interval I can be designated by, for example, inputting an arbitrary length (mm or cm). In addition, by designating one measurement interval I on the center line C, each of the measurement intervals I is set to be an interval the same as this. However, the respective measurement intervals I may be able to be set one by one individually, or may be divided into a plurality of groups to be set for each group. In addition, as a designation method of the measurement interval I, for example, input of the number of intervals (N) disposed on the center line C may be obtained. In this case, the measurement interval I can be automatically calculated by dividing the length of the center line C by the input number of intervals (N).
[0041] Next, as illustrated in Step 4 of FIG. 3, the operator designates an arbitrary measurement width W toward the intersecting directions (specifically, perpendicular directions) of the center line via the input device 5. A measurement width designating unit 18 of the central processing device 2 accepts the designation of the measurement width W from the input device 5. In the present embodiment, the measurement width W has equally spaced intervals on the right and left, and all the measurement intervals have the same measurement width W. Therefore, the operator only needs to input one measurement width W. However, the measurement width W may be able to be set one by one individually for each measurement interval, or the respective intervals may be divided into a plurality of groups, and the measurement width may be able to be set for each group.
[0042] Steps 1 to 4 illustrated in FIG. 3 are processes accompanying the input operation of the operator. In the processes after this, the coordinate value of each measurement point and the like is automatically calculated based on the plots P1, P2, the measurement interval I, and the measurement width W input in the respective steps hitherto.
[0043] As illustrated in Step 5 of FIG. 4, a measurement point coordinate obtaining unit 19 of the central processing device 2 obtains three-dimensional coordinates (xyz-coordinates) of respective measurement points L, R that are apart from the center line on both the right and left sides by the measurement width W at every measurement interval I. Especially, in the present invention, a depth coordinate (z-coordinate) of each measurement point is necessary. Note that, in FIG. 4, the measurement point on the left side of the center line C is indicated by reference sign L, the measurement point on the right side is indicated by reference sign R, and numbers such as L1 and L2 or R1 and R2 are assigned in the order from the upper measurement points. In addition, for convenience, the n-th right and left measurement points from the top are respectively assumed to be Ln and Rn. As described above, the integrated image displayed as the back portion image includes three-dimensional data. Therefore, by identifying the measurement points L, R in the back portion image, the three-dimensional coordinate of each measurement point L, R can be obtained from the three-dimensional data included in this integrated image. In addition, the measurement point coordinate obtaining unit 19 may obtain an xyz-coordinate of an intermediate point on the center line C between the paired right and left measurement points L, R. In FIG. 4, the intermediate points of the center line C are assigned reference numerals, such as C1 and C2, in order from the top. The coordinate value of each measurement point obtained by the measurement point coordinate obtaining unit 19 is recorded in the storage unit 10.
[0044] Next, as illustrated in Step 6 of FIG. 4, a right-left difference calculating unit 20 calculates a difference (referred to as “right-left difference”) between the depth coordinates (z-coordinates) on the paired right and left measurement points L, R disposed at symmetrical positions of the center line C at every measurement interval I. The right-left difference calculating unit 20 only needs to, for example, obtain an absolute value of the difference between the right and left measurement points L, R by the formula: |Ln(z)−Rn(z)|.
[0045] Next, as illustrated in Step 7 of FIG. 4, a curved portion identifying unit 21 compares the right-left difference between the right and left measurement points L, R calculated by the right-left difference calculating unit 20 at each measurement interval, and thereby identifies a position where this right-left difference is at its maximum as a curved portion M where a curve degree of the back portion of the subject is high. Thus, identifying the curved portion M where the right-left difference is at its maximum is one process that is important in the examination of scoliosis. By automatizing this process, the interpretation of the back portion image by the operator can be efficiently assisted.
[0046] In addition, as illustrated in FIG. 5, the spine of a human body is generally sectioned into regions of the cervical spine, the thoracic spine, the lumbar spine, and the sacral spine. Therefore, also in the present embodiment, the center line C in the back portion image is sectioned into four regions (cervical spine, thoracic spine, lumbar spine, and sacral spine) according to the regions of the spine of a human body. In this case, the curved portion identifying unit 21 may identify the curved portion M as described above, and then determine in which region of the center line this curved portion M belongs. In the examination of scoliosis, it is diagnosed in which region the curved portion M belongs, and by automatizing this process, the interpretation of the back portion image by the operator can be assisted further efficiently.
[0047] Next, as illustrated in Step 8 of FIG. 4, a height difference calculating unit 22 calculates the height differences between the depth coordinates of the respective paired right and left measurement points disposed in symmetrical positions across the center line C, and the depth coordinate of the intermediate point on the center line between the right and left measurement points at every measurement interval. For example, it is assumed that the depth coordinate of the left side measurement point Ln is Ln(z), the depth coordinate of the right side measurement point Rn is Rn(z), and the depth coordinate of the intermediate point Cn positioned between these right and left measurement points Ln, Rn is Cn(z). In this case, the height difference calculating unit 22 performs a computation of the formula: Ln(z)+Cn(z) and the formula: Rn(z)−Cn(z). Accordingly, information on what degree of height difference the respective right and left measurement points have compared with the intermediate point is quantified.
[0048] As illustrated in the block diagram of FIG. 1, the information obtained by the right-left difference calculating unit 20, the curved portion identifying unit 21, and the height difference calculating unit 22 is transmitted to the display unit 14, and output from this display unit 14 to the display 4. In FIG. 2, an exemplary screen displayed on the display 4 is illustrated. First, among the right-left differences between the right and left measurement points obtained by the right-left difference calculating unit 20, the curved portion where the right-left difference is at its maximum is highlighted by a bold line or the like in the back portion image. Then, the value of the right-left difference is displayed in a column “Automatic Calculation: Maximum Difference: XX.X mm”. In addition, the region in which the curved portion identified by the curved portion identifying unit 21 belongs is displayed in a column “Portion: thoracic spine” (“thoracic spine” is an exemplification). Further, the height differences between the right and left measurement points and the intermediate point obtained by the height difference calculating unit 22 are displayed in association with the respective measurement points in the back portion image. In addition, among the height differences obtained by the height difference calculating unit 22, the value of the height difference between the left side measurement point and the right side measurement point equivalent to the curved portion is displayed in columns “Left: XX.X mm” and “Right: XX.X mm”. Thus, on the display 4, along with the back portion image of the subject to which the moire fringes are given, various information useful for examination of scoliosis is displayed like a list. In addition, in a case where a user has determined that the curved portion where the automatically calculated right-left difference is at its maximum is displaced, it is possible to shift the bold line up and down in the y-axis direction to finely adjust it by the operation of the input device 5. The result is displayed in a column “Manual Calculation: Maximum Difference: XX.X mm”.
[0049] FIG. 5 schematically illustrates a difference between the prior art (Patent Document 1: WO2013/081030) and the present invention. As indicated in FIG. 5(a), in the prior art, projected peaks positioned on both the right and left sides of the spine in the back portion of the subject are identified, and the height difference h between these peak positions is calculated. However, as indicated in FIG. 5(b), in a case where a projected peak does not exist on both the right and left sides of the spine (center line C), the prior art does not function. In contrast to this, in the present invention, as indicated in FIG. 5(b), the center line C can be arbitrarily designated, and the measurement width W in both the right and left side directions from this center line C can also be arbitrarily designated. Subsequently, in the present invention, the height difference h (that is, the right-left difference between the depth coordinates) between the right and left measurement points Ln, Rn identified by the measurement width W is obtained. Therefore, the present invention does not depend on projected peaks, and even in a case where a peak does not exist on both the right and left sides of the center line C, the height difference h between the right and left measurement points Ln, Rn can be obtained. Therefore, the interpretation assistance apparatus 1 of the present invention is generally applicable to any case of scoliosis.
[0050] In addition, the interpretation assistance apparatus 1 of the present invention has a function of collectively processing a plurality of back portion images as interpretation targets in a case where these plurality of back portion images exist. Note that, while the plurality of back portion images as the interpretation targets are all preferred to be those of the same subject, there is no problem in the process even if a back portion image of a different subject is included. First, the display unit 14 displays a representative back portion image (representative image) among the plurality of back portion images on the display 4. The plot designating unit 15 preliminarily accepts designation of a plurality of plots common to the plurality of back portion images as the interpretation targets in the representative image. The measurement interval designating unit 17 preliminarily accepts designation of a measurement interval common to the plurality of back portion images. The measurement width designating unit 18 preliminarily accepts designation of a measurement width common to the plurality of back portion images. The center line creating unit 16 creates a center line that passes through the designated plurality of plots in each of the plurality of back portion images. Then, the measurement point coordinate obtaining unit 19 collectively obtains the three-dimensional coordinate values (especially the depth coordinates) of the measurement points in each of the plurality of back portion images. Subsequently, for the right-left difference calculating unit 20, the curved portion identifying unit 21, and the height difference calculating unit 22, processes similar to those described above are performed on each of the plurality of back portion images as the interpretation targets. Thus, by accepting designation of the plots, the measurement interval, and the measurement width common to the plurality of back portion images, and collectively obtaining the coordinate values of the measurement points in the respective back portion images, the plurality of back portion images can be simultaneously processed by few input operations. Accordingly, interpretation processing time can be substantially shortened.
[0051] In the present application, the embodiments of the present invention have been described above by referring to the drawings to express the contents of the present invention. However, the present invention is not limited to the embodiments described above, and includes changed embodiments and improved embodiments obvious to those skilled in the art based on the matters described in the present application.
DESCRIPTION OF REFERENCE SIGNS
[0052] 1. . . . interpretation assistance apparatus [0053] 2. . . . central processing device [0054] 3. . . . three-dimensional sensor [0055] 4. . . . display [0056] 5. . . . input device [0057] 10. . . . storage unit [0058] 11. . . . three-dimensional data obtaining unit [0059] 12. . . . moire visual image converting unit [0060] 13. . . . integrated image creating unit [0061] 14. . . . display unit [0062] 15. . . . plot designating unit [0063] 16. . . . center line creating unit [0064] 17. . . . measurement interval designating unit [0065] 18. . . . measurement width designating unit [0066] 19. . . . measurement point coordinate obtaining unit [0067] 20. . . . right-left difference calculating unit [0068] 21. . . . curved portion identifying unit [0069] 22. . . . height difference calculating unit