Endoscope guidance from interactive planar slices of a volume image
11596292 · 2023-03-07
Assignee
Inventors
Cpc classification
G16Z99/00
PHYSICS
A61B2090/365
HUMAN NECESSITIES
A61B34/20
HUMAN NECESSITIES
A61B90/37
HUMAN NECESSITIES
A61B90/36
HUMAN NECESSITIES
A61B5/066
HUMAN NECESSITIES
A61B2034/301
HUMAN NECESSITIES
International classification
A61B1/00
HUMAN NECESSITIES
A61B90/00
HUMAN NECESSITIES
A61B34/00
HUMAN NECESSITIES
A61B5/06
HUMAN NECESSITIES
A61B34/20
HUMAN NECESSITIES
Abstract
An endoscopic imaging system (10) employing an endoscope (20) and an endoscope guidance controller (30). In operation, endoscope (20) generates an endoscopic video (23) of an anatomical structure within an anatomical region. Endoscopic guidance controller (30), responsive to a registration between the endoscopic video (23) and a volume image (44) of the anatomical region, controls a user interaction (50) with a graphical user interface (31) including one or more interactive planar slices (32) of the volume image (44), and responsive to the user interaction (50) with the graphical user interface (31), endoscopic guidance controller (30) controls a positioning of the endoscope (20) relative to the anatomical structure derived from the interactive planar slices (32) of the volume image (44). A robotic endoscopic imaging system (11) incorporates a robot (23) in the endoscopic imaging system (10) whereby endoscope guidance controller (30) controls a positioning by robot (23) of the endoscope (20) relative to the anatomical structure.
Claims
1. An endoscopic imaging system, comprising: an endoscope for generating an endoscopic video of an anatomical structure within an anatomical region; and an endoscope guidance controller operably connected to the endoscope, the endoscopic guidance controller structurally configured to: based on a registration between the endoscopic video and a volume image of the anatomical region, control a graphical user interface to facilitate a user interaction with a slice selector configured to select and display at least one interactive planar slice of the volume image, the selected at least one interactive planar slice corresponding to a selected planar view of the anatomical structure, the user interaction comprising a user selection of a point of interest within the at least one interactive planar slice, and based on the selected planar view and the selected point of interest, guide the endoscope to a location within the anatomical region and position the endoscope to adjust a field of view of the endoscope to align with the selected planar view of the anatomical structure for imaging the selected point of interest within the endoscopic video.
2. The endoscopic imaging system of claim 1, wherein the at least one interactive planar slice comprises at least one of: an interactive axial slice of the volume image; an interactive coronal slice of the volume image; and an interactive sagittal slice of the volume image.
3. The endoscopic imaging system of claim 1, wherein: the endoscopic guidance controller is responsive to a user navigation through a plurality of interactive planar slices of the volume image, and the endoscopic guidance controller is further responsive to a user selection of the at least one interactive planar slice of the volume image derived from the user navigation through the plurality of interactive planar slices of the volume image.
4. The endoscopic imaging system of claim 1, wherein: the graphical user interface further comprises a display of the endoscopic video; the endoscopic guidance controller is responsive to the user selection of a point of interest within the at least one interactive planar slice of the volume image; and the endoscopic guidance controller controlling a display of the user selection of the point of interest within the endoscopic video.
5. The endoscopic imaging system of claim 1, wherein: the graphical user interface further comprises a display of a volume segmentation of the anatomical structure derived from the volume image; and the endoscopic guidance controller is further configured to display a clipping of the volume segmentation of the anatomical structure derived from the at least one interactive planar slice of the volume image.
6. The endoscopic imaging system of claim 5, wherein the endoscopic guidance controller is further responsive to a user selection of a point of interest within the clipping of the volume segmentation of the anatomical structure.
7. The endoscopic imaging system of claim 1, wherein the graphical user interface further comprises: the endoscopic video of the anatomical structure within the anatomical region; and an augmented endoscopic view of the volume image of the anatomical region.
8. The endoscopic imaging system according to claim 1, further comprising: a robot operably connected to the endoscopic guidance controller and the endoscope, the robot configured to position the endoscope relative to the anatomical structure within the anatomical region based on control by the endoscopic guidance controller.
9. The endoscopic imaging system according to claim 1, wherein the endoscope guidance controller is configured to determine a spatial differential between a current planar view of the endoscopic video and the selected planar view; and wherein the endoscope guidance controller is configured to guide the endoscope to adjust the field of view based on the determined spatial differential.
10. The endoscopic imaging system according to claim 1, wherein the endoscope guidance controller is configured to determine an angular differential between a current planar view of the endoscopic video and the selected planar view; and wherein the endoscope guidance controller is configured to guide the endoscope to adjust the field of view based on the determined angular differential.
11. The endoscopic imaging system according to claim 1, wherein the endoscope guidance controller is configured to determine a spatial differential and an angular differential between a current planar view of the endoscopic video and the selected planar view; and wherein the endoscope guidance controller is configured to guide the endoscope to adjust the field of view based on the determined spatial differential and angular differential.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
(11) To facilitate an understanding of the present disclosure, the following description of
(12) Referring to
(13) For the cardiac surgery, endoscope 20 is inserted into a patient's body through a small port into a cardiac region of the patient as known in the art. In operation, endoscope 20 is strategically positioned within the port to generate an endoscopic image of a heart within the cardiac region of the patient, and video capture device 21 converts the endoscopic image from endoscope 20 into an endoscopic video of computer readable temporal sequence of digital frames 22.
(14) Endoscope guidance controller 30 processes both the digital frames of the endoscopic video 23 and a volume image 42 of heart segmented from the cardiac region (e.g., a preoperative or intraoperative CT, MRI or X-ray 3D image of the segmented heart), and executes a registration routine for registering endoscopic video 23 and volume image 42 whereby a transformation between an imaging coordinate system of endoscope 20 and an imaging coordinate system of the volume imaging modality is known.
(15) In practice, any type of registration routine may be executed by endoscope guidance controller 30. In one embodiment for coronary artery bypass grafting as known in the art, endoscope guidance controller 30 executes a registration routine involving a matching of an arterial tree extracted from the segmented heart of volume image 42 to branches of the arterial tree viewed within endoscopic video 23. For this embodiment, endoscope guidance controller 30 may extract the arterial tree from the segmented heart of volume image 42 as known in the art, or process a volume image 44 of an extracted arterial tree.
(16) Based on the image registration, a user selection of a point of interest on volume image 42 may be used to transform that point of interest into an associated point within the imaging coordinate system of endoscope 20 whereby endoscope 20 may be guided towards the associated point for imaging such point within the endoscopic video. To this end, endoscope guidance controller 30 controls a user interaction (“UI”) 50 with a graphical user interface 31 including one or more interactive planar slice(s) 32 of volume image 44. Examples of an interactive planar slice 32 include, but are not limited to, an interactive axial slice 32a, an interactive coronal slice 32b and an interactive sagittal slice 32c as shown.
(17) To facilitate user interaction 50, in practice, graphical user interface 31 displays graphical icons, visual indicators, etc. for interfacing with interactive planar slice(s) 32. In one embodiment, graphical user interface 31 displays a slice selector of segmented volume image 42 and/or a screen slider for each displayed planer view of segmented volume image 42 whereby a hardware input device (e.g., a mouse, a keyboard, a joystick, a touchpad, etc.) may be user operated to manipulate the slice selector and/or screen slider as known in the art.
(18) User interaction 50 involves a user navigation through interactive planar slice(s) 32 of volume image 42 whereby the user may select particular planar view(s) of the segmented heart within volume image 42 to guide endoscope 20 within the cardiac region for positioning endoscope 20 with a field-of-view of endoscope 20 corresponding to the selected planar view(s) of the segmented heart. Alternatively, from the user selection of particular planar view(s) on interest of the segmented heart, the user may further select particular point(s) of interest within the selected planar view(s) for positioning endoscope 20 with a field-of-view of endoscope 20 corresponding to the selected point(s) of interest of the segmented heart.
(19) To supplement user interaction 50, in practice as known in the art, graphical user interface 31 may further include a display of an endoscopic video 23 of digital frames 22, an expanded endoscopic video 24 of volume image 42, and a volume image 45 of a rotatable segmented heart. In one embodiment, an overlay of the extracted arterial tree of volume image 44 may be overlaid on endoscopic video 23 and/or endoscopic video 24 as known in the art, and volume image 42 may be shown with a different coloring of the segmented heart and extracted arterial tree as known in the art.
(20) For guidance purposes of endoscope 20, in practice as known in the art, endoscope guidance controller 30 ascertains any spatial differential and/or any angular differential between a current planar view of endoscopic video 23, and a user selection of a planar view of interest and/or a point of interest via interactive planar slice(s) 32 of volume image 42. From an ascertained spatial differential and/or an ascertained angular differential, endoscope guidance controller determines an endoscopic path to position endoscope 20 for imaging the user selected planar view and/or the user selected point of interest. For the determination of the endoscopic path, endoscope guidance controller 30 finds a center location of the user selected planar view and/or a location of the user selected point of interest from a camera calibration matrix of endoscope 20 as known in the art, or from an uncalibrated visual servoing of endoscope 20 as known in the art involving an overlay of extracted arterial tree of volume image 44 upon endoscopic video 23 and/or segmented heart of volume image 42.
(21) For a manual guidance of endoscope 20, the determined endoscopic path is visually and/or audibly communicated 33 to the user of system 10 whereby the user may, by hand and/or a manipulation of an articulated platform/robot, linearly displace, pivot and/or revolve endoscope 20 relative to the incision port of the patient by a specified distance and/or specified degrees to position endoscope 20 for imaging the user selected planar view and/or the user selected point of interest.
(22) Referring to
(23) To facilitate a further understanding of the inventive principles of the present disclosure, the following description of
(24) Referring to
(25) Referring to
(26) An intraoperative phase involves oblique endoscope 20a being inserted into patient 90 through a small port into a cardiac region of patient 90 as known in the art. In operation, oblique endoscope 20a is strategically positioned within the port to initiate a generation by oblique endoscope 20a of an endoscopic video of heart 92 of patient 90, and endoscopic guidance controller 30a controls a further positioning by spherical robot 25a of oblique endoscope 20a relative to heart 92 for imaging a user selected planar view and/or a user selected point of interest of heart 92 in accordance with the inventive principle of graphical user interface 31a of the present disclosure.
(27) Specifically, an intraoperative stage S64 of flowchart 60 encompasses a user interaction via keyboard 14 with graphical user interface 31a as displayed on monitor 13 of workstation 12.
(28) In one embodiment as shown in
(29) (1) an endoscopic video 70 of heart 92 (
(30) (2) an augmented endoscopic video 71 of volume image 44a;
(31) (3) a volume image 72 of a rotatable segmented heart from volume image 42a; and
(32) (4) interactive axial slice 73a, interactive coronal slice 73b and interactive sagittal slice 73c of volume image 42a (
(33) In a second embodiment, volume image 72 may be clipped by a user selection of an interactive planar slice as shown
(34) In a third embodiment, volume image 72 may be clipped by a user selection of an interactive planar slice as shown
(35) The user interaction may involve a user navigation through one of interactive planar slices 73a-73c via volume image 42a via activation of a slice selector as known in the art whereby the user may select a particular planar view of the segmented heart within volume image 42a to guide oblique endoscope 20a within the cardiac region for positioning oblique endoscope 20a via spherical robot 25a with a field-of-view of oblique endoscope 20a corresponding to the selected planar view of the segmented heart.
(36) In practice, endoscope guidance controller 30a may display a user selection of the planar view of volume image 42a for user guidance contemplation purposes without any movement of oblique endoscope 20a by robot 25a until receiving a user confirmation of the user selection of the point of interest. Furthermore for user guidance contemplation purposes, the user selected planar view may displayed within augmented endoscopic video 71 and/or the segmented heart of volume image 42a may be rotated to centralize the user selected planar view.
(37) The user interaction may further involve a user navigation through one of interactive planar slices 73a-73c of volume image 42a via the slice selector as known in the art whereby the user may select a particular point of interest within a selected planar view for positioning oblique endoscope 20a via spherical robot 25a with a field-of-view of oblique endoscope 20a corresponding to the selected point of interest of the segmented heart. For example, as shown in
(38) In practice, endoscope guidance controller 30a may display a user selection of a point of interest on an interactive planar slice of volume image 42a for user guidance contemplation purposes without any movement of oblique endoscope 20a by spherical robot 25a until receiving a user confirmation of the user selection of the point of interest. Furthermore for user guidance contemplation purposes, the users selected point of interest may be displayed within augmented endoscopic video 71 and/or the segmented heart of volume image 42a may be rotated to centralize the user selected point of interest.
(39) The user interaction may further involve a user navigation through one or more interactive planar slices 73a-73c of volume image 42a via the slice selector as known in the art whereby the user may select a particular point of interest corresponding to an orthogonal intersection of interactive planar slices 73a-73c for positioning oblique endoscope 20a via spherical robot 25a with a field-of-view of oblique endoscope 20a corresponding to the selected point of interest of the segmented heart. For example, as shown in
(40) In practice, endoscope guidance controller 30a may display an orthogonal intersection of interactive planer images 73a-73c of volume image 42a for user guidance contemplation purposes without any movement of oblique endoscope 20a by robot 25a until receiving a user confirmation of the user selection of the point of interest. Furthermore for user guidance contemplation purposes, the orthogonal intersection of interactive planar slices 73a-73c may be displayed within augmented endoscopic video 71 and/or the segmented heart of volume image 42a may be rotated to centralize the orthogonal interaction of interactive planar slices 73a-73c.
(41) Referring back to
(42) Specifically, endoscope guidance controller 30 ascertains any spatial differential and/or any angular differential between a current planar view of endoscopic video 70, and a user selection of a planar view of interest and/or a point of interest via interactive planar slice(s) 73a-73c of volume image 42a. From an ascertained spatial differential and/or an ascertained angular differential, endoscope guidance controller determines an endoscopic path to position oblique endoscope 20a for imaging the user selected planar view and/or the user selected point of interest. For the determination of the endoscopic path, endoscope guidance controller 30 finds a center location of the user selected planar view and/or a location of the user selected point of interest from a camera calibration matrix of endoscope 20 as known in the art, or from an uncalibrated visual servoing of endoscope 20 as known in the art involving an overly of extracted arterial tree of volume image 44a upon endoscopic video 23 and/or segmented heart of volume image 42a.
(43) For automatic robotic guidance of oblique endoscope 20a, endoscope guidance controller 30a communicates robot actuation commands as known in the art to spherical robot 25a for pivoting and/or revolving oblique endoscope 20a relative to the incision port of the patient by specified degrees to position oblique endoscope 20a for imaging the user selected planar view and/or the user selected point of interest. For example, as shown in
(44) Referring back to
(45) Referring to
(46) Furthermore, as one having ordinary skill in the art will appreciate in view of the teachings provided herein, features, elements, components, etc. described in the present disclosure/specification and/or depicted in the
(47) Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (e.g., any elements developed that can perform the same or substantially similar function, regardless of structure). Thus, for example, it will be appreciated by one having ordinary skill in the art in view of the teachings provided herein that any block diagrams presented herein can represent conceptual views of illustrative system components and/or circuitry embodying the principles of the invention. Similarly, one having ordinary skill in the art should appreciate in view of the teachings provided herein that any flow charts, flow diagrams and the like can represent various processes which can be substantially represented in computer readable storage media and so executed by a computer, processor or other device with processing capabilities, whether or not such computer or processor is explicitly shown.
(48) Furthermore, exemplary embodiments of the present disclosure can take the form of a computer program product or application module accessible from a computer-usable and/or computer-readable storage medium providing program code and/or instructions for use by or in connection with, e.g., a computer or any instruction execution system. In accordance with the present disclosure, a computer-usable or computer readable storage medium can be any apparatus that can, e.g., include, store, communicate, propagate or transport the program for use by or in connection with the instruction execution system, apparatus or device. Such exemplary medium can be, e.g., an electronic, magnetic, optical, electromagnetic, infrared or semiconductor system (or apparatus or device) or a propagation medium. Examples of a computer-readable medium include, e.g., a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), flash (drive), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk read only memory (CD-ROM), compact disk read/write (CD-R/W) and DVD. Further, it should be understood that any new computer-readable medium which may hereafter be developed should also be considered as computer-readable medium as may be used or referred to in accordance with exemplary embodiments of the present disclosure and disclosure.
(49) Having described preferred and exemplary embodiments of novel and inventive endoscope guidance from volume image slices, (which embodiments are intended to be illustrative and not limiting), it is noted that modifications and variations can be made by persons having ordinary skill in the art in light of the teachings provided herein, including the
(50) Moreover, it is contemplated that corresponding and/or related systems incorporating and/or implementing the device or such as may be used/implemented in a device in accordance with the present disclosure are also contemplated and considered to be within the scope of the present disclosure. Further, corresponding and/or related method for manufacturing and/or using a device and/or system in accordance with the present disclosure are also contemplated and considered to be within the scope of the present disclosure.