IMAGE ACQUISITION VISUALS FOR AUGMENTED REALITY
20220392607 · 2022-12-08
Inventors
Cpc classification
A61B2090/365
HUMAN NECESSITIES
G16H40/20
PHYSICS
A61B34/20
HUMAN NECESSITIES
G16H20/40
PHYSICS
G16H50/20
PHYSICS
A61B90/37
HUMAN NECESSITIES
International classification
G16H20/40
PHYSICS
A61B34/20
HUMAN NECESSITIES
A61B90/00
HUMAN NECESSITIES
Abstract
Various embodiments of the present disclosure encompass a visual image sequence controller (40) for controlling an augmentation of a visual imaging sequence acquisition of an interventional imaging sequence in a rendered augmented view of an image-guided intervention by an augmented reality device (30). The visual imaging sequence acquisition (42) includes, for each inter-ventional image of the interventional imaging sequence (23), an interactive virtual indicator of a sequence number of a corresponding interventional image within the interventional imaging sequence (23) and an imaging parameter of the imaging modality (20) (e.g., a direction, a rotation and/or an angulation of the imaging modality (20)) during an acquisition of the corresponding interventional image by the imaging modality (20). The controller (40) interfaces with the imaging modality (20) and/or a display (24, 31) responsive to a user interaction with at least one of the interactive virtual indicators.
Claims
1. A visual image sequence controller (40) for an augmented image-guided intervention system including an augmented reality device (30) operable to render an augmented view of an image-guided intervention, the visual image sequence controller (40) comprising: a non-transitory machine-readable storage medium encoded with instructions for execution by at least one processor to control an augmentation of a visual imaging sequence acquisition of an interventional imaging sequence in a rendered augmented view of the image-guided intervention by the augmented reality device (30), wherein the non-transitory machine-readable storage medium includes instructions to: access geometric information (41) of an imaging modality (20) associated with an acquisition of an interventional imaging sequence (23) of interventional images by the imaging modality (20) during the image-guided intervention; generate the visual imaging sequence acquisition (42) of the interventional imaging sequence (23) including, for each interventional image of the interventional imaging sequence (23), an interactive virtual indicator of a sequence number of a corresponding interventional image within the interventional imaging sequence and of an imaging parameter of the imaging modality (20) during an acquisition of the corresponding interventional image by the imaging modality (20); augment the view of the image-guided intervention by the augmented reality device (30) with the interactive virtual indicators as virtual objects indicative of the acquisition of the interventional imaging sequence (23) of interventional images by the imaging modality (20) during the image-guided intervention; and interface with the at least one of the imaging modality (20) and a display (24, 31) responsive to a user interaction with at least one of the interactive virtual indicators.
2. The visual image sequence controller (40) of claim 1, wherein the non-transitory machine-readable storage medium further includes instructions to annotate the visual imaging sequence acquisition (42) of the interventional imaging sequence (23).
3. The visual image sequence controller (40) of claim 1, wherein the non-transitory machine-readable storage medium further includes instructions to control an image display interaction between the visual imaging sequence acquisition (42) of the interventional imaging sequence (23) and a display (24, 31).
4. The visual image sequence controller (40) of claim 1, wherein the non-transitory machine-readable storage medium further includes instructions to control an imaging modality posing interaction between the visual imaging sequence acquisition (42) of the interventional imaging sequence (23) and the imaging modality (20).
5. The visual image sequence controller (40) of claim 1, wherein the visual image sequence controller (40) is installable within the augmented reality device (30).
6. The visual image sequence controller (40) of claim 1, wherein the visual image sequence controller (40) is installable within an imaging modality (20); and wherein, when installed within an imaging modality (20), the non-transitory machine-readable storage medium further includes instructions to communicate the visual imaging sequence acquisition (42) of the interventional imaging sequence (23) from the imaging modality (20) to the augmented reality device (30).
7. The visual image sequence controller (40) of claim 1, wherein the visual image sequence controller (40) is installable within an auxiliary intervention device; and wherein, when installed within an auxiliary intervention device, the non-transitory machine-readable storage medium further includes instructions to communicate the visual imaging sequence acquisition (42) of the interventional imaging sequence (23) from the auxiliary intervention device to the augmented reality device (30).
8. An augmented image-guided intervention system, comprising: an augmented reality device (30) operable to render an augmented view of an image-guided intervention; and a visual image sequence controller (40) configured to control an augmentation of a visual imaging sequence acquisition of an interventional imaging sequence in a rendered augmented view of the image-guided intervention by the augmented reality device (30), wherein the visual image sequence controller (40) is configured to: access geometric information (41) of an imaging modality (20) associated with an acquisition of an interventional imaging sequence (23) of interventional images by the imaging modality (20) during the image-guided intervention; generate the visual imaging sequence acquisition (42) of the interventional imaging sequence (23) including, for each interventional image of the interventional imaging sequence (23), an interactive virtual indicator of a sequence number of a corresponding interventional image within the interventional imaging sequence and of an imaging parameter of the imaging modality (20) during an acquisition of the corresponding interventional image by the imaging modality (20); augment the view of the image-guided intervention by the augmented reality device (30) with the interactive virtual indicators as virtual objects indicative of the acquisition of the interventional imaging sequence (23) of interventional images by the imaging modality (20) during the image-guided intervention; and interface with the at least one of the imaging modality (20) and a display (24, 31) responsive to a user interaction with at least one of the interactive virtual indicators.
9. The augmented image-guided intervention system of claim 8, wherein the visual image sequence controller (40) is further configured to annotate the visual imaging sequence acquisition (42) of the interventional imaging sequence (23).
10. The augmented image-guided intervention system of claim 8, wherein the visual image sequence controller (40) is further configured to control an image display interaction between the visual imaging sequence acquisition (42) of the interventional imaging sequence (23) and a display (24, 31).
11. The augmented image-guided intervention system of claim 8, wherein the visual image sequence controller (40) is further configured to control an imaging modality posing interaction between the visual imaging sequence acquisition (42) of the interventional imaging sequence (23) and the imaging modality (20).
12. The augmented image-guided intervention system of claim 8, wherein the augmented reality device (30) comprises the visual image sequence controller (40).
13. The augmented image-guided intervention system of claim 8, wherein the visual image sequence controller (40) is installable within an imaging modality (20); and wherein the visual image sequence controller (40) is further configured to communicate the visual imaging sequence acquisition (42) of the interventional imaging sequence (23) from the imaging modality (20) to the augmented reality device (30).
14. The augmented image-guided intervention system of claim 8, wherein the visual image sequence controller (40) is installable within an auxiliary intervention device; and wherein the visual image sequence controller (40) is further configured to communicate the visual imaging sequence acquisition (42) of the interventional imaging sequence (23) from the auxiliary intervention device to the augmented reality device (30).
15. An augmented image-guided intervention method, comprising: rendering, by an augmented reality device (30), an augmented view of an image-guided intervention; and controlling, by a visual image sequence controller (40), an augmentation of a visual imaging sequence acquisition of an interventional imaging sequence in a rendered augmented view of the image-guided intervention by the augmented reality device (30) including accessing, by the visual image sequence controller (40), geometric information (41) of an imaging modality (20) associated with an acquisition of an interventional imaging sequence (23) of interventional images by the imaging modality (20) during the image-guided intervention; generating, by the visual image sequence controller (40), the visual imaging sequence acquisition (42) of the interventional imaging sequence (23) including, for each interventional image of the interventional imaging sequence (23), an interactive virtual indicator of a sequence number of a corresponding interventional image within the interventional imaging sequence and of an imaging parameter of the imaging modality (20) during an acquisition of the corresponding interventional image by the imaging modality (20); augmenting, by the visual image sequence controller (40), the view of the image-guided intervention by the augmented reality device (30) with the interactive virtual indicators as virtual objects indicative of the acquisition of the interventional imaging sequence (23) of interventional images by the imaging modality (20) during the image-guided intervention; and interfacing, by the visual image sequence controller (40), with the at least one of the imaging modality (20) and a display (24, 31) responsive to a user interaction with at least one of the interactive virtual indicators
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The present disclosure will present in detail the following description of exemplary embodiments with reference to the following Figures wherein:
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0036] The present disclosure is applicable to image-guided intervention implementing minimally-invasive imaging to diagnose and/or treat anatomical diseases (e.g., X-ray interventional imaging of organs, ultrasound interventional imaging of organs, etc.).
[0037] The present disclosure improves upon the prior art of image-guided intervention by providing unique embodiments of 2D/3D interactive virtual indicators in an augmented reality of an image-guided intervention (live or recorded) for graphically displaying, within a physical world, geometric information of an imaging modality associated with a sequential acquisition of 2D/3D interventional images to thereby illustrate a sequence of a portion or an entirety of the image-guided intervention to a clinician in a natural way.
[0038] To facilitate an understanding of the present disclosure, the following description of
[0039] Referring to
[0040] For purposes of describing and claiming the present disclosure, the term “imaging modality” encompasses all systems, as known in the art of the present disclosure and hereinafter conceived, for implementing an image-guided intervention procedure by directing energy (e.g., X-ray beams, ultrasound, radio waves, magnetic fields, light, electrons, lasers, and radionuclides) into an anatomy for purposes of generating images of the anatomy (e.g., biological tissues and bone). Examples of an imaging modality include, but are not limited to, interventional X-ray imaging systems and interventional ultrasound systems.
[0041] In practice, imaging modality 20 includes an interventional imaging controller 22 for controlling an activation/deactivation of an interventional imaging device 21 (e.g., an X-ray C-arm, an ultrasound probe, etc.) to systematically direct energy into an anatomy via operator-generated commands and/or image guided procedural-generated commands for purposes of generating images of the anatomy as known in the art of the present disclosure.
[0042] For purposes of describing and claiming the present disclosure, the term “augmented reality device” encompasses all devices, as known in the art of the present disclosure and hereinafter conceived, for implementing an interactive experience of overlaying virtual object(s) in a physical world of image-guided intervention based on an accurate spatial registration of the augmented reality device to the physical world of image-guided intervention that facilitates a consistent positioning and orienting of the virtual object(s) in the real-world image-guided intervention. Examples of an augmented reality device include, but are not limited to, Microsoft Hololens®, Microsoft HoloLens®v2, DAQRI Smart Glasses ®, Magic Leap®, Vusix Blade® and Meta2®.
[0043] In practice, augmented reality device 30 includes an augmented reality controller 32 for controlling augmented reality display 31 to display virtual object(s) in a physical world of image-guided intervention based on an accurate spatial registration of augmented reality display 31 to imaging modality 20 and/or the subject anatomy that facilitates a consistent positioning and orienting of the virtual object(s) to the subject anatomy and an operater-interaction with the virtual object(s).
[0044] For purposes of describing and claiming the present disclosure, the term “visual image sequence controller” encompasses all structural configurations, as understood in the art of the present disclosure and as exemplary described in the present disclosure, of a main circuit board or an integrated circuit for controlling an application of various principles of the present disclosure for an augmentation of a visual imaging sequence acquisition of an interventional imaging sequence in a rendered augmented view of the image-guided intervention by an augmented reality device as exemplary described in the present disclosure. The structural configuration of the controller may include, but is not limited to, processor(s), computer-usable/computer readable storage medium(s), an operating system, application module(s), peripheral device controller(s), slot(s) and port(s).
[0045] For purposes of describing and claiming the present disclosure, the term “application module” broadly encompasses an application incorporated within or accessible by a controller consisting of an electronic circuit (e.g., electronic components and/or hardware) and/or an executable program (e.g., executable software stored on non-transitory computer readable medium(s) and/or firmware) for executing a specific application associated with implementing an augmentation of a visual imaging sequence acquisition of an interventional imaging sequence in a rendered augmented view of the image-guided intervention by the augmented reality device
[0046] In practice, as exemplary described in the present disclosure, visual image sequence controller 40 access an intervention file 41 informative of geometric information of imaging modality 20 associated with an acquisition of 2D/3D interventional images to thereby control an augmentation of a visual imaging sequence acquisition of an interventional imaging sequence by imaging modality 20 in a rendered augmented view of the image-guided intervention by augmented reality device 30. The visual imaging sequence acquisition includes, for each interventional image of the interventional imaging sequence, an interactive virtual indicator of a sequence number and an imaging direction during an acquisition of the interventional image by imaging modality 20.
[0047] More particularly, augmented reality controller 32 facilitates a rendering of computer generated content as overlay on an augmented view of an image-guided intervention. When a pose of the eyes are known by augmented reality controller 32, the visual imaging acquisition 42 can be rendered such that it appears to be part of the image-guided intervention. For this to happen, augmented reality device 30 and the subject anatomy need to be accurately tracked. This may be accomplished via a camera (e.g., visual or ToF) of augmented reality device 30 as known in the art of the present disclosure and/or a surgical navigation system as known in the art of the present disclosure. Additional detection of augmented reality device 30 and the subject anatomy adds may be appropriate for additional accuracy, such as, for example, fiducial marker(s), active marker(s), etc.
[0048] Further in practice, visual image sequence controller 40 may interface with a display (not shown), as known in the art of the present disclosure or hereinafter conceived, for displaying an interventional image as commanded by an user-activation of an interactive visual indicator of the present disclosure. Visual image sequence controller 40 may interface with interventional imaging controller 22, as known in the art of the present disclosure or hereinafter conceived, for controlling a re-posing of interventional imaging device 21 as commanded by user-activation of an interactive visual indicator of the present disclosure.
[0049] An augmented reality image-guided intervention method of the present disclosure involves, subsequent to or during an acquisition of an interventional imaging sequence by imaging modality 20, augmented reality controller 32 controlling an augmentation of an augmented view of the image-guided intervention via an augment reality display 32 of a visual image sequencing acquisition 42 generated by visual image sequencing controller 40 to include interactive virtual indicators of sequence numbers and imaging directions of the interventional imaging sequence.
[0050] For example,
[0051] Referring to
[0052] Still referring to
[0053] In practice, an interactive virtual indicator may have any form, shape and color to convey a corresponding sequence number and imaging direction. For this example, the interactive virtual indicators are formed as vectors of an arrow shape starting from the center of the projection on the patient and extending outward to the center of the detector position. Next to the end-point of the vector, the sequence number of the interventional image appears.
[0054] An interactive virtual indicator may be any virtual indicator with witch a user, such as a clinician, is able to interact with; allowing a (two-way) communication or flow of information between a user one or more computers or one or more processors or any device embedding or connected to such computer(s) or processor(s). A clinician may interact with an interactive virtual indicator via using gestures, voice commands or other interactive modes of augmented reality display 31a as known in the art of the present disclosure or hereinafter conceived.
[0055] The interventional images of the interventional image sequence 23 are hidden by default, as not to clutter augmented reality display 31a. Only a ‘selected’ interactive virtual indicator will show a pictorial image 33 corresponding to the ‘selected’ interactive virtual indicator within a display as known in the art of the present disclosure or hereinafter disclosed (e.g., a display of imaging modality 20, augmented reality display 31 or an additional display monitor).
[0056] While the interactive virtual indicators may correspond to interventional images acquired at different poses of C-arm 21a, the image-guided intervention may involve an acquisitions of a temporal series of interventional images at the same pose of C-arm 21a as will be further discussed in the present disclosure. For this embodiment, the interactive virtual indicator may be a vector showing the imaging direction of the C-arm 21a with an associated stack of sequence numbers.
[0057] To facilitate an understanding of the present disclosure, the following description of
[0058] The first exemplary image-guided intervention involves a diagnosis X-ray imaging phase as shown in
[0059] More particularly,
[0060] Additionally, if a clinician wishes to return C-arm to a previous pose for purpose of reacquiring a better interventional image at that pose, the clinician may tap the correct interactive virtual indicator or a virtual object C-arm (not shown) and an interventional imaging controller will move the C-arm to the recalled pose. For example, while C-arm 21a is in the pose of
[0061]
[0062] The second exemplary image-guided intervention involves a diagnosis ultrasound imaging phase as shown in
[0063] More particularly,
[0064] Additionally, if a clinician wishes to return robotic ultrasound probe to a previous pose for purpose of reacquiring a better interventional image at that pose, the clinician may tap the correct interactive virtual indicator or a virtual object robotic ultrasound probe (not shown) and an interventional imaging controller will move the robotic ultrasound probe to the recalled pose. For example, while robotic ultrasound probe 21b is in the pose of
[0065]
[0066] To facilitate a further understanding of the various inventions of the present disclosure, the following description of
[0067] Referring to
[0068] Referring to
[0069] Referring to
[0070] To facilitate a further understanding of the various inventions of the present disclosure, the following description of
[0071] Referring to
[0072] Each processor 141 may be any hardware device, as known in the art of the present disclosure or hereinafter conceived, capable of executing instructions stored in memory 142 or storage or otherwise processing data. In a non-limiting example, the processor(s) 141 may include a microprocessor, field programmable gate array (FPGA), application-specific integrated circuit (ASIC), or other similar devices.
[0073] The memory 142 may include various memories, as known in the art of the present disclosure or hereinafter conceived, including, but not limited to, L1, L2, or L3 cache or system memory. In a non-limiting example, the memory 142 may include static random access memory (SRAM), dynamic RAM (DRAM), flash memory, read only memory (ROM), or other similar memory devices.
[0074] The user interface 143 may include one or more devices, as known in the art of the present disclosure or hereinafter conceived, for enabling communication with a user such as an administrator. In a non-limiting example, the user interface may include a command line interface or graphical user interface that may be presented to a remote terminal via the network interface 144.
[0075] The network interface 144 may include one or more devices, as known in the art of the present disclosure or hereinafter conceived, for enabling communication with other hardware devices. In a non-limiting example, the network interface 144 may include a network interface card (NIC) configured to communicate according to the Ethernet protocol. Additionally, the network interface 144 may implement a TCP/IP stack for communication according to the TCP/IP protocols. Various alternative or additional hardware or configurations for the network interface 144 will be apparent.
[0076] The storage 145 may include one or more machine-readable storage media, as known in the art of the present disclosure or hereinafter conceived, including, but not limited to, read-only memory (ROM), random-access memory (RAM), magnetic disk storage media, optical storage media, flash-memory devices, or similar storage media. In various non-limiting embodiments, the storage 145 may store instructions for execution by the processor(s) 141 or data upon with the processor(s) 141 may operate. For example, the storage 145 may store a base operating system for controlling various basic operations of the hardware. The storage 145 also stores application modules 147 in the form of executable software/firmware for implementing the various functions of the visual image sequence controller 140 as previously described in the present disclosure including, but not limited to, a virtual object generator 148 and a virtual object augmenter 149 for executing a flowchart 240 representative of a visual imaging sequencing method of the present disclosure as shown in
[0077] Referring to
[0078] Referring to
[0079] Further, as one having ordinary skill in the art will appreciate in view of the teachings provided herein, structures, elements, components, etc. described in the present disclosure/specification and/or depicted in the Figures may be implemented in various combinations of hardware and software, and provide functions which may be combined in a single element or multiple elements. For example, the functions of the various structures, elements, components, etc. shown/illustrated/depicted in the Figures can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software for added functionality. When provided by a processor, the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared and/or multiplexed. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor (“DSP”) hardware, memory (e.g., read only memory (“ROM”) for storing software, random access memory (“RAM”), non-volatile storage, etc.) and virtually any means and/or machine (including hardware, software, firmware, combinations thereof, etc.) which is capable of (and/or configurable) to perform and/or control a process.
[0080] Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (e.g., any elements developed that can perform the same or substantially similar function, regardless of structure). Thus, for example, it will be appreciated by one having ordinary skill in the art in view of the teachings provided herein that any block diagrams presented herein can represent conceptual views of illustrative system components and/or circuitry embodying the principles of the invention. Similarly, one having ordinary skill in the art should appreciate in view of the teachings provided herein that any flow charts, flow diagrams and the like can represent various processes which can be substantially represented in computer readable storage media and so executed by a computer, processor or other device with processing capabilities, whether or not such computer or processor is explicitly shown.
[0081] The terms “signal”, “data” and “command” as used in the present disclosure broadly encompasses all forms of a detectable physical quantity or impulse (e.g., voltage, current, or magnetic field strength) as understood in the art of the present disclosure and as exemplary described in the present disclosure for transmitting information and/or instructions in support of applying various inventive principles of the present disclosure as subsequently described in the present disclosure. Signal/data/command communication between various components of the present disclosure may involve any communication method as known in the art of the present disclosure including, but not limited to, signal/data/command transmission/reception over any type of wired or wireless datalink and a reading of signal/data/commands uploaded to a computer-usable/computer readable storage medium.
[0082] It will be appreciated that the term “comprising” does not exclude other elements or steps and that the indefinite article “a” or “an” does not exclude a plurality. A single processor may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to an advantage. Any reference signs in the claims should not be construed as limiting the scope of the claims.
[0083] Although claims have been formulated in this application to particular combinations of features, it should be understood that the scope of the disclosure of the present invention also includes any novel features or any novel combinations of features disclosed herein either explicitly or implicitly or any generalisation thereof, whether or not it relates to the same invention as presently claimed in any claim and whether or not it mitigates any or all of the same technical problems as does the parent invention. The applicants hereby give notice that new claims may be formulated to such features and/or combinations of features during the prosecution of the present application or of any further application derived therefrom.