Patent classifications
A61N2005/1062
IRRADIATION PLANNING APPARATUS AND IRRADIATION PLAN CORRECTION METHOD
Provided is an irradiation planning apparatus including: a three-dimensional CT value data acquisition unit (36); a prescription data input processing unit (32) which acquires prescription data; a stopping power ratio conversion unit (37) and a nuclear reaction effective density conversion unit (38) which respectively generate first conversion data (41) and second conversion data (42) on the basis of three-dimensional CT value data; and a calculation unit (33) which calculates a dose distribution on the basis of the prescription data, the first conversion data (41), and the second conversion data (42), wherein the stopping power ratio conversion unit (37) and the nuclear reaction effective density conversion unit (38) perform correction processing for correcting data obtained from the three-dimensional CT value data, using a physical quantity indicative of a likelihood of spalling particles of incident charged particle beam (3), and then determine the dose distribution. Thus, in calculation of a dose distribution in a body in particle beam irradiation planning, a dose error introduced by a difference between the probability of nuclear reaction initiated, in a body, by incident particles and the probability of nuclear reaction initiated, in water, by incident particles is simply and accurately corrected.
POSITIONING APPARATUS AND METHOD OF POSITIONING
A positioning apparatus and a positioning method has a control element and function 40 includes a radiograph acquisition element 41 that acquires radiograph data detected by two radiography systems selected from a group consisting of a flat panel detector, a DRR (Digital Reconstructed Radiograph) generation element 42 that generates DRR in two different directions by virtually performing fluoroscopic projection relative to the 3-dimensional CT data obtained through the network 17, a positioning element 43 that positions a CT to the X-ray fluoroscopic radiograph obtained from two radiography systems, and a displacement distance calculation element 44 that calculates a displacement distance of the tabletop 31 based on the gap between radiographs for improved positioning. The positioning element 43 has a multidimensional optimization element 45 and a 1-dimensional optimization element 46 that optimize parameters relative to rotation and translation of the fluoroscopic projection to maximize an evaluation function that evaluates a matching degree between the DRR and the X-ray fluoroscopic radiograph.
OFFLINE ANGLE SELECTION IN ROTATIONAL IMAGING AND TRACKING SYSTEMS
A method of operating imaging and tracking. The method includes determining, for each angle of a plurality of angles from which tracking images can be generated by an imaging device, a value of a tracking quality metric for tracking a target based on an analysis of a projection generated at that angle. The method also includes selecting, by a processing device, a subset of the plurality of angles that have a tracking quality metric value that satisfies a tracking quality metric criterion, one or more angles of the subset to be used to generate a tracking image of the target during a treatment stage, wherein the subset comprises at least a first angle and a second angle that is at least separated by a minimum threshold from the first angle.
Subject positioning systems and methods
Subject positioning systems and methods are provided. A method may include obtaining first information of at least part of a subject when the subject is located at a preset position, and determining, based on the first information, a first position of each of one or more feature points located on the at least part of the subject. The method may include obtaining, using an imaging device, second information of the at least part of the subject when the subject is located at a candidate position. The method may further include determining, based on the second information, a second position of each of the one or more feature points, a first distance between the first position and the second position for each feature point of the one or more feature points, and a target position of the subject based at least in part on the one or more first distances.
Compact proton therapy systems and methods
A system proton treatment system including a proton accelerator structured to generate a proton beam, a plurality of beamline pathways configured to direct the proton beam from the proton accelerator to a corresponding plurality of treatment rooms, a rotatable bending magnet located between the proton accelerator and the plurality of treatment rooms, the rotatable bending magnet being structured to selectively rotate between multiple treatment rooms, and an upright patient positioning mechanism disposed in each of the treatment rooms, the upright patient positioning mechanism being structured to support a patient within a particular treatment room and to rotate the patient between a fixed imaging source and imaging panel.
Determination of Dynamic DRRs
A computer implemented method for determining a two dimensional DRR referred to as dynamic DRR based on a 4D-CT, the 4D-CT describing a sequence of three dimensional medical computer tomographic images of an anatomical body part of a patient, the images being referred to as sequence CTs, the 4D-CT representing the anatomical body part at different points in time, the anatomical body part comprising at least one primary anatomical element and secondary anatomical elements, the computer implemented method comprising the following steps: acquiring the 4D-CT; acquiring a planning CT, the planning CT being a three dimensional image used for planning of a treatment of the patient, the planning CT being acquired based on at least one of the sequence CTs or independently from the 4D-CT, acquiring a three dimensional image, referred to as undynamic CT, from the 4D-CT, the undynamic CT comprising at least one first image element representing the at least one primary anatomical element and second image elements representing the secondary anatomical elements; acquiring at least one trajectory, referred to as primary trajectory, based on the 4D-CT, the at least one primary trajectory describing a path of the at least one first image element as a function of time; acquiring trajectories of the second image elements, referred to as secondary trajectories, based on the 4D-CT; for the image elements of the undynamic CT, determining trajectory similarity values based on the at least one primary trajectory and the secondary trajectories, the trajectory similarity values respectively describing a measure of similarity between a respective one of the secondary trajectories and the at least one primary trajectory; determining the dynamic DRR by using the determined trajectory similarity values, and, in case the planning CT is acquired independently from the 4D-CT, further using a transformation referred to as planning transformation from the undynamic CT to the planning CT, at least a part of image values of image elements of the dynamic DRR being determined by using the trajectory similarity values.
Offline angle selection in rotational imaging and tracking systems
A processing device determines a plurality of angles from which tracking images can be generated by an imaging device. The processing device generates a plurality of projections of a treatment planning image of a patient, the treatment planning image comprising a delineated target, wherein each projection of the plurality of projections has an angle that corresponds to one of the plurality of angles from which the tracking images can be taken. The processing device determines, for each angle of the plurality of angles, a value of a tracking quality metric for tracking the target based on an analysis of a projection generated at that angle. The processing device selects a subset of the plurality of angles that have a tracking quality metric value that satisfies a tracking quality metric criterion.
SUBJECT POSITIONING SYSTEMS AND METHODS
Subject positioning systems and methods are provided. A method may include obtaining first information of at least part of a subject when the subject is located at a preset position, and determining, based on the first information, a first position of each of one or more feature points located on the at least part of the subject. The method may include obtaining, using an imaging device, second information of the at least part of the subject when the subject is located at a candidate position. The method may further include determining, based on the second information, a second position of each of the one or more feature points, a first distance between the first position and the second position for each feature point of the one or more feature points, and a target position of the subject based at least in part on the one or more first distances.
Online angle selection in rotational imaging and tracking systems
A method of operating a radiation apparatus is described that selects at least a first angle and a second angle from the set of angles for a first rotation of the gantry. The method generates, using an imaging device mounted to the gantry, a first tracking image of the target from the first angle during the first rotation of the gantry. The method generates, using the imaging device, a second tracking image of the target from the second angle during the first rotation of the gantry. The method performs targeting tracking based on the first tracking image and the second tracking image.
Object tracking device
An object tracking device includes a superimposed image creation unit configured to create a plurality of superimposed images in which each of a plurality of non-tracking object images which do not include a tracking object image feature is superimposed on a tracking object section image which includes a tracking object image feature; a discriminator creation unit configured to learn at least one of an image feature and position information of the tracking object, based on the plurality of superimposed images to create a discriminator; and a tracking object specifying unit configured to specify at least one of the image feature and the position information of the tracking object in a tracked image including the respective image features of the tracking object and an obstacle, based on the discriminator and the tracked image.