Patent classifications
G06T2207/30104
CLASSIFICATION DISPLAY METHOD OF ULTRASOUND DATA AND ULTRASOUND IMAGING SYSTEM
Disclosed are a method for displaying ultrasonic data and an ultrasound imaging system. The method may include: acquiring ultrasonic video data to be displayed; obtaining at least one representative frame from the ultrasonic video data; classifying the representative frame to obtain a category of the representative frame, and determining a category of the ultrasonic video data according to the category of the representative frame; and displaying in categories the ultrasonic video data according to the category of the ultrasonic video data.
AUTOMATICALLY DETECTING AND QUANTIFYING ANATOMICAL STRUCTURES IN AN ULTRASOUND IMAGE USING A CUSTOMIZED SHAPE PRIOR
A facility for detecting a target structure is described. The facility receives an ultrasound image. It subjects the ultrasound image to a detection model to obtain, for each of one or more occurrences of a target structure appearing in the ultrasound image, a set of parameter values fitting a distinguished shape to the target structure occurrence. The facility stores the obtained one or more parameter value sets in connection with the ultrasound image.
MEDICAL IMAGE PROCESSING APPARATUS AND MEDICAL IMAGE PROCESSING METHOD
There is provided a medical image processing apparatus which includes a first extraction unit configured to extract coronary arteries depicted in images of a plurality of time phases relating to the heart, and to extract at least one stenosed part depicted in each coronary artery; a calculation unit configured to calculate a pressure gradient of each of the extracted coronary arteries, based on tissue blood flow volumes of the coronary arteries; a second extraction unit configured to extract an ischemic region depicted in the images; and a specifying unit configured to specify a responsible blood vessel of the ischemic region by referring to a dominance map, in which each of the extracted coronary arteries and a dominance territory are associated, for the extracted ischemic region, and to specify a responsible stenosis, based on the pressure gradient corresponding to a stenosed part in the specified responsible blood vessel.
Method and system for image processing to determine blood flow
Systems and methods are disclosed for evaluating cardiovascular treatment options for a patient. One method includes creating a three-dimensional model representing a portion of the patient's heart based on patient-specific data regarding a geometry of the patient's heart or vasculature; and for a plurality of treatment options for the patient's heart or vasculature, modifying at least one of the three-dimensional model and a reduced order model based on the three-dimensional model. The method also includes determining, for each of the plurality of treatment options, a value of a blood flow characteristic, by solving at least one of the modified three-dimensional model and the modified reduced order model; and identifying one of the plurality of treatment options that solves a function of at least one of: the determined blood flow characteristics of the patient's heart or vasculature, and one or more costs of each of the plurality of treatment options.
Device and method for cancer detection, diagnosis and treatment guidance using active thermal imaging
The present invention discloses means and methods for detecting irregularities in the cells throughout a healthy tissue. The method generally relates to cancer detection, diagnosis and treatment, and more specifically pertains to detection, diagnosis and treatment guidance of cancerous or precancerous conditions through the use of thermal imaging technology and analysis.
Method and system for image processing to determine blood flow
Embodiments include a system for determining cardiovascular information for a patient. The system may include at least one computer system configured to receive patient-specific data regarding a geometry of the patient's heart, and create a three-dimensional model representing at least a portion of the patient's heart based on the patient-specific data. The at least one computer system may be further configured to create a physics-based model relating to a blood flow characteristic of the patient's heart and determine a fractional flow reserve within the patient's heart based on the three-dimensional model and the physics-based model.
ACCOUNTING FOR ERRORS IN OPTICAL MEASUREMENTS
Apparatus and methods are described including preparing a blood sample for analysis by depositing the blood sample within a sample chamber (52), and placing the sample chamber, with the blood sample deposited therein, within a microscopy unit (24). One or more microscopic images of the sample chamber (52) with the blood sample deposited therein are acquired, using a microscope of the microscopy unit. Based upon the one or more images, an amount of one or more cell types within the sample chamber that had already settled within the sample chamber, prior to acquisition of the one or more microscopic images is determined. A characteristic of the sample is determined, at least partially in response thereto. Other applications are also described.
Systems and methods for video-based patient monitoring during surgery
The present invention relates to the field of medical monitoring, and in particular non-contact monitoring of one or more physiological parameters in a region of a patient during surgery. Systems, methods, and computer readable media are described for generating a pulsation field and/or a pulsation strength field of a region of interest (ROI) in a patient across a field of view of an image capture device, such as a video camera. The pulsation field and/or the pulsation strength field can be generated from changes in light intensities and/or colors of pixels in a video sequence captured by the image capture device. The pulsation field and/or the pulsation strength field can be combined with indocyanine green (ICG) information regarding ICG dye injected into the patient to identify sites where blood flow has decreased and/or ceased and that are at risk of hypoxia.
Super-resolution reconstruction preprocessing method and super-resolution reconstruction method for contrast-enhanced ultrasound images
A super-resolution reconstruction preprocessing method of contrast-enhanced ultrasound images includes: acquiring an image set to be preprocessed; acquiring grayscale fluctuation signal of a pixel point in the registered contrast-enhanced ultrasound images to be preprocessed; performing denoise reconstruction operation on the image set to be preprocessed to obtain a reconstructed feature parameter image, and performing interpolation calculation on the reconstructed feature parameter image to obtain a sparse microbubble image. By analyzing the grayscale fluctuation signals of the collocated pixel point set in the plurality of frames of the registered contrast-enhanced ultrasound images to be preprocessed, a signal-to-noise ratio and a signal-to-background ratio are improved. By performing interpolation operation on the reconstructed feature parameter image, spatial decoupling of overlapping microbubbles is realized, and influence of strong noise and high concentration microbubble on the accuracy of super-resolution reconstruction is reduced.
Machine learning systems for training encoder and decoder neural networks
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for jointly training an encoder neural network and a decoder neural network. In one aspect, a method comprises: updating current values of a set of encoder parameters and current values of a set of decoder parameters using gradients of a reconstruction loss function that measures an error in a reconstruction of multi-modal data from a training example, wherein: the reconstruction loss function comprises a plurality of scaling factors that each scale a respective term in the reconstruction loss function that measures an error in the reconstruction of a corresponding proper subset of feature dimensions of the multi-modal data from the training example.