Patent classifications
A61B8/469
Systems and methods for automatic detection and visualization of turbulent blood flow using vector flow data
A system for visualization and quantification of ultrasound imaging data according to embodiments of the present disclosure may include a display unit, and a processor communicatively coupled to the display unit and to an ultrasound imaging apparatus for generating an image from ultrasound data representative of a bodily structure and fluid flowing within the bodily structure. The processor may be configured to estimate axial and lateral velocity components of the fluid flowing within the bodily structure, determine a plurality of flow directions within the image based on the axial and lateral velocity components, differentially encode the flow directions based on flow direction angle to generate a flow direction map, and cause the display unit to concurrently display the image including the bodily structure overlaid with the flow direction map.
SYSTEMS AND METHODS FOR FINDING HEPATO-RENAL INDICES
Systems and methods for calculating hepato-renal index (HRI) values from radiofrequency (RF) data are disclosed herein. The RF data may include fundamental frequency components, harmonic frequency components, or a combination thereof. Signal intensities within regions of interest may be calculated from the RF data. The signal intensities may be averaged to arrive at an average signal intensity value for each region of interest. In some examples, some of the highest and/or lowest signal intensity values may be removed prior to averaging. The ratio of the average signal intensities from the different regions of interest may be then be taken to arrive at the HRI values.
Ultrasound Detection System
Disclosed herein is an ultrasound imaging system configured to guide medical device insertion. The ultrasound imaging system includes an ultrasound probe having an ultrasound generation device configured to detect one or more anatomical targets within a target area, and one or more projectors configured to project one or more icons within the target area. The ultrasound imaging system can also include a console configured to generate the one or more icons. The console can be coupled to the ultrasound probe, and be in communication with each of the ultrasound generation device and the one or more projectors.
MEDICAL IMAGE PROCESSING DEVICE, ULTRASONIC DIAGNOSTIC DEVICE, AND STORAGE MEDIUM
A medical image processing device of an embodiment has processing circuitry. The processing circuitry performs position transformation to represent a position in a region of interest set in a captured mammography image of a breast of a subject and an examination position on a body of the subject in an ultrasound examination in a plane of the same scale, determines whether or not the position in the region of interest and the examination position match in the plane, and saves or outputs an ultrasonic image corresponding to the examination position upon determining that the position in the region of interest matches the examination position.
Relative backscatter coefficient in medical diagnostic ultrasound
In backscatter coefficient imaging, a backscatter coefficient of one region of interest relative another region of interest is used to avoid calibration. The system effects are removed by using a frequency-dependent measure of the backscatter. The relative frequency-dependent backscatter coefficient is determined by an ultrasound scanner.
Information processing apparatus, program for operating information processing apparatus, method for operating information processing apparatus, and mammography apparatus
A control device of a mammography apparatus includes an acquisition unit that acquires a radiographic image as radiography information in a case in which the radiographic image of the breast is captured and a generation condition setting unit that sets generation conditions in a case in which an ultrasound image of the breast is generated, on the basis of the radiographic image acquired by the acquisition unit. The generation condition setting unit analyzes the radiographic image to detect the amount of mammary glands in the breast and sets, as the generation conditions, an amplification factor of an ultrasound image signal and a dynamic range which is a width of a grayscale value of the ultrasound image assigned to a value of the ultrasound image signal, according to the detected amount of mammary glands.
COMPUTER-READABLE RECORDING MEDIUM HAVING STORED THEREIN INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING APPARATUS, AND METHOD FOR PROCESSING INFORMATION
There is disclosed a non-transitory computer-readable recording medium having stored therein an information processing program for causing a computer to execute a process. The process includes: specifying a first region, a second region and a third region from a nondestructive inspection image, the first region corresponding to a recess-free shape, the second region corresponding to a detection target included in the recess-free shape, the third region corresponding to a reference object included in the recess-free shape; specifying a first straight line that divides the first region into two and that passes through the third region; obtaining two intersections of the first straight line and an outer circumference of the recess-free shape; specifying a second straight line that passes through a center point of the two intersections and that is orthogonal to the first straight line; and outputting information indicative of a position of the detection target in the nondestructive inspection image, using a coordinate system using the first straight line and the second straight line as axes.
Medical image processing device, endoscope system, medical image processing method, and program
There are provided a medical image processing device, an endoscope system, a medical image processing method, and a program which detect an optimal lesion region according to an in-vivo position of a captured image. Images at a plurality of in-vivo positions of a subject are acquired from medical equipment that sequentially captures and displays in real time the images; positional information indicating the in-vivo position of the acquired image is acquired; from among a plurality of region-of-interest detection units that detect a region of interest from an input image and correspond to the plurality of in-vivo positions, respectively, a region-of-interest detection unit corresponding to the position indicated by the positional information is selected; and the selected region-of-interest detection unit detects a region of interest from the acquired image.
Automatically identifying anatomical structures in medical images in a manner that is sensitive to the particular view in which each image is captured
A facility for processing a medical imaging image is described. The facility applies to the image a first machine learning model trained to recognize a view to which an image corresponds, and a second machine learning model trained to identify any of a set of anatomical features visualized in an image. The facility accesses a list of permitted anatomical features for images corresponding to the recognized view, and filters the identified anatomical features to exclude any not on the accessed list. The facility causes the accessed image to be displayed, overlaid with a visual indication of each of the filtered identified anatomical features.
APPARATUS FOR CORRECTING POSTURE OF ULTRASOUND SCANNER FOR ARTIFICIAL INTELLIGENCE-TYPE ULTRASOUND SELF-DIAGNOSIS USING AUGMENTED REALITY GLASSES, AND REMOTE MEDICAL DIAGNOSIS METHOD USING SAME
An apparatus for correcting a posture of an ultrasound scanner for artificial intelligence-type ultrasound self-diagnosis, includes an ultrasound scanner including an ultrasound probe configured to acquire and transmit an ultrasound image of a patient; a mapper configured to acquire a body map of the patient in which a plurality of virtual interested organs is arranged on a body image; a scanner navigator configured to calculate current position coordinates of the ultrasound scanner on the body map and the ultrasound image; augmented reality glasses configured to display the ultrasound image and a virtual object image; and a processor configured to determine whether the patient has a disease and a risk degree of the disease based on an artificial neural network result of an implemented deep learning neural network trained on ultrasound training images provided with the ultrasound image.