Patent classifications
A61B8/468
SYSTEMS AND METHODS FOR CONTEXTUAL IMAGING WORKFLOW
A hierarchical workflow is configured to associate examination information captured using an imaging platform with contextual metadata. The examination information may include ultrasound image data, which may be associated with annotations, measurements, pathology, body markers, and/or the like. The hierarchical workflow may comprise templates associated with respective anatomical regions, locations, volumes, and/or surfaces. A template may define configuration data to automatically adapt the imaging platform to capture imaging data in the corresponding anatomical region. The template may further include guidance information for the operator, including processing steps for capturing relevant examination information. Additional examination information may be captured and included in the hierarchical workflow.
IMAGE DISPLAY METHOD AND ULTRASOUND IMAGING SYSTEM
The present application provides an image display method and an ultrasound imaging system. The image display method includes performing a position adjustment operation on an initial volume view in an image display interface to obtain at least one reference volume view, storing position information of the at least one reference volume view, and creating at least one label to point to the position information, the at least one label being displayed in the image display interface.
ULTRASOUND DIAGNOSIS APPARATUS FOR SELF-DIAGNOSIS AND REMOTE-DIAGNOSIS, AND METHOD OF OPERATING THE ULTRASOUND DIAGNOSIS APPARATUS
An ultrasound diagnosis apparatus and method enabling general users to easily acquire ultrasound images even when the users are unskilled at using ultrasound diagnosis apparatuses, and a non-transitory computer-readable storage medium having the ultrasound diagnosis method recorded thereon are provided. The ultrasound diagnosis apparatus includes a probe configured to acquire ultrasound data of an object; an image generation unit configured to generate an ultrasound image of the object by using the ultrasound data; a probe location acquisition unit configured to acquire a location of the probe on the object; a display unit configured to display the location of the probe and a reference location on an image representing the object; and a control unit configured to determine whether the location of the probe corresponds to the reference location.
ULTRASOUND IMAGING SYSTEM TOUCH PANEL WITH MULTIPLE DIFFERENT CLUSTERS OF CONTROLS
An ultrasound imaging system (102) includes a touch screen user interface (122) and a touch screen controller (148). The touch screen user interface includes a touch panel (124). The touch panel includes a plurality of different clusters (510-522) of controls including a first cluster (512) in first sub-region and with a tactile control and one or more other cluster (510 and 514-522) in one or more other different sub-regions and with soft controls. The touch screen controller visually renders the one or more other clusters in the one or more different sub-regions spatially arranged with respect to each other based on a predetermined control cluster configuration for the touch screen user interface. The one or more other clusters include controls that correspond to different groupings of ultrasound imaging operations of the ultrasound imaging system.
Apparatus and method for automatic ultrasound segmentation for visualization and measurement
A system and method for performing ultrasound scans is provided. One embodiment of the ultrasonagraphic system acquires sonogram information from a series of ultrasonic scans of a human subject. The series of ultrasound scans are taken over a portion of interest on the human subject which has their underlying bone structure or other ultrasound discernable organ that is under examination. The data from the series of scans are synthesized into a single data file that corresponds to a three-dimensional (3D) image and/or 3D model of the underlying bone structure or organ of the examined human subject.
Medical scan interface feature evaluating system
A medical scan interface feature evaluator system is operable to generate an ordered image-to-prompt mapping by selecting a set of user interface features to be displayed with each of an ordered set of medical scans. The set of medical scans and the ordered image-to-prompt mapping are transmitted to a set of client devices. A set of responses are generated by each client device in response to sequentially displaying each of the set of medical scans in conjunction with a mapped user interface feature indicated in the ordered image-to-prompt mapping via a user interface. Response score data is generated by comparing each response to truth annotation data of the corresponding medical scan. Interface feature score data corresponding to each user interface feature is generated based on aggregating the response score data, and is used to generate a ranking of the set of user interface features.
Percutaneous coronary intervention (PCI) planning interface and associated devices, systems, and methods
A method of evaluating a vessel of a patient is provided. The method includes outputting, to a display device, a screen display including: a visualization based on pressure measurements obtained from a first instrument and a second instrument positioned within the vessel of the patient while the second instrument is moved longitudinally through the vessel and the first instrument remains stationary within the vessel; and a visual representation of a vessel; receiving a user input to modify the visualization to simulate a therapeutic procedure; and updating the screen display, in response to the user input, including modifying the visualization based on the user input. A system for evaluating a vessel of a patient is also provided. The system includes first and second instruments sized and shaped for introduction into the vessel of the patient; and a processing system communicatively coupled to the first and second instruments and a display device.
ULTRASONIC DIAGNOSTIC APPARATUS, CONTROL METHOD OF ULTRASONIC DIAGNOSTIC APPARATUS, AND CONTROL PROGRAM OF ULTRASONIC DIAGNOSTIC APPARATUS
An ultrasonic diagnostic apparatus includes: an ultrasonic probe that transmits/receives ultrasonic waves; a camera that captures a subject; an ultrasonic image generator that generates an ultrasonic image of the subject based on a reception signal acquired from the ultrasonic probe; a first hardware processor that generates a display image including a camera image acquired from the camera and the ultrasonic image acquired from the ultrasonic image generator, and reproducibly saves the display image in a storage; and a second hardware processor that decides a display style of the camera image to be arranged in the display image in accordance with a usage when reading the display image.
ULTRASONIC DIAGNOSTIC APPARATUS, CONTROL METHOD OF ULTRASONIC DIAGNOSTIC APPARATUS, AND CONTROL PROGRAM OF ULTRASONIC DIAGNOSTIC APPARATUS
An ultrasonic diagnostic apparatus includes: an ultrasonic probe that transmits/receives ultrasonic waves; a camera that captures a subject; an ultrasonic image generator that generates an ultrasonic image of the subject based on a reception signal acquired from the ultrasonic probe; a first hardware processor that generates a display image including a camera image acquired from the camera and an ultrasonic image acquired from the ultrasonic image generator; and a second hardware processor that decides a display style of the camera image to be arranged in the display image in accordance with an operation mode of the ultrasonic diagnostic apparatus.
Processing a user input in relation to an image
A method is for processing a user input in relation to an image. In an embodiment, the method includes receiving first input data derived from user input in relation to a first image, the first input data indicating a selection of at least a part of the first image; performing a determination process to determine, at least partly based upon the first input data received and previous action data, one or more candidate actions to perform in relation to the first image, the previous action data relating to previous image processing actions performed in relation to one or more images; and providing output data indicating at least a first action of the one or more candidate actions determined.