G06T7/0016

CROSS SECTION VIEWS OF WOUNDS
20230094442 · 2023-03-30 · ·

A non-transitory computer readable medium storing data and computer implementable instructions that, when executed by at least one processor, cause the at least one processor to perform operations for generating cross section views of a wound, the operations including receiving 3D information of a wound based on information captured using an image sensor associated with an image plane substantially parallel to the wound; generating a cross section view of the wound by analyzing the 3D information; and providing data configured to cause a presentation of the generated cross section view of the wound.

OBTAINING HIGH-RESOLUTION OCULOMETRIC PARAMETERS

Disclosed are systems and methods for extracting high resolution oculometric parameters. A video stream having a video of a face of a user is processed to obtain a set of oculometric parameters, such as eyelid data, iris data (e.g., iris translation, iris radius and iris rotation), and pupil data (e.g., pupil center and pupil radius) at a first resolution. A deconvolution process is performed on the video stream to improve accuracy or resolution of the oculometric parameters, based on stimulus information of a video stimulus displayed on a client device associated with the user, environment data of an environment in which the user is located, device data of the client device, etc. The oculometric parameters are then processed using a prediction model that is trained based on high resolution oculometric parameters obtained using eye tracking devices to predict oculometric parameters at a resolution greater than the first resolution.

SYSTEMS AND METHODS FOR VISION DIAGNOSTICS
20230096623 · 2023-03-30 ·

Systems and methods for improved vision diagnostics are disclosed. Some embodiments relate to machine learning models for the analysis of vision diagnostics data. Some embodiments relate to improved, robust regression methods that can be used with vision diagnostics data to detect trends that may warrant medical intervention.

RADIOPHARMACEUTICAL DISTRIBUTION IMAGE GENERATION SYSTEM AND METHOD USING DEEP LEARNING
20230029695 · 2023-02-02 ·

The present invention relates to a radiopharmaceutical distribution image generation system and method using deep learning and, more specifically, to a radiopharmaceutical distribution image generation system and method using deep learning, wherein dynamic medical images collected from multiple patients and a time-radiation dose distribution curve for each organ can be learnt through a deep learning network and a spatial distribution image of a radiopharmaceutical can be generated from a static medical image acquired from a specific patient. According to the present invention, even when a medical image is acquired only for a specific time after a radiopharmaceutical is injected into a patient, a spatial distribution image of the radiopharmaceutical can be acquired across the entire time by usin g the deep learning network, and quantitative analysis of the radiopharmaceutical can be performed by calculating a time-radiation dose distribution curve on the basis thereof.

Method And System For Evaluating Efficacy Of A Therapeutic Intervention
20230033034 · 2023-02-02 · ·

A method for evaluating efficacy of a therapeutic intervention includes obtaining, from each of paired liver biopsy samples of a subject comprising a first sample prior to the therapeutic intervention and a second sample after the therapeutic intervention, a first set of image data indicative of a first histopathological feature and a second set of image data indicative of a second histopathological feature. For each of the first and second samples, the second histopathological feature is quantified, the first and second sets of image data are overlapped based on a common reference frame, and the first histopathological feature present in an overlapping area of the first and second sets of image data is quantified. The method also includes determining the efficacy of the therapeutic intervention based on a comparison of the quantified second histopathological feature between the first and second samples and/or a comparison of the quantified first histopathological feature in the overlapping area between the first and second samples. The first and second pathological features include features selected from the group consisting of fibrosis, inflammation, ballooning and steatosis.

Image guided surgical methodology and system employing patient movement detection and correction

A method and system utilizes an imaging device that generates images of target tissue of a patient during a surgical procedure that acts on the target tissue imaged by the imaging device. The method and system enables visual detection of patient movement during the surgical procedure by marking at least one spatial attribute of one or more identifiable features of the target tissue illustrated in an image presented in a display window. Prior to acting on the target tissue, a visual indicator of the spatial attribute(s) is superimposed on one or more subsequent images captured by the imaging device and displayed to the operator. The operator can visually compare a position of the visual indicator to a position of the operator-identified feature in order to detect movement of the patient during the procedure. The system and methodology also facilitates realignment that corrects for detected patient movement.

Systems and methods for cultivating and distributing aquatic organisms
11612119 · 2023-03-28 · ·

System and methods for monitoring the growth of an aquatic plant culture and detecting real-time characteristics associated with the aquatic plant culture aquatic plants. The systems and methods may include a control unit configured to perform an analysis of at least one image of an aquatic plant culture. The analysis may include processing at least one collected image to determine at least one physical characteristic or state of an aquatic plant culture. Systems and methods for distributing aquatic plant cultures are also provided. The distribution systems and methods may track and control the distribution of an aquatic plant culture based on information received from various sources. Systems and methods for growing and harvesting aquatic plants in a controlled and compact environment are also provided. The systems may include a bioreactor having a plurality of vertically stacked modules designed to contain the aquatic plants and a liquid growth medium.

Systems and methods for automated assessment of embryo quality using image based features

Systems and methods for automated imaging and evaluation of image based features are disclosed herein. Method for automated imaging and evaluation of image based features can include receiving time-lapse images of at least one human embryo contained in a multi-well culture dish that can have a plurality of micro-wells. Image based features can be automatically generated from the time-lapse images of the human embryo. The image based features, which can include a cavitation feature, can be inputted into a classifier. The classifier can automatically and directly generate a viability prediction with the classifier from the image-based features.

OPTIMUM WEIGHTING OF DSA MASK IMAGES
20230037260 · 2023-02-02 ·

A method for generating a subtraction image for digital subtraction angiography to reduce noise and movement artifacts. Obtaining a plurality of mask images of an object takes place before administering a contrast agent into the object and obtaining a map of the object after administering a contrast agent into the object. A first sum image is obtained from the plurality of mask images in that the plurality of mask images is summed in each case multiplied by an individual weighting. The individual weightings for each of the plurality of mask images are automatically determined by an optimization method, and the subtraction image is ascertained by subtraction of the sum image from the map.

Platform For Co-Culture Imaging To Characterize In Vitro Efficacy Of Heterotypic Effector Cellular Therapies In Cancer

A method for characterizing cancer organoid response to an immune cell based therapy, includes providing a panel of different combinations of cancer organoid cells and immune cells to culturing wells and culturing the different combination under conditions that support organoid growth. Brightfield and corresponding fluorescence images of the culturing wells are captured and provided to one or more trained machine learning algorithms that identify and distinguish cancer organoid cells from immune cells and characterize cancer organoid morphology changes caused by an immune cell based therapies, from which an analytical report including a characterization of cancer organoid cell death caused by the immune cell based therapy is provided.