G06T2207/10148

Systems and Methods for Processing Spatially Related Sequence Data Received from a Sequencing Device
20230161810 · 2023-05-25 ·

Methods, systems, and computer-readable media for processing spatially related sequence data received from a sequencing device are presented. In one or more embodiments, a computing platform may receive, from a sequencing device, image data associated with a sample. The computing platform may identify, based on the image data received from the sequencing device, a first sequence located at first spatial coordinates. Subsequently, the computing platform may store, in a spatially searchable database, a first data element comprising the first spatial coordinates and a first identifier corresponding to the first sequence to spatially relate the first sequence to other sequences present in the sample. In some instances, the image data received from the sequencing device may include spatial information, temporal information, and color information associated with the sample, and the computing platform may present, on a display device, information identifying a presence of the first sequence at the first spatial coordinates.

Planar waveguide apparatus with diffraction element(s) and system employing same

A waveguide apparatus includes a planar waveguide and at least one optical diffraction element (DOE) that provides a plurality of optical paths between an exterior and interior of the planar waveguide. A phase profile of the DOE may combine a linear diffraction grating with a circular lens, to shape a wave front and produce beams with desired focus. Waveguide apparati may be assembled to create multiple focal planes. The DOE may have a low diffraction efficiency, and planar waveguides may be transparent when viewed normally, allowing passage of light from an ambient environment (e.g., real world) useful in AR systems. Light may be returned for temporally sequentially passes through the planar waveguide. The DOE(s) may be fixed or may have dynamically adjustable characteristics. An optical coupler system may couple images to the waveguide apparatus from a projector, for instance a biaxially scanning cantilevered optical fiber tip.

Planar waveguide apparatus with diffraction element(s) and system employing same

A waveguide apparatus includes a planar waveguide and at least one optical diffraction element (DOE) that provides a plurality of optical paths between an exterior and interior of the planar waveguide. A phase profile of the DOE may combine a linear diffraction grating with a circular lens, to shape a wave front and produce beams with desired focus. Waveguide apparati may be assembled to create multiple focal planes. The DOE may have a low diffraction efficiency, and planar waveguides may be transparent when viewed normally, allowing passage of light from an ambient environment (e.g., real world) useful in AR systems. Light may be returned for temporally sequentially passes through the planar waveguide. The DOE(s) may be fixed or may have dynamically adjustable characteristics. An optical coupler system may couple images to the waveguide apparatus from a projector, for instance a biaxially scanning cantilevered optical fiber tip.

APPARATUS AND METHOD FOR DETECTING TAB FOLDING, AND IMAGE ANALYZER

This application provides an apparatus and method for detecting tab folds, and an image analyzer. The apparatus for detecting tab folds includes: a first image obtaining module, configured to obtain a first image of a first lateral face of tabs of a battery cell; a second image obtaining module, configured to obtain a second image of a second lateral face of the tabs, where the second lateral face is different from the first lateral face; and an image analyzer, configured to obtain, based on the first image, a first number of layers of the tabs corresponding to the first lateral face, and obtain, based on the second image, a second number of layers of the tabs corresponding to the second lateral face, and determine, based on at least one of the first number of layers or the second number of layers, whether the tabs are in a folded state.

METHODS OF DETERMINING ABERRATIONS OF A CHARGED PARTICLE BEAM, AND CHARGED PARTICLE BEAM SYSTEM

A method of determining aberrations of a charged particle beam (11) focused by a focusing lens (120) toward a sample (10) in a charged particle beam system is described. The method includes: (a) taking one or more images of the sample at one or more defocus settings to provide one or more taken images (h.sub.1...N); (b) simulating one or more images of the sample taken at the one or more defocus settings based on a set of beam aberration coefficients (.sup.iC) and a focus image of the sample to provide one or more simulated images; (c) comparing the one or more taken images and the one or more simulated images for determining a magnitude (Ri) of a difference therebetween; and (d) varying the set of beam aberration coefficients (.sup.iC) to provide an updated set of beam aberration coefficients (.sup.i+1C) and repeating (b) and (c) using the updated set of beam aberration coefficients (.sup.i+1C) in an iterative process for minimizing said magnitude (R.sub.i). Alternatively, in (b), one or more beam cross sections may be simulated, and, in (c) the simulated beam cross sections may be compared with one or more retrieved beam cross sections retrieved from the one or more taken images for determining a magnitude (R.sub.i) of a difference therebetween. Further, a charged particle beam system for imaging and/or inspecting a sample that is configured for any of such methods is provided.

Rapid On-Site Evaluation Using Artificial Intelligence for Lung Cytopathology

A system and method are presented for applying convolutional neural networks (CNNs) to aid in rapid on-site evaluation cytopathology. Image data are acquired from a biopsy slide. Areas of interest are determined using a first CNN. The image data is segmented into image tiles, and tiles showing the areas of interest are analyzed using a second CNN to assign a histologic category to the slide. The second CNN also utilizes site specific data relating to the biopsy location. Layered image data from multiple focal planes can be acquired of the slide and used as input to the second CNN. Categorized tiles are sorted and presented to a remote computing system for cytopathology determinations, aided by the results of applying the second CNN. Semantic segmentation can also be developed, both as input to the second CNN and as data presented to the remote computer system.

METHOD AND IMAGE-PROCESSING DEVICE FOR DETECTING FOREIGN OBJECTS ON A TRANSPARENT PROTECTIVE COVER OF A VIDEO CAMERA
20220335706 · 2022-10-20 · ·

A method for determining whether or not a transparent protective cover of a video camera comprising a lens-based optical imaging system is partly covered by a foreign object is disclosed. The method comprises: obtaining (402) a first captured image frame captured by the video camera with a first depth of field; obtaining (404) a second captured image frame captured by the video camera with a second depth of field which differs from the first depth of field; and determining (406) whether or not the protective cover is partly covered by the foreign object by analysing whether or not the first and second captured image frames are affected by presence of the foreign object on the protective cover such that the difference between the first depth of field and the second depth of field results in a difference in a luminance pattern of corresponding pixels of a first image frame and a second image frame. The first image frame is based on the first captured image frame and the second image frame is based on the second captured image frame.

AUTOFOCUS SYSTEMS AND METHODS FOR PARTICLE ANALYSIS IN BLOOD SAMPLES

Particles such as blood cells can be categorized and counted by a digital image processor. A digital microscope camera can be directed into a flowcell defining a symmetrically narrowing flowpath in which the sample stream flows in a ribbon flattened by flow and viscosity parameters between layers of sheath fluid. A contrast pattern for autofocusing is provided on the flowcell, for example at an edge of a rear illumination opening. The image processor assesses focus accuracy from pixel data contrast. A positioning motor moves the microscope and/or flowcell along the optical axis for autofocusing on the contrast pattern target. The processor then displaces microscope and flowcell by a known distance between the contrast pattern and the sample stream, thus focusing on the sample stream. Blood cell images are collected from that position until autofocus is reinitiated, periodically, by input signal, or when detecting temperature changes or focus inaccuracy in the image data.

PSEUDO-DATA GENERATION APPARATUS, PSEUDO-DATA GENERATION METHOD, LEARNING APPARATUS AND LEARNING METHOD
20230143991 · 2023-05-11 · ·

According to one embodiment, a pseudo-data generation apparatus includes processing circuitry. The processing circuitry acquires one or more pieces of partial observation data that form part of whole observation data. The processing circuitry generates pseudo-whole observation data by inputting the one or more pieces of partial observation data to a function, the pseudo-whole observation data being pseudo-data of the whole observation data. The function is optimized by training so that partial observation data for training and pseudo-partial observation data for training resemble each other, the pseudo-partial observation data for training being obtained by converting the pseudo-whole observation data for training.

Augmented reality pulse oximetry

One embodiment is directed to a system comprising a head-mounted member removably coupleable to the user's head; one or more electromagnetic radiation emitters coupled to the head-mounted member and configured to emit light with at least two different wavelengths toward at least one of the eyes of the user; one or more electromagnetic radiation detectors coupled to the head-mounted member and configured to receive light reflected after encountering at least one blood vessel of the eye; and a controller operatively coupled to the one or more electromagnetic radiation emitters and detectors and configured to cause the one or more electromagnetic radiation emitters to emit pulses of light while also causing the one or more electromagnetic radiation detectors to detect levels of light absorption related to the emitted pulses of light, and to produce an output that is proportional to an oxygen saturation level in the blood vessel.