SORTING ANIMALS BASED ON NON-INVASIVE DETERMINATION OF ANIMAL CHARACTERISTICS
20230157263 · 2023-05-25
Inventors
- Ayal BRENNER (Birkirkara, MT)
- Mordekhay SHNIBERG (Birkirkara, MT)
- Christoph PODES (Birkirkara, MT)
- Ana Catarina Marcelino Costa DA SILVA (Birkirkara, MT)
Cpc classification
A61B8/5223
HUMAN NECESSITIES
A61B5/7264
HUMAN NECESSITIES
B07C5/34
PERFORMING OPERATIONS; TRANSPORTING
International classification
A61B5/00
HUMAN NECESSITIES
A61B8/00
HUMAN NECESSITIES
Abstract
Methods and systems are disclosed for improvements in aquaculture that allow for increasing the number and growth efficiency of fish in an aquaculture setting (or other animals in other settings) by identifying or predicting characteristics of the animals based on visual or ultrasound images of the animals obtained through non-invasive means. The animals are sorted based on the characteristics.
Claims
1. A system for sorting animals, comprising: an ultrasound transducer configured to obtain an ultrasound image of an animal located on a path, the ultrasound transducer configured to obtain the ultrasound image while the animal moves along a portion of the path; a sorter configured to sort the animal into a group; and control circuitry configured to: determine, based on the ultrasound image, a characteristic of the animal; and control the sorter to sort the animal into the group based on the characteristic.
2. The system of claim 1, further comprising: a conveyor comprising a plurality of compartments configured to receive animals and move the animals along the path; and a camera configured to obtain a visual image of the animal in a compartment on the conveyor as the animal moves past the camera; wherein: the ultrasound transducer is configured to obtain the ultrasound image of the animal in the compartment on the conveyor, the ultrasound transducer configured to obtain the ultrasound image while the ultrasound transducer moves along a portion of the path with the animal; and the control circuitry is configured to: determine, based on the visual image, a starting point on the animal for the ultrasound transducer, and control the ultrasound transducer to move along the portion of the path based on the starting point to obtain the ultrasound image.
3. The system of claim 1, wherein the control circuitry is configured to determine the characteristic of the animal based on the ultrasound image by inputting the ultrasound image to an artificial neural network, which is trained to output the characteristic based on the ultrasound image.
4. The system of claim 3, wherein the artificial neural network is trained to identify one or more phenotype characteristics of the animal based on the ultrasound image, and determine presence of a biomarker in the animal indicative of the characteristic output by the artificial neural network based on the one or more phenotype characteristics.
5. The system of claim 1, wherein the characteristic is gender of the animal, presence of disease in the animal, size of the animal, early maturation of the animal, mature parr, presence of bacterial kidney disease in the animal, heart and/or skeletal muscle inflammation in the animal, or a fat percentage of the animal.
6. The system of claim 1, wherein the control circuitry is configured to determine a starting point on the animal for the ultrasound image by providing a visual image of the animal to a machine vision algorithm, which is trained to determine the starting point based on the visual image.
7. The system of claim 1, wherein the ultrasound transducer is configured to move in at least two dimensions, the at least two dimensions comprising: a first dimension along the path; and a second dimension along a body of the animal, the second dimension substantially perpendicular to the first dimension and the path; wherein the ultrasound transducer is configured to move in the first dimension and the second dimension substantially simultaneously while obtaining the ultrasonic image, starting from a starting point; and wherein a width of the ultrasonic transducer and the movement in the second dimension defines an image area on the body of the animal, the image area including target anatomy of the animal.
8. The system of claim 1, further comprising a plurality ultrasound transducers, each controlled by the control circuitry to obtain ultrasound images of a plurality of animals in a plurality of compartments on a conveyor at the same time.
9. The system of claim 1, wherein the sorter comprises a mechanical arm controlled by the control circuitry to move between multiple positions such that sorting the animal into a group comprises moving the mechanical arm to direct the animal from the conveyor to a same physical location as other animals in the group.
10. The system of claim 1, wherein the animal is a fish and a starting point for the ultrasound image corresponds to a start of an operculum of the fish.
11. The system of claim 1, the system further comprising a camera, wherein the camera is configured to obtain a red green blue (RGB) image set that includes a visual image; the ultrasound transducer is configured to obtain an ultrasound image set of the animal that includes the ultrasound image; and the control circuitry is configured to: determine a starting point for the ultrasound transducer based on the RGB image set, and determine the characteristic based on the ultrasound image set.
12. The system of claim 11, wherein the animal is a fish and the visual image is a red green blue (RGB) image, and wherein the control circuitry is further configured to determine, based on the RGB image, a short operculum in the fish and/or damage to gills of the fish, diseases resistance, growth performance, and/or current diseases of the fish.
13. A method for sorting animals with a sorting system, the sorting system including an ultrasound transducer, a sorter, and control circuitry, the method comprising: obtaining, with the ultrasound transducer, an ultrasound image of an animal located on a path, the ultrasound transducer configured to obtain the ultrasound image while the animal moves along a portion of the path; determining, with the control circuitry, based on the ultrasound image, a characteristic of the animal; and controlling, with the control circuitry, the sorter to sort the animal into the group based on the characteristic.
14. The method of claim 13, the sorting system further including a conveyor and a camera, the method further comprising: receiving animals with a plurality of compartments of the conveyor, and moving the animals along the path; obtaining, with the camera, a visual image of an animal in a compartment on the conveyor as the animal moves past the camera; obtaining, with the ultrasound transducer, the ultrasound image of the animal in the compartment on the conveyor, the ultrasound transducer configured to obtain the ultrasound image while the ultrasound transducer moves along the portion of the path with the animal; and determining, with the control circuitry, based on the visual image, a starting point on the animal for the ultrasound transducer, and controlling the ultrasound transducer to move along the portion of the path based on the starting point to obtain the ultrasound image.
15. The method of claim 13, wherein the control circuitry is configured to determine the characteristic of the animal based on the ultrasound image by inputting the ultrasound image to an artificial neural network, which is trained to output the characteristic based on the ultrasound image.
16. The method of claim 15, wherein the artificial neural network is trained to identify one or more phenotype characteristics of the animal based on the ultrasound image, and determine presence of a biomarker in the animal indicative of the characteristic output by the artificial neural network based on the one or more phenotype characteristics.
17. The method of claim 13, wherein the characteristic is gender of the animal, presence of disease in the animal, size of the animal, early maturation of the animal, mature parr, presence of bacterial kidney disease in the animal, heart and/or skeletal muscle inflammation in the animal, or a fat percentage of the animal.
18. The method of claim 13, wherein the control circuitry is configured to determine a starting point on the animal for the ultrasound image by providing a visual image of the animal to a machine vision algorithm, which is trained to determine the starting point based on the visual image.
19. The method of claim 13, further comprising moving the ultrasound transducer in at least two dimensions, the at least two dimensions comprising: a first dimension along the path; and a second dimension along a body of the animal, the second dimension substantially perpendicular to the first dimension and the path; wherein the ultrasound transducer is configured to move in the first dimension and the second dimension substantially simultaneously while obtaining the ultrasonic image, starting from the starting point; and wherein a width of the ultrasonic transducer and the movement in the second dimension defines an image area on the body of the animal, the image area including target anatomy of the animal.
20. The method of claim 13, wherein the sorting system includes a plurality of ultrasound transducers, and wherein the method further comprises controlling, with the control circuitry, the plurality of ultrasound transducers to obtain ultrasound images of a plurality of animals in a plurality of compartments on a conveyor at the same time.
21. The method of claim 13, wherein the sorter comprises a mechanical arm, and wherein the method further comprises controlling, with the control circuitry, the mechanical arm to move between multiple positions such that sorting the animal into a group comprises moving the mechanical arm to direct the animal from the conveyor to a same physical location as other animals in the group.
22. The method of claim 13, wherein the animal is a fish and a starting point for the ultrasound image corresponds to a start of an operculum of the fish.
23. The method of claim 13, the system further comprising a camera, wherein the camera is configured to obtain a red green blue (RGB) image set that includes a visual image; the ultrasound transducer is configured to obtain an ultrasound image set of the animal that includes the ultrasound image; and the control circuitry is configured to: determine a starting point for the ultrasound transducer based on the RGB image set, and determine the characteristic based on the ultrasound image set.
24. The method of claim 23, wherein the animal is a fish and the visual image is a red green blue (RGB) image, and wherein the method further comprises determining, with the control circuitry, based on the RGB image, a short operculum in the fish and/or damage to gills of the fish, diseases resistance, growth performance, and/or current diseases of the fish.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
DETAILED DESCRIPTION OF THE DRAWINGS
[0033] In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the invention. It will be appreciated, however, by those having skill in the art that the embodiments of the invention may be practiced without these specific details or with an equivalent arrangement. In other cases, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the embodiments of the invention.
[0034] In contrast to conventional approaches to identifying animal characteristics that use invasive approaches, methods and systems are described herein for non-invasive procedures that identify animal characteristics non-invasively and efficiently, and sort the animals based on the characteristics, for farming or other applications. A sufficiently high throughput (animals per hour) in a system with a limited size (to be able to transport and potentially share between farms, plus producers have limited space) is provided. Advantageously, even with the high throughput, the system produces ultrasound images which are readable by a machine learning model (such as a neural network) so that the machine learning model can produce predictions with a very high accuracy on features like gender determination, for example, in real time or near real time.
[0035] Sufficiently high throughput is achieved by, among other things, configuring the system such that the animals are positioned side by side (instead of nose to tail) on a conveyor, which reduces the time between animals. Ultrasound can be limited in its image acquisition speed. Therefore the relative speed of a scan (how fast an ultrasound transducer scans an animal) is limited. In order to overcome these or other limitations related to scanning speed, the ultrasound transducer(s) of the present system is (are) configured to travel with an animal along the conveyor or path during a scan. This provides a faster processing time of animals by the system relative to prior approaches. In addition, a high frequency ultrasound transducer is used, but the speed of a given scan is limited to produce blur free images.
[0036] In some embodiments, ultrasound with increased acquisition speed is used so that ultrasound transducers of the present systems and methods need not travel with the animal along a conveyor or path during a scan.
[0037] The system also uses machine vision to determine the starting point of the ultrasound scan based on red green blue (RGB) images of an animal. Based on the width of an ultrasound transducer (e.g., that corresponds to a width on the animal), a certain window (on the animal) is scanned (e.g., over a certain length of the body of the animal), which may be used to predict an organ (gonad), or other characteristics of the animal.
[0038]
[0039] Conveyor 102 is configured to receive animals (e.g., such as fish 101) and move the animals along a path 110. The animals may be placed one by one on conveyor 102, aligned laterally (relative to an axis of conveyor 102) in compartments 112 so that the animals travel with one of their sides facing toward camera 104 or an ultrasound transducer 106a, 106b, or 106c. The animals move on conveyor 102 to camera 104 and ultrasound transducers 106a, 106b, or 106c, where they are imaged (visually and ultrasonically respectively). Once the visual and ultrasound images are processed, the animals are sorted into (e.g., three or more) groups by sorter 108.
[0040] Path 110 is aligned along conveyor 102, starting at one end of conveyor 102 and extending to an opposite end of conveyor 102 and sorter 108. In some embodiments, path 110 begins at a feeder input zone, where an operator places one animal after another, oriented in a specific direction, into compartments 112 of conveyor 102. In some embodiments, conveyor 102 is configured with an axial tilt angle 111 so that the animals travel aligned to one side of conveyor 102 (e.g., caused by gravity). For example, in some embodiments, conveyor 102 comprises a plurality of compartments 112 configured to receive and hold the animals while they move along path 110. In some embodiments, compartments 112 are oriented at an angle relative to a surface of conveyor 102 to ensure a repeating position of an animal in each compartment 112. A given compartment 112 may have one or more sides that extend a certain distance from a surface of conveyor 102 at a certain angle. The distance or angle may be determined based on the type or dimensions of the animal, or other information. The distance or angle may be configured to be sufficient to separate one animal from the next on conveyor 102.
[0041] In some embodiments, conveyor 102 or the surfaces of compartments 112 may be formed by or coated with a material configured to reduce slippage or other movement of an animal in a compartment 112 on conveyor 102. For example, the material may include cloth, sponge, rubber or another polymer, or other materials. However, in some embodiments, one or more surfaces of conveyor 102 or compartments 112 may be metallic or be formed from other materials. In some embodiments conveyor 102 is supported by a frame 150 and/or other components.
[0042] By way of a non-limiting example, in some embodiments, the animals may be fish 101. Compartments 112 may be configured to hold a given fish perpendicular to path 110 of conveyor 102 (shown in
[0043] In some embodiments, the image set may be created using an imaging device such as camera 104 that detects electromagnetic radiation with wavelengths between about 400 nanometers to about 1100 nanometers. In some embodiments, the image set may be created using an imaging device such as camera 104 that detects electromagnetic radiation with wavelengths between 400 to 500 nanometers, between 500 to 600 nanometers, between 700 to 900 nanometers, or between 700 to 1100 nanometers.
[0044] Camera housing 105 is configured to define an imaging area for camera 104. The imaging area may be an area where an amount of artificial or ambient light is controlled when images are taken by camera 104, for example. Camera housing 105 may be formed by one or more walls at a location just above conveyor 102. In some embodiments, camera 104 and camera housing 105 remain stationary relative to conveyor 102 and compartments 112 as animals move beneath camera 104 along path 110.
[0045] Ultrasound transducers 106a, 106b, and 106c are configured to obtain ultrasound images of the animals in compartments 112 on conveyor 102. In some embodiments, ultrasound transducers 106a, 106b, and 106c are configured to obtain an ultrasound image set of the animal that includes the ultrasound images. In some embodiments, an individual ultrasound transducer 106a, 106b, or 106c is configured to obtain one or more ultrasound images of a given animal on conveyor 102. For example, the ultrasound images may include an ultrasound image set of an animal. The ultrasound image set may include one or more ultrasound images of the animal. If the ultrasound image set includes multiple ultrasound images, the multiple ultrasound images may be captured from different angles (e.g., a top view, side view, bottom view, etc.) and/or may be captured substantially simultaneously. The views may also include plan, elevation, and/or section views. The one or more views may create a standardized series of orthographic two-dimensional images that represent the form of the three-dimensional animal. For example, six views of the animal may be used, with each projection plane parallel to one of the coordinate axes of the animal. The views may be positioned relative to each other according to either a first-angle projection scheme or a third-angle projection scheme. The ultrasound images in the ultrasound image set may include separate images (e.g., images stored separately, but linked by a common identifier such as a serial number) or images stored together. An ultrasound image in an ultrasound image set may also be a composite image (e.g., an image created by cutting, cropping, rearranging, and/or overlapping two or more ultrasound images.
[0046] Ultrasound transducers 106a, 106b, and 106c are configured to obtain the ultrasound images while the ultrasound transducers 106a, 106b, and 106c move along a portion of path 110 with the animals. Ultrasound transducers 106a, 106b, and 106c are configured to move in at least two dimensions. The at least two dimensions comprise a first dimension along path 110 and a second dimension along a body of a given animal. The second dimension is substantially perpendicular to the first dimension and path 110, for example. In some embodiments, an ultrasound transducer 106a, 106b, or 106c is configured to move in the first dimension and the second dimension substantially simultaneously while obtaining ultrasonic images, starting from the starting point.
[0047] Movement in two dimensions occurs at a controlled speed over defined distances that correspond to movement of an animal on conveyor 102, and a length of a given animal. The controlled speed and distances facilitate acquisition of standardized images for each animal carried by conveyor 102. A mechanical system comprising independent electrical linear actuators may be configured to move each ultrasound transducer 106a, 106b, and 106c along path 110, or perpendicular to path 110, along the body of a given fish 101, or in other directions, for example. Such actuators may also be used to move a given transducer toward or away from the body of a fish 101 (e.g. to place pressure on the body of a fish 101 during a scan as described herein).
[0048] In some embodiments, a width of an ultrasonic transducer 106a, 106b, and 106c, and the movement in the second dimension defines an image area on the body of the animal. The image area includes target anatomy of the animal. In some embodiments, a width of an ultrasound transducer 106a, 106b, or 106c is at least 10 mm. In some embodiments, a width of an ultrasound transducer 106a, 106b, or 106c is at least 20 mm. In some embodiments, a width of an ultrasound transducer 106a, 106b, or 106c is at least 30 mm. In some embodiments, an ultrasound transducer 106a, 106b, or 106c is configured to move along the body of an animal over a distance of about 15 mm. In some embodiments, an ultrasound transducer 106a, 106b, or 106c is configured to move along the body of an animal over a distance of about 32 mm. In some embodiments, an ultrasound transducer 106a, 106b, or 106c is configured to move along the body of an animal over a distance of about 45 mm. It should be understood that the width of an ultrasound transducer and the distance that the ultrasound transducer moves along the length of the body of an animal may be adjusted based on the size or type of animal being scanned, for example.
[0049] In some embodiments, an ultrasound transducer 106a, 106b, or 106c is configured to contact an animal in a compartment 112 and keep pressure on the animal while ultrasound images are acquired. This may be accomplished through, for example, rolls located before and after a transducer. The rolls and transducer can be loaded with a spring or other systems to maintain steady pressure without damaging a fish.
[0050] By way of a non-limiting example, ultrasound transducers 106a, 106b, and 106c are configured to scan three fish 101 moving along conveyor 102 at the same time. The group of transducers is moving back and forth, to measure one group of fish 101 and then move for a next group of fish 101. When the ultrasound scans are being performed, the transducers move at the same speed as the fish along path 110 of conveyor 102. Movements of the group of transducers and conveyor 102 are synchronized. Independent electrical linear transducers are configured to move each ultrasound transducer 106a, 106b, and 106c along and perpendicular to path 110 along the body of a given fish 101 during ultrasound imaging. Such actuators may also be used to move a given transducer toward or away from the body of a fish 101 (e.g. to place pressure on the body of a fish 101 during a scan).
[0051] Sorter 108 is configured to sort the animals into groups. Sorter 108 is configured to sort the animals into groups as the animals come off of conveyor 102. In some embodiments, sorter 108 comprises a mechanical arm controlled by control circuitry to move between multiple positions such that sorting the animal into a group comprises moving the mechanical arm to direct the animal from conveyor 102 to a same physical location as other animals in the group.
[0052]
[0053] Control circuitry 200 is configured to determine, based on the visual images from camera 104, starting points on the animals for the ultrasound transducers. In some embodiments, control circuitry 200 is configured to control the ultrasound transducers to move along the portion of path 110 based on the starting point to obtain the ultrasound images. However, in some embodiments, the ultrasound transducers are controlled to move along the portion of path 110 by mechanical means. Control circuitry 200 is configured to determine the starting point on a given animal by providing the visual image(s) for that animal to a machine vision algorithm, which is trained to determine the starting point based on the visual image(s).
[0054] In some embodiments, determining a starting point comprises generating a pixel array based on the visual images or image set of the animal. The pixel array may refer to computer data that describes an image (e.g., pixel by pixel). In some embodiments, this may include one or more vectors, arrays, and/or matrices that represent either a Red, Green, Blue or grayscale image. Furthermore, in some embodiments, control circuitry 200 may additionally convert the image set from a set of one or more vectors, arrays, and/or matrices to another set of one or more vectors, arrays, and/or matrices. For example, the control circuitry 200 may convert an image set having a red color array, a green color array, and a blue color to a grayscale color array. In some embodiments, for example, the animal is a fish and the starting point, determined based on the pixel array, corresponds to a start of an operculum of the fish.
[0055] Control circuitry 200 is configured to determine, based on the visual images, the ultrasound images, and/or other information, characteristics of the animals. In some embodiments, control circuitry 200 is configured to receive the visual images from camera 104 (
[0056] In some embodiments, control circuitry 200 is configured to determine the starting point based on the RGB image set, and determine the characteristics based on the RGB image seta and/or the ultrasound image set. A characteristic may be or describe a condition, feature, or quality of an animal, that may be used to sort an animal into a group. The characteristics may include a current physiological condition (e.g., a condition occurring normal in the body of the animal) such as a gender of the animal (e.g., as determined by the development of sex organs) and/or a stage of development in the animal (e.g., the state of smoltification in a fish). The characteristics may include a predisposition to a future physiological condition such as a growth rate, maturity date, and/or behavioral traits. The characteristics may include a pathological condition (e.g., a condition centered on an abnormality in the body of the animal based on a response to a disease) such as whether or not the animal is suffering from a given disease and/or is currently infected with a given disease. The characteristics may include a genetic condition (e.g., a condition based on the formation of the genome of the animal) such as whether or not the animal includes a given genotype. The characteristics may include a presence of a given biomarker (e.g., a measurable substance in an organism whose presence is indicative of a disease, infection, current internal condition, future internal condition, and/or environmental exposure). The characteristics may include phenotype characteristics (e.g., one or more observable characteristics of an animal resulting from the interaction of its genotype with the environment). These externally visible traits may include traits corresponding to physiological changes in the animal. For example, during smoltification in a fish (i.e., the series of physiological changes where juvenile salmonid fish adapt from living in fresh water to living in seawater), externally visible traits related to this physiological change may include altered body shape, increased skin reflectance (silvery coloration), and increased enzyme production (e.g., sodium-potassium adenosine triphosphatase) in the gills.
[0057] By way of several specific examples, the characteristics (which again may be determined based on ultrasound images, RGB images, and/or other information) may include a gender of an animal, presence of disease in an animal, size of an animal, early maturation of an animal, mature parr, presence of bacterial kidney disease in an animal, heart or skeletal muscle inflammation in an animal, a fat percentage of an animal, a size of an animal, a shape of an animal, a weight of an animal, and/or other characteristics. In some embodiments, the animal is a fish and the visual image is a red green blue (RGB) image. In some embodiments, control circuitry 200 is configured to determine, based on the RGB image, characteristics such as a short operculum in the fish and/or damage to gills of the fish, diseases resistance, growth performance, current diseases, smoltification status, and/or other characteristics (e.g., see other examples of characteristics described herein).
[0058] Control circuitry 200 is configured to determine the characteristics of an animal based on one or more ultrasound images (and/or visual images) of that animal by inputting the one or more ultrasound images (and/or visual images) to a machine learning mode such as an artificial neural network, which is trained to output the characteristics based on the one or more ultrasound (or visual) images. The artificial neural network is trained to identify one or more phenotype characteristics of the animal based on the ultrasound image, and determine presence of a biomarker in the animal indicative of the characteristic output by the artificial neural network based on the one or more phenotype characteristics.
[0059] As shown in
[0060] By way of a non-limiting example,
[0061] Each of these devices may also include memory in the form of electronic storage. The electronic storage may include non-transitory storage media that electronically stores information. The electronic storage media of the electronic storages may include one or both of (i) system storage that is provided integrally (e.g., substantially non-removable) with servers or client devices or (ii) removable storage that is removably connectable to the servers or client devices via, for example, a port (e.g., a USB port, a firewire port, etc.) or a drive (e.g., a disk drive, etc.). The electronic storages may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media. The electronic storages may include one or more virtual storage resources (e.g., cloud storage, a virtual private network, and/or other virtual storage resources). The electronic storage may store software algorithms, information determined by the processors, information obtained from servers, information obtained from client devices, or other information that enables the functionality as described herein.
[0062]
[0063] In some embodiments, computing system 300 may use one or more prediction models to predict characteristics based on visual images, ultrasound images, or other information. For example, as shown in
[0064] As an example, with respect to
[0065] Machine learning model 322 may be trained to detect the characteristics in animals based on a set of ultrasound images. For example, ultrasound transducers 106a, 106b, or 106c (
[0066] The system may then receive an ultrasound image set of a second fish. Computing system 300 may input one or more of the ultrasound images in the set into machine learning model 322. Computing system 300 may then receive an output from machine learning model 322 indicating that the second fish has the same characteristic (e.g., genotype biomarker) as the first. For example, computing system 300 may input a second data set (e.g., ultrasound image sets of fish for which characteristics are not known) into machine learning model 322. Machine learning model 322 may then classify the image sets of fish based on the images.
[0067]
[0068]
[0069] In some embodiments, model 450 may implement an inverted residual structure where the input and output of a residual block (e.g., block 454) are thin bottleneck layers. A residual layer may feed into the next layer and directly into layers that are one or more layers downstream. A bottleneck layer (e.g., block 458) is a layer that contains few neural units compared to the previous layers. Model 450 may use a bottleneck layer to obtain a representation of the input with reduced dimensionality. An example of this is the use of autoencoders with bottleneck layers for nonlinear dimensionality reduction. Additionally, model 450 may remove non-linearities in a narrow layer (e.g., block 458) in order to maintain representational power. In some embodiments, the design of model 450 may also be guided by the metric of computation complexity (e.g., the number of floating point operations). In some embodiments, model 450 may increase the feature map dimension at all units to involve as many locations as possible instead of sharply increasing the feature map dimensions at neural units that perform downsampling. In some embodiments, model 450 may decrease the depth and increase width of residual layers in the downstream direction.
[0070] Returning to
[0071]
[0072] In some embodiments, method 500 may be implemented, at least in part, in one or more processing devices such as one or more processing devices described herein (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information). The one or more processing devices may include one or more devices executing some or all of the operations of method 500 in response to instructions (e.g., machine readable instructions) stored electronically on an electronic storage medium. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 500.
[0073] At an operation 502, animals are received with a plurality of compartments of the conveyor, and moved along a path. In some embodiments, operation 502 is performed by a conveyor the same as or similar to conveyor 102 (shown in
[0074] At an operation 504, a visual image (or set of visual images) of an animal in a compartment on the conveyor is obtained as the animal moves past a camera. In some embodiments, the camera is configured to obtain a red green blue (RGB) image set that includes the visual image. In some embodiments, operation 504 is performed by camera the same as or similar to camera 104 (shown in
[0075] At an operation 506, an ultrasound image (or set of ultrasound images) of the animal is obtained with an ultrasound transducer. The ultrasound image of the animal is obtained with the animal in the compartment on the conveyor. The ultrasound transducer is configured to obtain the ultrasound image while the ultrasound transducer moves along a portion of the path with the animal. In some embodiments, the ultrasound transducer is configured to obtain an ultrasound image set of the animal that includes the ultrasound image. In some embodiments, operation 506 is performed by one or more ultrasound transducers the same as or similar to ultrasound transducers 106a, 106b, or 106c (shown in
[0076] At an operation 508, a starting point on the animal is determined for the ultrasound transducer, with the control circuitry, based on the visual image. The control circuitry is configured to determine the starting point on the animal by providing the visual image to a machine vision algorithm, which is trained to determine the starting point based on the visual image. In some embodiments, the animal is a fish and the starting point corresponds to a start of an operculum of the fish.
[0077] Operation 508 also includes controlling the ultrasound transducer to move along the portion of the path based on the starting point to obtain the ultrasound image. Controlling the ultrasound transducer comprises controlling the ultrasound transducer to move in at least two dimensions. The at least two dimensions comprise a first dimension along the path and a second dimension along a body of the animal. The second dimension is substantially perpendicular to the first dimension and the path. The ultrasound transducer is configured to move in the first dimension and the second dimension substantially simultaneously while obtaining the ultrasonic image, starting from the starting point. A width of the ultrasonic transducer and the movement in the second dimension defines an image area on the body of the animal. The image area includes a target anatomy of the animal. In some embodiments, the sorting system includes a plurality of ultrasound transducers, and operation 508 includes controlling, with the control circuitry, the plurality of ultrasound transducers to obtain ultrasound images of a plurality of animals in a plurality of compartments on the conveyor at the same time. In some embodiments, operation 508 is performed by control circuitry the same as or similar to control circuitry 200 (shown in
[0078] At an operation 510, a characteristic of the animal is determined. The characteristic is determined with the control circuitry, based on the ultrasound image. The characteristic is gender of the animal, presence of disease in the animal, size of the animal, early maturation of the animal, mature parr, presence of bacterial kidney disease in the animal, heart and/or skeletal muscle inflammation in the animal, a fat percentage of the animal, and/or other characteristics.
[0079] The control circuitry is configured to determine the characteristic of the animal based on the ultrasound image by inputting the ultrasound image to an artificial neural network, which is trained to output the characteristic based on the ultrasound image. The artificial neural network is trained to identify one or more phenotype characteristics of the animal based on the ultrasound image, and determine presence of a biomarker in the animal indicative of the characteristic output by the artificial neural network based on the one or more phenotype characteristics. In some embodiments, the control circuitry is configured to determine the starting point based on the RGB image set, and determine the characteristic based on the ultrasound image set. In some embodiments, the animal is a fish and the visual image is a red green blue (RGB) image, and operation 510 comprises determining, with the control circuitry, based on the RGB image, a short operculum in the fish and/or damage to gills of the fish, diseases resistance, growth performance, current diseases, and/or other characteristics. In some embodiments, operation 510 is performed by control circuitry the same as or similar to control circuitry 200 (shown in
[0080] At an operation 512, the sorter is controlled to sort the animal into a group based on the characteristic. In some embodiments, the sorter comprises a mechanical arm. The mechanical arm is controlled, with the control circuitry, to move between multiple positions such that sorting the animal into a group comprises moving the mechanical arm to direct the animal from the conveyor to a same physical location as other animals in a group. In some embodiments, operation 512 is performed by control circuitry the same as or similar to control circuitry 200 (shown in
[0081] In block diagrams such as
[0082] It is contemplated that the steps or descriptions of
[0083] To the extent that it aids understanding of the concepts described above,
[0084] Other embodiments of the present system are contemplated.
[0085] For example,
[0086]
[0087]
[0088] Although the present invention has been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred embodiments, it is to be understood that such detail is solely for that purpose and that the invention is not limited to the disclosed embodiments, but on the contrary, is intended to cover modifications and equivalent arrangements that are within the scope of the appended claims. For example, it is to be understood that the present invention contemplates that, to the extent possible, one or more features of any embodiment can be combined with one or more features of any other embodiment.
[0089] The present techniques will be better understood with reference to the following enumerated embodiments:
1. A system for sorting animals, comprising: an ultrasound transducer configured to obtain an ultrasound image of an animal located on a path, the ultrasound transducer configured to obtain the ultrasound image while the animal moves along a portion of the path; a sorter configured to sort the animal into a group; and control circuitry configured to: determine, based on the ultrasound image, a characteristic of the animal; and control the sorter to sort the animal into the group based on the characteristic.
2. The system of embodiment 1, further comprising: a conveyor comprising a plurality of compartments configured to receive animals and move the animals along the path; and a camera configured to obtain a visual image of the animal in a compartment on the conveyor as the animal moves past the camera; wherein: the ultrasound transducer is configured to obtain the ultrasound image of the animal in the compartment on the conveyor, the ultrasound transducer configured to obtain the ultrasound image while the ultrasound transducer moves along a portion of the path with the animal; and the control circuitry is configured to: determine, based on the visual image, a starting point on the animal for the ultrasound transducer, and control the ultrasound transducer to move along the portion of the path based on the starting point to obtain the ultrasound image.
3. The system of any of the previous embodiments, wherein the control circuitry is configured to determine the characteristic of the animal based on the ultrasound image by inputting the ultrasound image to an artificial neural network, which is trained to output the characteristic based on the ultrasound image.
4. The system of any of the previous embodiments, wherein the artificial neural network is trained to identify one or more phenotype characteristics of the animal based on the ultrasound image, and determine presence of a biomarker in the animal indicative of the characteristic output by the artificial neural network based on the one or more phenotype characteristics.
5. The system of any of the previous embodiments, wherein the characteristic is gender of the animal, presence of disease in the animal, size of the animal, early maturation of the animal, mature parr, presence of bacterial kidney disease in the animal, heart and/or skeletal muscle inflammation in the animal, or a fat percentage of the animal.
6. The system of any of the previous embodiments, wherein the control circuitry is configured to determine a starting point on the animal for the ultrasound image by providing a visual image of the animal to a machine vision algorithm, which is trained to determine the starting point based on the visual image.
7. The system of any of the previous embodiments, wherein the ultrasound transducer is configured to move in at least two dimensions, the at least two dimensions comprising: a first dimension along the path; and a second dimension along a body of the animal, the second dimension substantially perpendicular to the first dimension and the path; wherein the ultrasound transducer is configured to move in the first dimension and the second dimension substantially simultaneously while obtaining the ultrasonic image, starting from a starting point; and wherein a width of the ultrasonic transducer and the movement in the second dimension defines an image area on the body of the animal, the image area including target anatomy of the animal.
8. The system of any of the previous embodiments, further comprising a plurality ultrasound transducers, each controlled by the control circuitry to obtain ultrasound images of a plurality of animals in a plurality of compartments on a conveyor at the same time.
9. The system of any of the previous embodiments, wherein the sorter comprises a mechanical arm controlled by the control circuitry to move between multiple positions such that sorting the animal into a group comprises moving the mechanical arm to direct the animal from the conveyor to a same physical location as other animals in the group.
10. The system of any of the previous embodiments, wherein the animal is a fish and a starting point for the ultrasound image corresponds to a start of an operculum of the fish.
11. The system of any of the previous embodiments, the system further comprising a camera, wherein the camera is configured to obtain a red green blue (RGB) image set that includes a visual image; the ultrasound transducer is configured to obtain an ultrasound image set of the animal that includes the ultrasound image; and the control circuitry is configured to: determine a starting point for the ultrasound transducer based on the RGB image set, and determine the characteristic based on the ultrasound image set.
12. The system of any of the previous embodiments, wherein the control circuitry is further configured for determining characteristics based on RGB images and/or other information. For example, in some embodiments, the control circuitry is further configured to determine, based on the RGB image, a short operculum in the fish and/or damage to gills of the fish, diseases resistance, growth performance, current diseases of the fish, a smoltification status, and/or other characteristics (e.g., see other examples of characteristics described herein).
13. A method for sorting animals with a sorting system, the sorting system including an ultrasound transducer, a sorter, and control circuitry, the method comprising: obtaining, with the ultrasound transducer, an ultrasound image of an animal located on a path, the ultrasound transducer configured to obtain the ultrasound image while the animal moves along a portion of the path; determining, with the control circuitry, based on the ultrasound image, a characteristic of the animal; and controlling, with the control circuitry, the sorter to sort the animal into the group based on the characteristic.
14. The method of embodiment 13, the sorting system further including a conveyor and a camera, the method further comprising: receiving animals with a plurality of compartments of the conveyor, and moving the animals along the path; obtaining, with the camera, a visual image of an animal in a compartment on the conveyor as the animal moves past the camera; obtaining, with the ultrasound transducer, the ultrasound image of the animal in the compartment on the conveyor, the ultrasound transducer configured to obtain the ultrasound image while the ultrasound transducer moves along the portion of the path with the animal; and determining, with the control circuitry, based on the visual image, a starting point on the animal for the ultrasound transducer, and controlling the ultrasound transducer to move along the portion of the path based on the starting point to obtain the ultrasound image.
15. The method of any of the previous embodiments, wherein the control circuitry is configured to determine the characteristic of the animal based on the ultrasound image by inputting the ultrasound image to an artificial neural network, which is trained to output the characteristic based on the ultrasound image.
16. The method of any of the previous embodiments, wherein the artificial neural network is trained to identify one or more phenotype characteristics of the animal based on the ultrasound image, and determine presence of a biomarker in the animal indicative of the characteristic output by the artificial neural network based on the one or more phenotype characteristics.
17. The method of any of the previous embodiments, wherein the characteristic is gender of the animal, presence of disease in the animal, size of the animal, early maturation of the animal, mature parr, presence of bacterial kidney disease in the animal, heart and/or skeletal muscle inflammation in the animal, a smoltification status, or a fat percentage of the animal.
18. The method of any of the previous embodiments, wherein the control circuitry is configured to determine a starting point on the animal for the ultrasound image by providing a visual image of the animal to a machine vision algorithm, which is trained to determine the starting point based on the visual image.
19. The method of any of the previous embodiments, further comprising moving the ultrasound transducer in at least two dimensions, the at least two dimensions comprising: a first dimension along the path; and a second dimension along a body of the animal, the second dimension substantially perpendicular to the first dimension and the path; wherein the ultrasound transducer is configured to move in the first dimension and the second dimension substantially simultaneously while obtaining the ultrasonic image, starting from the starting point; and wherein a width of the ultrasonic transducer and the movement in the second dimension defines an image area on the body of the animal, the image area including target anatomy of the animal.
20. The method of any of the previous embodiments, wherein the sorting system includes a plurality of ultrasound transducers, and wherein the method further comprises controlling, with the control circuitry, the plurality of ultrasound transducers to obtain ultrasound images of a plurality of animals in a plurality of compartments on a conveyor at the same time.
21. The method of any of the previous embodiments, wherein the sorter comprises a mechanical arm, and wherein the method further comprises controlling, with the control circuitry, the mechanical arm to move between multiple positions such that sorting the animal into a group comprises moving the mechanical arm to direct the animal from the conveyor to a same physical location as other animals in the group.
22. The method of any of the previous embodiments, wherein the animal is a fish and a starting point for the ultrasound image corresponds to a start of an operculum of the fish.
23. The method of any of the previous embodiments, the system further comprising a camera, wherein the camera is configured to obtain a red green blue (RGB) image set that includes a visual image; the ultrasound transducer is configured to obtain an ultrasound image set of the animal that includes the ultrasound image; and the control circuitry is configured to: determine a starting point for the ultrasound transducer based on the RGB image set, and determine the characteristic based on the ultrasound image set.
24. The method of any of the previous embodiments, wherein the animal is a fish and the visual image is a red green blue (RGB) image, and wherein the method further comprises determining, with the control circuitry, based on the RGB image, a short operculum in the fish and/or damage to gills of the fish, diseases resistance, growth performance, and/or current diseases of the fish.
25. A tangible, non-transitory, machine-readable medium storing instructions that, when executed by a data processing apparatus of the control circuitry, cause the control circuitry to perform one or more operations of any of embodiments 1-24.