G06V20/05

TURBIDITY DETERMINATION USING COMPUTER VISION

Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, that generate from a first pair and a second pair of images of livestock that are within an enclosure and that are taken at different times using a stereoscopic camera, at least two distance distributions of the aquatic livestock within the enclosure. The distance distributions can be used to determine a measure associated with an optical property of the water within the enclosure. A signal associated with the measure can be provided.

RADAR AND COLOCATED CAMERA SYSTEMS AND METHODS

Techniques are disclosed for systems and methods to provide remote sensing imagery for mobile structures. A remote sensing imagery system includes a radar assembly mounted to a mobile structure and a coupled logic device. The radar assembly includes an imaging system coupled to or within the radar assembly and configured to provide image data associated with the radar assembly. The logic device is configured to receive radar returns corresponding to a detected target from the radar assembly and image data corresponding to the radar returns from the imaging system, and then generate radar image data based on the radar returns and the image data. Subsequent user input and/or the sensor data may be used to adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.

RADAR AND COLOCATED CAMERA SYSTEMS AND METHODS

Techniques are disclosed for systems and methods to provide remote sensing imagery for mobile structures. A remote sensing imagery system includes a radar assembly mounted to a mobile structure and a coupled logic device. The radar assembly includes an imaging system coupled to or within the radar assembly and configured to provide image data associated with the radar assembly. The logic device is configured to receive radar returns corresponding to a detected target from the radar assembly and image data corresponding to the radar returns from the imaging system, and then generate radar image data based on the radar returns and the image data. Subsequent user input and/or the sensor data may be used to adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.

Machine learning based automated object recognition for unmanned autonomous vehicles

A platform is positioned within an environment. The platform includes an image capture system connected to a controller implementing a neural network. The neural network is trained to associate visual features within the environment with a target object utilizing a known set of input data examples and labels. The image capture system captures input images from the environment and the neural network recognizes features of one or more of the input images that at least partially match one or more of the visual features within the environment associated with the target object. The input images that contain the visual features within the environment that at least partially match the target object are labeled, a geospatial position of the target object is determined based upon pixels within the labeled input images, and a class activation map is generated, which is then communicated to a supervisory system for action.

Method and system for external fish parasite monitoring in aquaculture

A method for external fish parasite monitoring in aquaculture, comprising the steps of: —submerging a camera (52) in a sea pen containing fish, the camera having a field of view; —capturing images of the fish with the camera (52); and—identifying external fish parasite on the fish by analyzing the captured images, characterized in that a target region within the field of view of the camera (52) is illuminated from above and below with light of different intensities and/or spectral compositions.

METHOD OF REMOVING UNDERWATER BUBBLES AND A DEVICE THEREFOR

There is disclosed a method of removing underwater gas bubbles from underwater devices such as cameras, optical lenses, or domes comprising acquiring a camera image or a sequence of images, applying a computer vision algorithm for detecting bubbles that stick to underwater cameras, optical lenses or domes and vibrating a motor for a time period sufficient to remove the majority of the bubbles, and optionally repeating vibrating the motor until complete removal or dismissal of bubbles is achieved.

There is further disclosed a method of moving a moving element with respect to an underwater surface associated with an underwater acquisition device, wherein a motion of the moving element enables at least part of the moving element to interact with at least one of the surface or the underwater gas bubbles, thereby dismissing or removing underwater gas bubbles from at least part of the surface.

Corresponding systems are also disclosed.

METHOD OF REMOVING UNDERWATER BUBBLES AND A DEVICE THEREFOR

There is disclosed a method of removing underwater gas bubbles from underwater devices such as cameras, optical lenses, or domes comprising acquiring a camera image or a sequence of images, applying a computer vision algorithm for detecting bubbles that stick to underwater cameras, optical lenses or domes and vibrating a motor for a time period sufficient to remove the majority of the bubbles, and optionally repeating vibrating the motor until complete removal or dismissal of bubbles is achieved.

There is further disclosed a method of moving a moving element with respect to an underwater surface associated with an underwater acquisition device, wherein a motion of the moving element enables at least part of the moving element to interact with at least one of the surface or the underwater gas bubbles, thereby dismissing or removing underwater gas bubbles from at least part of the surface.

Corresponding systems are also disclosed.

FISH COUNTING SYSTEM, FISH COUNTING METHOD, AND PROGRAM

This fish counting system comprises: an image acquisition unit configured to acquire a plurality of images obtained by capturing, over time, images of a photographing area in which a fluid including a fish flows; an extraction unit configured to extract a fish in each image; and a counting unit configured to count the number of fish. The photographing area has a first area and a second area. The counting unit is configured to count the fish when the fish in the first area moves to the second area.

FISH BIOMASS, SHAPE, SIZE, OR HEALTH DETERMINATION

Methods, systems, and apparatuses, including computer programs encoded on a computer-readable storage medium for estimating the shape, size, mass, and health of fish are described. A pair of stereo cameras may be utilized to obtain off-axis images of fish in a defined area. The images may be processed, enhanced, and combined. Object detection may be used to detect and track a fish in images. A pose estimator may be used to determine key points and features of the detected fish. Based on the key points, a model of the fish is generated that provides an estimate of the size and shape of the fish. A regression model or neural network model can be applied to the fish model to determine characteristics of the fish.

Estimating fish size, population density, species distribution and biomass

A computerized system of performing fish census which otherwise requires high level of domain knowledge and expertise is described. Divers with minimal knowledge of fish can obtain high quality population and species distribution measurements using a stereo camera rig and fish video analyzer software that was developed. The system has two major components: a camera rig and software for fish size, density and biomass estimation. The camera rig consists of a simple stand on which one to four pairs of stereo cameras are mounted to take videos of the benthic floor for a few minutes. The collected videos are uploaded to a server which performs stereo analysis and image recognition. The software produces video clips containing estimates of fish size, density and species biodiversity and a log report containing information about the individual fishes for further end user analysis.