AI-BASED AUTONOMOUS SYSTEM AND METHOD FOR PREDICTING AND REMOVING FLOATING WASTE FROM WATER BODIES

20260092421 ยท 2026-04-02

    Inventors

    Cpc classification

    International classification

    Abstract

    An artificial intelligence (AI)-based autonomous system integrated with an autonomous water-cleaning vehicle that predicts and selectively removes floating waste by accurately analyzing algae type, volume, and vitality, thereby ensuring targeted removal to maintain ecological balance and enhance water quality in water bodies. The AI-based autonomous system comprises a computing device, an autonomous water-cleaning vehicle, a network, a server, a database, and a user device. The autonomous water-cleaning vehicle comprises a floating body, a plurality of propelling wheels, one or more motors, a primary supporter, a secondary supporter, a first extended column, a second extended column, a supporting frame, a driving unit, a first roller, a second roller, an inclined conveyor belt, a counterweight, a container, an AI imaging unit, a water-salinity sensor, the optical fluorescence-sensing unit, a navigation unit, and a controller.

    Claims

    1. An artificial intelligence (AI)-based autonomous system for predicting and selectively removing floating waste from water bodies, comprising: a computing device having a controller and a non-transitory memory for storing instructions that are executable by the controller, wherein the computing device is disposed on a floating body of an autonomous water-cleaning vehicle, and is configured to automatically maneuver the autonomous water-cleaning vehicle across a water surface for performing a cleaning operation to collect the floating waste that includes organic waste and recyclable waste, wherein the computing device is configured to execute a selective cleaning operation by monitoring and analyzing types of the floating waste and parameters of the organic waste, wherein the parameters include type, volume, and vitality of the organic waste, and wherein the selective-removal process enables retention of ecologically beneficial organic waste and facilitates removal of harmful or excessive organic waste from the water surface that negatively impacts water quality, wherein the controller is configured to: receive image data generated by an artificial intelligence (AI) imaging unit, water-salinity levels obtained from a water-salinity sensor, fluorescence-decay lifetime characteristics, temporal-attenuation profiles, and ranging data of the organic waste produced by an optical fluorescence-sensing unit, wherein the AI imaging unit and the optical fluorescence-sensing unit are configured to be disposed on a support frame affixed to the floating body, and wherein the water-salinity sensor is affixed to a front-bottom region of the floating body to enable submerged environmental sampling; generate a unified, time-synchronized, multi-parameter dataset by performing normalization and synchronization of the image data, the water-salinity levels, and the fluorescence-decay lifetime characteristics and ranging data, thereby producing a reduced-dimensional computational representation that eliminates redundant data and compresses salient feature vectors into a compact form suitable for real-time inference and control execution on the controller, wherein the normalization and synchronization of the image data, the water-salinity levels, and the fluorescence-decay lifetime characteristics and ranging data reduces inference latency and improves selective-removal accuracy; classify the organic waste into ecologically beneficial and ecologically harmful types using the AI imaging unit, wherein the AI imaging unit is pre-programmed with dataset of images representing the organic waste into ecologically beneficial and ecologically harmful types, enabling precise identification and classification of the organic waste on the water surface; determine a vitality level of the organic waste by analyzing the fluorescence-decay lifetime characteristics and the temporal-attenuation profiles using a machine learning model, wherein the controller is pre-programmed with fluorescence decay pattern data for determining organic waste vitality; compute the volume of the organic waste using the fluorescence intensity, the ranging data, and data processing techniques to generate a three-dimensional biomass model, thereby enabling structured volumetric estimation and optimized memory usage during point-cloud reconstruction; compare the types, the vitality level, and the volume of the organic waste with salinity-dependent ecological threshold values stored in the non-transitory memory to generate an organic-waste removal decision using a decision model, wherein dynamic ecological thresholding reduces unnecessary removal of beneficial organic waste, and improves selective-removal accuracy compared to fixed-threshold systems; activate a conveyor unit of the autonomous water-cleaning vehicle when the removal decision indicates harmful, dead, or excessive organic waste by the decision model, and inhibit activation when the organic waste is determined to be beneficial and below threshold levels, thereby reducing mechanical load and decreasing energy consumption; and command a navigation unit to maneuver the autonomous water-cleaning vehicle toward or away from detected organic-waste regions based on the removal decision, thereby increasing propulsion efficiency and reducing energy consumption through optimized path-planning, wherein the computing device is configured to selectively activate the conveyor unit to remove the harmful or excessive organic waste as determined by the organic-waste removal decision from the water surface, thereby producing a tangible modification to the water body.

    2. The AI-based autonomous system of claim 1, wherein the AI imaging unit is rotatable through 360 degrees, and wherein the AI imaging unit comprises a waterproof and corrosion-resistant enclosure configured for outdoor aquatic operation.

    3. The AI-based autonomous system of claim 1, wherein the organic waste comprises algae, and wherein the dataset of images comprises images of green algae, diatoms, red algae, blue-green algae, dinoflagellates, and golden algae.

    4. The AI-based autonomous system of claim 1, wherein the optical fluorescence-sensing unit comprises a laser source emitting at 450-532 nanometers, a beam steering unit, a fluorescence detector, a spectral filter, and a timing and ranging module.

    5. The AI-based autonomous system of claim 1, wherein the water-salinity sensor is a toroidal conductivity salinity meter partially immersed in water, and wherein the controller is an NVIDIA Jetson Orin Nano 8 GB module disposed within the floating body.

    6. The AI-based autonomous system of claim 1, wherein the machine learning model comprises at least one of support vector machine (SVM), k-means clustering, DBSCAN clustering, or long short-term memory (LSTM) networks.

    7. The AI-based autonomous system of claim 1, wherein the data processing techniques comprise at least one of regression analysis, point cloud processing, and kalman filter model.

    8. The AI-based autonomous system of claim 1, wherein the decision model comprises at least one of a fuzzy logic model and a random forest classifier model.

    9. The AI-based autonomous system of claim 1, wherein the temporal-attenuation profiles are fluorescence decay patterns with decay times between 650-1100 nanoseconds for live algae and under 500 nanoseconds for dead algae.

    10. The AI-based autonomous system of claim 1, wherein the point-cloud reconstruction techniques utilize fluorescence intensity and LiDAR ranging data to generate three-dimensional biomass estimates.

    11. The AI-based autonomous system of claim 1, wherein the organic-waste removal decision is generated by comparing the computed volume against salinity-dependent threshold values stored in the non-transitory memory.

    12. The AI-based autonomous system of claim 1, wherein the navigation unit comprises a global positioning system (GPS) and a global navigation satellite system (GNSS) for real-time geolocation tracking, an inertial measurement unit (IMU) with accelerometers and gyroscopes for precise orientation and motion control, and an obstacle detection and avoidance system that utilizes sonar, LiDAR (Light Detection and Ranging), and AI-based path-planning model.

    13. The AI-based autonomous system of claim 1, wherein the computing device is configured to communicate with a user device via a network through IoT communication modules.

    14. The AI-based autonomous system of claim 1, wherein the AI-based autonomous system is configured to enable a user to manually operate the autonomous water-cleaning vehicle through the user device via a multi-platform application.

    15. The AI-based autonomous system of claim 1, wherein the salinity-dependent ecological threshold values define maximum beneficial algae volumes of 10-30 g/m.sup.3 for salinity below 0.5 ppt, 30-50 g/m.sup.3 for salinity 0.5-5 ppt, 50-100 g/m.sup.3 for salinity 5-18 ppt, 100-200 g/m.sup.3 for salinity 18-30 ppt, 200-500 g/m.sup.3 for salinity 30-35 ppt, and over 500 g/m.sup.3 for salinity above 35 ppt.

    16. A method for predicting and selectively removing floating waste from water bodies using an artificial intelligence (AI)-based autonomous system, comprising: receiving, by a controller, image data generated by an artificial intelligence (AI) imaging unit, water-salinity levels obtained from a water-salinity sensor, fluorescence-decay lifetime characteristics, temporal-attenuation profiles, and ranging data of organic waste produced by an optical fluorescence-sensing unit; generating, by the controller, a unified, time-synchronized, multi-parameter dataset by performing normalization and synchronization of the image data, the water-salinity levels, and the fluorescence-decay lifetime characteristics and ranging data; classifying, by the controller, organic waste into ecologically beneficial and ecologically harmful types using the AI imaging unit; determining, by the controller, a vitality level of the organic waste based on the fluorescence-decay lifetime characteristics and the temporal-attenuation profiles using a machine learning model, wherein the organic waste comprises algae, wherein the temporal-attenuation profiles are fluorescence decay patterns with decay times between 650-1100 nanoseconds, and wherein the controller is pre-programmed with fluorescence decay pattern data for determining organic waste vitality; computing, by the controller, the volume of the organic waste using the fluorescence intensity, the ranging data, and point-cloud reconstruction techniques to generate a three-dimensional biomass model, thereby enabling structured volumetric estimation and optimized memory usage during point-cloud reconstruction; comparing, by the controller, the types, the vitality level, and the volume of the organic waste with salinity-dependent ecological threshold values stored in a non-transitory memory to generate an organic-waste removal decision using a decision model, wherein the organic-waste removal decision is generated by comparing the computed volume against salinity-dependent threshold values stored in the non-transitory memory; activating, by the controller, a conveyor unit of the autonomous water-cleaning vehicle when the removal decision indicates harmful, dead, or excessive organic waste by the decision model, and inhibiting activation when the organic waste is determined to be beneficial and below threshold levels, thereby reducing mechanical load and decreasing energy consumption; commanding, by the controller, a navigation unit to maneuver the autonomous water-cleaning vehicle toward or away from detected organic-waste regions based on the removal decision, thereby increasing propulsion efficiency and reducing energy consumption through optimized path-planning; and enabling, by a computing device, selective activation of the conveyor unit to remove the harmful or excessive organic waste as determined by the organic-waste removal decision from the water surface, thereby producing a tangible modification to the water body.

    17. The method of claim 16, wherein the AI imaging unit is pre-programmed with dataset of images comprises images of green algae, diatoms, red algae, blue-green algae, dinoflagellates, and golden algae.

    18. The method of claim 16, wherein the machine learning model comprises support vector machine (SVM), k-means clustering, DBSCAN clustering, or long short-term memory (LSTM) networks.

    19. The method of claim 16, wherein the point-cloud reconstruction techniques comprise regression analysis, point cloud processing, or kalman filter model, wherein the decision model comprises fuzzy logic and random forest classifier models.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0041] The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate an embodiment of the invention, and, together with the description, explain the principles of the invention.

    [0042] FIG. 1A illustrates a block diagram of an artificial intelligence (AI)-based autonomous system for predicting and selectively removing floating waste from water bodies, in accordance with embodiments of the invention.

    [0043] FIG. 1B illustrates a block diagram of the optical fluorescence-sensing unit of the autonomous water-cleaning vehicle, in accordance with embodiments of the invention.

    [0044] FIG. 2A illustrates an isometric view of the autonomous water-cleaning vehicle, in accordance with embodiments of the invention.

    [0045] FIG. 2B illustrates a bottom view of the autonomous water-cleaning vehicle, in accordance with embodiments of the invention.

    [0046] FIG. 3 illustrates a schematic view of the autonomous water-cleaning vehicle while predicting and removing recyclable waste in water bodies, in accordance with embodiments of the invention.

    [0047] FIG. 4 illustrates a schematic view of the autonomous water-cleaning vehicle while removing the harmful or excessive organic waste as determined by the organic-waste removal decision from the water surface, in accordance with embodiments of the invention.

    [0048] FIG. 5 illustrates a flow chart of a method for predicting and selectively removing floating waste from water bodies using the artificial intelligence (AI)-based autonomous system, in accordance with embodiments of the invention.

    [0049] FIG. 6 illustrate a flow chart of a method for predicting and selectively removing floating waste from water bodies using the artificial intelligence (AI)-based autonomous system, in accordance with embodiments of the invention.

    DETAILED DESCRIPTION

    [0050] Reference will now be made in detail to the present preferred embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numerals are used in the drawings and the description to refer to the same or like parts.

    [0051] FIG. 1A refers to a block diagram of an artificial intelligence (AI)-based autonomous system 100 for predicting and removing floating waste 10 in water bodies. In one embodiment herein, the artificial intelligence (AI)-based autonomous system 100 integrated with an autonomous water-cleaning vehicle 128 predicts and selectively removes floating waste 10 by accurately analyzing type, volume, and vitality, thereby ensuring targeted removal to maintain ecological balance and enhance water quality in water bodies. In one embodiment herein, The AI-based autonomous system 100 comprises a computing device 170, an autonomous water-cleaning vehicle 128, a network 120, a server 122, a database 124, and a user device 125.

    [0052] In one embodiment herein, the computing device 170 comprises a controller 172 and a non-transitory memory 176 for storing instructions executable by the controller 172. The computing device 170 is disposed on a floating body 130 of the autonomous water-cleaning vehicle 128 and functions as a control unit that controls the operation of the vehicle. The computing device 170 is configured to automatically maneuver the autonomous water-cleaning vehicle 128 across a water surface to perform a cleaning operation to collect floating waste 10 comprises organic waste 12 and recyclable waste 14, as shown in FIG. 4. The computing device 170 is configured to execute a selective cleaning operation by monitoring and analyzing types of the floating waste and parameters of the organic waste. The parameters include type, volume, and vitality of the organic waste 12. The selective-removal process enables retention of ecologically beneficial organic waste and facilitates removal of harmful or excessive organic waste from the water surface, thereby improving water quality. The organic waste 12 can be, but is not limited to, algae.

    [0053] In one embodiment herein, the controller 172 is configured to receive image data generated by an artificial intelligence (AI) imaging unit 160, water-salinity levels obtained from a water-salinity sensor 162, fluorescence-decay lifetime characteristics, temporal-attenuation profiles, and ranging data produced by an optical fluorescence-sensing unit 164.

    [0054] The AI imaging unit 160 and the optical fluorescence-sensing unit 164 are disposed on a support frame 140 affixed to the floating body 130, and the water-salinity sensor 162 is affixed to a front-bottom region of the floating body 130 to enable submerged environmental sampling. In one embodiment herein, the water-salinity sensor 162 is a toroidal conductivity salinity meter partially immersed in water.

    [0055] In one embodiment herein, the AI imaging unit 160 is rotatable through 360 degrees. The AI imaging unit 160 is pre-programmed with dataset of images representing the organic waste 12 into ecologically beneficial and ecologically harmful types, enabling precise identification and classification of the organic waste on the water surface. In one embodiment, the dataset of images can be, but is not limited to, images of green algae (Chlorophyta), diatoms (Bacillariophyta), red algae (Rhodophyta), blue-green algae (Cyanobacteria), dinoflagellates (Dinophyta), and golden algae (Chrysophyta). The AI imaging unit 160 comprises a waterproof and corrosion-resistant enclosure configured for outdoor aquatic operation.

    [0056] As used herein, the term type of the organic waste refers to the categorical classification output generated by the AI imaging unit based on image-feature extraction and comparison against a pre-programmed dataset comprising labeled species groups including Chlorophyta, Bacillariophyta, Rhodophyta, Cyanobacteria, Dinophyta, and Chrysophyta.

    [0057] In one embodiment herein, the optical fluorescence-sensing unit 164 comprises a laser source 164A emitting at 450-532 nanometers, a beam steering unit 164B, a fluorescence detector 164C, a spectral filter 164D, and a timing and ranging module 164E, as shown in FIG. 1B, to improve accuracy and efficiency in detecting contaminants.

    [0058] In one embodiment herein, the controller 172 is an NVIDIA Jetson Orin Nano 8 GB module housed within the floating body 130. The controller 172 is configured to generate a unified, time-synchronized, multi-parameter dataset by performing normalization and synchronization of the image data, the water-salinity levels, and the fluorescence-decay lifetime characteristics and ranging data, thereby producing a reduced-dimensional computational representation that eliminates redundant data and compresses salient feature vectors into a compact form suitable for real-time inference and control execution on the controller 172. The normalization and synchronization of the image data, the water-salinity levels, and the fluorescence-decay lifetime characteristics and ranging data reduces inference latency and improves selective-removal accuracy.

    [0059] In one embodiment herein, the controller 172 is configured to classify the organic waste 12 into ecologically beneficial and ecologically harmful types using the AI imaging unit 160. The controller 172 is configured to determine a vitality level of the organic waste by analyzing the fluorescence-decay lifetime characteristics and the temporal-attenuation profiles using a machine learning model, wherein the controller 172 is pre-programmed with fluorescence decay pattern data for determining organic waste vitality. The machine learning model can be, but is not limited to, support vector machine (SVM), k-means clustering, DBSCAN clustering, or long short-term memory (LSTM) networks.

    [0060] In one embodiment herein, the temporal-attenuation profiles are fluorescence decay patterns with decay times between 650-1100 nanoseconds for live algae and under 500 nanoseconds for dead algae. The term vitality refers to the metabolic activity state of the organic waste, quantified by analyzing fluorescence-decay lifetime characteristics and temporal-attenuation profiles obtained from the optical fluorescence-sensing unit 164, wherein vitality is categorized as live when decay-time T lies within the 650-1100 nanosecond range and categorized as dead when decay-time T falls below 500 nanoseconds, with intermediate values classified using the trained machine-learning vitality-classification model.

    [0061] In one embodiment herein, the controller 172 is configured to compute the volume of the organic waste 12 using the fluorescence intensity, the ranging data, and data processing techniques to generate a three-dimensional biomass model, thereby enabling structured volumetric estimation and optimized memory usage during point-cloud reconstruction. The data processing techniques can be, but is not limited to, regression analysis, point cloud processing, and kalman filter model. The point-cloud reconstruction techniques utilize fluorescence intensity and LiDAR ranging data to generate three-dimensional biomass estimates.

    [0062] In one embodiment herein, the controller 172 is configured to compare the types, the vitality level, and the volume of the organic waste 12 with salinity-dependent ecological threshold values stored in the non-transitory memory 176 to generate an organic-waste removal decision using a decision model. The dynamic ecological thresholding reduces unnecessary removal of beneficial organic waste and improves selective-removal accuracy compared to fixed-threshold systems. The decision model can be, but is not limited to, fuzzy logic and random forest classifier model. The organic-waste removal decision is generated by comparing the computed volume against salinity-dependent threshold values stored in the non-transitory memory 176. The term volume refers to the three-dimensional biomass magnitude computed by the controller 172 using fluorescence-LiDAR range measurements and fluorescence-intensity values to generate a point-cloud representation of the detected organic waste, wherein the controller 172 calculates the biomass volume by performing point-cloud clustering, surface fitting, and voxel-based integration.

    [0063] In one embodiment herein, the salinity-dependent ecological threshold values define maximum beneficial organic waste volumes of 10-30 g/m.sup.3 for salinity below 0.5 ppt, 30-50 g/m.sup.3 for salinity 0.5-5 ppt, 50-100 g/m.sup.3 for salinity 5-18 ppt, 100-200 g/m.sup.3 for salinity 18-30 ppt, 200-500 g/m.sup.3 for salinity 30-35 ppt, and over 500 g/m.sup.3 for salinity above 35 ppt.

    [0064] In one embodiment herein, the controller 172 is configured to activate a conveyor unit 151 of the autonomous water-cleaning vehicle 128 when the removal decision indicates harmful, dead, or excessive organic waste by the decision model, and inhibit activation when the organic waste is determined to be beneficial and below threshold levels, thereby reducing mechanical load and decreasing energy consumption. The conveyor unit 151 comprises an inclined conveyor belt 152 with drainage perforations 154, a first roller 148, and a second roller 150, as shown in FIG. 2A. The first roller 148 is connected to a driving unit 142. The second roller 150 is positioned at a submerged front end of the floating body 130.

    [0065] In one embodiment herein, the controller 172 is configured to command a navigation unit 168 to maneuver the autonomous water-cleaning vehicle 128 toward or away from detected organic-waste regions based on the removal decision, thereby increasing propulsion efficiency and reducing energy consumption through optimized path-planning. The navigation unit 168 can be, but is not limited to, a global positioning system (GPS) and a global navigation satellite system (GNSS) for real-time geolocation tracking, an inertial measurement unit (IMU) with accelerometers and gyroscopes for precise orientation and motion control, and an obstacle detection and avoidance system that utilizes sonar, and an AI-based path-planning model.

    [0066] In one embodiment herein, the computing device 170 is configured to selectively activate the conveyor unit 151 to remove harmful or excessive organic waste from the water surface, as determined by the organic-waste removal decision, thereby producing a tangible modification to the water body.

    [0067] In one embodiment herein, the computing device 170 is configured to communicate with a user device 125 via a network 120 through IoT communication modules 174. The AI-based autonomous system 100 is configured to enable a user to manually operate the autonomous water-cleaning vehicle 128 through the user device 125 via a multi-platform application.

    [0068] In one embodiment herein, the user device 125 is configured to enable a user to remotely monitor and control the autonomous water-cleaning vehicle 128. The user device 125 can be, but is not limited to, a smartphone, a laptop, a tablet, or any suitable electronic device capable of wireless communication.

    [0069] In one embodiment herein, the user device 125 comprises the user interface 126, which is configured to allow the user to input commands, view system status, and operate the autonomous water-cleaning vehicle 128 when manual control is required. As shown in FIG. 1A, the user interface 126 presents buttons for Start, Stop, and directional controls for forward, backward, left, and right movement, enabling direct manual maneuvering of the autonomous water-cleaning vehicle 128. The user interface 126 can be, but is not limited to, a display unit. The user interface 126 further displays visuals from the AI imaging unit 160, system alerts, and organic waste removal decisions generated by the controller 172. The AI-based autonomous system 100 is configured to enable a user to manually operate the autonomous water-cleaning vehicle 128 through the user device 125 via a multi-platform application.

    [0070] In one embodiment herein, the computing device 170 is configured to establish a persistent bidirectional communication link with at least one remote user device 125 via the network 120 utilizing the IoT communication modules 174. This link facilitates the real-time streaming of telemetry data from the autonomous water-cleaning vehicle 128 to the user device 125 and the concurrent transmission of operational commands from the user device 125 back to the vehicle 128.

    [0071] In one embodiment herein, the AI-based autonomous system 100 includes a multi-platform control application comprising executable instructions stored on the user device 125.

    [0072] The application provides a dual-mode operational interface that integrates manual vehicle control with the autonomous AI decision-making framework. The interface mode is selectively switchable by the user.

    [0073] In one embodiment herein, the multi-platform application operates in a first autonomous monitoring mode. In this mode, the application renders a graphical interface presenting a synthesized data overlay on a live video feed from the AI imaging unit 160. This overlay visually distinguishes regions of ecologically beneficial organic waste from ecologically harmful organic waste as classified by the controller 172, and further displays the projected navigation path of the autonomous water-cleaning vehicle 128.

    [0074] In one embodiment herein, the multi-platform application operates in a second manual override mode. In this mode, the application renders a direct teleoperation interface, wherein user input via the user device 125 generates real-time command signals for the propulsion and steering mechanisms of the autonomous water-cleaning vehicle 128. The manual override mode enables direct user control of the conveyor unit 151 activation independent of the organic-waste removal decision generated by the controller 172.

    [0075] In one embodiment herein, the direct teleoperation interface presented by the multi-platform application comprises a dynamic touch-based steering control element superimposed on the live video feed. The interface further includes at least one dedicated virtual actuator for conveyor unit 151 control and an interactive map element allowing the user to define a navigation waypoint by selecting a location on the displayed video feed, whereby the controller 172 commands the navigation unit 168 to maneuver the vehicle to the user-selected location.

    [0076] In one embodiment herein, the controller 172 is configured to generate a priority alert signal upon detecting a predefined operational condition, such as an obstacle detection event or a sensor fault indication. The IoT communication modules 174 transmit this alert signal to the user device 125, causing the multi-platform application to present a prominent visual and haptic notification and prompt the user to switch from the autonomous monitoring mode to the manual override mode.

    [0077] In one embodiment herein, the computing device 170 is configured to log a sequential record of all operational events, including periods of autonomous operation and instances of manual override intervention. This record chronologically associates sensor data, AI-generated decisions, user-initiated commands, and vehicle actuator responses, thereby creating an auditable trail for operational review and system performance analysis.

    [0078] In one embodiment herein, the multi-platform application comprises an interface configured to allow the user to remotely monitor and control the autonomous water-cleaning vehicle 128. The multi-platform application is executed on various user devices 125, which include smartphones, laptops, tablets, and personal computers, providing flexibility for the user to engage with the AI-based autonomous system 100 from all locations. The application enables visualization of the AI imaging unit feed, system alerts, and the status of organic waste removal operations.

    [0079] The multi-platform application comprises functionality for manual operation of the autonomous water-cleaning vehicle 128, which include start and stop buttons and directional control for forward, backward, left, and right movement. Additionally, the application allows users to adjust operational parameters and configure cleaning schedules. Alerts related to system status, environmental changes, and detected organic waste conditions are displayed on the user interface 126, thereby ensuring efficient control and management of the AI-based autonomous system 100.

    [0080] In one embodiment herein, the network 120 can be, but is not limited to, a Local Area Network (LAN), a Cellular Network, a Wide Area Network (WAN), an Intranet, a Virtual Private Network (VPN), and wireless networks that use radio frequency (RF) and infrared (IR) technology to transmit data without physical cables, thereby providing mobility and flexibility. The versatility of the network 120 ensures that the computing device 170 and the user device 125 seamlessly connect to the server 122 and the database 124, thereby enabling the users to access functionalities of the AI-based autonomous system 100 and resources from a variety of locations and devices. This wireless connectivity enhances the overall accessibility and convenience of the AI-based autonomous system 100 for the users.

    [0081] In one embodiment herein, the controller 172 of the computing device 170 is configured to receive the image data, the water-salinity levels, the fluorescence-decay lifetime characteristics, and the ranging data, and store the fluorescence-decay patterns in the database 124. The controller 172 of the computing device 170 is configured to normalise and synchronise the received data elements as an input dataset and stores them in the database 124 for organic waste type classification, organic waste vitality analysis, and organic waste volume estimation.

    [0082] FIG. 1B refers to a block diagram of the optical fluorescence-sensing unit 164 of the autonomous water-cleaning vehicle 128. In one embodiment herein, the AI-based autonomous system 100 is configured to control and optimize the autonomous water-cleaning vehicle 128 for predicting and removing the floating waste 10 from water bodies. The floating waste 10 comprises the organic waste 12 and the recyclable waste 14. The organic waste 12 comprises algae. The AI-based autonomous system 100 integrates multiple computational, analytical, and decision-making functionalities to enable autonomous, efficient, and real-time water cleaning operations. In one embodiment herein, the optical fluorescence-sensing unit 164 is configured to utilize infrared (IR), multispectral, and hyperspectral imaging for remote algae detection and water composition mapping. The optical fluorescence-sensing unit 164 identifies spectral signatures associated with harmful cyanobacteria blooms and floating debris 10, differentiating between non-toxic algae and hazardous blooms such as red tides. Additionally, the optical fluorescence-sensing unit 164 detects oil spills, industrial discharge, and chemical runoff from distant locations, providing a broader assessment of water pollution. By integrating with the AI imaging unit 160 and water-salinity sensor 162, the optical fluorescence-sensing unit 164 enhances the AI-based autonomous system's ability to predict contamination trends and optimize cleaning strategies.

    [0083] In one embodiment herein, the optical fluorescence-sensing unit 164 comprises a laser source 164A, a beam steering unit 164B, a fluorescence detector 164C, a spectral filter 164D, and a timing and ranging module 164E to improve accuracy and efficiency in detecting contaminants. The laser source 164A emits a controlled laser beam to illuminate water surfaces, enhancing visibility and detection precision. The beam steering unit 164B dynamically adjusts the laser direction to scan a wide area for pollutants. The fluorescence detector 164C captures and analyzes fluorescence signals from chlorophyll-a and other biological markers, allowing precise identification of harmful algae. The spectral filter 164D isolates specific wavelengths for targeted analysis, improving spectral resolution in detecting different types of contaminants. The timing and ranging module 164E is configured to measure the time delay and intensity of reflected signals, enabling precise distance calculations and 3D mapping of water conditions. The integration of these components enhances the ability of the optical fluorescence-sensing unit 164 to detect, classify, and assess contaminants with high accuracy.

    [0084] FIG. 2A refers an isometric view of the autonomous water-cleaning vehicle 128, in accordance with one embodiment of the invention. FIG. 2B refers a bottom view of the autonomous water-cleaning vehicle 128, in accordance with one embodiment of the invention. The autonomous water-cleaning vehicle 128 is configured to automatically maneuver across a water surface to perform a cleaning operation to collect the floating waste 10 comprises the organic waste 12 and the recyclable waste 14. The organic waste 12 comprises the algae. In one embodiment herein, the autonomous water-cleaning vehicle 128 comprises a floating body 130, a plurality of propelling wheels (132A, 132B, 132C, 132D), one or more motors (134A, 134B, 134C, 134D), a primary supporter 136A, a secondary supporter 136B, a first extended column 138A, a second extended column 138B, a supporting frame 140, the driving unit 142, the conveyor unit 151, a counterweight 156, a container 158, the AI imaging unit 160, the water-salinity sensor 162, the optical fluorescence-sensing unit 164, a primary power source 166A, a secondary power source 166B, the navigation unit 168, and the controller 172.

    [0085] In one embodiment herein, the floating body 130 is configured rectangular-shaped hollow structure and buoyancy nature. The floating body 130 enables the autonomous water-cleaning vehicle 128 to move and float on the water surface. The corners of the floating body 130 are configured as curved shape, which assists the vehicle's propelling action on the water surface. In one embodiment herein, the plurality of propelling wheels (132A, 132B, 132C, 132D) is operatively connected to each corner of the floating body 130. a first wheel 132A, a second wheel 132B, a third wheel 132C, a fourth wheel 132D, as shown in FIG. 2B. The one or more motors (134A, 134B, 134C, 134D) include a first motor 134A, a second motor 134B, a third motor 134C, a fourth motor 134D, as shown in the FIG. 2B.

    [0086] In one embodiment herein, the first wheel 132A is operatively connected to the first motor 134A within the floating body 130. The second wheel 132B is operatively connected to the second motor 134B within the floating body 130. The third wheel 132C is operatively connected to the third motor 134C within the floating body 130. The fourth wheel 132D is operatively connected to the fourth motor 134D within the floating body 130. In one embodiment herein, the primary supporter 136A and the secondary supporter 136B are mounted on the floating body 130. The container 158 is inclinedly positioned between the primary supporter 136A and the secondary supporter 136B.

    [0087] In one embodiment herein, Table 1 depicts specifications of the one or more motors (134A, 134B, 134C, 134D).

    TABLE-US-00001 TABLE 1 Acceptable Parameter Preferred Spec Range Notes Motor type 24 V BLDC + planetary 24 V brushed BLDC = higher efficiency/longer life gearbox (marine-sealed) DC + planetary Continuous power 80-120 W 60-150 W Sized for steady cruise with margin (per motor) Peak power 200-300 W 150-400 W For quick turns, weed escape (3 s, per motor) Gear ratio 20:1-30:1 15:1-40:1 Puts output in 100-160 rpm zone Output speed 100-160 rpm 80-200 rpm For paddle tip speed 0.8-1.2 m/s (post-gear) (R 100 mm) Output torque 4-6 N .Math. m 3-8 N .Math. m Gives usable thrust with slip losses (continuous) Peak torque (3 s) 10-14 N .Math. m 8-18 N .Math. m Short surge for maneuvering Supply / current 24 V, 4-6 A cont.; Size driver 25-30 A peak (per motor) 12-18 A peak Encoder Incremental 512-1024 256-2048 CPR Closed-loop speed/heading hold CPR (A/B/Z) Brake (optional) 24 V holding 0.5-2 N .Math. m Holds position against wind/current brake 1 N .Math. m Shaft / mount 10 mm keyed, face- 8-12 mm Stainless shaft & hardware mount 63 mm class Bearings / seals Double-sealed bearings, Keep spray out; grease ports if possible shaft seal + slinger Protection IP67-IP68 IP65-IP68 Splash/brief immersion ready motor + gearbox Corrosion resistance Epoxy-coated housing, Fresh/brackish duty SS fasteners Operating temp 10 C. to +60 C. 20 C. to +70 C. Outdoor range Motor controller Dual 24 V BLDC FOC, 20-40 A peak Current limit + stall detect 30 A peak/channel, regen clamp Cable/connector Shielded 4-core + M12 Keep encoder lines twisted/shielded (IP67)

    [0088] In one embodiment herein, the driving unit 142 is operatively positioned on the primary supporter 136A. The first roller 148 is rotatably positioned between the primary supporter 136A and the secondary supporter 136B. In one embodiment herein, the one end of the first roller 148 is rotatably connected to the driving unit 142 to rotate in at least one direction upon activating the driving unit 142. In one embodiment herein, the front end of the floating body 130 is configured with the first extended column 138A and the second extended column 138B.

    [0089] In one embodiment herein, the conveyor unit 151 is mounted between the primary supporter 136A and the secondary supporter 136B at one end, and the other end of conveyor unit 151 is mounted between the first extended column 138A and the second extended column 138B. In one embodiment herein, the conveyor unit 151 comprises a driving unit a first roller 148, a second roller 150, the inclined conveyor belt 152. In one embodiment herein, the second roller 150 is rotatably positioned between the first extended column 138A and the second extended column 138B. The first roller 148 is an active roller. The second roller 150 is a passive roller.

    [0090] In one embodiment herein, the inclined conveyor belt 152 is movably positioned around the first roller 148 and the second roller 150. The first roller 148 and the second roller 150 rotate in at least one direction upon activation of the driving unit 142 to drive the inclined conveyor belt 152 over the water surface for collecting the organic waste 12 and the recyclable waste 14. In one embodiment herein, the counterweight 156 is positioned at the rear end of the floating body 130 to counterbalance the weight of the inclined conveyor belt 152. The inclined conveyor belt 152 includes drainage perforations 154 that reduce the hydraulic load on the conveyor drive mechanism by draining water during waste collection, thereby decreasing energy consumption by approximately 15-30% compared to a non-perforated belt while maintaining collection efficiency of organic waste 12 and recyclable waste 14. In one embodiment herein, the container 158 is inclinedly positioned underneath the inclined conveyor belt 152 to receive and store the organic waste 12 and the recyclable waste 14. In one embodiment herein, Table 2 depicts specifications of the driving unit 142.

    TABLE-US-00002 TABLE 2 Parameter Driving unit Type Geared DC Motor Voltage 12 V DC Power Rating 100-150 W (low-speed torque) Torque Medium (for conveyor load) Speed 60-100 RPM

    [0091] In one embodiment herein, the supporting frame 140 is perpendicularly positioned on the floating body 130 and beneath the container 158. The AI imaging unit 160 is mounted on the supporting frame 140. The AI imaging unit 160 is a 360-degree rotatable camera designed to detect the floating waste 10 from all sides of the water surface. In one embodiment herein, the AI imaging unit 160 is rotatable through 360 degrees. The AI imaging unit 160 is pre-programmed with dataset of images representing the organic waste 12 into ecologically beneficial and ecologically harmful types, enabling precise identification and classification of the organic waste on the water surface. In one embodiment, the dataset of images comprises images of green algae (Chlorophyta), diatoms (Bacillariophyta), red algae (Rhodophyta), blue-green algae (Cyanobacteria), dinoflagellates (Dinophyta), and golden algae (Chrysophyta). The AI imaging unit 160 comprises a waterproof and corrosion-resistant enclosure configured for outdoor aquatic operation.

    [0092] In one embodiment herein, the water-salinity sensor 162 is integrated at the front end of the floating body 130 to continuously monitor the salinity levels of the water. The water-salinity sensor 162 is the toroidal conductivity salinity meter partially immersed in water. The water-salinity sensor 162 is configured to detect the salinity levels. The water-salinity sensor 162 is affixed to the front-bottom region of the floating body 130 to enable submerged environmental sampling. In one embodiment herein, the water-salinity sensor 162 is a toroidal conductivity salinity meter partially immersed in water.

    [0093] In one embodiment herein, the optical fluorescence-sensing unit 164 is integrated on the supporting frame 140. The optical fluorescence-sensing unit 164 is a Fluorescence LiDAR (Laser Induced Detection and Ranging). The optical fluorescence-sensing unit 164 is configured to detect the algae volume and vitality on the water surface. In one embodiment herein, the optical fluorescence-sensing unit 164 comprises a laser source 164A, a beam steering unit 164B, a fluorescence detector 164C, a spectral filter 164D, and a timing and ranging module 164E. In one embodiment herein, the laser source 164A emits a specific wavelength (typically 450-532 nm) of light to excite the algae pigments (like chlorophyll-a). The intensity and spread of the laser help in measuring algae volume over the water surface.

    [0094] In one embodiment herein, the beam steering unit 164B controls the angle and direction of the laser beam for accurate scanning over the water surface and facilitates non-contact measurement of algae distribution and depth, essential for comprehensive volume estimation.

    [0095] In one embodiment herein, the fluorescence detector 164C detects the fluorescent light emitted by algae after excitation. The fluorescence intensity directly correlates with algae concentration (volume). It differentiates live and dead algae by analyzing fluorescence patterns, as live algae exhibit stronger fluorescence due to active pigments.

    [0096] In one embodiment herein, the spectral filter 164D filters out background light (like sunlight) and reflected laser signals, allowing only fluorescent signals from algae to reach the detector to accurately detect the algae without interference. In one embodiment herein, the timing and ranging module 164E measures the time delay between the emitted laser pulse and the received fluorescent signal. Based on this data, it calculates the distance and depth profile of the algae layers and estimates the algae volume on the water surface. In one embodiment herein, Table 3 depicts specifications of the AI imaging unit 160, the water-salinity sensor 162, and the optical fluorescence-sensing unit 164.

    TABLE-US-00003 TABLE 3 Target / Recommended Subsystem Parameter spec Why / Notes Interface & Power Salinity meter Measurement Toroidal/inductive Resists fouling; RS-485 principle conductivity with stable in debris-rich (Modbus-RTU), integrated temperature water 12-24 V DC, <60 mA Salinity/ 0-60 mS/cm Covers freshwater EC range (0-40 ppt salinity) .fwdarw. marine Accuracy / 0.2 ppt (or 1% rdg), For thresholding resolution 0.01 ppt and trend logging Temp Auto 0-50 C. Corrects EC .fwdarw. ppt compensation (NTC-10k or PT1000) conversion Response time T90 2 s Real-time control friendly Materials / PVDF/PEEK body, IP68, Durable, sub- rating 0-3 bar surface mounting Calibration 1-3-point (field), cell- Easy maintenance constant stored Mounting 150-250 mm below Avoids surface depth surface with guard film / bubbles Cable / EMC 5 m shielded, IP68 gland Noise immunity Algorithm PSS-78 (Practical Salinity Standardized ppt Scale) output AI imaging Sensor & 8-12 MP, 1/2.3- 1/1.8 Balance FOV + MIPI CSI-2 (2-4 unit format CMOS detail lanes), 5 V 1 W Shutter Global preferred (rolling Reduces motion CSI ribbon to SBC acceptable with short artifacts on a exposure) moving craft Frame rate 30 fps @ full res; Smooth tracking 60 fps binned Optics / FOV M12 or CS-mount, Wide coverage with f/2.0-2.8; low distortion HFOV 78-120 HDR / low- WDR 90 dB, Dawn/dusk light SNR1 0.1 lux performance ISP features AE/AGC/AWB; Stable inference de-noising; lens shading inputs Sync & time PPS/GPIO or PTP/NTP Aligns with LiDAR (preferred) timestamps Protection Splash shroud, Outdoor reliability hydrophobic lens coating, IP65 front Edge inference MobileNetV3-Large Meets real-time SBC 5 V rail INT8: 25-35 FPS on needs (5 A total) Jetson Orin Nano Fluorescence Excitation 520 nm (green) Excites chlorophyll-a GbE (UDP/PTP), LiDAR wavelength (alt: 450-532 nm) 24 V DC, 10-15 W Laser safety IEC 60825-1 Class 1 Eye-safe deck operation Pulse width / 5-20 ns pulses; Lifetime/decay timing 10 ns timing resolution; features jitter 1 ns Receiver band 650-750 nm (chlorophyll- Ambient light a emission) with narrow immunity band-pass Range / 0.5-6 m effective over water; Bow-mounted footprint spot 20-50 mm at 3 m scanning Scan / point FOV 30 15, Dense surface rate 5-20 Hz scan, mapping 10k pts/s Outputs Range, intensity, Volume + vitality GbE to SBC fluorescence decay/ (live/dead) lifetime per point Sunlight Works to 100 klux with Midday operation rejection optical band-pass + temporal gating IP / IP67, 10 to +50 C., Marine-ready 24 V from 48.fwdarw.24 environment <1.5 kg V DC-DC Time sync IEEE-1588 PTP or PPS Align with camera input frames Compliance EMC/ESD Reduces marine/industrial interference

    [0097] In one embodiment herein, the controller 172 is securely housed within the hollow section of the floating body 130. The controller 172 is operatively connected to the one or more motors (134A, 134B, 134C, 134D) to control the operation of the plurality of propelling wheels (132A, 132B, 132C, 132D). The controller 172 is operatively connected to the driving unit 142 to control the operation of the inclined conveyor belt 152 for collecting the organic waste 12 and the recyclable waste 14. The controller 172 is configured to communicate with the AI imaging unit 160, the water-salinity sensor 162, the optical fluorescence-sensing unit 164 and the navigation unit 168 for controlling operation of the AI-based autonomous system 100.

    [0098] In one embodiment herein, the controller 172 utilizes Regression Analysis, Point Cloud Processing, and the Kalman Filter models to estimate algae volume by analysing fluorescence intensity data collected by the LiDAR. Additionally, the controller 172 incorporates advanced models such as Support Vector Machine (SVM), Clustering models (e.g., K-Means, DBSCAN), and Time-Series Analysis (LSTM Networks) to accurately differentiate between live and dead algae by examining fluorescence decay patterns. The controller 172 is pre-programmed with fluorescence decay pattern data to accurately determine algae vitality. In one embodiment herein, the controller 172 uses Fuzzy Logic and Random Forest Classifier models to predict need for algae removal by comparing detected algae parameters such as algae type, volume, water salinity, and vitality.

    [0099] In one embodiment herein, Table 4 depicts pin to pin configuration of the controller 172, which can be, but is not limited to, NVIDIA Jetson Grin Nano (8 GB).

    TABLE-US-00004 TABLE 4 Pin/Header To device/ Notes (timing/ ID Controller (AF) Signal port Interface Direction Voltage termination) 1 MCU PA9 (AF7) USART1_TX SBC UART UART .fwdarw. 3.3 V SBC link; 460.8- RX (header 921.6 kbps pin 10) 2 MCU PA10 (AF7) USART1_RX SBC UART UART 3.3 V Cross TX/RX; TX (header common GND pin 8) 3 SBC Pin 6 GND MCU GND .Math. 0 V Single-point (star) ground 4 MCU PD8 (AF7) USART3_TX/DI RS-485 RS-485 .fwdarw. 3.3 V MODBUS-RTU transceiver 115200 bps DI .fwdarw. Probe A/B 5 MCU PD9 (AF7) USART3_RX/RO RS-485 RS-485 3.3 V 120 termination transceiver at far end RO Probe A/B 6 MCU PD2 (GPIO) DE/RE RS-485 RS-485 .fwdarw. 3.3 V High = TX, Low = transceiver ctrl RX enable 7 MCU PD1 (AF9) FDCAN1_TX CAN CAN- .fwdarw. 3.3 V Optional; 120 transceiver FD end .fwdarw. CAN bus 8 MCU PD0 (AF9) FDCAN1_RX CAN CAN- 3.3 V Optional transceiver FD CAN bus 9 MCU PC6 (AF2/ PWM Thruster-L PWM .fwdarw. 3.3 V 400 Hz; 1.0-2.0 TIM3_CH1) ESC signal ms 10 MCU PC7 (AF2/ PWM Thruster-R PWM .fwdarw. 3.3 V 400 Hz; 1.0-2.0 TIM3_CH2) ESC signal ms 11 MCU PA15 (AF1/ PWM Conveyor PWM .fwdarw. 3.3 V 400 Hz; 1.0-2.0 TIM2_CH1) ESC signal ms 12 MCU PB0 (AF2/ PWM Pan servo PWM .fwdarw. 3.3 V 50 Hz; 1.0-2.0 TIM3_CH3) ms 13 MCU PB1 (AF2/ PWM Tilt servo PWM .fwdarw. 3.3 V 50 Hz; 1.0-2.0 TIM3_CH4) ms 14 MCU PB8 (AF4) I.sup.2C1_SCL 9-axis IMU I.sup.2C .Math. 3.3 V 400 kHz; 4.7 k SCL pull-up 15 MCU PB9 (AF4) I.sup.2C1_SDA 9-axis IMU I.sup.2C .Math. 3.3 V 400 kHz; 4.7 k SDA pull-up 16 MCU PA0 Batt-V sense Divider ADC 0-3.3 V 100k/3.3k + 10 (ADC1_IN0) from 48 V nF RC 19 MCU PA1 Batt current Hall sensor ADC 0-3.3 V Isolated shunt (ADC1_IN1) output module 17 MCU PC13 (GPIO, E-Stop input NC loop Digital 3.3 V Fail-safe HIGH; PU) (via opto) INT on edge 18 MCU PC1 (GPIO) Bin-full IR/reflective Digital 3.3 V Debounce (SW) sensor 19 MCU PB12 (GPIO) Conveyor Limit switch Digital 3.3 V Interrupt, pull-up limit/jam 20 SBC CSI-2 port MIPI CSI-2 AI imaging CSI-2 .Math. 15/22-pin FFC unit 21 Field Pwr +24 V/0 V Salinity Power .fwdarw. 24 V From 48.fwdarw.24 V probe DC-DC 22 SBC Pwr +5 V/0 V SBC main Power .fwdarw. 5 V 24.fwdarw.5 V, 5 A rail 23 MCU Pwr +3.3 V/0 V MCU & IMU Power .fwdarw. 3.3 V LDO from 5 V, star AGND

    [0100] In one embodiment herein, the primary power source 166A is securely housed within the hollow section of the floating body 130 to supply electrical power to the controller 172 and the one or more motors (134A, 134B, 134C, 134D). In one embodiment herein, the secondary power source 166B is positioned on the supporting frame 140 supply the electrical power to the AI imaging unit 160, the water-salinity sensor 162, the optical fluorescence-sensing unit 164, and the navigation unit 168. In one embodiment herein, the navigation unit 168 is securely housed within the hollow section ofthe floating body 130 to navigate the autonomous water-cleaning vehicle 128. In one embodiment herein, Table 5 depicts specifications of the primary power source 166A and the secondary power source 166B.

    TABLE-US-00005 TABLE 5 Primary power source Secondary power source Parameter (Wheels & Controller) (AI imaging unit & LiDAR) Type Lithium-ion (Li-ion) Lithium Polymer (Li-Po) Voltage 48 V 24 V Capacity 100 Ah (for 6-8 hours 50 Ah (continuous LiDAR runtime) and sensors) Energy Density High (>250 Wh/kg) Very High (>300 Wh/kg) Protection BMS with thermal BMS with overcharge management protection Waterproof IP67 IP65 Rating

    [0101] In one embodiment herein, the controller 172 includes the oT-based communication modules 174 to enable real-time monitoring, control, and data transmission. The IoT-based communication modules 174 enable wireless communication between the user device 125 and the controller 172. In one embodiment herein, the IoT-based communication modules 174 enable wireless communication ofthe controller 172 with the AI imaging unit 160, the water-salinity sensor 162, the optical fluorescence-sensing unit 164, and the navigation unit 168.

    [0102] In one embodiment herein, Table 6 depicts specifications of the IoT-based communication modules 174.

    TABLE-US-00006 TABLE 6 Communication Requirement Module Reason Short-range Bluetooth Low power consumption, suit- communication (HC-05 / HM-10) able for app-based direct control. (10 m) Mid-range Wi-Fi Supports cloud connectivity, (100 m-500 m) (ESP32 / ESP8266) OTA updates, and real-time data streaming. Long-range LoRa Suitable for remote areas with- communication (RAK3172 / SX1276) out internet, low power con- (several km) sumption. Cellular NB-IoT (SIM7000) / Works in areas without Wi-Fi, connectivity 4G LTE (SIM7600) reliable for continuous data transmission.

    [0103] In one embodiment herein, Table 7 depicts the IoT-based communication modules 174 that enable wireless communication of the controller 172 with the AI imaging unit 160.

    TABLE-US-00007 TABLE 7 Data Communication Transfer Power Key Module Type Rate Range Latency Consumption Advantages Wi-Fi Wireless Up to 30-100 m Low to Moderate High-speed, (802.11ac/ax) 9.6 Gbps Moderate supports video (Wi-Fi 6) streaming & AI processing Bluetooth Wireless Up to 10-100 m Low Low Energy-efficient, 5.0/5.2 2 Mbps suitable for low-bandwidth AI tasks Zigbee (IEEE Wireless 250 Kbps 10-100 m Low Very Low Low-power 802.15.4) consumption, ideal for IoT applications LoRa (Long Wireless 0.3-50 2-10 km High Very Low Suitable for Range) Kbps long-range communication in remote water bodies 5G (NR mm Wave Wireless Up to Several km Very Low Moderate Best for real- & Sub-6 GHz) 10 Gbps (depends on time AI-based infrastructure) water analysis & cloud integration Li-Fi (Light Wireless Up to 1 10 m (Line Very Low Low Ultra-fast Fidelity) (Optical) Gbps of Sight) communication using light signals, immune to RF interference RF Transceiver Wireless 1-100 100 m-1 km Low Low Reliable, (433 MHz/ Kbps operates in 915 MHz) non-Wi-Fi congested bands Ethernet (LAN, Wired Up to N/A Very Low Moderate Reliable, RJ45) 1 Gbps (Physical stable, ideal cable) for fixed monitoring stations CAN Bus Wired 1 Mbps Up to Low Low Efficient, used (Controller Area 40 m in industrial Network) automation I2C (Inter- Wired Up to Short-range Very Low Low Suitable for Integrated 3.4 Mbps (1 m) on-board AI Circuit) imaging unit- controller communication UART Wired 115 Kbps- Short-range Low Low Simple, widely (Universal 1 Mbps (1-2 m) used in embedded Asynchronous systems Receiver- Transmitter) SPI (Serial Wired Up to Short-range Very Low Low High-speed Peripheral 10 Mbps (PCB level) short-distance Interface) communication

    [0104] In one embodiment herein, Table 8 depicts the IoT-based communication modules 174 that enable wireless communication of the controller 172 with the optical fluorescence-sensing unit 164.

    TABLE-US-00008 TABLE 8 Data Communication Transfer Power Key Module Type Rate Range Latency Consumption Advantages Ethernet (LAN, Wired Up to N/A Very Low Moderate High-speed, RJ45 - Gigabit) 1 Gbps (Physical reliable, ideal cable) for real-time data processing Fiber Optic Wired Up to Long- Very Low Low Ultra-fast, low (Optical 100 Gbps range interference, Communication) (10+ km) ideal for high- precision LIDAR data CAN Bus Wired 1 Mbps Up to Low Low Reliable for (Controller 40 m industrial Area Network) automation, suitable for short-range LIDAR control RS-485/RS- Wired Up to 1.2 km Low Low Simple and 232 (Serial 10 Mbps (RS-485) effective for Communication) industrial applications Wi-Fi Wireless Up to 30-100 m Low to Moderate High-speed, (802.11ac/ax) 9.6 Gbps Moderate good for (Wi-Fi 6) wireless LIDAR applications 5G (NR mmWave Wireless Up to Several Very Low Moderate Best for real- & Sub-6 GHz) 10 Gbps km time LIDAR data transmission & cloud integration LoRa (Long Wireless 0.3-50 Kbps 2-10 km High Very Low Suitable for Range) remote environmental monitoring applications Zigbee (IEEE Wireless 250 Kbps 10-100 m Low Very Low Low-power, 802.15.4) suitable for networked sensor communication Li-Fi (Light Wireless Up to 10 m Very Low Low Ultra-fast Fidelity) (Optical) 1 Gbps (Line of communication Sight) using light, immune to RF interference SPI (Serial Wired Up to Short- Very Low Low High-speed, Peripheral 10 Mbps range suitable for Interface) (PCB embedded LIDAR level) processing I2C (Inter- Wired Up to Short- Very Low Low Efficient for Integrated 3.4 Mbps range internal sensor- Circuit) (1 m) controller communication

    [0105] In one embodiment herein, Table 9 depicts the wireless communication between the controller 172 and the one or more motors (134A, 134B, 134C, 134D).

    TABLE-US-00009 TABLE 9 Motor Type Recommended Module Protocol Used Reason DC Propelling Motor Driver (L298N / PWM (Pulse Width Controls motor speed Motors BTS7960 / VNH2SP30) Modulation) + UART and direction.

    [0106] In one embodiment herein, Table 10 depicts the wireless communication between the controller 172 and the navigation unit 168.

    TABLE-US-00010 TABLE 10 Requirement Recommended Module Protocol Used Reason GPS Tracking NEO-6M / SIM808 GPS UART / I2C / SPI Provides real-time location Module tracking for navigation. Data Transfer to ESP32 / Raspberry Pi UART / I2C Sends GPS coordinates for path Controller optimization.

    [0107] FIG. 3 refers to a schematic view of the autonomous water-cleaning vehicle 128 while predicting and removing recyclable waste 14 in water bodies. In one embodiment herein, the autonomous water-cleaning vehicle 128 is designed to autonomously detect, navigate toward, and collect recyclable waste 14 while avoiding obstacles in the water. The AI-based autonomous system 100 operates through a sequence of controlled steps to ensure efficient waste removal. In one embodiment herein, the operation of the autonomous water-cleaning vehicle 128 begins with the user linking the vehicle to the user device 125 via the network 120.

    [0108] Once connected, the user places the autonomous water-cleaning vehicle 128 on the water surface requiring cleaning. In this position, the front end of the inclined conveyor belt 152 is submerged to facilitate waste collection, while the propelling wheels (132A, 132B, 132C, 132D) remain partially immersed to enable movement. The user then activates the autonomous water-cleaning vehicle 128 by pressing the start button on the user interface 126 of the user device 125, initiating the controller 172, which powers the propelling wheel motors (134A, 134B, 134C, 134D) and the driving unit 142.

    [0109] In one embodiment herein, once activated, the autonomous water-cleaning vehicle 128 moves across the water surface, driven by the rotation of the plurality of propelling wheels (132A, 132B, 132C, 132D). Simultaneously, the driving unit 142 rotates, which in turn rotates the first roller 148. This rotation drives the inclined conveyor belt 152, while the second roller 150 rotates passively to support the belt's movement. The inclined conveyor belt 152 continuously collects recyclable waste 14 from the water surface and transfers it toward the container 158.

    [0110] In one embodiment herein, the autonomous water-cleaning vehicle 128 employs an AI-powered waste direction monitoring system. The AI imaging unit 160 scans the water surface in all directions using its 360-degree rotating capability. As it monitors the surroundings, the AI imaging unit 160 detects recyclable waste 14 and compares the captured image data with pre-programmed waste patterns stored in its system. If the detected waste matches the pre-defined data, the AI imaging unit 160 transmits this information to the controller 172 for processing.

    [0111] In one embodiment herein, upon receiving data from the AI imaging unit 160, the controller 172 determines the most efficient movement path for recyclable waste 14 collection. For instance, if the AI imaging unit 160 detects recyclable waste 14 on the right side of the autonomous water-cleaning vehicle 128, it sends a signal to the controller 172. In response, the controller 172 adjusts the speed of the plurality of propelling wheels (132A, 132B, 132C, 132D) by accelerating the second wheel 132B and the fourth wheel 132D while decelerating the first wheel 132A and the third wheel 132C, thereby causing the autonomous water-cleaning vehicle 128 to steer toward the right to collect the recyclable waste 14 effectively. The AI-based autonomous system 100 ensures that the autonomous water-cleaning vehicle 128 automatically adapts its direction based on the distribution of recyclable waste 14 in the water. Additionally, if the AI imaging unit 160 detects obstacles, such as a wall surface, the controller 172 processes this information and adjusts the vehicle's movement accordingly. The controller 172 directs the vehicle to navigate away from the obstacle, preventing potential collisions and ensuring operational longevity and functionality by modifying its direction and adjusting the propelling speed.

    [0112] In one embodiment herein, the AI-based autonomous system 100 provides an advanced and autonomous solution for predicting, detecting, and removing recyclable waste 14 from water bodies. By integrating AI-powered navigation, automated waste collection, and real-time obstacle avoidance, the autonomous water-cleaning vehicle 128 enhances the efficiency of water body maintenance while minimizing manual intervention.

    [0113] FIG. 4 refers to a schematic view of the autonomous water-cleaning vehicle 128 while predicting and removing organic waste 12 in water bodies. The autonomous water-cleaning vehicle 128 is designed to detect, classify, and determine the appropriate removal of organic waste 12 using advanced AI-powered image processing, fluorescence LiDAR, and salinity-based analysis.

    [0114] In one embodiment herein, the AI imaging unit 160, positioned above the container 158, captures images of organic waste 12 present on the water surface. The AI imaging unit 160 compares the captured images with a dataset of images representing green algae (Chlorophyta), diatoms (Bacillariophyta), red algae (Rhodophyta), blue-green algae (Cyanobacteria), dinoflagellates (Dinophyta), and golden algae (Chrysophyta) to classify the organic waste 12 into ecologically beneficial and ecologically harmful types, enabling precise identification and classification. Once classified, the AI imaging unit 160 transmits the organic waste type data to the controller 172 for further analysis. Upon receiving the data, the controller 172 activates the optical fluorescence-sensing unit 164 and the water-salinity sensor 162 to assess organic waste removal feasibility.

    [0115] In one embodiment herein, the AI-based autonomous system 100 employs a fluorescence LiDAR-based vitality detection mechanism. The laser source 164A of the optical fluorescence-sensing unit 164 emits a laser beam onto the detected organic waste 12 to excite its pigments. In response, the organic waste 12 emits fluorescence signals, which are captured by the fluorescence detector 164C. The fluorescence intensity and decay pattern data are transferred to the controller 172 for analysis. The controller 172 utilizes a machine learning model comprises support vector machine (SVM), k-means clustering, DBSCAN clustering, or long short-term memory (LSTM) networks to analyze the fluorescence decay patterns and differentiate between live and dead organic waste. the AI-based autonomous system 100 determines that organic waste 12 is alive if the fluorescence decay pattern exhibits longer decay times and higher intensity, whereas shorter decay times and lower intensity indicate dead organic waste.

    [0116] In one embodiment herein, the controller 172 computes the volume of the organic waste 12 by processing fluorescence intensity data along with the time delay between the emitted laser pulse and the detected fluorescence signal, which is provided by the timing and ranging module 164E. The controller 172 employs point-cloud reconstruction techniques, which can be, but is not limited to, regression analysis, point cloud processing, and Kalman filter models to estimate the total organic waste volume. Additionally, the controller 172 integrates data from the water-salinity sensor 162 and compares the detected organic waste volume with water salinity levels to assess whether the organic waste concentration exceeds salinity-dependent ecological threshold values stored in the non-transitory memory 176.

    [0117] In one embodiment herein, the controller 172 determines the appropriate action for organic waste removal based on multiple parameters, including organic waste type, volume, vitality, and water salinity levels. The decision-making process is governed by a decision model comprising fuzzy logic and random forest classifier models. If the organic waste 12 is beneficial, alive, and the detected volume is below the salinity-dependent threshold, the controller 172 refrains from removal and commands the navigation unit 168 to redirect the autonomous water-cleaning vehicle 128 away from the organic waste region by adjusting the propelling wheels. However, if the detected organic waste volume exceeds the threshold, or if the organic waste is identified as harmful or dead, the controller 172 initiates the removal process by activating the conveyor unit 151.

    [0118] In one embodiment herein, the AI-based autonomous system 100 provides an intelligent, data-driven approach for detecting, analyzing, and selectively removing harmful organic waste 12 from water bodies. By integrating Aw-powered image recognition, fluorescence LiDAR-based vitality detection, salinity-based volume estimation, and machine learning-based decision model, the autonomous water-cleaning vehicle 128 ensures efficient and environmentally responsible organic waste management.

    [0119] Table 11 depicts dimensions of the components of the autonomous water-cleaning vehicle 128.

    TABLE-US-00011 TABLE 11 Dimension Component (L W H) (cm) Description Overall 80 30 25 Total size of the device, ensuring stability and optimal Dimensions waste collection. Floating body 40 25 10 Rectangular, hollow structure with curved corners for stability and movement on water. Propelling Diameter: 5 cm Waterproof, high-torque wheels for smooth navigation. Wheels Waste Collection 100 40 20 Inclined conveyor for efficient floating waste collection. Conveyor First Roller Length: 30 cm, 4 cm Supports inclined conveyor belt movement. Second Roller Length: 30 cm, 4 cm Guides belt for waste transport. Inclined 70 30 Perforated design for water drainage during waste conveyor belt collection. Container 25 20 20 Stores collected floating waste. Supporting 15 15 12 Provides structural support for AI imaging unit and frame Fluorescence LiDAR.

    [0120] In one embodiment herein, Table 12 depicts training data of the AI imaging unit 160.

    TABLE-US-00012 TABLE 12 Category Data to Train Purpose Floating Plastic Waste Plastic bottles, bags, wrappers Detect and direct device toward plastic waste. Floating Paper Waste Cardboard, newspapers, cups Identify and direct toward paper-based waste. Organic Floating Waste Leaves, wood debris, algae patches Recognize organic waste presence. Water Surface Clean water, ripples, disturbed water Distinguish waste from normal water patterns. Obstacle Identification Walls, rocks, buoys, boats Detect and avoid collisions. Floating Animals Fish, birds, amphibians Prevent harm to aquatic life.

    [0121] In one embodiment herein, Table 13 depicts data related to the algae 12.

    TABLE-US-00013 TABLE 13 Algae Key Visual Category Algae Type Examples Characteristics Ecological Impact Useful Green Algae Chlorella, Bright green; Oxygen production; Algae (Chlorophyta) Spirogyra filamentous; smooth, supports aquatic food chain uniform appearance Diatoms Navicula, Brownish-green; Primary producers; water (Bacillariophyta) Cyclotella intricate silica-based quality indicators patterns Red Algae Polysiphonia Deep red to purple; Habitat stabilization; (Rhodophyta) branching structures ecosystem enhancement Harmful Blue-Green Algae Microcystis, Blue-green; scum- Toxin production; cause Algae (Cyanobacteria) Anabaena like; sometimes harmful algal blooms mucilaginous Dinoflagellates Alexandrium Irregular shapes; Disrupt marine life; may exhibit produce toxins bioluminescence; red tides Golden Algae Prymnesium Golden-yellow; Produces fish-killing parvum chain-forming; dense toxins; detrimental effects bloom formations

    [0122] In one embodiment herein, Table 14 depicts fluorescence decay patterns, intensity and decay time data (stored in database 124).

    TABLE-US-00014 TABLE 14 Fluorescence Vitality Fluorescence Decay Time Pattern Algae Type Status Intensity (nanoseconds) Characteristics Interpretation Green Algae Alive High 650-800 Strong peak Active photosynthesis; (Chlorophyta) intensity; gradual healthy decay curve Green Algae Dead Low or None <400 Weak or absent Chlorophyll (Chlorophyta) peaks; rapid degradation; inactive decay Diatoms Alive Moderate to 700-850 Sustained Functioning primary (Bacillariophyta) High fluorescence; producer steady decline Diatoms Dead Low <450 Minimal Photosynthetic (Bacillariophyta) intensity; fast inactivity decay Red Algae Alive High 800-950 Prolonged Ecosystem-stable; (Rhodophyta) fluorescence healthy decay; consistent signal Red Algae Dead Low or None <500 Rapid decay; no Inactive; loss of (Rhodophyta) sustained pigmentation fluorescence Blue-Green Algae Alive Very High 850-1000 Intense peak; Active toxin producers (Cyanobacteria) slow decay (due (harmful) to gas vesicles) Blue-Green Algae Dead Very Low <400 Weak/no signal; Non-toxic; non- (Cyanobacteria) rapid decay photosynthetic Dinoflagellates Alive High 900-1100 Long decay time; Active toxin high-intensity production (harmful) bursts (bioluminescence in some cases) Dinoflagellates Dead Minimal <500 Abrupt decay; Inactive; no toxin low baseline production intensity Golden Algae Alive Moderate 700-900 Moderate peaks; Active; potential toxin (Chrysophyta) steady production fluorescence drop Golden Algae Dead Low <450 Quick decay; Inactive; no ecological (Chrysophyta) diminished threat fluorescence

    [0123] In one embodiment herein, Table 15 depicts algae volume and water salinity threshold.

    TABLE-US-00015 TABLE 15 Typical Algae Ecological Volume Impact (If Water Salinity Water Threshold Within Risk Level (If (ppt) Type (g/m.sup.3) Threshold) Exceeded) Recommended Action <0.5 ppt Freshwater 10-30 Balanced High risk of Remove excess algae; (Rivers, oxygen levels, eutrophication monitor water quality Lakes) supports and oxygen aquatic depletion biodiversity 0.5-5 ppt Brackish 30-50 Supports Moderate risk Remove algae if near upper (Estuaries) diverse of harmful threshold; monitor vitality ecosystems; algal blooms moderate (HABs) algae presence beneficial 5-18 ppt Slightly 50-100 Algae Potential Remove if the volume Saline supports disruption in exceeds 80 g/m.sup.3; check aquatic life; food chains; algae type oxygen harmful blooms production optimal 18-30 ppt Moderately 100-200 Stable High risk of Remove algae exceeding Saline ecosystems toxic blooms 150 g/m.sup.3; assess toxicity with robust (e.g., algae cyanobacteria) populations 30-35 ppt Marine 200-500 Algae Significant risk Immediate removal if >400 (Ocean, essential for of red tides and g/m.sup.3; activate emergency Seas) marine marine life protocols productivity; toxicity supports fisheries >35 ppt Hypersaline 500+ Specialized Potential for Remove algae exceeding (Salt Lakes) algae thrive; dense harmful 600 g/m.sup.3; intensive balanced blooms; oxygen monitoring required micro- depletion ecosystems

    [0124] In one embodiment herein, Table 16 depicts wheel rotations for direction control.

    TABLE-US-00016 TABLE 16 Left-Side Wheels Right-Side Wheels Floating Waste (Motor Speed - (Motor Speed - Device Movement Position RPM) RPM) Direction Directly Ahead 50 RPM (Forward) 50 RPM (Forward) Moves Straight Slightly Right 50 RPM (Forward) 30 RPM (Forward) Turns Slightly Right Far Right 70 RPM (Forward) 0 RPM (Stopped) / Turns Sharply Right 30 RPM (Reverse) Slightly Left 30 RPM (Forward) 50 RPM (Forward) Turns Slightly Left Far Left 0 RPM (Stopped) / 70 RPM (Forward) Turns Sharply Left 30 RPM (Reverse) Obstacle Detected 40 RPM (Reverse) 40 RPM (Reverse) Moves Backward

    [0125] In one embodiment herein, Table 17 depicts multi-environment performance data of the AI-based autonomous system 100.

    TABLE-US-00017 TABLE 17 Exp ID 1 2 3 4 5 6 7 Water body Pond-A Reservoir-B Creek-C Estuary-D Marina-E Harbor-F Urban (type) (fresh) (fresh) (brackish, (brackish, (marine, (marine, Canal-G flood) ebb) calm) chop) (fresh, turbid) Salinity 0.3 0.4 1.2 3.2 32.1 33.5 0.7 (ppt) Conditions 20 klux, 85 klux, 60 klux, 70 klux, 95 klux, 100 klux, 95 klux, (klux/sea calm light chop calm breeze calm chop turbid state) Patches 156 156 140 147 122 123 132 (n) Volume 25 25 40 40 350 350 25 Threshold V_thr (g/m.sup.3) Harmful 0.93/0.92 0.91/0.90 0.90/0.89 0.89/0.88 0.94/0.91 0.92/0.89 0.90/0.87 Precision/ Recall Useful 0.06 0.08 0.08 0.09 0.07 0.09 0.1 false- removal Volume 10.1%/+2.8% 11.0%/+3.6% 11.3%/+2.5% 11.8%/+2.1% 11.0%/+1.9% 11.6%/+2.0% 12.3%/+4.2% RMSE/Bias Vitality F1 0.92 0.9 0.9 0.9 0.93 0.91 0.89 LiDAR 100 91 93 88 90 86 84 noon valid (% of dawn) DO after 0.2 0.2 0.3 0.3 0.4 0.3 0.2 harvest (mg/L) Energy 255 270 275 285 298 310 290 (Wh/h)

    [0126] Here, Salinity (ppt) is considered as the salt level present in the water. Conditions (klux/sea state) are considered as the sunlight intensity measured in thousand lux and the surface-water state and chop level. Patches (n) are considered as the count of distinct surface clusters evaluated by the AI-based autonomous system. V_thr (g/m.sup.3) is considered as the salinity-adapted biomass threshold used for determining whether algae should be harvested.

    [0127] Here, Harmful Precision/Recall refers to the accuracy metrics for harmful-algae removal. The precision represents the proportion of correctly removed harmful algae, and the recall represents the proportion of total harmful algae successfully removed. Useful false-removal represents the fraction of beneficial and live algae that are incorrectly removed.

    [0128] Here, Volume RMSE/Bias represents the error percentage of the fluorescence-LiDAR biomass estimate when compared to laboratory measurements, including the average over-estimation and under-estimation; an RMSE of approximately 12% or lower and a near-zero bias indicates an accurate and unbiased volume measurement.

    [0129] Here, Vitality F1 represents the classifier performance score for distinguishing between live and dead algae, computed as the harmonic mean of vitality-classification precision and recall. LiDAR noon valid (% of dawn) represents the percentage of valid LiDAR returns during mid-day operation relative to early-morning baseline conditions, where values above approximately 85-90% indicate effective glare resistance.

    [0130] Here, DO after harvest (mg/L) refers to the change in dissolved oxygen concentration near the water surface following a harvest pass of the autonomous water-cleaning vehicle.

    [0131] In one embodiment herein, the AI-based autonomous system 100 is considered as an Aqua-Sense Harvester and is evaluated across seven different field sites, which can be, but is not limited to, freshwater ponds, reservoirs, urban canals, brackish estuaries, tidal creeks, and marine environments, with salinity ranging from 0.3 to 33.5 parts per thousand (ppt). A total of 976 surface patches are analyzed under varying illumination from 20 to 100 kilolux and surface conditions ranging from calm to choppy.

    [0132] In one embodiment herein, the harmful-algae removal capability demonstrated a mean precision of approximately 0.91 and a mean recall of approximately 0.90 across the listed locations. The AI-based autonomous system 100 exhibited low false-removal rates of beneficial and live algae, averaging approximately 0.08 across all water bodies, thereby confirming its selective harvesting capability.

    [0133] In one embodiment herein, the fluorescence-LiDAR-based algae-volume estimation produced a root-mean-square error of approximately 11.3% with an average bias of approximately +2.7% when compared with laboratory-validated biomass measurements. This demonstrates the high accuracy and low drift characteristics of the LiDAR-based estimation method implemented in the optical fluorescence-sensing unit.

    [0134] In one embodiment herein, the vitality-detection subsystem achieved an average F1-score of approximately 0.91 for distinguishing between live and dead algae, indicating robust classification performance across freshwater, brackish, and marine salinity ranges.

    [0135] In one embodiment herein, the optical fluorescence-sensing unit 164 demonstrated strong resistance to mid-day glare and wave-induced signal degradation. The mid-day LiDAR validity remained at approximately 90% of the dawn baseline for most field sites, with the lowest value (84%) observed under turbid and choppy conditions in the urban canal test. This performance confirms the suitability of the AI-based autonomous system for daytime algae-removal operations.

    [0136] In one embodiment herein, the ecological impact assessment indicated a positive increase in dissolved oxygen levels following algae-harvest passes, with an average ADO of +0.27 mg/L across test sites and a maximum of +0.4 mg/L observed in marine environments.

    [0137] This demonstrates the AI-based autonomous system's capability to mitigate hypoxia and improve near-surface water quality.

    [0138] In one embodiment herein, the energy consumption of the autonomous water-cleaning vehicle averaged approximately 282 watt-hours per hour (Wh/h), with observed values ranging between 255 Wh/h and 310 Wh/h. Marine and choppy environments exhibited slightly higher energy consumption due to increased hydrodynamic resistance and navigation corrections.

    [0139] In one embodiment herein, the experimental results confirm that the Aqua-Sense Harvester consistently removes harmful algae with high precision and recall, avoids unnecessary removal of beneficial biomass, maintains accurate biomass estimation through fluorescence-LiDAR analysis, operates robustly under varying illumination and water-surface conditions, and improves dissolved-oxygen levels in treated areas. These findings collectively demonstrate significant technical effects and performance advantages over conventional algae-removal systems.

    [0140] In one embodiment herein, Table 18 depicts advantages of the AI-based autonomous system 100.

    TABLE-US-00018 TABLE 18 Typical existing Area systems Our system (Aqua-Sense) Advantage / measurable benefit Selectivity Indiscriminate Policy: type + volume + Wrong (useful) removals 20-40 pp skimming of all vitality + salinity biomass Harmful-algae Manual/heuristic Fuzzy/ensemble decision Correct harmful removals +10-20 pp removal targeting rules Vitality None (color/reflectance Time-resolved fluorescence Prioritizes dead biomass; odor/scum detection only) (decay , A.sub.1/A.sub.2) reduction; DO gain +0.3-0.4 mg/L post-harvest Volume Visual area heuristics; Fluorescence LiDAR Volume RMSE 12%, bias <5% estimation coarse intensity-range model Sun/glitter Camera glare; mid-day Band-pass 650-750 nm + Mid-day valid returns 85-90% of dawn robustness dropouts temporal gating Salinity Fixed thresholds V_thr = f(ppt) per water Decision correctness 88% across sites adaptation type Multi-sensor Single sensor (camera AI cam + LiDAR + EC- Harmful P/R 0.90/0.90; useful false- fusion or EC) salinity removal 0.10 Path planning Straight line; always Avoid-route around Small path cost (deviation 0.7 m), harvest live/beneficial patches ecosystem preserved Energy per Wasted collection Targeted harvest only when Energy +10% vs baseline with better hectare passes warranted outcomes Calibration & Not specified LiDAR timing 10 ns; PSS- Repeatable, auditable performance error budget 78 EC.fwdarw.ppt; camera HDR Data & audit Sparse logs CSV logs: ppt, {circumflex over (V)}, decay FER (Federal Regulation)/inspection- feats, class, decision ready traceability Maintenance Contacting EC cell Toroidal/inductive EC Lower drift (+0.03 ppt/week), less (salinity) fouls probe cleaning Safety & Manual stop only E-stop, bin-full, jam Fewer incidents; graceful degrade failsafes interlocks (default-safe avoid) Marine Mixed IP ratings IP67 LiDAR; IP68 probe; Higher uptime in spray/chop readiness IP65-67 enclosures Upgradability Fixed GbE, RS-485, CAN-FD; OTA Easy sensor/model swaps

    [0141] In one embodiment herein, Table 19 depicts the components of the AI-based autonomous system 100, the suggested materials for each component, and the reasons supporting the selection of those materials.

    TABLE-US-00019 TABLE 19 Component Suggested Material Reason for Selection Floating body High-Density Polyethylene Lightweight, corrosion-resistant, buoyant, and (HDPE) / Marine-Grade Aluminum durable. Propelling Wheels Rubber-Coated Stainless Steel Provides grip on water surfaces, corrosion- resistant, and durable. First & Second Stainless Steel (SS 304) Corrosion-resistant, strong, and ensures Roller smooth belt movement. Inclined conveyor Polyurethane (PU) / PVC Water-resistant, durable, and flexible for belt efficient waste collection. Container Fiberglass-Reinforced Plastic (FRP) Lightweight, corrosion-resistant, and durable for prolonged use. Supporting frame Aluminum Alloy / Stainless Steel Strong, lightweight, and resistant to corrosion for holding AI imaging unit and sensors. Water Salinity Titanium / PVC Housing Corrosion-resistant, non-reactive to water, and Sensor durable. Controller Housing Aluminum Enclosure (IP67-rated) Waterproof, heat-resistant, and protects electronic components.

    [0142] FIG. 5 refers to a flow chart 500 of a method for predicting and selectively removing floating waste 10 from water bodies using the AI-based autonomous system 100. The method comprises a floating waste collection phase and an organic waste analysis and removal phase. At step 502, the AI-based autonomous system 100 is activated to initiate the floating waste collection process. At step 504, the autonomous water-cleaning vehicle 128 collects the floating waste 10, such as the recyclable waste 14, which can be, but is not limited to, paper and plastic, from the water surface. At step 506, the AI imaging unit 160 detects and monitors the direction of the floating waste 10 for efficient collection. At step 508, the computing device 170 directs the autonomous water-cleaning vehicle 128 toward the detected floating waste 10 based on input from the AI imaging unit 160. At step 510, the floating waste 10 is collected and deposited into the container 158.

    [0143] At step 512, the AI imaging unit 160 detects the presence of organic waste 12 on the water surface. At step 514, the controller 172 classifies the organic waste 12 into ecologically beneficial or harmful types using the AI imaging unit 160 based on the dataset of images related to the algae. At step 516, the controller 172 activates the optical fluorescence-sensing unit 164 and the water-salinity sensor 162 to analyze the organic waste 12. At step 518, the optical fluorescence-sensing unit 164 detects the vitality of the organic waste 12 based on fluorescence decay lifetime characteristics and temporal-attenuation profiles. At step 520, the water-salinity sensor 162 measures water salinity to assess environmental conditions. At step 522, the optical fluorescence-sensing unit 164, in conjunction with the controller 172, computes the volume of the organic waste 12 using fluorescence intensity, ranging data, and point-cloud reconstruction techniques.

    [0144] At step 524, the controller 172 analyzes the detected organic waste data, including type, vitality, volume, and water salinity, to determine the appropriate removal action. At step 526, the controller 172 generates an organic-waste removal decision by comparing the type, vitality, and volume of the organic waste 12 with salinity-dependent ecological threshold values stored in the non-transitory memory 176. If the organic waste 12 is determined to be beneficial, alive, and below the threshold volume, the controller 172 inhibits activation of the conveyor unit 151, thereby not removing the organic waste 12. If the organic waste 12 is determined to be harmful, dead, or exceeds the threshold volume, the controller 172 activates the conveyor unit 151 to remove the organic waste 12 from the water surface.

    [0145] FIG. 6 refers to a flow chart 600 of a method for predicting and selectively removing floating waste from water bodies using the artificial intelligence (AI)-based autonomous system 100. At step 602, the controller 172 receives the image data of the organic waste 12 generated by the artificial intelligence (AI) imaging unit 160, the water-salinity levels obtained from the water-salinity sensor 162, the fluorescence-decay lifetime characteristics, the temporal-attenuation profiles, and the ranging data of the organic waste 12 produced by the optical fluorescence-sensing unit 164. The organic waste 12 is algae.

    [0146] At step 604, the controller 172 generates a unified, time-synchronized, multi-parameter dataset by performing normalization and synchronization of the image data, the water-salinity levels, and the fluorescence-decay lifetime characteristics and ranging data. At step 606, the controller 172 classifies organic waste 12 into ecologically beneficial and ecologically harmful types using the AI imaging unit 160. The AI imaging unit 160 is pre-programmed with dataset of images representing green algae (Chlorophyta), diatoms (Bacillariophyta), red algae (Rhodophyta), blue-green algae (Cyanobacteria), dinoflagellates (Dinophyta), and golden algae (Chrysophyta).

    [0147] At step 608, the controller 172 determines a vitality level of the organic waste 12 based on the fluorescence-decay lifetime characteristics and the temporal-attenuation profiles using a machine learning model. The controller 172 is pre-programmed with fluorescence decay pattern data for determining algae vitality. The machine learning model can be, but is not limited to, support vector machine (SVM), k-means clustering, DBSCAN clustering, or long short-term memory (LSTM) networks. The temporal-attenuation profiles are fluorescence decay patterns with decay times between 650-1100 nanoseconds.

    [0148] At step 610, the controller 172 computes the volume of the organic waste 12 using the fluorescence intensity, the ranging data, and point-cloud reconstruction techniques to generate a three-dimensional biomass model, thereby enabling structured volumetric estimation and optimized memory usage during point-cloud reconstruction. The point-cloud reconstruction techniques can be, but is not limited to, regression analysis, point cloud processing, or kalman filter model.

    [0149] At step 612, the controller 172 compares the types, the vitality level, and the volume of the organic waste 12 with salinity-dependent ecological threshold values stored in the non-transitory memory 176 to generate an organic-waste removal decision using a decision model.

    [0150] The dynamic ecological thresholding reduces unnecessary removal of beneficial organic waste. The decision model can be, but is not limited to, fuzzy logic and random forest classifier models. The organic-waste removal decision is generated by comparing the computed volume against salinity-dependent threshold values stored in the non-transitory memory 176.

    [0151] The salinity-dependent ecological threshold values define maximum beneficial organic waste volumes of 10-30 g/m.sup.3 for salinity below 0.5 ppt, 30-50 g/m.sup.3 for salinity 0.5-5 ppt, 50-100 g/m.sup.3 for salinity 5-18 ppt, 100-200 g/m.sup.3 for salinity 18-30 ppt, 200-500 g/m.sup.3 for salinity 30-35 ppt, and over 500 g/m.sup.3 for salinity above 35 ppt.

    [0152] At step 614, the controller 172 activates the conveyor unit 151 of the autonomous water-cleaning vehicle 128 when the removal decision indicates harmful, dead, or excessive organic waste by the decision model, and inhibits activation when the organic waste 12 is determined to be beneficial and below threshold levels, thereby reducing mechanical load and decreasing energy consumption.

    [0153] At step 616, the controller 172 commands a navigation unit 168 to maneuver the autonomous water-cleaning vehicle 128 toward or away from detected organic-waste regions based on the removal decision, thereby increasing propulsion efficiency and reducing energy consumption through optimized path-planning. At step 618, the computing device 170 enables selective activation of the conveyor unit 151 to remove the harmful or excessive organic waste 12 as determined by the organic-waste removal decision from the water surface, thereby producing a tangible modification to the water body.

    [0154] In some embodiments herein, the machine learning models employed for the AI-based autonomous system 100 are referenced herein, including but not limited to, Support Vector Machine (SVM), K-Means and DBSCAN clustering, Long Short-Term Memory (LSTM) networks, Fuzzy Inference Systems, and Random Forest classifiers, are implemented using established computational frameworks (for example, TensorFlow, PyTorch, scikit-learn) executed on the disclosed controller (for example, NVIDIA Jetson Orin Nano). Training of these machine learning models utilizes labeled datasets comprising paired the multi-parameter dataset and ground-truth ecological outcomes. For classification tasks (for example, beneficial vs. harmful algae), the AI imaging unit dataset (shown in Table 12) is expanded via public ecological image repositories such as the NOAA Harmful Algal Bloom (HAB) archive and the AlgaeBase taxonomic database, applying standard augmentation techniques (rotation, scaling, lighting variation) to improve model robustness. For vitality determination, fluorescence decay patterns (Table 14) are used as time-series inputs to LSTM networks or as feature vectors for SVM classifiers, where the models are trained to correlate specific decay lifetimes (for example, 650-1100 ns for live algae) with vitality states confirmed via laboratory microscopy. The fuzzy logic and Random Forest decision models integrate these classification outputs with continuous variables (salinity, estimated volume) by applying predefined rule sets (for example, IF algae_type=harmful AND volume>threshold AND vitality=low THEN remove=YES) and trained ensemble trees on historical water-quality data. The resulting models are optimized for edge deployment, employing quantization and pruning to operate within the non-transitory memory 176 and processing constraints of the embedded controller while maintaining inference accuracy above predetermined thresholds (for example, >90% precision for harmful algae identification). Any listed machine-learning model is suitable as long as it achieves classification performance above accuracy, as described in the examples.

    [0155] In the foregoing description various embodiments of the present disclosure have been presented for the purpose of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise form disclosed. Obvious modifications or variations are possible in light of the above teachings. The various embodiments were chosen and described to provide the best illustration of the principles of the disclosure and their practical application, and to enable one of ordinary skill in the art to utilize the various embodiments with various modifications as are suited to the particular use contemplated. All such modifications and variations are within the scope of the present disclosure as determined by the appended claims when interpreted in accordance with the breadth they are fairly, legally, and equitably entitled.

    [0156] It will readily be apparent that numerous modifications and alterations can be made to the processes described in the foregoing examples without departing from the principles underlying the invention, and all such modifications and alterations are intended to be embraced by this application.