Preprint
Review

This version is not peer-reviewed.

AIoT Advances in Aquaculture: A Review

A peer-reviewed article of this preprint also exists.

Submitted:

04 December 2024

Posted:

05 December 2024

You are already at the latest version

Abstract
The integration of artificial intelligence (AI) and the internet of things (IoT), known as AIoT, is driving significant advancements in the aquaculture industry, offering solutions to longstanding challenges related to operational efficiency, sustainability, and productivity. This review explores the latest research studies in AIoT within the aquaculture industry, focusing on real-time environmental monitoring, data-driven decision-making, and automation. IoT sensors deployed across aquaculture systems continuously track critical parameters such as temperature, pH, dissolved oxygen, salinity, and fish behavior. AI algorithms process these data streams to provide predictive insights into water quality management, disease detection, species identification, biomass estimation, and optimized feeding strategies, among others. Much as AIoT adoption in aquaculture is advantageous on various fronts, there are still numerous challenges, including high implementation costs, data privacy concerns, and the need for scalable and adaptable AI models across diverse aquaculture environments. This review also highlights future directions for AIoT in aquaculture, emphasizing the potential for hybrid AI models, improved scalability for large-scale operations, and sustainable resource management.
Keywords: 
;  ;  ;  ;  

1. Introduction

The Aquaculture, often referred to as fish farming, is the cultivation of aquatic organisms under controlled conditions [1]. It plays a critical role in meeting the growing global demand for seafood while reducing pressure on wild fish populations [2]. This practice extends beyond fish to include a diverse array of aquatic species cultivated for food, environmental restoration, and ornamental purposes [3,4].
Fish are the most prominent group in aquaculture, with species such as salmon, tilapia, catfish, trout, carp, and seabass commonly farmed due to their commercial value and adaptability to aquaculture environments [4,5]. In addition to fish, crustaceans like shrimp and crabs are significant contributors, especially in regions where their demand is high. Shrimp farming, in particular, has become a cornerstone of global aquaculture exports [6,7]. Mollusks, including scallops, oysters, and mussels, are another essential category. These species are valued for their economic and nutritional contributions and are often cultivated in marine environments. Pearls, produced by certain species of oysters, are an example of aquaculture’s intersection with luxury markets. Coral farming, though niche, is increasingly vital for ornamental trade and ecological restoration, particularly in efforts to rehabilitate damaged coral reefs [8,9]. Other species cultivated in aquaculture include jellyfish, which are farmed for both culinary purposes and ecosystem management, and aquatic macroinvertebrates, which serve as indicators of environmental health and are used in ecological assessments [10,11]. Phytoplankton, integral to the aquatic food chain, are also cultivated for their ecological and nutritional roles [12,13]. Aquatic plants such as seaweeds and algae further diversify aquaculture, with applications ranging from human consumption to cosmetics and biofuel production [1,14]. This broad spectrum of species reflects the adaptability and significance of aquaculture in addressing global challenges, including food security, environmental sustainability, and economic development.
Aquaculture continues to evolve as a vital component of modern agriculture and environmental management through the facilitation of AIoT which integrates AI and the IoT to enhance fish farming practices through improved productivity, sustainability, and operational efficiency [15,16,17,18,19]. The development of IoT devices, including sensors, cameras, and monitoring systems, has revolutionized data collection from remote and dynamic aquaculture environments [20,21]. These devices provide continuous streams of data on critical parameters such as water quality, fish behavior, feeding patterns, and environmental conditions and AI algorithms analyze this data to generate actionable insights that drive real-time decision-making and autonomous operations, thereby reducing human intervention and enabling smarter, more sustainable practices [22,23].
The foundational IoT infrastructure in aquaculture consists of data-gathering devices, connectivity gateways, and cloud platforms for data storage and processing [20]. This infrastructure supports real-time monitoring and analysis of environmental parameters like temperature, salinity, pH, and dissolved oxygen; factors critical for maintaining optimal aquatic conditions [24]. Cloud computing enables the parallel processing and scalability needed to handle large-scale datasets efficiently, providing a robust platform for integrating AI capabilities into aquaculture systems. AIoT applications in aquaculture span a broad spectrum of innovations, including smart feeding systems, water quality management, disease detection, fish biomass estimation, fish behavior monitoring, organism counting, species segmentation and classification, breeding and growth estimation, individual fish tracking, automation and robotics. For instance, IoT-enabled smart feeding systems utilize data from sensors and underwater cameras to monitor fish behavior and optimize feeding schedules and quantities [25]. The AI models then analyze this data to ensure fish are fed adequately while minimizing waste, improving growth rates, and reducing environmental impact.
In water quality management, IoT sensors monitor variables like temperature, pH, dissolved oxygen, and ammonia levels, while AI-based predictive models use this data to forecast trends and recommend interventions [26,27]. Deep learning techniques such as Long Short-Term Memory networks (LSTMs) analyze time-series data to predict oxygen depletion [26,28], enabling proactive aeration adjustments. Similarly, reinforcement learning (RL) algorithms dynamically optimize water management strategies to balance energy efficiency with environmental stability. Early disease detection and prevention are critical to reducing losses in aquaculture, and AIoT systems offer significant advancements in this area. IoT-enabled cameras capture high-resolution images of fish, which are analyzed by deep learning models like YOLO and U-Net to detect early signs of disease, such as lesions or abnormal swimming patterns [21,29,30,31]. These systems provide real-time alerts, allowing for rapid interventions to contain outbreaks. AIoT also supports fish biomass estimation through sonar and camera-based data, where machine learning models calculate population density and size distribution, aiding in resource allocation and harvesting strategies.
Applications such as species segmentation, organism counting, and breeding optimization further demonstrate AIoT’s versatility. CNN-based models like ResNet and VGG facilitate species identification and population monitoring, ensuring compliance with regulations and maintaining biodiversity [32,33]. Selective breeding programs benefit from AI models that analyze genetic and environmental data to predict traits like disease resistance and growth efficiency, enhancing productivity and resilience. Robotics and autonomous systems integrated with AIoT technologies are redefining aquaculture operations. Autonomous underwater vehicles (AUVs) equipped with cameras and sensors perform tasks such as tank cleaning, fish inspections, and feed delivery with precision and minimal human oversight [34]. Reinforcement learning algorithms enhance these systems, enabling them to adapt to complex environments and learn optimal strategies through trial-and-error interactions [35].
These advancements are pivotal for the aquaculture industry, which has seen exponential growth over recent decades. Since the 1950s, aquaculture’s share of global fisheries output has risen from 4% to nearly half of total fish production in 2020, with an estimated market value of US$265 billion [2,36]. By 2030, aquaculture is projected to account for more than half of global fish production. Though the sector still faces some challenges such as environmental fluctuations, disease outbreaks, and feeding deficiencies [37], AIoT-enabled systems are uniquely positioned to address these challenges by integrating predictive analytics, real-time monitoring, and autonomous decision-making. Moreover, emerging technologies such as edge computing, blockchain, and hybrid AI models hold further promise for enhancing AIoT’s role in aquaculture [38]. Edge computing enables faster, decentralized data processing, reducing latency in decision-making, while blockchain improves data security and traceability [39].
This review comprehensively analyzes recent advancements in AIoT within aquaculture. We analyze 215 research papers (2012-present), categorizing them into ten application areas: smart feeding systems, water quality management, disease detection and prevention, fish biomass estimation, fish behavior monitoring, organism counting, species segmentation and classification, breeding and growth estimation, individual fish tracking, and automation and robotics. Further, we delve into AIoT advantages and adoption challenges, including high implementation costs, data privacy concerns, and the need for scalable and adaptable AI models across various environments. Finally, we discuss future AIoT trends, emphasizing potential developments in hybrid models, scalability solutions, and AIoT’s role in promoting sustainable aquaculture practices. By offering insights into the long-term potential of AIoT, this paper sheds light on its transformative impact on global aquaculture.
This review paper makes significant contributions to the literature on AIoT integration in aquaculture, setting itself apart through its unparalleled breadth and depth of analysis:
(1)
The paper systematically categorizes 215 research works (2012–present) into ten application areas, offering a holistic perspective on AIoT advancements, challenges, and opportunities in aquaculture. This organization provides a comprehensive understanding unmatched by prior reviews.
(2)
To the best of our knowledge, this work represents the most extensive review of AIoT applications in aquaculture, integrating diverse insights to highlight the transformative potential of these technologies and serving as a foundational resource for multidisciplinary research and innovation.
(3)
The review addresses critical adoption barriers such as high costs, data privacy concerns, and scalability issues. It also explores future directions, including hybrid AI models, blockchain for secure data management, and edge computing for real-time, remote operations.

2. AIoT Components in Aquaculture

The integration of AIoT in aquaculture relies on two core components: IoT sensors for real-time data collection and AI algorithms for data processing and decision-making. IoT sensors continuously monitor key environmental and biological factors, such as water quality (pH, temperature, dissolved oxygen) and fish behavior. The deployment of sensors is supported by advanced communication protocols, such as LoRa, NB-IoT, and 5G, ensuring reliable data transmission from remote farms to central management systems. These components as illustrated in Figure 1, work together to enhance efficiency, reduce waste, and improve the sustainability of aquaculture operations by enabling automated, data-driven management practices.

2.1. IoT Sensors

IoT sensors are central to transforming aquaculture through real-time data collection and monitoring, enabling precise management of both environmental and biological parameters crucial for maintaining healthy aquatic ecosystems [20]. For example, dissolved oxygen levels, which naturally fluctuate throughout the day, can be monitored in real-time to alert farmers to potentially harmful drops, enabling them to activate aeration systems as needed [40]. Temperature sensors also allow for quick detection of sudden temperature shifts, allowing farmers to maintain optimal conditions to reduce fish stress and prevent thermal shock [41]. Moreover, IoT sensors facilitate early warning systems for issues such as disease outbreaks, feeding inefficiencies, and water pollution [42].

2.1.1. Water Quality Sensors

Water quality sensors are essential in maintaining the health and productivity of aquaculture systems by continuously monitoring critical parameters like pH, temperature, dissolved oxygen, salinity, and ammonia. Studies have demonstrated that these factors significantly influence aquatic species’ physiological well-being and growth. Slight deviations can lead to stress, vulnerability to disease, and, in severe cases, mortality [26]. Recent advancements have aimed to improve sensor precision, durability, and cost-effectiveness, ensuring that water quality monitoring meets the high standards required for aquaculture environments. For pH measurement, glass electrode sensors and ion-sensitive field-effect transistor (ISFET) sensors are commonly utilized due to their reliability in detecting subtle changes in water acidity, a critical aspect of fish health [43]. Glass electrodes, although widely used, require regular calibration and can be affected by biofouling, which limits their long-term applicability in aquaculture [44]. In contrast, ISFET sensors offer better resistance to drift and durability, making them more suitable for harsh aquatic environments.
Temperature sensors, such as thermocouples and resistance temperature detectors (RTDs), are crucial for regulating water conditions that directly affect fish metabolism and growth rates. Dissolved oxygen (DO) levels are another critical factor in fish welfare, with levels needing to remain within specific ranges to prevent hypoxia, which can impair fish health [16]. Electrochemical DO sensors, such as the Clark electrode, operate through amperometric techniques but require regular maintenance to address membrane fouling, a recurring issue in high-organic-content waters [27]. Optical DO sensors, based on fluorescence quenching by molecular oxygen, provide a non-invasive and accurate alternative, and recent studies underscore their effectiveness in aquaculture. However, these sensors are more costly and can degrade over time, particularly when exposed to certain chemicals in water.
Salinity, which influences osmotic balance in fish, is often monitored with conductivity-based sensors, specifically through inductive methods. Inductive sensors have become popular due to their ability to isolate sensing components from direct contact with water, thus minimizing biofouling and corrosion [45]. Studies have shown that while conductive sensors are effective for consistent salinity readings, inductive sensors are more durable for long-term deployment in aquaculture, where biofouling is a significant challenge [46].
Ammonia, a byproduct of metabolic waste, is toxic to fish at high levels, and its monitoring is therefore critical for maintaining safe water conditions. Ion-selective electrodes (ISEs) and colorimetric sensors are the two main approaches to ammonia measurement in aquaculture. ISEs provide accurate, continuous measurements but are prone to sensor drift, particularly under variable water conditions, necessitating frequent recalibration [47]. Colorimetric methods, while effective in detecting ammonia, are limited by slower response times and are generally less suitable for real-time applications in high-density aquaculture systems [48]. Collectively, these sensor technologies support an integrated approach to aquaculture water quality monitoring. However, literature highlights several challenges, including sensor drift, biofouling, and maintenance requirements, which can impact data reliability and sensor lifespan [49]. Consequently, recent research has focused on enhancing sensor materials, developing anti-biofouling surfaces, and integrating multi-sensor systems to improve data accuracy, minimize maintenance, and facilitate rapid detection of water quality fluctuations.

2.1.2. Optical Sensors

Optical sensors have been widely adopted in water quality monitoring for aquaculture, offering valuable insights into turbidity and water clarity, both essential indicators of environmental health in aquatic systems. Turbidity can arise from sediment particles, organic matter, or plankton, which may compromise fish health by altering oxygen levels, light penetration, and other ecological conditions [42]. Increased turbidity is often linked with the presence of pollutants or phytoplankton blooms, both of which can elevate stress and disease risk among fish [44]. Optical turbidity sensors operate based on light interactions with particles, generally measuring either scattering or absorption.
Cameras, typically combined with computer vision algorithms, are widely used for behavioral analysis, allowing for the real-time monitoring of movement and spatial distribution within tanks. Studies have demonstrated that analyzing fish swimming patterns and spatial dispersion can be effective in identifying early stress signals and detecting disease [50]. For instance, abnormal changes in swimming speed, turning rates, or erratic movement patterns often indicate environmental stress or health issues, prompting timely interventions [51]. Additionally, computer vision-based tracking can optimize feeding routines by correlating feeding response with behavior, thereby reducing feed waste and promoting sustainable practices [8]. Despite their benefits, optical sensors have limitations, such as biofouling, which can obstruct optical pathways and lead to inaccurate measurements over time. Given these considerations, optical turbidity sensors remain a practical, cost-effective solution for aquaculture water quality management, particularly when accurate, low-energy measurements are required. Their ability to detect subtle changes in water clarity allows for proactive responses to environmental shifts, potentially preventing adverse effects on fish health and productivity in aquaculture facilities.

2.1.3. Motion Sensors

Motion sensors including accelerometers, are crucial in monitoring movement patterns, feeding behavior, and stress indicators. These sensors enable a non-invasive approach to tracking fish activity and well-being, which is essential for maintaining health and productivity. Accelerometers, typically embedded in tags or attached externally, offer high-resolution data on individual fish movements, such as tail-beat frequency, burst swimming, or rest periods. This data provides critical insights into metabolic rates, activity levels, and feeding behavior. Studies have found that accelerometer-based monitoring is effective in detecting subtle changes that may not be apparent through visual observation alone, such as increased activity due to suboptimal water quality or decreased activity due to illness [52]. Integrating accelerometers with telemetry systems has further allowed for remote, real-time behavioral monitoring, improving the ability to respond quickly to changes in fish health.

2.1.4. Deployment Strategies

Effective deployment strategies for IoT sensors in aquaculture are essential to maximize data accuracy and reliability, as well as to meet specific environmental and operational demands. Selecting the appropriate wireless communication protocol is a primary consideration, with LoRa (Long Range), NB-IoT (Narrowband IoT), and 5G emerging as popular options due to their distinct advantages. LoRa, for instance, is widely favored for large-scale, remote aquaculture applications because of its long-range transmission and low power requirements, allowing for effective monitoring in vast or hard-to-reach areas [42]. In contrast, 5G technology is gaining traction in aquaculture settings that require high data throughput and low latency, particularly for high-frequency video feeds used in real-time behavior monitoring. NB-IoT also offers a viable solution in scenarios where power efficiency and long-range capabilities are needed without the bandwidth demands of 5G, proving effective for transmitting data from lower-frequency sensors like dissolved oxygen or pH meters. Sensor placement is another crucial aspect of deployment. Strategic distribution across various depths and locations within ponds, tanks, or cages allows for a comprehensive assessment of environmental variables, from temperature gradients to oxygen levels, which can vary significantly within the same body of water.

2.2. AI Algorithms

AI algorithms are critical in converting extensive data collected by IoT sensors into actionable insights, transforming farm management practices through predictive modeling, automation, and optimizing processes like feeding and water management. The integration of machine learning (ML), deep learning (DL), reinforcement learning (RL), and computer vision in aquaculture enables efficient handling and analysis of diverse, complex datasets, each bringing unique strengths that address specific challenges within the field. The overall pipeline of AIoT for automation of aquaculture operations is summarized in Figure 2.
This pipeline begins with data acquisition. IoT sensors, underwater cameras, and aerial surveillance systems collect real-time data on critical parameters such as water quality, fish behavior, feeding activity, and environmental conditions. This stage ensures continuous and accurate data collection, forming the foundation for all subsequent analyses. Once the data is collected, it is transmitted wirelessly to centralized or cloud-based servers for processing. Wireless communication modules facilitate secure and efficient transmission, ensuring minimal delays and data loss, and enabling seamless integration across devices and systems. Before being processed by AI algorithms, the raw data undergoes preprocessing. Techniques such as filtering, augmentation, deblurring, normalization, and data integration are applied to clean and format the data, ensuring its quality and consistency for reliable analysis. At the pipeline's core is the AI algorithm processing stage, where advanced machine learning and AI models analyze the preprocessed data to extract actionable insights. This step encompasses a wide range of tasks tailored to specific aquaculture needs. For instance, smart feeding systems optimize feeding schedules based on fish behavior and environmental factors, while water quality management algorithms predict and maintain optimal water parameters. AI also facilitates behavior monitoring to detect stressors or abnormal patterns in fish activity, organism counting for accurate stock estimation, and segmentation and classification of aquaculture species to monitor biodiversity and species health [53].

3. AIoT Applications in Aquaculture

Integrating AIoT into aquaculture is revolutionizing farm management by merging real-time IoT sensor data collection with advanced AI-driven analytics. This synergy is enhancing efficiency, sustainability, and productivity in aquaculture operations. We summarize the existing AIoT applications in Figure 3.

3.1. Smart Feeding Systems

Smart feeding systems represent a transformative application area in aquaculture. These applications monitor fish feeding activity and environmental parameters to automate feeding, ensuring aquatic organisms receive the appropriate amount of food at optimal times. Research to date has provided a solid foundation for smart feeding systems, exploring various methods to enhance feeding strategies and efficiency [8,23,25,54,55,56,57,58,59,60,61,62,63,64].

3.1.1. Feeding Frequency and Nutrient Requirements

Efficient smart feeding systems that optimize feed utilization and growth metrics across species are highly essential in aquaculture. Extensive research is being conducted to design systems that refine feeding schedules based on fish activity and improve nutrient distribution. For instance, an investigation of the impact of feeding frequency on fish growth has been conducted in [54]. The authors employ a gradient boosting machine (GBM) model to optimize the feed conversion ratio (FCR) and average daily weight gain (ADG), achieving high predictive accuracy and an R² of 93.03% and 72.49%, respectively. It is established that increased feeding frequency generally reduces FCR and feeding to satiation maximizes ADG. Similarly, authors in [55] use a bioenergetic and protein flux model to optimize feeding in gilthead seabream, achieving a 46.85% improvement in final fish weight by adjusting feed composition and frequency. These studies emphasize the significance of feeding frequency and nutrient balance in achieving optimal growth and minimizing waste, highlighting a need for tailored feeding schedules across species.

3.1.2. Automated Detection of Feeding Behavior

Automatic fish-feeding behavior detection is critical in determining whether fish are satiated or require more food. Multiple studies have utilized computer vision and acoustic sensing to monitor and dynamically adjust feeding protocols. For instance, in [23], a computer vision system with an R(2+1)D deep learning model is proposed to detect feeding activities by analyzing water surface disturbances (waves) during and after feeding, achieving a classification accuracy of 93.2%. This model shows potential for turbid outdoor environments, although species-specific adaptation remains an area for further validation. An enhancement of ResNet34 with CBAM attention is proposed in [56] for feeding intensity classification in tilapia, achieving a remarkable accuracy of 99.72%. Similarly, [59] advances computer vision techniques for high-density fish tanks, using an SVM classifier with particle filtering to enhance real-time feed detection. These studies demonstrate the versatility and effectiveness of AIoT in diverse aquaculture environments, though they also reveal challenges in adapting to species-specific and environmental variations.

3.1.3. Acoustic-Based Monitoring and Behavioral Analysis

Acoustic-based monitoring has emerged as another effective technique for assessing feeding behavior, especially useful for detecting subtle differences in feeding intensity. In [57], a passive acoustic detection system is introduced to monitor shrimp feeding, using sound pattern analysis. The proposed system correlates feeding sound intensity with hunger levels. Despite challenges from ambient noise, this approach holds potential for real-time feed adjustments. Further, an audio spectrum Swin transformer (ASST) model is proposed in [62] to classify fish feeding intensity based on acoustic signals in aquaculture ponds, achieving a high classification accuracy of 96.16%. These acoustic-based systems offer valuable insights into fish behavior but face limitations in environments with background noise or low-intensity feeding signals, particularly in juvenile or smaller species.

3.1.4. AI-Enhanced Precision Feeding Systems

Precision feeding systems, often enhanced with AI technologies, represent a leading approach in smart feeding systems. Authors in [25] explore this area using a multilayer perceptron neural network (MLPNN) trained on Fourier descriptors to classify fish behaviors, achieving a perfect accuracy of 100% for real-time feeding adjustments. Similarly, [64] combines image segmentation with fuzzy inference through a feature fusion attention U-Net (FFAUNet) and an adaptive neuro-fuzzy inference system (ANFIS) optimized by particle swarm optimization (PSO). This hybrid model reaches a classification accuracy of 98.57%, providing precise feeding based on water waves representing the feeding intensities (heavy, medium, normal).

3.1.5. Integrating Multimodal Data for Enhanced Monitoring

Existing studies have suggested integrating multimodal data can improve feeding system robustness and accuracy. CNN-based grading is combined with fuzzy control in [60] to assess feeding intensity more precisely, while [8] integrates ECG, acceleration, and environmental data to monitor heart rate and feeding in coral reef fish, using Bayesian change-point detection and Kullback–Leibler divergence for reliable detection of feeding events. Such multimodal approaches provide a holistic view of feeding behavior, allowing for comprehensive monitoring, though real-time data fusion and application in field settings present ongoing challenges.

3.2. Water Quality Management

Maintaining optimal water quality is fundamental for sustainable and productive aquaculture, as aquatic species’ health, growth, and welfare are directly influenced by environmental conditions. Deviations in parameters like dissolved oxygen (DO), pH, temperature, and ammonia levels can significantly affect fish health, impacting growth rates and disease susceptibility. Integrating IoT sensors and AI-based predictive analytics has proven essential in modern aquaculture, enabling continuous, real-time monitoring and early intervention to prevent adverse conditions. Studies have shown that proactive monitoring systems can effectively forecast critical events, such as oxygen depletion and algal blooms, promoting stable, healthier environments and supporting sustainable aquaculture practices.

3.2.1. Water Temperature

Water temperature directly influences metabolic rates, growth, and health of aquaculture organisms, making its management crucial. IoT-enabled sensors and AI-driven predictive models have enabled continuous temperature monitoring, allowing prompt responses to fluctuations. For example, an improved unscented Kalman filter (UKF) with sequential analysis and optimized extreme learning machine (ELM) is proposed in [65] to enhance temperature monitoring accuracy in aquaculture ponds. This approach reduces data redundancy and interference, providing robust real-time monitoring. Authors in [26] demonstrate the use of a hybrid CNN-LSTM-GRU model to capture time dependencies in temperature data. Moreover, in [28], a temporal dependence-based LSTM (TD-LSTM) model is introduced to capture temporal dependencies in marine temperature prediction, effectively improving accuracy across different ocean depths and regions.
3.2.2. pH
pH levels influence various biochemical processes in aquaculture, affecting nutrient availability, metabolic rates, and fish well-being. Various studies have highlighted the effectiveness of AI models for pH prediction, which helps maintain optimal pH ranges and prevent stress-induced health issues. For example, a random forest (RF) model is proposed in [16]. In [19], a simple recurrent unit (SRU) model was applied to predict pH changes based on data from outdoor ponds, achieving stable predictions over extended monitoring periods. This approach, supported by moving average smoothing, allows the model to capture both short-term variations and long-term pH trends. Such predictive models contribute to effective pH management, enabling aquaculture operators to make timely interventions and support optimal growth conditions.

3.2.3. Salinity

Salinity is a critical environmental factor that affects osmotic balance and physiological functions in aquatic species. Maintaining stable salinity levels is essential to prevent stress and optimize growth. A low-cost salinity sensor that uses calibration circuits to ensure accuracy in varying conditions is introduced in [46]. Meanwhile, [45] enhances salinity monitoring by integrating a novel transducer design that automatically adjusts sensitivity, achieving minimal linear error in salinity measurements. Similarly, the authors in [66] explore salinity management by developing a durable conductivity meter suitable for real-time monitoring in field conditions. These studies demonstrate the effectiveness of IoT-enabled salinity sensors, providing aquaculture operators with reliable data to maintain optimal salinity conditions and prevent adverse impacts on fish health.

3.2.4. Dissolved Oxygen (DO)

DO is vital for aquatic organisms, directly influencing respiration and overall health. Effective DO management in aquaculture helps prevent hypoxia-related stress and mortality. A predictive model that combines LightGBM, BiSRU, and an attention mechanism to predict DO levels accurately is developed in [67], allowing timely adjustments in high-density tanks. In [68], the ECA-Adam-RBFNN model leverages clustering and optimization techniques to handle non-linear DO fluctuations, demonstrating robust predictive accuracy under varying environmental conditions. In [27], an ensemble model that combines genetic algorithm (GA) for feature selection with XGBoost, CatBoost, and extra trees (XT) regressors (GA-XGCBXT) for DO predictions is proposed. The study utilizes a dataset from a fish farm in Jeju, South Korea, that includes measurements of DO, pH, temperature, electrical conductivity (EC), and oxidation-reduction potential (ORP), to achieve a prediction RMSE of 0.31. While effective, the model requires recalibration for different environmental conditions. Authors in [69] proposed an enhanced Naive Bayes model that leverages differential sequences and a sliding window method to handle continuous values, combined with Laplacian correction for improved predictive performance on DO values. Similarly, [70] introduces a hybrid PCA-LSTM model, which combines principal component analysis for feature selection with LSTM for time-series prediction, improving DO prediction accuracy. Moreover, [71] uses a sparse auto-encoder (SAE) with LSTM for DO prediction, capturing deep latent features from water quality data for enhanced accuracy.

3.2.5. Turbidity

Turbidity affects water clarity, impacting light penetration, photosynthesis, and fish feeding. High turbidity can affect oxygen distribution in the water and lead to stressful conditions for fish, necessitating effective monitoring. Authors in [72] utilized a multilayer perceptron neural network embedded in IoT-enabled sensor nodes to classify pollutants based on pH, turbidity, and temperature measurements. The proposed approach achieved reliable pollutant detection in controlled environments. Similarly, a fault diagnosis model is proposed in [73] that uses rule-based decision trees (RBDT) and support vector machines (SVM) to identify anomalies in turbidity data, thereby enhancing operational reliability. Such IoT-integrated models enable proactive turbidity management, helping maintain optimal water conditions in aquaculture systems.

3.2.6. Chlorophyll

Chlorophyll-a levels serve as indicators of phytoplankton biomass and water quality, but excessive growth can lead to harmful algal blooms, reducing oxygen and threatening aquatic health. Authors in [74,75] explore chlorophyll prediction through SVM and LSTM models, respectively, supporting early intervention in algal bloom events. Deep learning techniques using CNNs and LSTMs are applied in [76] to analyze chlorophyll data from satellite and IoT sources, demonstrating the potential for real-time chlorophyll monitoring across diverse aquatic environments. Similarly, in [77], LSTM with batch normalization and dropout layers are utilized to improve short- and long-term chlorophyll-a predictions, while [78] uses LSTM for weekly chlorophyll-a concentration predictions to detect algal blooms in South Korean rivers.

3.2.7. Ammonia

Ammonia, a toxic byproduct of fish metabolism, poses significant health risks in aquaculture when present at elevated levels, leading to stress and compromised fish health. Monitoring ammonia levels is essential to prevent harmful accumulations. The concentration of ammonia is highly pH-dependent; when pH levels exceed 9.5, ammonia transforms from NH4+ to NH3, increasing its toxicity. Other factors such as temperature, dissolved oxygen, total dissolved solids, and algal growth, influence ammonia levels [13]. In aquaculture ponds, ammonia can build up from organic matter, uneaten feed, algal blooms, wastes of aquatic organisms, decomposing aquatic organisms, and nitrogen-rich compounds [79,80]. Ammonia toxicity is also a significant issue in landfills, where it accumulates and may seep into groundwater [43]. Effective ammonia monitoring in aquaculture ponds is crucial for assessing organism survival rates and overall water pollution levels [81]. The existing studies reflect a trend toward integrating predictive models and bioremediation techniques to improve water quality, which is essential for sustainable aquaculture practices. For example, a general regression neural network (GRNN) and long short-term memory (LSTM) networks are proposed in [47], for real-time prediction of ammonia, nitrogen and nitrite levels in high-density shrimp aquaculture systems. However, while ammonia predictions were stable, nitrite concentration predictions were less consistent, particularly under high biofilter loads. Building on the use of hybrid AI models, Wang et al. [48] present an advanced approach with a combined XAdaBoost and LSTM model for ammonia nitrogen prediction in aquaculture, which achieves lower RMSE and MAPE values compared to existing methods. Meanwhile, Collos et al. [13] provide a biological perspective by examining ammonium toxicity in unicellular algae. Shifting from prediction to treatment, John et al. [79] explore the application of bioremediation through a bacterial consortium aimed at treating ammonia and nitrite in tilapia aquaculture wastewater. The bacterial treatment achieved an impressive reduction in ammonia, improving the survival rate of tilapia significantly. However, this lab-based experiment prompts questions regarding the efficacy of the consortium on a larger scale, particularly in varied aquaculture environments where external conditions could influence bacterial performance. In another predictive modeling approach, Yu et al. [80] propose a hybrid model that combines empirical mode decomposition (EMD), improved particle swarm optimization (IPSO), and extreme learning machine (ELM) for accurate ammonia prediction. This EMD-IPSO-ELM model demonstrated strong predictive capabilities with enhanced accuracy compared to other methods. Addressing ammonia inhibition in anaerobic digestion, the authors in [43] explore how high ammonia levels affect biogas production during the digestion of municipal solid waste. The study finds that biogas production is inhibited at higher ammonia concentrations, impacting AD efficiency. A review by Karri et al. [81] further investigates ammonia management by comparing conventional and advanced removal methods in wastewater treatment. This critical review identifies the advantages and limitations of techniques such as ion exchange, membrane filtration, and biological treatments. Lastly, Nagaraju et al. [82] revisit predictive modeling with an emphasis on noise reduction in ammonia prediction within shrimp aquaculture. The pelican optimization algorithm (POA) is combined with discrete wavelet transform (DWT), which outperforms standalone POA in accuracy. Together, these studies provide a multifaceted perspective on ammonia management, from predictive AI models and biological treatments to analytical reviews of conventional methods.

3.3. Disease Detection and Classification

Early disease detection and classification are vital for effective control and management of disease outbreaks in aquaculture farms. This ultimately contributes to the health and productivity of fish and shrimp stocks. Disease outbreaks not only affect the welfare of cultured species but also lead to significant economic losses in fish farming practices.

3.3.1. Computer Vision-Based Disease Detection

Computer vision has been widely applied in aquaculture to detect visual symptoms of fish and shrimp diseases. For instance, in [7], an early detection mechanism of white spot syndrome virus (WSSV) in shrimp through image processing techniques is proposed. The study applies an artificial neural network (ANN), with feature extraction achieved through Canny edge detection and gray-level co-occurrence matrix (GLCM) features. The results indicate a robust performance, with the model achieving 94.71% accuracy, 94.86% sensitivity, and 93.32% specificity, demonstrating its efficacy in distinguishing healthy from infected shrimp, on a primary dataset of shrimp images, featuring both healthy and WSSV-infected specimens. However, limitations include potential image quality issues due to environmental variations and the model’s focus on WSSV, which could reduce its applicability to other shrimp diseases. Future directions involve expanding the dataset to cover additional diseases, enhancing feature extraction methods, and developing a real-time monitoring application for practical use in aquaculture. In [84,85] SVM, random forest classifiers with K-means clustering for image segmentation are utilized achieving classification accuracies of 94.12%, and 88.87% respectively in classifying both Epizootic Ulcerative Syndrome (EUS) and Tail and Fin Rot diseases.
Attention mechanisms in deep learning models [83] have shown promise for disease classification in aquaculture by allowing networks to focus on disease-specific features. In [86], the convolutional block attention module (CBAM) is integrated with ResNet50, achieving an accuracy of 89.9%. Similarly, [2] proposes a background elimination approach and improves upon traditional CNN models by integrating an online sequential extreme learning machine (OSELM) and CBAM to enhance feature extraction, achieving 94.28% accuracy even in challenging underwater conditions. Similar studies include [87] that classify epizootic ulcerative syndrome (EUS), Ichthyophthirius (Ich), and Columnaris with AlexNet CNN, and [30] which utilizes InceptionV3, and VGG16 achieving accuracy scores of 93.87% and 91.60%, respectively on the SB-Fish disease dataset from Kaggle. In [88], multiple machine-learning models including decision tree, logistic regression, Naive Bayes, multi-layer perceptron, and support vector machine (SVM) with different kernels are reviewed and SVM with a polynomial kernel achieved the highest accuracy in identifying EUS in fish. In [89], the authors aim to detect EUS in fish using image processing for improved diagnostic accuracy. The proposed model employs FAST and HOG for feature extraction, with PCA for dimensionality reduction and a neural network for classification. The study results show the FAST-PCA-neural network model achieves higher accuracy (86%) compared to HOG-PCA (65.8%), highlighting FAST’s effectiveness for EUS detection. Limitations include its focus on EUS, limiting broader application.

3.3.2. Water Quality-Linked Disease Prediction

Predictive models that monitor water parameters can detect deviations associated with disease risk. Authors in [90] employ a gradient boosting model (GBM) to classify disease risk based on water quality data. The model achieves a 92% accuracy rate in predicting disease likelihood by analyzing parameters such as pH, DO, and nitrate levels. Additionally, [91] integrates water quality monitoring with disease detection through IoT-enabled sensors and the MobileNetV2 CNN model, achieving 91.5% accuracy.

3.3.3. Cross-Modal and Zero-Shot Learning for Disease Identification

In cases where labeled image data for certain diseases are limited, cross-modal and zero-shot learning methods offer innovative solutions. There have been a few studies that have explored this area. Among them involve applying zero-shot learning by leveraging textual descriptions from scientific literature and mapping them onto image features to classify diseases in shrimp, achieving effective disease recognition even without direct image data for certain diseases [92]. This cross-modal approach enables knowledge transfer from textual data to visual data, offering a promising solution for identifying diseases beyond the training set. Similarly, [93] further explores transfer learning for fish disease detection, using VGG-16 and AlexNet architectures with data augmentation to achieve a classification accuracy of 91%.

3.3.4. Ensemble and Hybrid Models for Disease Classification

Combining multiple models in ensemble or hybrid architectures has proven effective in enhancing classification accuracy. Authors in [94] introduce the performance metric-infused weighted ensemble (PMIWE) model, combining models such as ResNet-50, DenseNet-121, and EfficientNetB3 to classify freshwater fish diseases with an accuracy of 97.53%. PMIWE achieves high reliability and interpretability by fusing the strengths of various transfer learning models, making it particularly suitable for real-time disease diagnosis. Similarly, [95] employs a CNN with histogram of oriented gradients (HOG) and thresholding techniques for disease classification. This model exemplifies the utility of combining feature extraction techniques with CNN architectures to improve classification robustness.

3.3.5. Adaptive Neural Fuzzy Systems

These systems are crucial in enhancing diagnostic precision and management in aquaculture disease detection. For instance, [29] introduces an improved U-Net segmentation model with multi-head attention for accurately identifying disease-affected regions in fish images. Similarly, [96] targets fish health management by developing a fuzzy neural network-based expert system for diagnosing diseases in grass carp. Results show high diagnostic accuracy for common grass carp diseases, though the model currently focuses only on this species. Future improvements aim to extend the system’s applicability to other species, increase adaptability, and enable real-time monitoring.

3.3.6. Mobile and IoT-Enabled Disease Monitoring Systems

These solutions offer practical and scalable disease monitoring options for aquaculture. The studies [97] and [21] both focus on mobile and IoT-enabled disease detection systems. In [97] authors utilize a CNN optimized for mobile applications to provide disease alerts and treatment recommendations via a cloud-based interface, making disease detection accessible to small and micro-scale fish farmers. The work in [21] employs the YOLOv5 model for disease detection integrated with water quality monitoring sensors, achieving 97.03% accuracy.

3.3.7. Biosensors for Pathogen Detection

Advancements in biosensor technology contribute to disease prevention by providing precise pathogen detection capabilities. Authors in [98] explore optimizing gold (Au) electrodes for pathogen detection in aquaculture, utilizing electrochemical impedance spectroscopy (EIS) to measure responses related to electrode surface properties. The study’s findings suggest that low roughness levels in Au electrodes enhance the quality of impedance responses, facilitating pathogen detection in aquaculture environments.

3.4. Fish Biomass Estimation

Fish biomass estimation is critical in optimizing aquaculture operations by enabling accurate assessments of fish health, growth, and population density. Reliable biomass estimation facilitates efficient feeding routines, reduces waste accumulation, and promotes sustainable fish farming practices. Modern advancements in machine learning, computer vision, sonar, and smart scale technologies have enabled non-invasive, real-time biomass estimation in aquaculture, overcoming traditional challenges associated with manual measurements and stress-induced inaccuracies.

3.4.1. Computer Vision and Deep Learning for Biomass Estimation

Computer vision and deep learning techniques offer powerful tools for estimating fish biomass in aquaculture environments. Various studies have explored biomass estimation in aquaculture using AI approaches. For example, a custom version of YOLOv5 (DL-YOLO) combined with stereo vision for real-time fish detection are employed in [99], achieving a mean relative error (MRE) of 2.87% and an R² of 0.98 in estimating fish length, height, and weight. Similarly, [100] utilizes mask R-CNN with pyramid squeeze attention for instance segmentation and 3D reconstruction, achieving an error rate below 5.6% in biomass estimation. Moreover, in [101] YOLOv7 is applied on the large-scale fish dataset to estimate fish length and biomass, achieving a high mean average precision (mAP) of 0.988, demonstrating the feasibility of using real-time object detection for efficient, automated biomass monitoring. However, the approach can be affected by lighting and visibility in underwater conditions.

3.4.2. Smart Scales for Real-Time Biomass Monitoring

Smart dynamic scales represent a practical and portable solution for biomass estimation in both offshore and in-land aquaculture environments. A Bluetooth-enabled dynamic scale that measures juvenile seabream biomass with a mean relative error of less than 1.4% is introduced in [102]. Designed with portability and cost-effectiveness in mind, this scale provides aquaculture operators with an accessible tool for monitoring fish growth and optimizing feeding practices. Authors in [103] further validate the dynamic scale’s accuracy in both offshore and in-land settings, demonstrating stable readings with minimal deviation from laboratory scale results. Moreover, a portable scale optimized for offshore use is presented in [104], achieving a relative error below 2% in oscillating conditions. Future research could enhance these devices by improving mechanical stability and developing dedicated apps to expand usability in commercial settings.

3.4.3. Sonar and Acoustic Methods for Biomass Estimation

These methods have proven effective for biomass estimation in environments where optical imaging is challenging, as evidenced in various existing works. For example, authors in [105] leverage sonar technology in combination with machine learning models, including a VGG network, to estimate biomass in high-density tanks under varying water conditions. Similarly, in [106] a standalone echosounder device designed for closed aquaculture settings is proposed. The device estimates fish counts and biomass by calculating target strength and backscattering coefficients, achieving an error margin of ±7% in field tests. Moreover, acoustic methods such as those used in [107] rely on ultrasound transducers and cross-correlation processing to detect and monitor biomass in fish migration channels near hydropower plants. This approach is instrumental in assessing ecological impacts and supporting biodiversity in impacted aquatic environments.

3.4.4. Structure from Motion (SfM) and 3D Modeling for Biomass Estimation

SfM technology enables the reconstruction of 3D models of fish for non-intrusive biomass estimation. In [108], SfM is combined with multi-view stereo (MVS) techniques to generate dense point clouds and 3D meshes from underwater images. The proposed approach achieves a high correlation (r = 0.98) between estimated volumes and actual biomass while reducing error margins by over 30% compared to traditional techniques. Further, the method allows for precise volumetric measurements in real time. However, environmental factors like lighting variability and water turbidity can affect performance, necessitating further refinement to ensure accuracy across diverse aquaculture settings.

3.4.5. Acoustic Signal Processing for Non-Invasive Biomass Estimation

Non-invasive acoustic signal processing techniques have shown promise in biomass estimation for certain fish species. Authors in [109] introduce an acoustic signal processing method that estimates biomass based on chirp signals emitted by damselfish. By using cross-correlation of signals captured by multiple acoustic sensors, this method achieves an error rate within 2% for simulated biomass estimations, providing a reliable non-invasive alternative for biomass monitoring in natural habitats. Although practical deployment remains challenging due to factors like signal propagation and multipath interference, this approach offers a promising direction for monitoring soniferous fish species.

3.5. Fish Behavior Detection

Monitoring fish behavior is critical for maintaining fish health, welfare, and achieving optimal growth in aquaculture systems. Behavioral changes often serve as early indicators of stress, disease, satiety or hunger, and adverse environmental conditions, providing valuable insights for timely interventions. Recent advancements in AI, IoT, acoustic monitoring, and computer vision have revolutionized behavior detection, enabling non-invasive, real-time monitoring even in complex aquaculture environments. These technologies have demonstrated significant improvements in accuracy and reliability, as evidenced by various studies.

3.5.1. Abnormal Behavior Detection

Detecting abnormal fish behaviors, such as erratic swimming or lethargy, is essential for identifying potential health issues in aquaculture. An introduction of a deep learning-based system using a ResNeXt 3 × 1D convolutional network is made in [110], achieving 95.3% accuracy in distinguishing abnormal behaviors. This model improves monitoring efficiency in aquaculture by reducing computational complexity, though further work could expand behavior definitions and refine the model for edge computing. In [50], a combination of faster R-CNN for object detection, directed cycle graphs (DCG) for posture classification, and dynamic time warping (DTW) for behavior analysis is proposed. The model achieved a 92.8% accuracy in identifying unusual behaviors in fish farms. This multi-model approach allows for accurate real-time behavior detection, but its effectiveness can be influenced by environmental factors like lighting and background noise. Similarly, [111] tackles abnormal behavior recognition in high-density fish swarms, using a dual-flow deep network with a multi-instance learning framework to isolate abnormal behaviors. Moreover, an integration of a modified motion influence map with recurrent neural networks (RNNs) is proposed in [112], providing a robust framework for identifying patterns in sequential data.

3.5.2. Swimming Speed and Locomotion Analysis

Swimming speed and locomotion are important indicators of fish welfare and overall activity levels in aquaculture environments. Authors in [113] present a Doppler-based method that uses acoustic telemetry to measure swimming speed in marine cages, achieving an RMS error of 7.85 cm/s across a range of relevant speeds. This technique, while non-invasive, may be affected by signal reflections and environmental noise, especially in complex, high-density settings. Analysis of swimming behavior using acoustic video and deformable model techniques, which capture fish body deformation to identify species-specific swimming patterns is undertaken in [114]. Moreover, in the context of cold-water coral ecosystem monitoring, [115] introduces a workflow combining manual annotation, gold-standard creation and CNNs to estimate coral polyp activity levels. With an accuracy of 96% for polyp activity classification, the study identifies significant temporal patterns linked to environmental events, emphasizing its value in ecological research.

3.5.3. Disease Detection Through Behavior Analysis

Behavioral cues can often signal early disease onset in aquaculture species, allowing for timely interventions. An introduction of EchoBERT, a transformer-based model that analyzes echograms to detect early indicators of pancreas disease in Atlantic salmon is made in [116], achieving an MCC score of 0.694. The model outperforms traditional LSTM-based approaches, enabling earlier disease detection by over a month. Future work aims to refine this model across varied environments, expanding its applicability for other behavior-linked disease indicators.

3.5.4. Shoaling and Social Behavior Modeling

Shoaling behavior provides valuable insights into group health and social dynamics within aquaculture systems. In [114], a CNN model is proposed to detect different states of fish shoal behavior, such as feeding and stress responses, achieving 82.5% accuracy through spatiotemporal fusion images. This method combines spatial and optical flow information, though its performance may be limited by lighting inconsistencies and other environmental factors. Similarly, [51] explores a simulated environment for schooling behavior, using Deep Q-Networks (DQN) to model fish interactions without predefined rule-based models. This model successfully reproduces ordered group formations, demonstrating the potential for DQN-based approaches in understanding collective behaviors. Although currently limited to simulations, future work aims to refine this model for real-world schooling scenarios.

3.5.5. Acoustic and Echogram-Based Behavior Detection

Acoustic and echogram-based methods provide non-intrusive options for behavior detection, particularly in turbid or challenging environments. EchoBERT transformer model that leverages echograms to detect disease-related behaviors is proposed in [116], while [111] applies acoustic imaging to capture behaviors within fish swarms, achieving high recognition accuracy in abnormal behavior detection. These approaches highlight the potential of acoustic monitoring for continuous welfare assessments in aquaculture, though further developments are needed to improve robustness across diverse environmental conditions.

3.6. Counting Aquaculture Organisms

Accurate counting of organisms in aquaculture is vital for managing stock, monitoring health, and optimizing feeding. Traditionally, this task has been labor-intensive, invasive, and prone to inaccuracies, especially in high-density environments. Recent developments in AI, computer vision, and sensor technologies have made automated, accurate counting feasible for a variety of aquaculture species, including fish, shrimp, and holothurians. Existing studies such as [6,117,118,119,120,121,122,123], have reported considerable success in the area of counting aquaculture organisms.

3.6.1. Shrimp Larvae and Seed Counting

Shrimp counting is a critical process in hatcheries and farms. Authors in [117] present ShrimpCountNet, a deep-learning model using density estimation for automated counting of shrimp larvae, achieving an impressive accuracy of 98.72%. However, its accuracy diminishes with extremely dense populations, indicating the need for further optimization for larger samples. For counting shrimp seeds, [6] introduces ShrimpseedNet, a lightweight model optimized for smartphone deployment, with an accuracy of 95.53%. This model offers a practical, portable solution for aquaculture workers, though its performance may vary under extreme lighting and high-density conditions. Future work would therefore focus on enhancing accuracy for more challenging scenarios and adapting the model for additional species.

3.6.2. Fish counting in Controlled and Natural Environments

Fish counting in controlled environments often requires identifying individual fish in crowded ponds, tanks or nets. YOLOv5 is used for fish detection and counting combined with optical flow to analyze movement for stress detection in [118], achieving an F-measure of 81%. An introduction of a multi-scale context-enhanced CNN to estimate fish density by addressing occlusion and perspective distortions is made in [121]. This approach improves accuracy in crowded environments, with lower mean square and absolute errors compared to previous models, though its accuracy decreases with lower fish densities. Future directions include dataset expansion and semi-supervised labeling for improved low-density performance. A similar study by [122] targets counting young fish (fry) in high-density environments, using a super-resolution GAN density estimate attention network (SGDAN). This model achieved a 97.57% accuracy rate, outperforming existing methods like MCNN and CSRNet, though it is currently limited to still images rather than video streams.
Moreover, [34] and [124], leverage deep learning to advance automated monitoring and estimation systems for marine ecology and fisheries management. In [34], YOLOv2 and a denoising auto-encoder were used to detect scallops in low-contrast underwater images. Similarly, in [124], autoencoders and GANs were combined to generate synthetic jellyfish images, which were then used to train an FCN for jellyfish swarm density estimation, achieving 83.8% accuracy. Despite their contributions, both models face limitations in adapting to diverse environmental conditions. Future directions focus on enhancing data diversity, expanding applications to other marine species, and integrating additional modalities for robust, scalable monitoring solutions.

3.6.3. Echosounder and Acoustic Counting Methods

Echosounder technology offers a non-intrusive way to estimate fish populations, especially in farming nets where visual counting is difficult. In [119], an acoustic signal processing approach is employed to estimate fish populations in nets with less than 10% error. The system compensates for reflections from the net, but performance decreases in sparse populations, highlighting the need for enhanced signal processing techniques. Future work would focus on improving accuracy in low-density populations and expanding applications in diverse aquaculture setups.

3.6.4. Counting Holothurians (Sea Cucumbers)

Holothurian populations are increasingly managed using non-contact counting to minimize disturbance. A system utilizing YOLO-V3 and Faster R-CNN for detection, combined with tracking algorithms like SORT and IOU is presented in [123], achieving robust counting even in turbid water. However, accuracy is affected by underwater visibility and occlusion. The researchers plan to expand the underwater dataset and test more advanced tracking algorithms to handle diverse environmental conditions.

3.6.5. Automated Scale Counting and Phenotypic Feature Detection

Counting lateral line scales is essential for breeding programs and species identification. In [120], the TRH-YOLOv5 model which uses transformer mechanisms and a small-target detection module to improve precision in counting these scales is introduced and achieves 98.8% precision. This method supports phenotypic analysis in sustainable fisheries, though underwater lighting and motion blur remain challenges. Future improvements will address these environmental factors and broaden phenotypic feature detection for more species.

3.7. Segmentation, Detection, and Classification of Aquaculture Species

The segmentation, detection, and classification of aquaculture species are critical tasks for effective aquaculture monitoring and biodiversity conservation. Advances in AI and computer vision have introduced robust solutions for these tasks, even in challenging underwater conditions. In this section, we discuss the recent research studies that address the complexity of recognizing aquaculture species and marine structures in a variety of environments.

3.7.1. Fish Species Classification

Identifying fish species is fundamental to ecological monitoring and fisheries management. Numerous works have been done on fish species identification involving AIoT. In [125], the authors introduced an ANN-based model to classify nine fish species, achieving a perfect score of 100%. This study demonstrates the model’s potential but remains limited by its focus on only nine species. An ensemble model (Ens_MobV3+NasNet) is proposed in [33] for species classification in aquaculture, achieving 98.6% accuracy. Despite the model’s robustness, misclassifications occur with occluded or poorly oriented images. Similarly, [126] developed WildFishNet, a neural network utilizing a fusion activation mechanism combining softmax and openmax functions, with a structure inspired by EfficientNetV2 to enhance feature extraction for fine-grained recognition. The model achieved an 86.3% accuracy in distinguishing known and unknown fish species. In [127], an unsupervised learning framework for underwater fish recognition is presented and evaluated on Fish4Knowledge and NOAA Fisheries datasets. Conversely, a modified U-Net CNN is proposed in [128] to classify herring, salmon, and bubbles in multi-frequency echograms, achieving high F1 scores (93.0% for herring, 87.3% for salmon, 86.5% for bubbles). However, the proposed method is limited in its ability to handle class imbalance and a small test set, thereby proposing to expand annotated data and incorporate additional environmental factors for improved classification in future work. Moreover, [129] introduces a compact CNN model with under 20,000 parameters for classifying six pelagic fish species in underwater images, achieving ~49% accuracy. Similarly, authors in [130] utilize a vision transformer (ViT) model for classifying estuarine fish species, achieving 99.04% accuracy (and 100% with augmentation) on a 12-species dataset, outperforming traditional models like VGG16 and ResNet50. Similar studies include [131,132] that develop a customized CNN based on AlexNet (termed Deep-ShripNet) and a deep learning model using stacked auto-encoders for feature extraction combined with logistic regression for classification (SAEs–LR) for distinguishing soft-shell (defective) shrimp from sound shrimp using image-based classification and evaluating shrimp freshness during cold storage, respectively. Moreover, in [10,14,133,134,135], the overarching aim is to develop advanced deep learning-based systems for automated classification in diverse domains, leveraging the strengths of convolutional neural networks (CNNs) and related architectures. These studies focus on specific applications such as pearl quality assessment [133], coral species classification [134], jellyfish monitoring [10], aquatic macroinvertebrate classification [135], and diatom identification for water quality assessment [14].

3.7.2. Underwater Species Detection in Complex Environments

Low-quality underwater images pose unique challenges in species detection. Though there have been numerous research efforts aimed at handling species detection in complex environments, real time species detection remains a challenge. The study by [32] developed composited FishNet, a two-stage cascade R-CNN model for fish detection and classification in low-resolution, complex underwater environments. This model showed high robustness with an AP of 75.2% and AP50 of 92.8%, although challenges remain with small fish targets and occlusions. In [136], YOLOv5-Fish, an enhanced YOLOv5 model that incorporates advanced feature extraction layers is proposed to improve detection in blurred, low-contrast scenes. Similarly, [11] develops an improved faster R-CNN model for real-time jellyfish detection in marine environments, achieving over 93% mAP and enhanced training speed (1.85bit/s to 7.35bit/s). In autonomous underwater robotic applications, [137] introduced YOLO8-FASG, a YOLOv8-based model enhanced for small fish detection. Moreover, authors in [31] develop an enhanced YOLOv5-nano model (YOLOv5n-GAM-CoordConv) for real-time jellyfish detection in underwater images, achieving a high mAP@0.5 of 89.1%.

3.7.3. Segmentation in Underwater and SAR Images

Precise segmentation is essential in aquaculture monitoring and resource management by enabling accurate analysis of underwater and SAR images. Several research initiatives have focused on addressing the challenges associated with these tasks. Among these is [138] which used the STFF model, a self-supervised transformer model with feature fusion, to enhance semantic segmentation in SAR images. The model achieved superior accuracy and continuity in segmenting marine aquaculture zones but struggled with small targets and adverse sea conditions. Future directions will involve improving the detection of small targets and adapting the model for a wider range of marine environments. For segmenting aquaculture rafts, [139] introduced MDOAU-Net, a lightweight U-Net model tailored for noisy SAR images. Similarly, authors in [140] developed an optimized segmentation model for offshore farms using SAR images, achieving a FWIoU of 0.9876 with a fast inference time. This model’s speed and accuracy make it suitable for large-scale applications, though sensitivity to SAR image noise is a limitation. Expanding the model to different aquaculture structures is a promising future direction. An introduction of the IDUDL model is done in [141] for semantic segmentation of SAR images in marine aquaculture and achieves high accuracy in OA, MIoU, and Kappa scores.
For underwater segmentation, [142] utilized a combination of manual bounding boxes and automated mask generation for training Mask R-CNN models, incorporating U-Net++ and DeepLabv3+ for mask creation. Finally, [143] introduced U_EFF_NET, a U-Net-based model with EfficientNet-b0 and CBAM for segmenting offshore aquaculture farms in SAR images. The model achieved a FWIoU of 98.12% and demonstrated fast inference speeds, ranking highly in the 2021 Gaofen Challenge. Despite its impressive performance, it remains sensitive to extreme SAR image noise.

3.7.4. Detection and Classification of Aquaculture Structures

The detection of aquaculture structures, such as floating rafts, is crucial for environmental and resource management. Fan et. al [144] introduced a novel clustering algorithm, CMKFCM, for detecting aquaculture rafts in SAR images. The model integrates neurodynamic optimization with particle swarm optimization for enhanced clustering accuracy. Despite its robustness, detection performance is affected by sea surface roughness and noise. Future improvements will focus on adapting the model to varied marine environments. Similarly, An RSC-APMN model for SAR image segmentation in marine aquaculture is developed in [145], incorporating modules to handle fluctuating sea conditions.

3.7.5. Image Processing for Fish Quality Assessment

For fish quality and freshness assessment, [146] developed an adaptive thresholding method to segment fish gills, achieving a high correlation with expert annotations. This method provides a reliable approach for real-time quality validation. However, careful handling is needed to avoid interference from blood spills. Future work will explore enhancing segmentation precision for broader applications in quality assessment. Authors in [147] focused on fish segmentation in retail environments, achieving an 88.69% segmentation accuracy with SegNet. This model is promising for automated seafood quality assessment but faces challenges with noisy backgrounds and species variability. Future directions include expanding the dataset to include more species and refining segmentation techniques for realistic retail settings.

3.8. Breeding and Growth Estimation

Breeding and growth estimation in aquaculture is essential for optimizing fish feeding, fish health evaluation, reproductive success, and environmental safety. Recent studies have focused on developing automated and accurate techniques to estimate growth parameters and monitor breeding maturity, utilizing advanced AI models, computer vision, and stereo vision technology.

3.8.1. Automated Breeding and Spawning Detection

Automating breeding and spawning detection in aquaculture can significantly improve reproductive success and breeding schedules. In this direction, [148] introduced a YOLOR-based system for assessing cobia maturity, achieving an accuracy of 74.3% on test datasets. Although promising, the system's performance is limited by environmental factors such as water clarity. For oyster farming, [149] developed a fault detection-based algorithm for automatic spawning detection using valve movement data, achieving high accuracy with no false alarms. However, parameter tuning by trial and error limits its adaptability.

3.8.2. Fish Population Estimation Using Echosounders and Digital Twins

Accurate estimation of fish populations is essential in determining feed quantities and stocking densities. Advances in technologies such as echosounders and digital twins (DT) have opened new possibilities for monitoring and decision-making in aquaculture, though significant challenges remain in their development and deployment. Existing studies that have explored this area include [150] which employed echosounder technology to estimate fish populations within farming nets, achieving an error margin of less than 10% in densely populated areas. Despite its success, the algorithm encounters challenges in sparse populations and with interference from net signals. Future work aims to enhance net signal removal techniques and explore advanced echosounders to further improve estimation accuracy. In parallel, [151] reviewed the potential of DT in aquaculture, emphasizing their promise for real-time monitoring and informed decision-making. DT applications in aquaculture are still in their early stages and face obstacles such as interoperability issues and data security concerns.

3.8.3. Growth Estimation

Accurately estimating fish growth parameters, including body size, weight, length, and growth trajectories, is crucial for optimizing feed management, health monitoring, and ensuring sustainable aquaculture practices. Several studies have explored innovative approaches for measuring fish growth metrics. For body size estimation, [152] developed an underwater stereo vision system capable of measuring fish length and height, achieving a low error rate of 0.85% for length estimation. Similarly, [153] introduced YOLO-FL, a YOLO-based model that integrates segmentation and key point detection to measure fish body length with a MAPE of 3.7%. A possible future research direction is to pilot the model under practical aquaculture settings and refine image processing techniques for broader applicability.
For weight prediction and growth trajectory estimation, [154] compared empirical methods with LSTM-based models, identifying a combined weighted-sum approach that reduced errors by 61.3% compared to empirical models alone. Although effective, the model’s generalizability across species and environments is limited, prompting future exploration of environmental data integration and testing in diverse conditions. Non-invasive techniques for fish length estimation reduce stress on fish and optimize labor in aquaculture. For example, [155] presented a CNN-based system using stereo vision to estimate fish length, with relative errors of 3.15% for seabream and 7.4% for seabass. Environmental factors like fish movement and turbidity affect accuracy. Real-time fish size measurement optimizes feeding practices by allowing precise size tracking. In [156], the authors developed a stereo vision-based size measurement system achieving an accuracy with an error margin of 4%. However, fish rotation and distance from the cameras can impact measurements.

3.8.4. Environmental Monitoring for Growth and Health

Environmental monitoring is essential for sustainable aquaculture practices, particularly in detecting and mitigating risks from natural phenomena. A related study in [157] addressed jellyfish bloom detection using sonar images enhanced with GAN-generated data, achieving improved classification accuracy and reduced false positives. The study by [12] leverages high-throughput imaging flow cytometry and a fine-tuned ResNetv2 CNN for automated phytoplankton species and life-cycle stage classification, achieving 97% overall accuracy.

3.9. Fish Tracking and Individuality

This is critical for monitoring fish behavior, health, feeding patterns, breeding, and population dynamics in both controlled and natural environments. Recent advancements in AIoT have enabled more accurate and scalable solutions for individual and group tracking and behavior analysis in complex aquatic environments.

3.9.1. Individual Fish Identification

Biometric approaches, such as iris recognition, have emerged as promising solutions for achieving reliable and non-invasive identification. These methods leverage unique physical traits, enabling accurate distinction between individuals without needing external tagging or invasive procedures. A study by [158] utilized iris patterns as biometric identifiers for Atlantic salmon, achieving over 95% accuracy in short-term identification. The approach combined convolutional neural networks (CNNs) with Log-Gabor filters to extract iris features. However, the reliability of this method decreases over the long term due to changes in iris patterns as the fish grow. Future research directions include exploring near-infrared imaging to improve robustness and adapting the system for underwater environments to enhance practicality.

3.9.2. Multi-Fish Tracking in Controlled and Unconstrained Environments

Various studies have explored innovative approaches to tackle challenges like high density, occlusions, and behavioral variability highlighting the progress in multi-fish tracking technologies while addressing key limitations. A study by [159] developed a 3D video tracking system for monitoring fish behavior in controlled water tank environments, achieving over 95% precision. In unconstrained marine settings, [160] introduced DFTNet, a tracking framework combining a Siamese network for appearance similarity with an attention-based LSTM for motion analysis. This approach significantly reduced ID switches by 60.9%, improving tracking accuracy. However, challenges persist in differentiating highly similar species and handling complex behaviors in crowded scenes. Moreover, [161] applied a tracking-by-detection approach to monitoring Sillago Sihama fish in aquaculture, using Gaussian mixture models (GMM) to identify behavioral patterns.

3.9.3. Activity Segmentation and Tracking in Sonar and Echogram Data

Using sonar and echogram data to track fish activity is essential for behavior monitoring in turbid or low-visibility aquatic environments. A study by [162] developed a sonar-based tracking method incorporating Kalman filtering and Gaussian modeling, achieving improved multi-object tracking accuracy and reducing false detections. While effective, further research would integrate deep learning to enhance segmentation precision and adaptability. In echogram analysis, [163] introduced U-MSAA-Net for identifying and segmenting finfish and krill, demonstrating superior accuracy despite challenges with noisy data. Similarly, [164] addressed fish detection and tracking in fishways using YOLO-CTS and Deep-SORT, achieving high precision even in complex environments. However, the system struggles in high-density scenarios, prompting future efforts to integrate sonar data and refine tracking algorithms for greater scalability and accuracy.

3.9.4. Fish Group Activity and Behavior Recognition

The existing related studies emphasize the potential of advanced computational models for recognizing fish group activities while highlighting the need for improved efficiency and deeper integration with environmental data to enhance real-world applicability. The study by [165] utilized a graph convolution network (GCN) with feature calibration for group activity recognition, achieving an impressive accuracy of 93.33%. A similar study in [166], focused on tracking fish behavior to assess water quality, combining an Otsu adaptive segmentation algorithm with a Kalman filter to achieve over 98% detection accuracy across various water conditions.

3.9.5. Tracking Fish Around Marine Structures and Moving Cameras

This is critical for understanding interactions with man-made installations and improving environmental monitoring. These tasks present unique challenges due to turbulence, occlusion, and dynamic environments. To monitor fish and seabird behavior around tidal turbines, [167] employed a modified nearest neighbor algorithm, effectively tracking in turbulent conditions. For tracking with moving cameras, [168] introduced the deformable multiple kernel (DMK) tracking algorithm, achieving high accuracy under variable camera motions. While effective, the method faces challenges in tracking larger schools and maintaining performance over longer durations.

3.9.6. Individuality and Behavior Tracking for Aquaculture Monitoring

Understanding tracking individual fish and their behaviors is vital in optimizing feeding practices, monitoring health, and reducing stress. A study by [161] developed a system combining fish tracking with behavior clustering for Sillago Sihama fish, enabling detailed behavior analysis to support aquaculture management. While effective, future research will enhance the robustness of this system to adapt to diverse aquaculture conditions and environments. Similarly, [169] introduced an online survival prediction system for monitoring stress during waterless transport, achieving high accuracy in predicting survival rates.

3.10. Automation and Robotics in Aquaculture

These advancements are critical for addressing the challenges of sustainable fish farming, resource management, and environmental conservation. Below, contributions from notable studies are categorized under various thematic areas.

3.10.1. Remote Sensing and Monitoring Systems

Remote sensing and monitoring systems are essential for efficient aquaculture management. A smart aquaculture assistant system [170] integrated IoT and big data for real-time pond monitoring and decision-making, showing effective environmental analysis and expert advice capabilities. Similarly, a segmentation model [171] mapped cage and raft aquaculture (CRA) structures with high accuracy using Sentinel-2 imagery, but future research aims to expand datasets and validate with multi-source imagery. A low-cost conductivity meter [66] accurately measures water quality but will benefit from IoT integration for remote sensing and improved calibration for diverse environments. An SRUNet model is developed in [172] to monitor offshore raft aquaculture in coastal areas, achieving high accuracy in aquaculture area delineation using SAR and MSI satellite imagery. A video-based system for real-time monitoring of Arctic Char in indoor tanks is introduced in [173]. The system tracks swim trajectories as health indicators, but it faces challenges in water turbidity and multi-layer tracking. Future improvements include integrating multi-camera setups and real-time machine learning capabilities. Authors in [174] focus on coastal aquaculture monitoring, aiming to accurately delineate aquaculture areas for environmental management using high-resolution GF-2 satellite images. Authors [175] address offshore aquaculture resilience by developing an autonomous submersible fish cage system designed to descend during adverse weather and resurface in calm conditions, protecting fish stocks and infrastructure. Moreover, [176] implemented an IoT-based smart aquarium system using decision tree regression to predict and regulate water conditions autonomously, achieving high-temperature prediction accuracy.

3.10.2. Robotic and Autonomous Systems for Underwater Monitoring

Robotics and autonomous systems have advanced underwater aquaculture operations. An ROV-based system [177] inspected fish nets using monocular odometry, with plans to incorporate SLAM systems for better robustness. A hybrid energy system [178] powered shrimp pond aeration using wind, solar, and electrolyzers, reducing CO2 emissions but requiring strategies to lower initial investment costs. Additionally, a robotic crayfish monitoring system [179] automated feeding and water quality management but faces challenges with connectivity and scalability. For eel larvae tracking, a vision-based system [180] employed kernel density estimation but requires stable environmental conditions for improved accuracy.

3.10.3. UAV and AUV-Based Aquaculture Inspections

UAVs and AUVs play a pivotal role in aquaculture inspections. A UAV-based system [181] assessed post-typhoon damage to oyster racks using edge detection but struggled with weather-related image quality issues, prompting research into UAV swarms for faster assessments. For underwater obstacle avoidance, an AUV system [182] utilized a custom CNN for transmission map estimation, demonstrating effective navigation but requiring real-time deployment and testing in varied underwater conditions. A trajectory tracking system [183] for UUV inspections achieved high precision but needs stability improvements under diverse environmental conditions. A motion planning framework for AUV navigation in aquaculture termed ResiVis is proposed in [184]. The proposed framework demonstrated robust navigation but faces limitations in highly dynamic settings, prompting future work on advanced object detection and state estimation modules for safe, autonomous underwater navigation.

3.10.4. Hybrid and Smart Monitoring Systems

Hybrid systems combine diverse technologies for aquaculture monitoring. An IoT-enabled robotic system to automate crayfish pond management is proposed in [179]. The HAUCS system in [185] achieved high accuracy in dissolved oxygen prediction and aims to enhance reliability for broader applications. A hybrid ultrasonic imaging system [186] for shrimp counting and positioning demonstrated accuracy but requires integration with underwater drones for expanded functionality. In aquaponics, a digital twin framework [187] enhanced real-time monitoring and factory management, though challenges in data integration and real-time interaction remain. Moreover [188] focuses on developing an automated system for identifying blood defects in cod fillets and enabling precise robotic processing in aquaculture. Using a multimodal RGB-D imaging approach, the study combines SVM for pixel-wise segmentation and CNN for defect detection, achieving 99% and 100% accuracy, respectively. Despite its high performance, challenges include spectral similarity between blood spots and normal tissue and environmental variations like lighting and texture. Moreover, [189] focuses on marine aquaculture zone (MAZ) extraction for environmental management using dual-polarimetric Sentinel-1 SAR imagery. The study introduced MAZ-Net, a modified U-Net architecture enhanced with GLCM-based texture maps and hybrid-scale convolution blocks, achieving a high F1-score of 94.77%, outperforming other models in identifying aquaculture areas across varying tidal and structural conditions.

3.10.5. Advanced Imaging and Sensor Technologies

Advanced imaging and sensor technologies support precision aquaculture monitoring. BI²Net [190] achieved high accuracy in extracting aquaculture areas despite sediment challenges, with plans to expand the dataset. A rule-based method [191] identified oyster aquaculture areas in turbid waters but requires refinement to address seasonal changes. A jellyfish monitoring system [10] using CNNs demonstrated high detection accuracy. Similarly, a fish catch forecasting system [192] achieved accurate predictions using hybrid models, with plans to integrate additional environmental features. A precision monitoring system [193] applied YOLO to multibeam echosounder images, aiming to improve object differentiation with 3D representations.

3.10.6. Digital Twin and Predictive Modeling

Digital Twin frameworks and predictive modeling optimize aquaculture operations. A DT system [15] combined AIoT for feeding and environmental management but requires improvements in connectivity and energy availability. Federated learning [194] on an edge-cloud platform supported privacy-preserving data processing for prawn farming, though scalability challenges persist. A smart Biofloc system [195] achieved high mortality prediction accuracy, with future work focusing on expanding species support. Additionally, a dynamic mapping system [196] for inland aquaculture utilized U-Net and recursive feature elimination, achieving high accuracy but requiring better segmentation for complex regions.

3.10.7. Biologically-Inspired Robotics

Biologically-inspired designs bring innovation to aquaculture robotics. A biomimetic AUV [197] inspired by Atlantic Salmon demonstrated efficient propulsion but requires real-world testing. MARL-based robotic fish [35] enhanced stability and reduced oscillations but need calibration for dynamic environments. A VR-based robotic grasping system [198] achieved high success rates in simulated environments, with future work focusing on real-world adaptation. Additionally, a cost-effective tracking strategy [199] for robotic fish demonstrated accuracy in trials, with plans to improve adaptability using ultrasonic sensors.

3.10.9. Specialized Aquaculture Applications

Specialized applications expand the scope of robotics in aquaculture. A hybrid energy system [178] optimized shrimp pond aeration using renewable resources, while a robust submersible cage system [200] maintained integrity under extreme sea conditions. A numerical model [201] analyzed mooring failures in net cages, prompting plans for experimental validation. Additionally, a sonar-based tracking system [202] achieved high accuracy in underwater target recognition, with future work focusing on lightweight models for real-time application.

4. Advantages of AIoT in Aquaculture

Compared to traditional approaches, AIoT brings about transformational advantages. Combining features that provide real-time data collection with intelligent analysis, AIoT can easily reduce the manual efforts of environmental monitoring, feeding, and disease detection. The traditional methods rely on a lot of manpower and periodic sampling. Moreover, these methods are inefficient and prone to errors. AIoT automatically performs these tasks with more accuracy, uniformity, and timeliness, enhancing productivity and sustainability of responses.
Among the most important benefits of AIoT is the automatic and continuous monitoring of the environment. IoT sensors installed in aquaculture systems monitor critical parameters thus allowing pro-active adjustment. This is especially critical in sensitive aquaculture environments, where minute changes can have huge implications on the health of organisms. Secondly, predictive analytics in management for effective resource management too stands out as one of the key benefits of AIoT in aquaculture. Advanced algorithms, such as LSTM networks, can predict oxygen depletion in advance and thus enable farmers to take precautionary measures of aeration or feeding. Thirdly, AIoT automates feeding and behavioral analysis. By applying AI algorithms on real-time video image analysis of fish behavior, the feeding schedules are corrected dynamically. This helps minimize feed waste and enables a healthier growth rate. Fourthly, AIoT systems create unparalleled conveniences for aquaculture operators to manage and monitor conditions from a distance. AI has also developed digital twin technologies, virtual representations of the aquaculture environment, where farmers can get a chance to observe, simulate, and operate parameters from afar. This would shut down labor demands, allowing farmers to operate several facilities with efficiency, improving the quality of decisions with quicker response times. Lastly, the ecosystem of AIoT gives way to sustainability and less environmental impact. Powering IoT sensors and devices with integrated renewable energy sources; solar or wind power, the system reduces greenhouse gas emissions due to the AIoT and cuts costs. In hybrid energy systems around shrimp ponds, aquaculture has been able to record a significant reduction in carbon footprint while maintaining optimal aeration, hence proving dual benefits toward environmental and economic sustainability.

5. Challenges and Limitations of AIoT in Aquaculture

While integration with AIoT is revolutionizing aquaculture, its continuous adoption and scalability are still plagued by many challenges and innate limitations, such as high cost and infrastructure requirements, data privacy concerns, environmental variability issues on the natural resources involved, and scaling.

5.1. High Initial Costs and Infrastructure Requirements

Implementing an AIoT system is costly. Sensor networks, data infrastructures, and computational resources for holistic integration of AIoT sometimes involve high upfront costs that may be discouraging from investment by the smaller- and medium-scale aquaculture farms. For such large-scale operations, the cost issues that remain include sensor maintenance, updating software, and storing the data, adding to the complexity and requiring both capital investment and technical expertise.

5.2. Environmental Variability and Model Adaptability

Aquaculture operates in environments subject to considerable variability, such as changes in water quality, temperature, and salinity. This dynamic setting is a major effect culprit of an AI model that often underperforms in generalizing across aquatic environments or even across species. This may involve frequent retraining and adaptations that are resource-intensive and time-consuming, particularly for smaller operations.

5.3. Data Privacy and Security Concerns

The interconnected nature of AIoT systems also exposes aquaculture operations to some level of risk regarding data privacy and security. There will be constant gathering of data through IoT devices on sensitive operational practices, fish health, and environmental conditions, creating a potential for hacking and unauthorized access. These are risks that call for stringent cybersecurity measures: encryption, secure network architecture, and access controls.

5.4. Scalability Challenges

AIoT scaling in such diversity may prove to be singularly problematic among aquaculture operators who have neither the infrastructure nor the resources to deploy devices by the thousands. Indeed, small-scale farms may find the attendant costs of such extensive networks of sensors and their attached data management systems prohibitively high and outside their investment budgets.

5.5. Complexity and Technical Expertise Requirements

The deployment and operation of the AIoT system is specialized, involving data science and machine learning skillsets besides taking up IoT devices. Most aquaculture operators, especially small and medium-sized enterprises, may lack the technical expertise needed to effectively implement such technologies and sustain them.

5.6. Limitations in Model Generalization Across Species and Applications

Most AI models in aquaculture have been developed either for particular species or for particular environments, further limiting their general application to diverse aquaculture operations. For example, an AI system tuned for Atlantic salmon will most likely perform poorly if applied directly to shrimp or tilapia owing to differences in behavioral traits and/or environmental requirements. It lacks adaptability and requires frequent retraining or model customization for each species or condition; this decreases the wide applicability of an AIoT solution across a broader range of aquaculture premises.

5.7. Environmental and Ecological Impact

Energy-intensive data processing and monitoring equipment make AIoT systems, especially those employed for this purpose, contribute to environmental degradation that is contrary to the goals of aquaculture sustainability. The energy demand for large-scale deployment of AIoT devices and utilization of electronics within aquatic environments raises concerns over their sustainability and possible disturbance of the ecosystem.

6. Future Directions of AIoT in Aquaculture

From the state-of-the-art AIoT applications for smart feeding systems, water quality management, disease detection, fish behavior monitoring, biomass estimation, and automation, several areas of work are still pending. Future directions in AIoT for aquaculture relate to real-time adaptability, resilience to environmental factors, extendibility, and cost-effectiveness across diverse aquaculture environments.

6.1. Real-time Adaptability and Decision-Making

An important direction for future studies is how AIoT could improve real-time adaptability in general for aquaculture; that is, enabling such facilities to respond immediately to dynamic environmental changes [23,108]. This would be able to allow systems operating continuously, even in the most remote environments, to provide optimized parameters, such as feeding intensity and water quality, for fish health and environmental sustainability.

6.2. Species-specific and Environmental Adaptability

It is an endless challenge to make the AIoT systems of aquaculture adaptable to a wide range of variability in species and environmental conditions. A good number of the models that perform well on a given condition would, perhaps, see a decrease in generalization due to variables such as lighting, water turbidity, or behavior when other species or environments are included. Future work needs to be guided toward adaptive algorithms and multimodal systems for the handling of variability in several species and ecological contexts [64,111]. It can be accomplished either by training the dataset on a wide variety of species and environments or by architecting transfer learning so that models generalize well across aquaculture sites.

6.3. Scalability and Cost-effectiveness for Broader Adoption

Scalability remains a great concern for adopting AIoT, primarily among small aquaculture operators who may not have the resources to invest in high-end solutions. The potential for innovation and integration of sensors with mobile-compatible systems in a cost-effective way will provide ways for making AIoT solutions accessible to small- and medium-sized farms. Also, it suggests that cloud platforms with data processing will minimize hardware requirements for smart feeding and water quality management, for instance, studies reviewed.

6.4. Integrated Multimodal Systems for Comprehensive Monitoring

In future AIoT systems, a set of data sources should be integrated; visual, acoustic, and environmental, into a single monitoring platform that can enhance general fish welfare. The research works carried out on the estimation of fish biomass and behavior have shown clear evidence of a multimodal approach using both visible and acoustic data in capturing comprehensive fish health and population dynamics. With the development of integrated and multimodal systems, aquaculture operators are presented with a complete holistic view of conditions of their culture environments. A system such as this would foster resilience by not being dependent on any one data source that could be influenced by environmental noise or species-specific behaviors related to feeding, water quality, and welfare of fishes.

6.5. Enhanced Disease Detection and Prevention Through Real-Time Biosensing

AIoT systems are yet to become efficient in the early detection and prevention of disease outbreaks, which indeed is a big area for future development. Current disease detection approaches employ AI algorithms over IoT biosensors, monitoring health indicators; however, these approaches have some limitations about dataset constraints and the requirement for generalizability across species. Future research should be directed towards embedding real-time biosensors that can monitor the levels of the pathogen to allow operators to take early recognition and intervention against the emergence of the disease. In addition, through the creation of mobile and offline compatible solutions, AIoT-enabled detection of diseases can achieve even the most remote and resource-poor aquaculture facilities.

6.6. Advanced Automation and Robotics for Efficient Operations

Aquaculture robotics is a developing field, with recent research demonstrating systems capable of performing underwater tasks of increasing complexity related to the inspection of fish nets, structural monitoring, and counting of organisms. Future robotics development should therefore emphasize enhancing adaptability and efficiency for operating in varied and demanding environments with reliability. Continuous real-time data collection and processing shall be facilitated through enhancement in sensor and algorithm robustness while enabling longer times for deployability by improved means of achieving longer battery life and energy efficiency. Multi-sensor fusion combined with AI could also result in much more autonomous, precision-based robotic systems capable of serving routine operations, saving labor, and improving operational efficiency.

6.7. Sustainable Energy Solutions for Remote Aquaculture

Having aquaculture farms in remote or offshore areas, the requirement for energy independence is, in fact, vital. The hybrid energy platforms on renewable solar and wind energy, together with AIoT-enabled energy optimization, can reduce dependence on conventional sources of power. Aquaculture-specific development of sustainable energy solutions can surely cut operational expenditure by a large extent while addressing the environmental concern of the aquaculture industry as a whole. Future research in this field should focus on bringing these together with traditional AIoT infrastructure that might provide a regular power supply to these systems for continuous monitoring and automated operations, particularly in offshore conditions.

6.8. Digital Twins and Predictive Analytics for Proactive Management

Digital twin technology involves the development of a virtual model of an operating aquaculture system. It has promising potential for real-time simulation and optimization of operations [215]. This will enable predictive analytics, enabling operators to model potentially occurring scenarios and allow them to take proactive steps to mitigate risks, optimize feeding, and boost overall system performance. Increased development of digital twins in aquaculture should enable data integration with higher levels of integration from IoT sensors, enhance real-time monitoring, and ensure predictive decision-making. In turn, future studies should be directed to the better handling of data and a higher level of interoperability in digital twin frameworks while developing flexible models that can simulate different aquaculture ecosystems.

7. Conclusion and Future Work

This review systematically explores and analyzes the extent to which AIoT technologies are advancing the aquaculture industry. Integrating AI and IoT has contributed to the advancement of aquaculture by effectively addressing critical challenges such as operational inefficiencies, environmental sustainability, disease management, and resource optimization. The study categorizes AIoT applications into ten core areas: smart feeding systems, water quality management, disease detection, fish biomass estimation, fish behavior detection, counting aquaculture organisms, species segmentation and classification, breeding and growth estimation, fish tracking and individuality, and automation and robotics. each area highlights unique methodologies, key achievements, and potential for future advancements. Among these, smart feeding systems exemplify how AIoT optimizes feeding schedules, minimizes waste, and enhances feed efficiency, significantly improving growth rates. Future research in this area is expected to develop adaptive systems and leverage cloud-based monitoring to ensure scalability and robustness. Similarly, water quality management has progressed with real-time monitoring and predictive adjustments enabled by AIoT, helping to maintain optimal conditions for fish welfare. However, challenges such as scaling across diverse environments and incorporating edge computing solutions for real-time processing remain priorities for future development.
In disease detection and prevention, AI-driven biosensors and IoT tools enable early detection of health issues, reducing mortality rates and improving overall fish health. Nonetheless, the lack of robust datasets and generalized models limits cross-species applicability, requiring further research. Fish biomass estimation, a critical component for yield forecasting, has been improved through computer vision and acoustic analysis techniques. Advancing this area will require refining multi-modal systems for varied environments and enhancing real-time capabilities. AIoT also plays a vital role in fish behavior detection, where models analyze patterns indicative of stress, disease, or social dynamics. Despite notable progress, challenges such as environmental variability, noise interference, and species-specific behaviors necessitate the development of more adaptable and species-diverse models. Similarly, automated counting methods for aquaculture organisms streamline management but face issues such as occlusion and environmental variability. These methods need further refinement to ensure broad applicability and versatility. Species segmentation and classification showcase the potential of advanced AI techniques in distinguishing species under complex conditions. Yet, ensuring adaptability to diverse environments and integrating low-power, real-time deployment solutions remains a challenge. In the context of breeding and growth estimation, models assessing fish maturity and growth rates are advancing breeding efficiency. Expanding these models to accommodate various species and integrating continuous real-time data are critical steps for more robust monitoring systems. Fish tracking and individuality monitoring provide valuable insights into behavior and the effects of water quality, but challenges persist in high-density and diverse aquaculture environments. Integrating multiple data sources could improve comprehensiveness and precision. Finally, automation and robotics are revolutionizing labor-intensive tasks, especially in offshore or remote locations, by enabling real-time monitoring and execution. Future advancements should focus on multi-modal sensors, enhanced connectivity, and efficient power management to improve resilience and operational efficiency.
While the potential of AIoT in aquaculture is transformative, several limitations remain. High initial investment costs, the complexity of data infrastructure, and the demand for technical expertise create barriers, particularly for smaller operators. Concerns about data privacy and cybersecurity in interconnected systems pose significant challenges. Many AI models trained for specific species or environments lack the flexibility for broader applications, underscoring the need for adaptive AI solutions. Observations from this review suggest that AIoT adoption thrives in environments that prioritize real-time monitoring, adaptive response mechanisms, and scalable infrastructure. Large-scale operations benefit from economies of scale, while smaller operators often face financial and technical hurdles. Bridging this gap will require cost-effective solutions, robust edge computing systems, and accessible training resources for personnel.

Author Contributions

Conceptualization, Y.-P. H., SPK.; formal analysis, Y.-P. H; investigation, Y.-P. H, SPK; writing—Original draft preparation, SPK; writing—review and editing, Y.-P. H; supervision and funding acquisition, Y.-P. H. All authors have read and agreed to the published version of the manuscript.

Funding

This work was partially funded by the National Science and Technology Council, Taiwan, under grants MOST111-2221-E-346-002-MY3 and NSTC113-2221-E-346-002-.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing does not apply to this article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. The State of World Fisheries and Aquaculture 2022; Food and Agriculture Organization of the United Nations (FAO): Rome, Italy, 2022.
  2. Huang, Y.-P.; Khabusi, S.P. A CNN-OSELM Multi-Layer Fusion Network With Attention Mechanism for Fish Disease Recognition in Aquaculture. IEEE Access 2023, 11, 58729–58744. [Google Scholar] [CrossRef]
  3. Araujo, G.S.; da Silva, J.W.A.; Cotas, J.; Pereira, L. Fish Farming Techniques: Current Situation and Trends. J. Mar. Sci. Eng. 2022, 10, 1598. [Google Scholar] [CrossRef]
  4. M. Sun, X. M. Sun, X. Yang, and Y. Xie, “Deep learning in aquaculture: A review,” Journal of Computers, vol. 31, no. 1, pp.294-319, Jan. 2020.
  5. Yue, G.H.; Tay, Y.X.; Wong, J.; Shen, Y.; Xia, J. Aquaculture species diversification in China. Aquac. Fish. 2023, 9, 206–217. [Google Scholar] [CrossRef]
  6. Liu, D.; Xu, B.; Cheng, Y.; Chen, H.; Dou, Y.; Bi, H.; Zhao, Y. Shrimpseed_Net: Counting of Shrimp Seed Using Deep Learning on Smartphones for Aquaculture. IEEE Access 2023, 11, 85441–85450. [Google Scholar] [CrossRef]
  7. Vembarasi, K.; Thotakura, V.P.; Senthilkumar, S.; Ramachandran, L.; Praba, V.L.; Vetriselvi, S.; Chinnadurai, M. White Spot Syndrome Detection in Shrimp using Neural Network Model. 2024 11th International Conference on Computing for Sustainable Global Development (INDIACom). LOCATION OF CONFERENCE, IndiaDATE OF CONFERENCE; pp. 212–217.
  8. Shen, Y.; Arablouei, R.; de Hoog, F.; Xing, H.; Malan, J.; Sharp, J.; Shouri, S.; Clark, T.D.; Lefevre, C.; Kroon, F.; et al. In-Situ Fish Heart-Rate Estimation and Feeding Event Detection Using an Implantable Biologger. IEEE Trans. Mob. Comput. 2021, 22, 968–982. [Google Scholar] [CrossRef]
  9. Tricas, T.; Boyle, K. Acoustic behaviors in Hawaiian coral reef fish communities. Mar. Ecol. Prog. Ser. 2014, 511, 1–16. [Google Scholar] [CrossRef]
  10. Kim, H.; Koo, J.; Kim, D.; Jung, S.; Shin, J.-U.; Lee, S.; Myung, H. Image-Based Monitoring of Jellyfish Using Deep Learning Architecture. IEEE Sensors J. 2016, 16, 2215–2216. [Google Scholar] [CrossRef]
  11. Weihong, B.; Yun, J.; Jiaxin, L.; Lingling, S.; Guangwei, F.; Wa, J. In-Situ Detection Method of Jellyfish Based on Improved Faster R-CNN and FP16. IEEE Access 2023, 11, 81803–81814. [Google Scholar] [CrossRef]
  12. Dunker, S.; Boho, D.; Wäldchen, J.; Mäder, P. Combining high-throughput imaging flow cytometry and deep learning for efficient species and life-cycle stage identification of phytoplankton. BMC Ecol. 2018, 18, 1–15. [Google Scholar] [CrossRef] [PubMed]
  13. Collos, Y.; Harrison, P.J. Acclimation and toxicity of high ammonium concentrations to unicellular algae. Mar. Pollut. Bull. 2014, 80, 8–23. [Google Scholar] [CrossRef] [PubMed]
  14. Pedraza, A.; Bueno, G.; Deniz, O.; Cristóbal, G.; Blanco, S.; Borrego-Ramos, M. Automated Diatom Classification (Part B): A Deep Learning Approach. Appl. Sci. 2017, 7, 460. [Google Scholar] [CrossRef]
  15. Ubina, N.A.; Lan, H.-Y.; Cheng, S.-C.; Chang, C.-C.; Lin, S.-S.; Zhang, K.-X.; Lu, H.-Y.; Cheng, C.-Y.; Hsieh, Y.-Z. Digital twin-based intelligent fish farming with Artificial Intelligence Internet of Things (AIoT). Smart Agric. Technol. 2023, 5. [Google Scholar] [CrossRef]
  16. Singh, M.; Sahoo, K.S.; Nayyar, A. Sustainable IoT Solution for Freshwater Aquaculture Management. IEEE Sensors J. 2022, 22, 16563–16572. [Google Scholar] [CrossRef]
  17. A. Petkovski, J. A. Petkovski, J. Ajdari, and X. Zenuni, “IoT-based solutions in aquaculture: A systematic literature review,” in Proc. of 44th Int. Convention on Information, Communication and Electronic Technology (MIPRO), Opatija, Croatia, pp.1358-1363, Sep. 2021. [Google Scholar]
  18. W.-C. Hu, L.-B. W.-C. Hu, L.-B. Chen, B.-H. Wang, G.-W. Li, and X.-R. Huang, “An AIoT-based water quality inspection system for intelligent aquaculture,” in Proc. of IEEE 11th Global Conf. on Consumer Electronics (GCCE), Osaka, Japan, pp.551-552, Oct. 2022.
  19. Hu, W.-C.; Chen, L.-B.; Wang, B.-H.; Li, G.-W.; Huang, X.-R. Design and Implementation of a Full-Time Artificial Intelligence of Things-Based Water Quality Inspection and Prediction System for Intelligent Aquaculture. IEEE Sensors J. 2023, 24, 3811–3821. [Google Scholar] [CrossRef]
  20. Rastegari, H.; Nadi, F.; Lam, S.S.; Ikhwanuddin, M.; Kasan, N.A.; Rahmat, R.F.; Mahari, W.A.W. Internet of Things in aquaculture: A review of the challenges and potential solutions based on current and future trends. Smart Agric. Technol. 2023, 4. [Google Scholar] [CrossRef]
  21. B B. D., T. Bodaragama, E. H. A. D. M. Miyurangana, Y. T. W. S. L. Jayakod, D. M. H. D. Vipulasiri, U. U. S. Rajapaksha, and J. Krishara, “IoT-Based Solution for Fish Disease Detection and Controlling a Fish Tank Through a Mobile Application,” in Proc. of IEEE 9th Int. Conf. for Convergence in Technology (I2CT), Pune, India, pp.1-6, Apr. 2024.
  22. Li, T.; Rong, S.; Chen, L.; Zhou, H.; He, B. Underwater Motion Deblurring Based on Cascaded Attention Mechanism. IEEE J. Ocean. Eng. 2022, 49, 262–278. [Google Scholar] [CrossRef]
  23. Hu, W.-C.; Chen, L.-B.; Huang, B.-K.; Lin, H.-M. A Computer Vision-Based Intelligent Fish Feeding System Using Deep Learning Techniques for Aquaculture. IEEE Sensors J. 2022, 22, 7185–7194. [Google Scholar] [CrossRef]
  24. Nagothu, S.K.; Sri, P.B.; Anitha, G.; Vincent, S.; Kumar, O.P. Advancing aquaculture: fuzzy logic-based water quality monitoring and maintenance system for precision aquaculture. Aquac. Int. 2024, 33. [Google Scholar] [CrossRef]
  25. Adegboye, M.A.; Aibinu, A.M.; Kolo, J.G.; Aliyu, I.; Folorunso, T.A.; Lee, S.-H. Incorporating Intelligence in Fish Feeding System for Dispensing Feed Based on Fish Feeding Intensity. IEEE Access 2020, 8, 91948–91960. [Google Scholar] [CrossRef]
  26. Haq, K.P.R.A.; Harigovindan, V.P. Water Quality Prediction for Smart Aquaculture Using Hybrid Deep Learning Models. IEEE Access 2022, 10, 60078–60098. [Google Scholar] [CrossRef]
  27. Khan, P.W.; Byun, Y.C. Optimized Dissolved Oxygen Prediction Using Genetic Algorithm and Bagging Ensemble Learning for Smart Fish Farm. IEEE Sensors J. 2023, 23, 15153–15164. [Google Scholar] [CrossRef]
  28. Liu, J.; Zhang, T.; Han, G.; Gou, Y. TD-LSTM: Temporal Dependence-Based LSTM Networks for Marine Temperature Prediction. Sensors 2018, 18, 3797. [Google Scholar] [CrossRef] [PubMed]
  29. Khabusi, S.P.; Huang, Y.-P.; Lee, M.-F.; Tsai, M.-C. Enhanced U-Net and PSO-Optimized ANFIS for Classifying Fish Diseases in Underwater Images. Int. J. Fuzzy Syst. 2024, 1–18. [Google Scholar] [CrossRef]
  30. Kumaar, A.S.; Vignesh, A.V.; Deepak, K. FishNet Freshwater Fish Disease Detection using Deep Learning Techniques. 2024 2nd International Conference on Advancement in Computation & Computer Technologies (InCACCT). LOCATION OF CONFERENCE, IndiaDATE OF CONFERENCE; pp. 368–373.
  31. Pham, T.-N.; Nguyen, V.-H.; Kwon, K.-R.; Kim, J.-H.; Huh, J.-H. Improved YOLOv5 based Deep Learning System for Jellyfish Detection. IEEE Access 2024, PP, 1–1. [Google Scholar] [CrossRef]
  32. Zhao, Z.; Liu, Y.; Sun, X.; Liu, J.; Yang, X.; Zhou, C. Composited FishNet: Fish Detection and Species Recognition From Low-Quality Underwater Videos. IEEE Trans. Image Process. 2021, 30, 4719–4734. [Google Scholar] [CrossRef] [PubMed]
  33. Anjum, S.S.; M, S.K.; S, N.F.H.; N, A.M. Ensemble Neural Network Based Fish Species Identification for Emerging Aquaculture Application. 2023 International Conference on Recent Advances in Science and Engineering Technology (ICRASET). LOCATION OF CONFERENCE, IndiaDATE OF CONFERENCE; pp. 1–6.
  34. Rasmussen, C.; Zhao, J.; Ferraro, D.; Trembanis, A. Deep Census: AUV-Based Scallop Population Monitoring. In Proceedings of the 2017 IEEE International Conference on Computer Vision Workshops (ICCVW), Venice, Italy, 22–29 October 2017; pp. 2865–2873. [Google Scholar] [CrossRef]
  35. Qiu, C.; Wu, Z.; Wang, J.; Tan, M.; Yu, J. Multiagent-Reinforcement-Learning-Based Stable Path Tracking Control for a Bionic Robotic Fish With Reaction Wheel. IEEE Trans. Ind. Electron. 2023, 70, 12670–12679. [Google Scholar] [CrossRef]
  36. Persson, D.; Nødtvedt, A.; Aunsmo, A.; Stormoen, M. Analysing mortality patterns in salmon farming using daily cage registrations. J. Fish Dis. 2021, 45, 335–347. [Google Scholar] [CrossRef] [PubMed]
  37. Tavares-Dias, M.; Martins, M.L. An overall estimation of losses caused by diseases in the Brazilian fish farms. J. Parasit. Dis. 2017, 41, 913–918. [Google Scholar] [CrossRef] [PubMed]
  38. D. Muhammed, E. D. Muhammed, E. Ahvar, S. Ahvar, M. Trocan, M.-J. Montpetit, and R. Ehsani, “Artificial intelligence of things (AIoT) for smart agriculture: A review of architectures, technologies and solutions,” Journal of Network and Computer Applications, vol. 228, no. 1, pp.1-27, Jun. 2024.
  39. T. Nguyen, H. T. Nguyen, H. Nguyen, and T. Nguyen Gia, “Exploring the integration of edge computing and blockchain IoT: Principles, architectures, security, and applications,” Journal of Network and Computer Applications, vol. 226, pp.1-24, Apr. 2024.
  40. Pasika, S.; Gandla, S.T. Smart water quality monitoring system with cost-effective using IoT. 6, 0409; e6. [Google Scholar] [CrossRef]
  41. Liu, S.; Yin, B.; Sang, G.; Lv, Y.; Wang, M.; Xiao, S.; Yan, R.; Wu, S. Underwater Temperature and Salinity Fiber Sensor Based on Semi-Open Cavity Structure of Asymmetric MZI. IEEE Sensors J. 2023, 23, 18219–18233. [Google Scholar] [CrossRef]
  42. Parra, L.; Lloret, G.; Lloret, J.; Rodilla, M. Physical Sensors for Precision Aquaculture: A Review. IEEE Sensors J. 2018, 18, 3915–3923. [Google Scholar] [CrossRef]
  43. Akindele, A.A.; Sartaj, M. The toxicity effects of ammonia on anaerobic digestion of organic fraction of municipal solid waste. Waste Manag. 2018, 71, 757–766. [Google Scholar] [CrossRef]
  44. Matos, T.; Pinto, V.; Sousa, P.; Martins, M.; Fernández, E.; Henriques, R.; Gonçalves, L.M. Design and In Situ Validation of Low-Cost and Easy to Apply Anti-Biofouling Techniques for Oceanographic Continuous Monitoring with Optical Instruments. Sensors 2023, 23, 605. [Google Scholar] [CrossRef] [PubMed]
  45. Chiang, C.-T.; Chen, T.-Y.; Wu, Y.-T. Design of a Water Salinity Difference Detector for Monitoring Instantaneous Salinity Changes in Aquaculture. IEEE Sensors J. 2020, 20, 3242–3248. [Google Scholar] [CrossRef]
  46. Chiang, C.-T.; Chang, C.-W. Design of a Calibrated Salinity Sensor Transducer for Monitoring Salinity of Ocean Environment and Aquaculture. IEEE Sensors J. 2015, 15, 5151–5157. [Google Scholar] [CrossRef]
  47. Chen, F.; Qiu, T.; Xu, J.; Zhang, J.; Du, Y.; Duan, Y.; Zeng, Y.; Zhou, L.; Sun, J.; Sun, M. Rapid Real-Time Prediction Techniques for Ammonia and Nitrite in High-Density Shrimp Farming in Recirculating Aquaculture Systems. Fishes 2024, 9, 386. [Google Scholar] [CrossRef]
  48. Wang, Y.; Xu, D.; Li, X.; Wang, W. Prediction Model of Ammonia Nitrogen Concentration in Aquaculture Based on Improved AdaBoost and LSTM. Mathematics 2024, 12, 627. [Google Scholar] [CrossRef]
  49. Delgado, A.; Briciu-Burghina, C.; Regan, F. Antifouling Strategies for Sensors Used in Water Monitoring: Review and Future Perspectives. Sensors 2021, 21, 389. [Google Scholar] [CrossRef] [PubMed]
  50. Wang, J.-H.; Lee, S.-K.; Lai, Y.-C.; Lin, C.-C.; Wang, T.-Y.; Lin, Y.-R.; Hsu, T.-H.; Huang, C.-W.; Chiang, C.-P. Anomalous Behaviors Detection for Underwater Fish Using AI Techniques. IEEE Access 2020, 8, 224372–224382. [Google Scholar] [CrossRef]
  51. Chen, P.; Wang, F.; Liu, S.; Yu, Y.; Yue, S.; Song, Y.; Lin, Y. Modeling Collective Behavior for Fish School With Deep Q-Networks. IEEE Access 2023, 11, 36630–36641. [Google Scholar] [CrossRef]
  52. Rosell-Moll, E.; Piazzon, M.; Sosa, J.; Ferrer, M.; Cabruja, E.; Vega, A.; Calduch-Giner, J.; Sitjà-Bobadilla, A.; Lozano, M.; Montiel-Nelson, J.; et al. Use of accelerometer technology for individual tracking of activity patterns, metabolic rates and welfare in farmed gilthead sea bream (Sparus aurata) facing a wide range of stressors. Aquaculture 2021, 539, 736609. [Google Scholar] [CrossRef]
  53. Chai, Y.; Yu, H.; Xu, L.; Li, D.; Chen, Y. Deep Learning Algorithms for Sonar Imagery Analysis and Its Application in Aquaculture: A Review. IEEE Sensors J. 2023, 23, 28549–28563. [Google Scholar] [CrossRef]
  54. Huang, M.; Zhou, Y.-G.; Yang, X.-G.; Gao, Q.-F.; Chen, Y.-N.; Ren, Y.-C.; Dong, S.-L. Optimizing feeding frequencies in fish: A meta-analysis and machine learning approach. Aquaculture 2024, 595. [Google Scholar] [CrossRef]
  55. Li, H.; Chatzifotis, S.; Lian, G.; Duan, Y.; Li, D.; Chen, T. Mechanistic model based optimization of feeding practices in aquaculture. Aquac. Eng. 2022, 97. [Google Scholar] [CrossRef]
  56. Cao, Y.; Liu, S.; Wang, M.; Liu, W.; Liu, T.; Cao, L.; Guo, J.; Feng, D.; Zhang, H.; Hassan, S.G.; et al. A Hybrid Method for Identifying the Feeding Behavior of Tilapia. IEEE Access 2023, 12, 76022–76037. [Google Scholar] [CrossRef]
  57. Wei, M.; Lin, Y.; Chen, K.; Su, W.; Cheng, E. Study on Feeding Activity of Litopenaeus Vannamei Based on Passive Acoustic Detection. IEEE Access 2020, 8, 156654–156662. [Google Scholar] [CrossRef]
  58. Catarino, M.M.R.S.; Gomes, M.R.S.; Ferreira, S.M.F.; Gonçalves, S.C. Optimization of feeding quantity and frequency to rear the cyprinid fish Garra rufa (Heckel, 1843). Aquac. Res. 2019, 50, 876–881. [Google Scholar] [CrossRef]
  59. Atoum, Y.; Srivastava, S.; Liu, X. Automatic Feeding Control for Dense Aquaculture Fish Tanks. IEEE Signal Process. Lett. 2014, 22, 1089–1093. [Google Scholar] [CrossRef]
  60. Zhou, C.; Xu, D.; Chen, L.; Zhang, S.; Sun, C.; Yang, X.; Wang, Y. Evaluation of fish feeding intensity in aquaculture using a convolutional neural network and machine vision. 507. [CrossRef]
  61. T. Noda, Y. T. Noda, Y. Kawabata, N. Arai, H. Mitamura, and S. Watanabe, “Monitoring escape and feeding behaviors of cruiser fish by inertial and magnetic sensors,” PLoS One, vol. 8, no. 11, pp.1-13, Nov. 2013.
  62. Zeng, Y.; Yang, X.; Pan, L.; Zhu, W.; Wang, D.; Zhao, Z.; Liu, J.; Sun, C.; Zhou, C. Fish school feeding behavior quantification using acoustic signal and improved Swin Transformer. Comput. Electron. Agric. 2022, 204. [Google Scholar] [CrossRef]
  63. M. A. Adegboye, A. M. M. A. Adegboye, A. M. Aibinu, J. G. Kolo, T. A. Folorunso, I. Aliyu, and L. S. Ho, “Intelligent Fish feeding regime system using vibration analysis,” World Journal of Wireless Devices and Engineering, vol. 3, no. 1, pp.1-8, Oct. 2019.
  64. Huang, Y.-P.; Vadloori, S. Optimizing Fish Feeding with FFAUNet Segmentation and Adaptive Fuzzy Inference System. Processes 2024, 12, 1580. [Google Scholar] [CrossRef]
  65. Chen, H.; Nan, X.; Xia, S. Data Fusion Based on Temperature Monitoring of Aquaculture Ponds With Wireless Sensor Networks. IEEE Sensors J. 2022, 23, 6–20. [Google Scholar] [CrossRef]
  66. Adhikary, A.; Roy, J.; Kumar, A.G.; Banerjee, S.; Biswas, K. An Impedimetric Cu-Polymer Sensor-Based Conductivity Meter for Precision Agriculture and Aquaculture Applications. IEEE Sensors J. 2019, 19, 12087–12095. [Google Scholar] [CrossRef]
  67. Liu, W.; Liu, S.; Hassan, S.G.; Cao, Y.; Xu, L.; Feng, D.; Cao, L.; Chen, W.; Chen, Y.; Guo, J.; et al. A Novel Hybrid Model to Predict Dissolved Oxygen for Efficient Water Quality in Intensive Aquaculture. IEEE Access 2023, 11, 29162–29174. [Google Scholar] [CrossRef]
  68. Li, D.; Wang, X.; Sun, J.; Feng, Y. Radial Basis Function Neural Network Model for Dissolved Oxygen Concentration Prediction Based on an Enhanced Clustering Algorithm and Adam. IEEE Access 2021, 9, 44521–44533. [Google Scholar] [CrossRef]
  69. Li, D.; Sun, J.; Yang, H.; Wang, X. An Enhanced Naive Bayes Model for Dissolved Oxygen Forecasting in Shellfish Aquaculture. IEEE Access 2020, 8, 217917–217927. [Google Scholar] [CrossRef]
  70. C. Yingyi, C. C. Yingyi, C. Qianqian, F. Xiaomin, Y. Huihui, and Li Daoliang, “Principal component analysis and long short-term memory neural network for predicting dissolved oxygen in water for aquaculture,” Trans. of the Chinese Society of Agricultural Engineering (CSAE), vol. 34, no. 17, pp.183-191, Aug. 2018.
  71. Li, Z.; Peng, F.; Niu, B.; Li, G.; Wu, J.; Miao, Z. Water Quality Prediction Model Combining Sparse Auto-encoder and LSTM Network. IFAC-PapersOnLine 2018, 51, 831–836. [Google Scholar] [CrossRef]
  72. Da Silva, Y.F.; Freire, R.C.S.; Neto, J.V.D.F. Conception and Design of WSN Sensor Nodes based on Machine Learning, Embedded Systems and IoT approaches for Pollutant Detection in Aquatic Environments. IEEE Access 2023, PP, 1–1. [Google Scholar] [CrossRef]
  73. Liu, S.; Xu, L.; Li, Q.; Zhao, X.; Li, D. Fault Diagnosis of Water Quality Monitoring Devices Based on Multiclass Support Vector Machines and Rule-Based Decision Trees. IEEE Access 2018, 6, 22184–22195. [Google Scholar] [CrossRef]
  74. Park, Y.; Cho, K.H.; Park, J.; Cha, S.M.; Kim, J.H. Development of early-warning protocol for predicting chlorophyll-a concentration using machine learning models in freshwater and estuarine reservoirs, Korea. Sci. Total. Environ. 2015, 502, 31–41. [Google Scholar] [CrossRef]
  75. Lee, G.; Bae, J.; Lee, S.; Jang, M.; Park, H. Monthly chlorophyll-a prediction using neuro-genetic algorithm for water quality management in Lakes. Desalination Water Treat. 2016, 57, 26783–26791. [Google Scholar] [CrossRef]
  76. Gambin, A.F.; Angelats, E.; Gonzalez, J.S.; Miozzo, M.; Dini, P. Sustainable Marine Ecosystems: Deep Learning for Water Quality Assessment and Forecasting. IEEE Access 2021, 9, 121344–121365. [Google Scholar] [CrossRef]
  77. Cho, H.; Choi, U.J.; Park, H. Deep Learning Application to Time Series Prediction of Daily Chlorophyll-a Concentration. WIT Trans. Ecol. Environ. 2018, 215, 157–163. [Google Scholar] [CrossRef]
  78. S. Lee and D. Lee, “Four major South Korea’s rivers using deep learning models,” Int. Journal of Environmental Research and Public Health, vol. 15, no. 7, pp.1-15, Jun. 2018.
  79. John, E.M.; Krishnapriya, K.; Sankar, T. Treatment of ammonia and nitrite in aquaculture wastewater by an assembled bacterial consortium. 526, 7353; 90. [Google Scholar] [CrossRef]
  80. Yu, H.; Yang, L.; Li, D.; Chen, Y. A hybrid intelligent soft computing method for ammonia nitrogen prediction in aquaculture. Inf. Process. Agric. 2020, 8, 64–74. [Google Scholar] [CrossRef]
  81. Karri, R.R.; Sahu, J.N.; Chimmiri, V. Critical review of abatement of ammonia from wastewater. 261, 31. [CrossRef]
  82. Nagaraju, T.V.; B. M., S.; Chaudhary, B.; Prasad, C.D.; R, G. Prediction of ammonia contaminants in the aquaculture ponds using soft computing coupled with wavelet analysis. Environ. Pollut. 2023, 331, 121924. [Google Scholar] [CrossRef]
  83. A. Vaswani, N. A. Vaswani, N. Shazeer, N. Parmar, et al., “Attention is all you need,” in Proc. of 31st Conf. on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA. pp.1-11, Dec. 2017.
  84. M. S. Ahmed, T. T. M. S. Ahmed, T. T. Aurpa, and M. A. K. Azad, “Fish disease detection using image based machine learning technique in aquaculture,” Journal of King Saud University - Computer and Information Sciences, vol. 34, no. 8, pp.5170-5182, 22. 20 May.
  85. M. J. Mia, R. B. M. J. Mia, R. B. Mahmud, M. S. Sadad, H. A. Asad, and R. Hossain, “An in-depth automated approach for fish disease recognition,” Journal of King Saud University - Computer and Information Sciences, vol. 34, no. 9, pp.7174-7183, Feb. 2022.
  86. S. P. Khabusi, Y. S. P. Khabusi, Y. -P. Huang, and M. -F. Lee, “Attention-based mechanism for fish disease classification in aquaculture,” in Proc. of Int. Conf. on System Science and Engineering (ICSSE), Ho Chi Minh, Vietnam, pp.95-100, Jul. 2023.
  87. Waleed, A.; Medhat, H.; Esmail, M.; Osama, K.; Samy, R.; Ghanim, T.M. Automatic Recognition of Fish Diseases in Fish Farms. 2019 14th International Conference on Computer Engineering and Systems (ICCES). LOCATION OF CONFERENCE, EgyptDATE OF CONFERENCE; pp. 201–206.
  88. Sujatha, K.; Mounika, P. Evaluation of ML Models for Detection and Prediction of Fish Diseases: A Case Study on Epizootic Ulcerative Syndrome. 2023 Second International Conference on Electrical, Electronics, Information and Communication Technologies (ICEEICT). LOCATION OF CONFERENCE, IndiaDATE OF CONFERENCE; pp. 1–7.
  89. S. Malik, T. S. Malik, T. Kumar, and A.K Sahoo, “Fish disease detection using HOG and FAST feature descriptor,” Int. Journal of Computer Science and Information Security (IJCSIS), vol. 15, no. 5, pp.216-221, 17. 20 May.
  90. Nayan, A.-A.; Saha, J.; Mozumder, A.N.; Mahmud, K.R.; Al Azad, A.K.; Kibria, M.G. A Machine Learning Approach for Early Detection of Fish Diseases by Analyzing Water Quality. Trends Sci. 2021, 18, 351–351. [Google Scholar] [CrossRef]
  91. Moni, J.; Jacob, P.M.; Sudeesh, S.; Nair, M.J.; George, M.S.; Thomas, M.S. A Smart Aquaculture Monitoring System with Automated Fish Disease Identification. 2024 1st International Conference on Trends in Engineering Systems and Technologies (ICTEST). LOCATION OF CONFERENCE, IndiaDATE OF CONFERENCE; pp. 01–06.
  92. Mendieta, M.; Romero, D. A cross-modal transfer approach for histological images: A case study in aquaculture for disease identification using zero-shot learning. 2017 IEEE Second Ecuador Technical Chapters Meeting (ETCM). LOCATION OF CONFERENCE, COUNTRYDATE OF CONFERENCE; pp. 1–6.
  93. Vijayalakshmi, M.; Sasithradevi, A.; Prakash, P. Transfer Learning Approach for Epizootic Ulcerative Syndrome and Ichthyophthirius Disease Classification in Fish Species. 2023 International Conference on Bio Signals, Images, and Instrumentation (ICBSII). LOCATION OF CONFERENCE, IndiaDATE OF CONFERENCE; pp. 1–5.
  94. Al Maruf, A.; Fahim, S.H.; Bashar, R.; Rumy, R.A.; Chowdhury, S.I.; Aung, Z. Classification of Freshwater Fish Diseases in Bangladesh Using a Novel Ensemble Deep Learning Model: Enhancing Accuracy and Interpretability. IEEE Access 2024, 12, 96411–96435. [Google Scholar] [CrossRef]
  95. A. Vasumathi, R. P. A. Vasumathi, R. P. Singh, E. S Bharathi, N. Vignesh, and S. Harsith, “Fish disease detection using machine learning,” in Proc. of Int. Conf. on Science Technology Engineering and Management (ICSTEM), Coimbatore, India, pp.1-4, Apr. 2024.
  96. Gu, J.; Deng, C.; Lin, X.; Yu, D. Expert system for fish disease diagnosis based on fuzzy neural network. 2012 Third International Conference on Intelligent Control and Information Processing (ICICIP). LOCATION OF CONFERENCE, ChinaDATE OF CONFERENCE; pp. 146–149.
  97. Darapaneni, N.; Sreekanth, S.; Paduri, A.R.; Roche, A.S.; Murugappan, V.; Singha, K.K.; Shenwai, A.V. AI Based Farm Fish Disease Detection System to Help Micro and Small Fish Farmers. 2022 Interdisciplinary Research in Technology and Management (IRTM). LOCATION OF CONFERENCE, IndiaDATE OF CONFERENCE; pp. 1–5.
  98. Prządka, M.P.; Wojcieszak, D.; Pala, K. Optimization of Au Electrode Parameters for Pathogen Detection in Aquaculture. IEEE Sensors J. 2024, 24, 5785–5796. [Google Scholar] [CrossRef]
  99. Zhang, T.; Yang, Y.; Liu, Y.; Liu, C.; Zhao, R.; Li, D.; Shi, C. Fully automatic system for fish biomass estimation based on deep neural network. Ecol. Informatics 2023, 79. [Google Scholar] [CrossRef]
  100. Gao, Z.; Jiang, W.; Man, X.; Zheng, R.; Ma, X. A Method for Estimating Fish Biomass Based on Underwater Binocular Vision. 2024 36th Chinese Control and Decision Conference (CCDC). LOCATION OF CONFERENCE, ChinaDATE OF CONFERENCE; pp. 2656–2660.
  101. S, S.; E, S.; Nandini, T.S.; T, S. Fish Biomass Estimation Based on Object Detection Using YOLOv7. 2023 4th International Conference for Emerging Technology (INCET). LOCATION OF CONFERENCE, IndiaDATE OF CONFERENCE; pp. 1–6.
  102. Rossi, L.; Bibbiani, C.; Fronte, B.; Damiano, E.; Di Lieto, A. Application of a smart dynamic scale for measuring live-fish biomass in aquaculture. 2021 IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor). LOCATION OF CONFERENCE, ItalyDATE OF CONFERENCE; pp. 248–252.
  103. Rossi, L.; Bibbiani, C.; Fronte, B.; Damiano, E.; Di Lieto, A. Validation campaign of a smart dynamic scale for measuring live-fish biomass in aquaculture. 2022 IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor). LOCATION OF CONFERENCE, ItalyDATE OF CONFERENCE; pp. 111–115.
  104. Damiano, E.; Bibbiani, C.; Fronte, B.; Di Lieto, A. Smart and cheap scale for estimating live-fish biomass in offshore aquaculture. 2020 IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor). LOCATION OF CONFERENCE, ItalyDATE OF CONFERENCE; pp. 160–164.
  105. Pargi, M.K.; Bagheri, E.; F, R.S.; Huat, K.E.; Shishehchian, F.; Nathalie, N. Improving Aquaculture Systems using AI: Employing predictive models for Biomass Estimation on Sonar Images. 2022 21st IEEE International Conference on Machine Learning and Applications (ICMLA). LOCATION OF CONFERENCE, BahamasDATE OF CONFERENCE; pp. 1629–1636.
  106. Sthapit, P.; Kim, M.; Kang, D.; Kim, K. Development of Scientific Fishery Biomass Estimator: System Design and Prototyping. Sensors 2020, 20, 6095. [Google Scholar] [CrossRef]
  107. Vasile, G.; Petrut, T.; D'Urso, G.; De Oliveira, E. IoT Acoustic Antenna Development for Fish Biomass Long-Term Monitoring. OCEANS 2018 MTS/IEEE Charleston. LOCATION OF CONFERENCE, COUNTRYDATE OF CONFERENCE; pp. 1–4.
  108. Tang, N.T.; Lim, K.G.; Yoong, H.P.; Ching, F.F.; Wang, T.; Teo, K.T.K. Non-Intrusive Biomass Estimation in Aquaculture Using Structure from Motion Within Decision Support Systems. 2024 IEEE International Conference on Artificial Intelligence in Engineering and Technology (IICAIET). LOCATION OF CONFERENCE, MalaysiaDATE OF CONFERENCE; pp. 682–686.
  109. Hossain, S.A.; Hossen, M. Biomass Estimation of a Popular Aquarium Fish Using an Acoustic Signal Processing Technique with Three Acoustic Sensors. 2018 International Conference on Advancement in Electrical and Electronic Engineering (ICAEEE). LOCATION OF CONFERENCE, BangladeshDATE OF CONFERENCE; pp. 1–4.
  110. Hu, W.-C.; Chen, L.-B.; Lin, H.-M. A Method for Abnormal Behavior Recognition in Aquaculture Fields Using Deep Learning. IEEE Can. J. Electr. Comput. Eng. 2024, 47, 118–126. [Google Scholar] [CrossRef]
  111. Chen, L.; Yin, X. Recognition Method of Abnormal Behavior of Marine Fish Swarm Based on In-Depth Learning Network Model. J. Web Eng. 2021, 20, 575–596. [Google Scholar] [CrossRef]
  112. J. Zhaoa, W. J. Zhaoa, W. Baoa, F. Zhangaet, al., “Modified motion influence map and recurrent neural network-based monitoring of the local unusual behaviors for fish school in intensive aquaculture,” Aquaculture, vol. 493, pp.165-175, 18. 20 May.
  113. Hassan, W.; Fore, M.; Pedersen, M.O.; Alfredsen, J.A. A New Method for Measuring Free-Ranging Fish Swimming Speed in Commercial Marine Farms Using Doppler Principle. IEEE Sensors J. 2020, 20, 10220–10227. [Google Scholar] [CrossRef]
  114. Le Quinio, A.; Martignac, F.; Girard, A.; Guillard, J.; Roussel, J.-M.; de Oliveira, E. Fish as a Deformable Solid: An Innovative Method to Characterize Fish Swimming Behavior on Acoustic Videos. IEEE Access 2024, 12, 134486–134497. [Google Scholar] [CrossRef]
  115. Osterloff, J.; Nilssen, I.; Jarnegren, J.; Buhl-Mortensen, P.; Nattkemper, T.W. Polyp Activity Estimation and Monitoring for Cold Water Corals with a Deep Learning Approach. 2016 ICPR 2nd Workshop on Computer Vision for Analysis of Underwater Imagery (CVAUI). LOCATION OF CONFERENCE, MexicoDATE OF CONFERENCE; pp. 1–6.
  116. Måløy, H. EchoBERT: A Transformer-Based Approach for Behavior Detection in Echograms. IEEE Access 2020, 8, 218372–218385. [Google Scholar] [CrossRef]
  117. Hu, W.-C.; Chen, L.-B.; Hsieh, M.-H.; Ting, Y.-K. A Deep-Learning-Based Fast Counting Methodology Using Density Estimation for Counting Shrimp Larvae. IEEE Sensors J. 2022, 23, 527–535. [Google Scholar] [CrossRef]
  118. Pai, K.M.; Shenoy, K.B.A.; Pai, M.M.M. A Computer Vision Based Behavioral Study and Fish Counting in a Controlled Environment. IEEE Access 2022, 10, 87778–87786. [Google Scholar] [CrossRef]
  119. Sthapit, P.; Teekaraman, Y.; MinSeok, K.; Kim, K. Algorithm to Estimation Fish Population using Echosounder in Fish Farming Net. 2019 International Conference on Information and Communication Technology Convergence (ICTC). LOCATION OF CONFERENCE, KoreaDATE OF CONFERENCE; pp. 587–590.
  120. Yu, H.; Wang, Z.; Qin, H.; Chen, Y. An Automatic Detection and Counting Method for Fish Lateral Line Scales of Underwater Fish Based on Improved YOLOv5. IEEE Access 2023, 11, 143616–143627. [Google Scholar] [CrossRef]
  121. Y. Zhou, H. Y. Zhou, H. Yu, J. Wu, Z. Cui, H. Pang, and F. Zhang,” Fish density estimation with multi-scale context enhanced convolutional neural network,” Journal of Communications and Information Networks, vol.4, no.3, pp.80-89, Sep. 2019.
  122. H. Chen, Y. H. Chen, Y. Cheng, Y. Dou, et al., “Fry counting method in high-density culture based on image enhancement algorithm and attention mechanism,” IEEE Access, vol. 12, pp.41734-41749, Mar. 2024.
  123. Zhang, X.; Zeng, H.; Liu, X.; Yu, Z.; Zheng, H.; Zheng, B. In Situ Holothurian Noncontact Counting System: A General Framework for Holothurian Counting. IEEE Access 2020, 8, 210041–210053. [Google Scholar] [CrossRef]
  124. Kim, K.; Myung, H. Autoencoder-Combined Generative Adversarial Networks for Synthetic Image Data Generation and Detection of Jellyfish Swarm. IEEE Access 2018, 6, 54207–54214. [Google Scholar] [CrossRef]
  125. N. P. Desai, M. F. N. P. Desai, M. F. Balucha, A. Makrariyab, and R. MusheerAziz, “Image processing model with deep learning approach for fish species classification,” Turkish Journal of Computer and Mathematics Education, vol. 13, no. 1, pp.85-99, Jan. 2022.
  126. Zhang, X.; Huang, B.; Chen, G.; Radenkovic, M.; Hou, G. WildFishNet: Open Set Wild Fish Recognition Deep Neural Network With Fusion Activation Pattern. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2023, 16, 7303–7314. [Google Scholar] [CrossRef]
  127. Chuang, M.-C.; Hwang, J.-N.; Williams, K. A Feature Learning and Object Recognition Framework for Underwater Fish Images. IEEE Trans. Image Process. 2016, 25, 1–1. [Google Scholar] [CrossRef] [PubMed]
  128. Slonimer, A.L.; Dosso, S.E.; Albu, A.B.; Cote, M.; Marques, T.P.; Rezvanifar, A.; Ersahin, K.; Mudge, T.; Gauthier, S. Classification of Herring, Salmon, and Bubbles in Multifrequency Echograms Using U-Net Neural Networks. IEEE J. Ocean. Eng. 2023, 48, 1236–1254. [Google Scholar] [CrossRef]
  129. Paraschiv, M.; Padrino, R.; Casari, P.; Bigal, E.; Scheinin, A.; Tchernov, D.; Anta, A.F. Classification of Underwater Fish Images and Videos via Very Small Convolutional Neural Networks. J. Mar. Sci. Eng. 2022, 10, 736. [Google Scholar] [CrossRef]
  130. Tejaswini, H.; Pai, M.M.M.; Pai, R.M. Automatic Estuarine Fish Species Classification System Based on Deep Learning Techniques. IEEE Access 2024, 12, 140412–140438. [Google Scholar] [CrossRef]
  131. Liu, Z. Soft-shell Shrimp Recognition Based on an Improved AlexNet for Quality Evaluations. J. Food Eng. 2019, 266, 109698. [Google Scholar] [CrossRef]
  132. Yu, X.; Tang, L.; Wu, X.; Lu, H. Nondestructive Freshness Discriminating of Shrimp Using Visible/Near-Infrared Hyperspectral Imaging Technique and Deep Learning Algorithm. Food Anal. Methods 2017, 11, 768–780. [Google Scholar] [CrossRef]
  133. Xuan, Q.; Fang, B.; Liu, Y.; Wang, J.; Zhang, J.; Zheng, Y.; Bao, G. Automatic Pearl Classification Machine Based on a Multistream Convolutional Neural Network. IEEE Trans. Ind. Electron. 2017, 65, 6538–6547. [Google Scholar] [CrossRef]
  134. M. E. Elawady, “Sparse coral classification using deep convolutional neural network,” University of Burgundy Thesis, pp.1-51, Jun. 2014.
  135. Riabchenko, E.; Meissner, K.; Ahmad, I.; Iosifidis, A.; Tirronen, V.; Gabbouj, M.; Kiranyaz, S. Learned vs. engineered features for fine-grained classification of aquatic macroinvertebrates. 2016 23rd International Conference on Pattern Recognition (ICPR). LOCATION OF CONFERENCE, MexicoDATE OF CONFERENCE; pp. 2276–2281.
  136. Wu, F.; Cai, Z.; Fan, S.; Song, R.; Wang, L.; Cai, W. Fish Target Detection in Underwater Blurred Scenes Based on Improved YOLOv5. IEEE Access 2023, 11, 122911–122925. [Google Scholar] [CrossRef]
  137. Qin, X.; Yu, C.; Liu, B.; Zhang, Z. YOLO8-FASG: A High-Accuracy Fish Identification Method for Underwater Robotic System. IEEE Access 2024, PP, 1–1. [Google Scholar] [CrossRef]
  138. Fan, J.; Zhou, J.; Wang, X.; Wang, J. A Self-Supervised Transformer With Feature Fusion for SAR Image Semantic Segmentation in Marine Aquaculture Monitoring. IEEE Trans. Geosci. Remote. Sens. 2023, 61, 1–15. [Google Scholar] [CrossRef]
  139. J. Wang, J. J. Wang, J. Fan, and J. Wang, “MDOAU-Net: A lightweight and robust deep learning model for SAR image segmentation in aquaculture raft monitoring,” IEEE Geoscience and Remote Sensing Letters, vol. 19, pp.1-5, Feb. 2022.
  140. Yu, C.; Liu, Y.; Xia, X.; Lan, D.; Liu, X.; Wu, S. Precise and Fast Segmentation of Offshore Farms in High-Resolution SAR Images Based on Model Fusion and Half-Precision Parallel Inference. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2022, 15, 4861–4872. [Google Scholar] [CrossRef]
  141. Wang, X.; Zhou, J.; Fan, J. IDUDL: Incremental Double Unsupervised Deep Learning Model for Marine Aquaculture SAR Images Segmentation. IEEE Trans. Geosci. Remote. Sens. 2022, 60, 1–12. [Google Scholar] [CrossRef]
  142. Sánchez, J.S.; Lisani, J.-L.; Catalán, I.A.; Álvarez-Ellacuría, A. Leveraging Bounding Box Annotations for Fish Segmentation in Underwater Images. IEEE Access 2023, 11, 125984–125994. [Google Scholar] [CrossRef]
  143. Qin, G.; Wang, S.; Wang, F.; Zhou, Y.; Wang, Z.; Zou, W. U_EFF_NET: High-Precision Segmentation of Offshore Farms From High-Resolution SAR Remote Sensing Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2022, 15, 8519–8528. [Google Scholar] [CrossRef]
  144. Fan, J.; Zhao, J.; An, W.; Hu, Y. Marine Floating Raft Aquaculture Detection of GF-3 PolSAR Images Based on Collective Multikernel Fuzzy Clustering. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2019, 12, 2741–2754. [Google Scholar] [CrossRef]
  145. Fan, J.; Deng, Q. RSC-APMN: Random Sea Condition Adaptive Perception Modulating Network for SAR-Derived Marine Aquaculture Segmentation. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2024, PP, 1–17. [Google Scholar] [CrossRef]
  146. Issac, A.; Dutta, M.K.; Sarkar, B.; Burget, R. An efficient image processing based method for gills segmentation from a digital fish image. 2016 3rd International Conference on Signal Processing and Integrated Networks (SPIN). LOCATION OF CONFERENCE, IndiaDATE OF CONFERENCE; pp. 645–649.
  147. O. Ulucan, D. O. Ulucan, D. Karakaya, and M. Turkan, “A large-scale dataset for fish segmentation and classification,” in Proc. of Innovations in Intelligent Systems and Applications Conference (ASYU), Istanbul, Turkey, pp.1-5, Oct. 2020. [Google Scholar]
  148. Hsieh, Y.-Z.; Meng, Y.-H. A Video Surveillance System for Determining the Sexual Maturity of Cobia. IEEE Trans. Consum. Electron. 2023, 70, 484–495. [Google Scholar] [CrossRef]
  149. Ahmed, H.; Ushirobira, R.; Efimov, D.; Tran, D.; Sow, M.; Payton, L.; Massabuau, J.-C. A Fault Detection Method for Automatic Detection of Spawning in Oysters. IEEE Trans. Control. Syst. Technol. 2015, 24, 1140–1147. [Google Scholar] [CrossRef]
  150. Sthapit, P.; Teekaraman, Y.; MinSeok, K.; Kim, K. Algorithm to Estimation Fish Population using Echosounder in Fish Farming Net. 2019 International Conference on Information and Communication Technology Convergence (ICTC). LOCATION OF CONFERENCE, KoreaDATE OF CONFERENCE; pp. 587–590.
  151. Le, N.-B.; Woo, H.; Lee, D.; Huh, J.-H. AgTech: A Survey on Digital Twins based Aquaculture Systems. IEEE Access 2024, PP, 1–1. [Google Scholar] [CrossRef]
  152. Hsieh, Y.-Z.; Lee, P.-Y. Analysis of Oplegnathus Punctatus Body Parameters Using Underwater Stereo Vision. IEEE Trans. Emerg. Top. Comput. Intell. 2023, 8, 879–891. [Google Scholar] [CrossRef]
  153. Sun, N.; Li, Z.; Luan, Y.; Du, L. Enhanced YOLO-Based Multi-Task Network for Accurate Fish Body Length Measurement. 2024 5th International Conference on Artificial Intelligence and Electromechanical Automation (AIEA). LOCATION OF CONFERENCE, ChinaDATE OF CONFERENCE; pp. 334–339.
  154. X. Chen, I. X. Chen, I. N’Doye, F. Aljehani, and T. -M. Laleg-Kirati, “Fish weight prediction using empirical and data-driven models in aquaculture systems,” in Proc. of IEEE Conf. on Control Technology and Applications (CCTA), Newcastle upon Tyne, United Kingdom, pp.369-374, Aug. 2024.
  155. Voskakis, D.; Makris, A.; Papandroulakis, N. Deep learning based fish length estimation. An application for the Mediterranean aquaculture. OCEANS 2021: San Diego – Porto. LOCATION OF CONFERENCE, United StatesDATE OF CONFERENCE;
  156. D. Pérez, F. J. D. Pérez, F. J. Ferrero, I. Alvarez, M. Valledor, and J. C. Campo, “Automatic measurement of fish size using stereo vision,” in Proc. of IEEE Int. Instrumentation and Measurement Technology Conference (I2MTC), Houston, TX, USA, pp.1-6, 18. 20 May.
  157. Gorpincenko, A.; French, G.; Knight, P.; Challiss, M.; Mackiewicz, M. Improving Automated Sonar Video Analysis to Notify About Jellyfish Blooms. IEEE Sensors J. 2020, 21, 4981–4988. [Google Scholar] [CrossRef]
  158. Schraml, R.; Hofbauer, H.; Jalilian, E.; Bekkozhayeva, D.; Saberioon, M.; Cisar, P.; Uhl, A. Towards Fish Individuality-Based Aquaculture. IEEE Trans. Ind. Informatics 2020, 17, 4356–4366. [Google Scholar] [CrossRef]
  159. Liu, X.; Yue, Y.; Shi, M.; Qian, Z.-M. 3-D Video Tracking of Multiple Fish in a Water Tank. IEEE Access 2019, 7, 145049–145059. [Google Scholar] [CrossRef]
  160. Gupta, S.; Mukherjee, P.; Chaudhury, S.; Lall, B.; Sanisetty, H. DFTNet: Deep Fish Tracker With Attention Mechanism in Unconstrained Marine Environments. IEEE Trans. Instrum. Meas. 2021, 70, 1–13. [Google Scholar] [CrossRef]
  161. Shreesha, S.; Pai, M.M.M.; Verma, U.; Pai, R.M. Fish Tracking and Continual Behavioral Pattern Clustering Using Novel Sillago Sihama Vid (SSVid). IEEE Access 2023, 11, 29400–29416. [Google Scholar] [CrossRef]
  162. Winkler, J.; Badri-Hoeher, S.; Barkouch, F. Activity Segmentation and Fish Tracking From Sonar Videos by Combining Artifacts Filtering and a Kalman Approach. IEEE Access 2023, 11, 96522–96529. [Google Scholar] [CrossRef]
  163. Marques, T.P.; Cote, M.; Rezvanifar, A.; Slonimer, A.; Albu, A.B.; Ersahin, K.; Gauthier, S. U-MSAA-Net: A Multiscale Additive Attention-Based Network for Pixel-Level Identification of Finfish and Krill in Echograms. IEEE J. Ocean. Eng. 2023, 48, 853–873. [Google Scholar] [CrossRef]
  164. Xu, X.; Hu, J.; Yang, J.; Ran, Y.; Tan, Z. A Fish Detection and Tracking Method based on Improved Inter-Frame Difference and YOLO-CTS. IEEE Trans. Instrum. Meas. 2024, PP, 1–1. [Google Scholar] [CrossRef]
  165. Zhao, Z.; Yang, X.; Liu, J.; Zhou, C.; Zhao, C. GCVC: Graph Convolution Vector Distribution Calibration for Fish Group Activity Recognition. IEEE Trans. Multimedia 2023, 26, 1776–1789. [Google Scholar] [CrossRef]
  166. Zhao, X.; Yan, S.; Gao, Q. An Algorithm for Tracking Multiple Fish Based on Biological Water Quality Monitoring. IEEE Access 2019, 7, 15018–15026. [Google Scholar] [CrossRef]
  167. Williamson, B.J.; Fraser, S.; Blondel, P.; Bell, P.S.; Waggitt, J.J.; Scott, B.E. Multisensor Acoustic Tracking of Fish and Seabird Behavior Around Tidal Turbine Structures in Scotland. IEEE J. Ocean. Eng. 2017, 42, 948–965. [Google Scholar] [CrossRef]
  168. Chuang, M.-C.; Hwang, J.-N.; Ye, J.-H.; Huang, S.-C.; Williams, K. Underwater Fish Tracking for Moving Cameras Based on Deformable Multiple Kernels. IEEE Trans. Syst. Man, Cybern. Syst. 2016, 47, 1–11. [Google Scholar] [CrossRef]
  169. Zhang, Y.; Ning, Y.; Zhang, X.; Glamuzina, B.; Xing, S. Multi-Sensors-Based Physiological Stress Monitoring and Online Survival Prediction System for Live Fish Waterless Transportation. IEEE Access 2020, 8, 40955–40965. [Google Scholar] [CrossRef]
  170. Le, N.-B.; Huh, J.-H. AgTech: Building Smart Aquaculture Assistant System Integrated IoT and Big Data Analysis. IEEE Trans. AgriFood Electron. 2024, 2, 471–482. [Google Scholar] [CrossRef]
  171. Xu, Y.; Lu, L. An Attention-Fused Deep Learning Model for Accurately Monitoring Cage and Raft Aquaculture at Large-Scale Using Sentinel-2 Data. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2024, 17, 9099–9109. [Google Scholar] [CrossRef]
  172. Liu, J.; Lu, Y.; Guo, X.; Ke, W. A Deep Learning Method for Offshore Raft Aquaculture Extraction Based on Medium-Resolution Remote Sensing Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2023, 16, 6296–6309. [Google Scholar] [CrossRef]
  173. Soltanzadeh, R.; Hardy, B.; Mcleod, R.D.; Friesen, M.R. A Prototype System for Real-Time Monitoring of Arctic Char in Indoor Aquaculture Operations: Possibilities & Challenges. IEEE Access 2019, 8, 180815–180824. [Google Scholar] [CrossRef]
  174. Ai, B.; Xiao, H.; Xu, H.; Yuan, F.; Ling, M. Coastal Aquaculture Area Extraction Based on Self-Attention Mechanism and Auxiliary Loss. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2022, 16, 2250–2261. [Google Scholar] [CrossRef]
  175. Kim, T.; Hwang, K.-S.; Oh, M.-H.; Jang, D.-J. Development of an Autonomous Submersible Fish Cage System. IEEE J. Ocean. Eng. 2013, 39, 702–712. [Google Scholar] [CrossRef]
  176. Abdurohman, M.; Putrada, A.G.; Deris, M.M. A Robust Internet of Things-Based Aquarium Control System Using Decision Tree Regression Algorithm. IEEE Access 2022, 10, 56937–56951. [Google Scholar] [CrossRef]
  177. Xia, J.; Ma, T.; Li, Y.; Xu, S.; Qi, H. A Scale-Aware Monocular Odometry for Fishnet Inspection With Both Repeated and Weak Features. IEEE Trans. Instrum. Meas. 2023, 73, 1–11. [Google Scholar] [CrossRef]
  178. Nguyen, N.T.; Matsuhashi, R. An Optimal Design on Sustainable Energy Systems for Shrimp Farms. IEEE Access 2019, 7, 165543–165558. [Google Scholar] [CrossRef]
  179. Luna, F.D.V.B.; Aguilar, E.d.l.R.; Naranjo, J.S.; Jaguey, J.G. Robotic System for Automation of Water Quality Monitoring and Feeding in Aquaculture Shadehouse. IEEE Trans. Syst. Man, Cybern. Syst. 2016, 47, 1575–1589. [Google Scholar] [CrossRef]
  180. Kim, J.; Song, S.; Kim, T.; Song, Y.-W.; Kim, S.-K.; Lee, B.-I.; Ryu, Y.; Yu, S.-C. Collaborative Vision-Based Precision Monitoring of Tiny Eel Larvae in a Water Tank. IEEE Access 2021, 9, 100801–100813. [Google Scholar] [CrossRef]
  181. M.-D. Yang, K.-S. M.-D. Yang, K.-S. Huang, J. Wan, H. P. Tsai, and L.-M. Lin, “Timely and quantitative damage assessment of oyster racks using UAV images,” IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 11, no. 8, pp.2862-2868, Aug. 2018.
  182. J. O. Gaya, L. T. J. O. Gaya, L. T. Goncalves, A. C. Duarte, B. Zanchetta, P. Drews, and S. S. C. Botelho, “Vision-based obstacle avoidance using deep learning,” in Proc. of XIII Latin American Robotics Symposium and IV Brazilian Robotics Symposium (LARS/SBR), Recife, Brazil, pp.7-12, Oct. 2016.
  183. Tun, T.T.; Huang, L.; Preece, M.A. Development and High-Fidelity Simulation of Trajectory Tracking Control Schemes of a UUV for Fish Net-Pen Visual Inspection in Offshore Aquaculture. IEEE Access 2023, 11, 135764–135787. [Google Scholar] [CrossRef]
  184. Xanthidis, M.; Skaldebø, M.; Haugaløkken, B.; Evjemo, L.; Alexis, K.; Kelasidi, E. ResiVis: A Holistic Underwater Motion Planning Approach for Robust Active Perception Under Uncertainties. IEEE Robot. Autom. Lett. 2024, 9, 9391–9398. [Google Scholar] [CrossRef]
  185. Ouyang, B.; Wills, P.S.; Tang, Y.; Hallstrom, J.O.; Su, T.-C.; Namuduri, K.; Mukherjee, S.; Rodriguez-Labra, J.I.; Li, Y.; Ouden, C.J.D. Initial Development of the Hybrid Aerial Underwater Robotic System (HAUCS): Internet of Things (IoT) for Aquaculture Farms. IEEE Internet Things J. 2021, 8, 14013–14027. [Google Scholar] [CrossRef]
  186. Lin, F.-S.; Yang, P.-W.; Tai, S.-K.; Wu, C.-H.; Lin, J.-L.; Huang, C.-H. A Machine-Learning-Based Ultrasonic System for Monitoring White Shrimps. IEEE Sensors J. 2023, 23, 23846–23855. [Google Scholar] [CrossRef]
  187. L. Xu, H. L. Xu, H. Yu, H. Qin, et al., “Digital twin for aquaponics factory: Analysis, opportunities, and research challenges,” IEEE Trans. on Industrial Informatics, vol. 20, no. 4, pp.5060-5073, Apr. 2024.
  188. Misimi, E.; Øye, E.R.; Sture. ; Mathiassen, J.R. Robust classification approach for segmentation of blood defects in cod fillets based on deep convolutional neural networks and support vector machines and calculation of gripper vectors for robotic processing. Comput. Electron. Agric. 2017, 139, 138–152. [Google Scholar] [CrossRef]
  189. Chen, W.; Li, X. Deep-learning-based marine aquaculture zone extractions from dual-polarimetric SAR imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2024, PP, 1–16. [Google Scholar] [CrossRef]
  190. Lu, Y.; Zhao, Y.; Yang, M.; Zhao, Y.; Huang, L.; Cui, B. BI²Net: Graph-Based Boundary–Interior Interaction Network for Raft Aquaculture Area Extraction From Remote Sensing Images. IEEE Geosci. Remote. Sens. Lett. 2024, 21, 1–5. [Google Scholar] [CrossRef]
  191. Zhang, X.; Ma, S.; Su, C.; Shang, Y.; Wang, T.; Yin, J. Coastal Oyster Aquaculture Area Extraction and Nutrient Loading Estimation Using a GF-2 Satellite Image. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2020, 13, 4934–4946. [Google Scholar] [CrossRef]
  192. Zhang, Y.; Yamamoto, M.; Suzuki, G.; Shioya, H. Collaborative Forecasting and Analysis of Fish Catch in Hokkaido From Multiple Scales by Using Neural Network and ARIMA Model. IEEE Access 2022, 10, 7823–7833. [Google Scholar] [CrossRef]
  193. Kristmundsson, J.; Patursson. ; Potter, J.; Xin, Q. Fish Monitoring in Aquaculture Using Multibeam Echosounders and Machine Learning. IEEE Access 2023, 11, 108306–108316. [Google Scholar] [CrossRef]
  194. Cheng, W.K.; Khor, J.C.; Liew, W.Z.; Bea, K.T.; Chen, Y.L. Integration of Federated Learning and Edge-Cloud Platform for Precision Aquaculture. IEEE Access 2024, PP, 1–1. [Google Scholar] [CrossRef]
  195. M. A. Abid, M. M. A. Abid, M. Amjad, K. Munir, H. U. R. Siddique, and A. D. Jurcut, “IoT-based smart Biofloc monitoring system for fish farming using machine learning,” IEEE Access, vol. 12, pp.86333-86345, Jun. 2024.
  196. Y. Han, J. Y. Han, J. Huang, F. Ling et al., “Dynamic mapping of inland freshwater aquaculture areas in Jianghan plain, China,” IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 16, pp.4349-4361, Apr. 2023.
  197. Singh, S.; Ahmad, S.; Amrr, S.M.; Khan, S.A.; Islam, N.; Gari, A.A.; Algethami, A.A. Modeling and Control Design for an Autonomous Underwater Vehicle Based on Atlantic Salmon Fish. IEEE Access 2022, 10, 97586–97599. [Google Scholar] [CrossRef]
  198. Dyrstad, J.S.; Mathiassen, J.R. Grasping virtual fish: A step towards robotic deep learning from demonstration in virtual reality. 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO). LOCATION OF CONFERENCE, COUNTRYDATE OF CONFERENCE; pp. 1181–1187.
  199. Zhong, Y.; Chen, Y.; Wang, C.; Wang, Q.; Yang, J. Research on Target Tracking for Robotic Fish Based on Low-Cost Scarce Sensing Information Fusion. IEEE Robot. Autom. Lett. 2022, 7, 6044–6051. [Google Scholar] [CrossRef]
  200. Milich, M.; Drimer, N. Design and Analysis of an Innovative Concept for Submerging Open-Sea Aquaculture System. IEEE J. Ocean. Eng. 2018, 44, 707–718. [Google Scholar] [CrossRef]
  201. Yang, R.-Y.; Tang, H.-J.; Huang, C.-C. Numerical Modeling of the Mooring System Failure of an Aquaculture Net Cage System Under Waves and Currents. IEEE J. Ocean. Eng. 2020, 45, 1396–1410. [Google Scholar] [CrossRef]
  202. Liu, X.; Zhu, H.; Song, W.; Wang, J.; Yan, L.; Wang, K. Research on Improved VGG-16 Model Based on Transfer Learning for Acoustic Image Recognition of Underwater Search and Rescue Targets. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2024, 17, 18112–18128. [Google Scholar] [CrossRef]
  203. Kuang, L.; Shi, P.; Hua, C.; Chen, B.; Zhu, H. An Enhanced Extreme Learning Machine for Dissolved Oxygen Prediction in Wireless Sensor Networks. IEEE Access 2020, 8, 198730–198739. [Google Scholar] [CrossRef]
  204. M. Singh, K. S. M. Singh, K. S. Sahoo, and A. H. Gandomi, “An intelligent-IoT-based data analytics for freshwater recirculating aquaculture system,” IEEE Internet of Things Journal, vol. 11, no. 3, pp.4206-4217, Feb. 2024.
  205. Cao, S.; Zhou, L.; Zhang, Z. Corrections to “Prediction of Dissolved Oxygen Content in Aquaculture Based on Clustering and Improved ELM”. IEEE Access 2021, 9, 135508–135512. [Google Scholar] [CrossRef]
  206. M. Mendieta and D. Romero “A cross-modal transfer approach for histological images: A case study in aquaculture for disease identification using Zero-Shot learning,” in Proc. of IEEE Second Ecuador Technical Chapters Meeting (ETCM), Salinas, Ecuador, pp.1-6, Oct. 2017.
  207. Waleed, A.; Medhat, H.; Esmail, M.; Osama, K.; Samy, R.; Ghanim, T.M. Automatic Recognition of Fish Diseases in Fish Farms. 2019 14th International Conference on Computer Engineering and Systems (ICCES). LOCATION OF CONFERENCE, EgyptDATE OF CONFERENCE; pp. 201–206.
  208. Wambura, S.; Li, H. Deep and Confident Image Analysis for Disease Detection. VSIP '20: 2020 2nd International Conference on Video, Signal and Image Processing. LOCATION OF CONFERENCE, IndonesiaDATE OF CONFERENCE; pp. 91–99.
  209. S. Shephard, D. G. S. Shephard, D. G. Reid, H. D. Gerritsen, and K. D. Farnsworth, “Estimating biomass, fishing mortality, and “total allowable discards” for surveyed non-target fish,” ICES Journal of Marine Science, vol. 72, no. 2, pp.458-466, Aug. 2014.
  210. Han, F.; Zhu, J.; Liu, B.; Zhang, B.; Xie, F. Fish Shoals Behavior Detection Based on Convolutional Neural Network and Spatiotemporal Information. IEEE Access 2020, 8, 126907–126926. [Google Scholar] [CrossRef]
  211. Chicchon, M.; Bedon, H.; Del-Blanco, C.R.; Sipiran, I. Semantic Segmentation of Fish and Underwater Environments Using Deep Convolutional Neural Networks and Learned Active Contours. IEEE Access 2023, 11, 33652–33665. [Google Scholar] [CrossRef]
  212. Pravin, S.C.; Rohith, G.; Kiruthika, V.; Manikandan, E.; Methelesh, S.; Manoj, A. Underwater Animal Identification and Classification Using a Hybrid Classical-Quantum Algorithm. IEEE Access 2023, 11, 141902–141914. [Google Scholar] [CrossRef]
  213. Huang, T.-W.; Hwang, J.-N.; Romain, S.; Wallace, F. Fish Tracking and Segmentation From Stereo Videos on the Wild Sea Surface for Electronic Monitoring of Rail Fishing. IEEE Trans. Circuits Syst. Video Technol. 2019, 29, 3146–3158. [Google Scholar] [CrossRef]
  214. Ren, W.; Wang, X.; Tian, J.; Tang, Y.; Chan, A.B. Tracking-by-Counting: Using Network Flows on Crowd Density Maps for Tracking Multiple Targets. IEEE Trans. Image Process. 2020, 30, 1439–1452. [Google Scholar] [CrossRef]
  215. Ubina, N.A.; Lan, H.-Y.; Cheng, S.-C.; Chang, C.-C.; Lin, S.-S.; Zhang, K.-X.; Lu, H.-Y.; Cheng, C.-Y.; Hsieh, Y.-Z. Digital twin-based intelligent fish farming with Artificial Intelligence Internet of Things (AIoT). Smart Agric. Technol. 2023, 5. [Google Scholar] [CrossRef]
Figure 1. Conceptual framework of AIoT in aquaculture.
Figure 1. Conceptual framework of AIoT in aquaculture.
Preprints 141844 g001
Figure 2. AIoT pipeline in aquaculture.
Figure 2. AIoT pipeline in aquaculture.
Preprints 141844 g002
Figure 3. Applications of AIoT in aquaculture.
Figure 3. Applications of AIoT in aquaculture.
Preprints 141844 g003
Table 1. Summary of related Studies.
Table 1. Summary of related Studies.
Application Dataset AI Algorithm Study Results Limitations Future Directions
Smart Feeding Systems Fish activity data, water quality parameters, and environmental factors from various sources CNN, ResNet, OSELM, SVM, RNN, Gradient Boosting, Naive Bayes [8,23,25,54,55,56,57,58,59,60,61,62,63,64,188] Improved feeding efficiency, reduced waste, enhanced fish growth rates Specificity to controlled environments and limited species diversity Integrating more species, real-time IoT-enabled feeding systems, edge computing
Water quality management Water quality metrics (DO, pH, temperature, turbidity), historical data, environmental sensors, and WSNs collected across various aquaculture and reservoir environments LSTM, ANN, SVM, CNN, LightGBM, PSO, Gradient Boosting, PCA-LSTM, SAE-LSTM, [13,16,18,26,27,28,40,41,43,45,46,47,48,65,66,67,68,69,70,71,72,73,74,75,76,77,79,80,81,82,176,203,204,205] High accuracy in water quality prediction, improved sustainability measures Variability in generalizability across aquaculture environments Real-time monitoring, scaling for larger environments, incorporating additional water quality factors
Disease detection, classification, and prevention Fish images, histological images, water quality data, and environmental factors were collected across aquaculture sites. CNN, ResNet, SVM, AlexNet, Random Forest, Transfer Learning, Decision Trees [2,7,21,29,30,84,85,86,88,89,90,91,92,93,94,95,96,97,98,206,207,208] High classification accuracy, effective early disease intervention Limited disease types covered in studies and reliance on controlled images Expanding datasets for varied diseases, development of mobile applications for real-time diagnosis
Fish Biomass Estimation High-resolution images, stereo vision, sonar imaging, and underwater acoustic signals collected in various fish farming environments YOLOv5, Mask R-CNN, DL-YOLO, 3D reconstruction, LAR, SfM [99,100,101,102,103,104,105,106,107,108,109,209] High biomass estimation accuracy, aiding in resource management and sustainable production Susceptible to environmental variables (e.g., water clarity, lighting) Real-time monitoring, broader species adaptability, integrating IoT for continuous tracking
Fish Behavior Detection Video data, echogram data, and telemetry data from controlled and natural aquaculture environments ResNeXt3×1D, LeNet-5, EchoBERT, FR-CNN, deformable models, DQN, CNN, RNN [50,51,110,111,112,113,114,115,116,210] Effective identification of abnormal behaviors and collective behavior patterns Dependent on environmental conditions like lighting and noise Real-time monitoring, expanding behavior categories, edge computing deployment
Counting Aquaculture Organisms Images, videos, and acoustics for various fish, shrimp larvae, fry, and holothurian populations collected across aquaculture sites YOLOv2, v5, Faster R-CNN, Multi-Scale CNN, MCNN, SGDAN, ShrimpCountNet, GAN [6,34,117,118,119,120,121,122,123,124] High counting accuracy, fast execution time, practical for commercial applications Limited accuracy in extremely high-density populations and dependence on specific environmental conditions Optimizing for diverse aquaculture species, reducing hardware requirements for mobile deployment
Segmentation, detection and classification of aquaculture species SAR images, echograms, underwater images and videos, from secondary sources like GaoFen-3, NOAA, and Fish4Knowledge, as well as primary datasets. YOLOv5, ANN, Mask R-CNN, Cascade R-CNN, SegNet, SVM, EfficientNet, U-Net, CNN, Transformer-based models, SAEs–LR [10,11,14,31,32,33,125,126,127,128,129,130,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,211,212] High segmentation and classification accuracy, effective monitoring and species recognition Limited generalization to broader species and varied underwater conditions Expanding datasets and models for new species and environmental settings, optimizing models for mobile use
Breeding and Growth Estimation Diverse datasets including cobia images, oyster valve activity, fish length and weight data, and simulated growth data YOLOv8, OpenPose, FHRCNN, CNN, LSTM, stereo vision, ResNetv2 [12,32,33,146,147,148,149,150,151,152,153,154,155,156,157] [128,136,138,141] High accuracy in growth and breeding stage predictions, non-invasive measurement approaches Limited adaptability to varying environmental conditions, accuracy affected by underwater factors Expanding to broader species, integrating real-time sensors, and refining models for environmental robustness
Fish Tracking and Individuality Diverse datasets including Atlantic salmon irises, sonar recordings, Fish4Knowledge, and in situ videos from tidal and aquaculture environments CNN, Kalman Filter, YOLO-CTS, Deep-SORT, DFTNet, GMM, GCVC, DMK, ARIMA [158,159,160,161,162,163,164,165,166,167,168,169,213,214] High tracking and identification accuracy, effective behavior analysis in complex environments Limited long-term identification stability for iris-based systems, impacted by environmental noise Enhancing biometric stability, integrating sensory inputs (e.g., sonar), refining detection for complex scenes
Automation and Robotics Datasets from primary and secondary sources, including satellite images (Sentinel, GF-1, GF-2), underwater video, sensor data from aquaculture sites, simulation data for autonomous systems YOLOv5, Mask R-CNN, LSTM, Digital Twins, RNN, CNN, Transformer-based models, multi-agent RL, image processing, federated learning, genetic optimization, 3DCNN [10,15,35,66,170,171,172,173,174,175,176,177,178,179,180,181,182,183,184,185,186,187,189,190,191,192,193,194,195,196,197,198,199,200,201,202] Improved monitoring, management, and operational efficiency in aquaculture through autonomous systems and robotic solutions Limited robustness in complex or variable environmental conditions, reliance on robust connectivity and energy Enhancing environmental adaptability, expanding capabilities to new species and settings, integrating multi-sourced data and IoT
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated