Preprint
Review

Application of Convolutional Neural Networks in Weed Detection and Identification: A Systematic Review.

Altmetrics

Downloads

235

Views

136

Comments

0

A peer-reviewed article of this preprint also exists.

Submitted:

15 January 2024

Posted:

16 January 2024

You are already at the latest version

Alerts
Abstract
Weeds are unwanted and invasive plants that proliferate and compete for resources such as space, water, nutrients, and sunlight, affecting the quality and productivity of the desired crops. Weed detection is crucial for the application of precision agriculture methods and for this purpose machine learning techniques can be used, specifically convolutional neural networks (CNN). This study focuses on the search for CNN architectures and technology used to detect and identify weeds in different crops; 61 articles applying CNN architectures and technology were analyzed in the last five years (2019-2023). The results show the used of different devices to acquire the image for training, such as digital cameras, smartphones, and drone cameras. Additionally, the YOLO family and algorithms are the most widely adopted architectures, followed by VGG, ResNet, Faster R-CNN, AlexNet, and MobileNet, respectively. This study provides an update on CNNs that will serve as a starting point for researchers wishing to implement these weed detection and identification techniques.
Keywords: 
Subject: Engineering  -   Bioengineering

1. Introduction

According to the United Nations, the world population is estimated to reach 9.7 billion inhabitants by 2050 [1]. Against this backdrop, facing the challenge of feeding this growing population with quality and sustainable products becomes an imperative task. Increasing crop productivity emerges as a measure to address this need. Thus, a strategy that contributes to improving productivity is properly managing weeds, given their direct impact on crop yields. Integrated weed management is essential to preserve agricultural productivity [2]. Plants considered weeds are fast-growing and actively compete for vital resources such as space, water, nutrients, and sunlight. This competition not only affects resource availability but also has a negative impact on crop yield and quality [3]. According to [4], damage due to weeds can represent up to 42% of agricultural production.
Currently, diverse weeding techniques are used, such as pre-and post-emergence herbicides, whose application not only generates environmental impacts but also affects the health of the workers who apply them [5]. Mechanical weeding applying mechanized or manually techniques, whose effectiveness in eliminating weeds is not always the desired one, depending on their stage of development [5]. Other weeding alternatives are still under development, or their feasibility has not been fully demonstrated. One is physical weeding using plastic covers [5] and microbiological weeding involving microorganisms [6]. Traditional weeding methods present environmental challenges or economic disadvantages, creating the need to explore innovative solutions based on new technologies to increase treatment efficiency. For instance, precision weeding uses image sensors and computational algorithms to apply herbicides only when weeds are identified [7].
Computational algorithms based on Deep Learning (DL) techniques [8], are used to improve the accuracy of weed detection. DL is an advanced branch of machine learning that uses multi-layered artificial neural networks to model and process more complex data, such as digital images. Within the processes of pattern recognition through digital images, the application of neural networks has evolved, creating new specific architectures for computer vision tasks such as Convolutional Neural Networks (CNN). CNNs efficiently detect spatial patterns in digital images by using convolution layers that apply filters to local regions of the input image [9]. These convolution layers allow the network to automatically learn hierarchical and complex features, such as edges, textures, and shapes, instead of relying on predefined features. The basic structure of a CNN model consists of three layers (Figure 1), a convolutional layer, a clustering layer, and a connection layer [10].
  • The convolutional layer extracts feature from the image using mathematical filters; the features can be edges, corners, or alignment patterns, which give the output a feature map that serves as input to the next layer.
  • The grouping layer reduces the resolution by reducing the dimension of the feature map in order to minimize the computational cost.
  • The connection layer the image obtained from the previous layer is sent to the fully connected neural network layer, which contains the activation function used to recognize the final image.
The use of DL-based algorithms in agriculture has increased in recent years, and several works have been found that compile the applications of this algorithm [11]. One of the most interesting applications is weed detection using CNNs, which allow for rapid weed detection, localization, and recognition [12, 8]. These methods, supported by large-scale datasets, have demonstrated high robustness against biological variability and diverse imaging conditions [13], reaching the most accurate classification or detection [14] [15], which allows automating weeding or weeding processes accurately and efficiently [7]. However, weed detection faces several problems in practice, such as similarities in colors, textures, shapes, occlusion effects, and variations in illumination environments. To overcome these limitations both traditional and CNN-based machine vision offer effective solutions [16]. Advances in the development of DL algorithms that train CNN models faced some limitation specially when a new CNN model in particular needs to be trained, it required a large amount of data and additional computing equipment with high processing capabilities. Therefore, to reduce these limitations the transfer learning (TL) technique is adopted, this allows using pre-trained models, and they are applied to the new model with some modifications to solve the specific problem [17, 13]. TL aims to transfer the knowledge from the source domain to the target application by improving its learning performance [18]. One of the examples of using TL with CNN is AlexNet, which was trained on the ImageNet dataset [19]. In agriculture, the TL approach has been implemented in weed detection and classification helping to minimize the need for large-scale image data collection, and reduce the computational costs related to in the training hours of in a new CNN model [20, 21, 22]. In addition to the use of e TL in agriculture, some generative adversarial network (GAN) techniques have been applied to generate artificial images to augment the training set with TL [23]. The evolution of CNNs has been marked by significant advances in terms of architectures, training techniques, efficiency, and applications, starting from the need to implement fast solutions in image analysis; some of the most used CNN architectures in weed detection are mentioned below:
  • AlexNet: Developed by [19] in 2012, won the ImageNet Large Scale Visual Recognition Challenge (ILSVRC) competition, demonstrating the impact of CNNs in computer vision.
  • GoogLeNet (Inception): Developed in 2014 by the Google Research team, introduced the concept of Inception modules with multiple filter sizes in parallel [24].
  • VGG: Developed in 2014 by the Visual Graphics Group (VGG) at Oxford University, it is known for its simplicity and depth [25]. Its structure of pure convolutional layers and deep subsampling influenced the design of later architectures, which had different improved versions.
  • ResNet: Developed by [26] in 2015, this architecture was highlighted using residual blocks, which allow the training of very deep networks by facilitating the information flow and mitigating the gradient vanishing problem.
  • Fast R-CNN: In 2015 it was presented a significant improvement over its predecessor, the R-CNN (Region-Based Convolutional Neural Network) model. It was developed by the Microsoft Research group to address the speed and computational efficiency limitations associated with R-CNN, providing a faster and more practical solution for object detection in images [27].
  • DenseNet: Proposed In 2017 by [28], it is notable for its densely connected structure, where each layer is directly connected to all subsequent layers. This dense connectivity can potentially improve information flow and mitigate the problem of gradient fading. It has influenced the design of subsequent architectures and continues to be a popular choice in research and practical implementation in computer vision tasks.
  • MobileNet: Proposed in 2017 by [29], it is specially designed for implementations on mobile devices and uses lightweight and efficient operations to balance performance and resource consumption.
  • YOLO (You Only Look Once): Developed in 2016 by [30], it is a fast and efficient object detection architecture as it approaches this task as a regression problem; instead of a separate classification for each region, this feature allows several versions from YOLOv1 to YOLOv8 in 2023. Starting with the fifth version released in 2020, known as YOLOv5, was built on PyTorch [31], maintaining the original YOLO approach of dividing the image into a grid and predicting bounding boxes with class probabilities for each cell. The overall architecture includes convolutional layers, attention layers, and other modern techniques; it is important to mention that this version was developed by the Ultralytics team, not by the original authors. In 2022, the YOLOv6 and YOLOv7 versions were developed, presenting improvements in their architecture and training scheme, and improving object detection accuracy without increasing the cost of inference, a concept known as "trainable feature bags" [32]. Finally, in 2023, YOLOv8 is presented; its improvements include new features, better performance, flexibility, and efficiency. Additionally, it includes improvements for detection, segmentation, pose estimation, tracking, and classification [33].
This study aims to review the latest research on the detection or identification of weed using CNN techniques, with the aim of providing a starting point for researchers interested in implementing this technique. Therefore, this review will be focus on analysing the existing type of architecture and the technology in weed detection.

2. Methods

In this study, a systematic review was carried out to identify and analyze scientific literature published on weed detection using CNN. The guidelines of the PRISMA statement [34] were followed in this review.

2.1. Research Question and Review Objectives

  • Research question: What are the Deep Learning techniques based on Convolutional Neural Networks used for weed detection in agriculture?
  • Main Objective: To identify the Deep Learning techniques that are employed in Convolutional Neural Networks for weed detection.
  • Specific Objectives:
  • To investigate the architectures of Convolutional Neural Networks employed in weed detection.
  • To identify the technology used for weed identification in different forms of production.

2.2. Sources of Information

The scientific resources platform of the Spanish Foundation for Science and Technology (FECYT), which has access to the following databases: Web of Science and Scopus, was used for the systematic search.

2.3. Search for Keywords

A primarily search was carried out to establish the relevant words for the systematic search; for this, the Scopus database was used with the words "weeds detection deep learning," establishing the "search within all fields," (ALL (weeds AND detection AND deep AND learning), in this search 6096 results were obtained. With the filtering options in "Filter by keyword," we found the five most used keywords with their number of matches are: "Deep Learning" (1800), "Machine Learning" (988), "Remote Sensing" (704), "Crops" (647), and "Convolutional Neural Networks" (638). The initial search covered the topics of importance for this systematic review and took "Weed detection," "Deep Learning," and "Convolutional Neural Networks" as keywords for the search.

2.4. Inclusion and Exclusion Criteria

The following inclusion and exclusion criteria were used and implemented through the filters of each database:
  • The search field is selected where the search is directed through titles, abstracts, and keywords, among others; this is specific to each database:
  • In Scopus, "search within Article title, Abstract, Keywords" was established.
  • In Web of Science, the search was established in "Topic"; this includes title, abstract, author keywords and keywords plus.
2.
The date range of the search is the last five years, from 2019 to 2023.
3.
Document type: "Document type: Article".
4.
Excluded are reviews, book chapters, narrative articles, conference or congress articles, unofficial notes or communications, and studies from other areas such as social, human, biological, chemical, legislative, social and economic impact.
5.
Language: "English Language".

2.5. Search Equations in Bibliographic Databases

The search equation is established by restrictively connecting all the results containing the keywords "weeds detection" AND "Deep learning" AND "Convolutional Neural Networks." With this, the search equation is established according to each platform:
  • Scopus: TITLE-ABS-KEY ("weed detection" AND "deep learning" AND "Convolutional Neural Networks ")
  • WOS: TS = ("weed detection " AND "deep learning" AND "Convolutional Neural Networks ")

2.6. Initial Search Results

  • Initial records: Scopus 104 and WOS 40. Initial results obtained 144.
  • Records eliminated by exclusion criteria: 65 results eliminated and 79 results remained, 61 from Scopus and 18 from WOS.

2.7. Duplicates and Screening

The free access tool Zotero 6.0.30 was used to eliminate duplicate results, and 14 duplicate results were eliminated, obtaining a total of 65 results.

2.8. Additional Records

16 articles obtained from reading book chapters and reviews were added to the results, obtaining 81 results.

2.9. Records Excluded

A total of 20 records were excluded because they do not meet the objective of this review or cannot be accessed through the scientific resources platform of the Spanish Foundation for Science and Technology (FECYT).
Finally, 61 articles that meet the established criteria were gathered and analyzed Figure 2 illustrates the process in a flow chart using the PRISMA methodology.

3. Results

3.1. Literature Analysis

A detailed analysis of the 61 bibliographic articles was conducted. The analysis indicated that in the last two years, there has been a massive growth in the number of articles published on weed detection, Figure 3. The increasing amount of research in this area is mainly due to the development of new and more efficient CNN architectures, the increase in the processing capacity of computers, as well as the reduction in the price of cameras and graphics processing units (GPU).

3.2. Technology for image acquisition

In the review of the selected articles, it was found that the authors used different types of technology for the acquisition of the images used in the training and validation of the CNNs, such as digital cameras, professional Reflex type, industrial high-speed and low-cost cameras such as those using Raspberry cards. UAVs with various types of cameras, both RGB and multispectral. Smartphones with high-resolution cameras were also used. In addition, it was found that some researchers did not acquire images and used free databases or previous works. Figure 4 shows the number of publications according to the technology used, where 49.2% used digital cameras, 29.5% used UAV as a mean of acquisition, 11.5% used smartphones, and 9.8% used already built datasets.

3.3. CNN architecture used

Table 1 summarizes relevant information extracted from the selected reviewed studies on CNN architectures such as author, publication year, CNN architecture used, technology and the species.
Figure 5 illustrates the frequency of use of the different CNNs for segmentation, detection, and classification of weeds. The YOLO family of algorithms with its multiple versions is the most applied technology, followed by the VGG, ResNet, and Faster R-CNN architectures and the previous one with various versions. Alexnet and MobileNet are also commonly used.

4. Discussion

The selected papers analysed in this review show the use of a range of technologies to capture images from multispectral cameras on UAVs, to low-cost system development, to smartphones available to everyone. Several articles integrated different technologies, as, for example, in [60], who used a Nikon 7000 camera to build the image dataset for training using YOLOv5. Additionally, they built a spraying system using Raspberry cameras to distinguish dicotyledonous from monocotyledonous weeds. In [68] study, they used an AUV, a multi-copter drone (Hylio Inc., Houston, TX, United States) equipped with a Fujifilm GFX100 (100 MP) camera, used YOLOv4 and Faster R-CNN architectures. In [37], they used two cameras Sony Cyber-Shot and Canon EOS Rebel T6 for image acquisition, used AlexNet, GoogLeNet, and VGGNet architectures to detect Perennial ryegrass, dandelion (Taraxacum officinale), ground ivy (Glechoma hederacea L.) and spotted spurge (Euphorbia maculata L.).
As the digital camera is the most used technology in this study, it can be concluded that it is preferred for its higher image quality and speed compared to that of a smartphone or an AUV camera, as for example[59] evaluated three digital cameras, a Canon T6 DSLR 7 camera, an LG G6 smartphone, and a Logitech c920 camera, the detection results were higher for the Canon T6 camera. In contrast, the Logitech c920 camera was not suitable for weed detection, demonstrating that SLR-type cameras are preferred for the development of mobile platforms, field carts, or robots due to their image quality and different adjustment options. In the work of [65], a semi-professional Nikon P250 camera was used to develop a prototype autonomous sprayer; similarly, in [36], a field robot was designed to detect weeds in high-density crops using a 20-Mpixel JAI camera, which is a high-speed industrial camera.
In this review the use of AUV cameras is largely based on the integrated cameras manufactured by DJI, in the Phanton 3 and 4, Matrice 600, Spark and Mavic versions. Therefore, they are limited to the performance and sensor configuration used by the brand. [72] used UAV Phantom 3 Professional imaging at three flight altitudes (10 m, 15 m, and 30 m), the VGG, ResNet, and DenseNet architectures, along with smaller ShuffleNet, MobileNet, EfficientNet, and MNASNet models were used to detect Rumex obtusifolius. In [51], they used a Mavic pro AUV integrated with a Parrot Sequoia camera, modified the CNN VGG-16 to detect weeds in sugar beet (Beta vulgaris subsp.), and used a Mavic pro AUV integrated with a Parrot Sequoia camera, modified the CNN VGG-16 to detect weeds in Sugar beet (Beta vulgaris subsp.).
The use of smartphones as imaging technology has increased in recent years, as can be seen in the work of [62], who used a Huawei Y7 Prime smartphone to take images in pea (Pisum sativum) to work with the Faster RCNN ResNet 50 model. Similarly, [70], a Xiaomi Mi 11 smartphone on bell pepper (Capsicum annum L.) to apply Alexnet, GoogLeNet, InceptionV3, and Xception. Researchers in [67] took advantage of existing databases and image repositories and used the DeepWeeds dataset to train SSD- MobileNet, SSD-InceptionV2, Faster R-CNN, CenterNet, EfficientDet, RetinaNet, and YOLOv4 models. [77] used the agri_data dataset available on Kaggle, on Falsethistle grass and walnut (Carya illinoinensis), to train VGGNet, VGG16, VGG19, and SVM models. In summary, Table 2 presents an analysis of the articles and groups them according to the technology used for the acquisition of the images and the most frequently used CNN architectures.

5. Conclusions

In this systematic review, following the guidelines of the PRISMA statement, 61 scientific articles on the detection of weeds using CNN were analysed, using the scientific resources platform of the Spanish Foundation for Science and Technology (FECYT), which has access to the following databases: Web of Science and Scopus. The review covers the last five years (2019-2023).
The CNN architectures were identified as the most applied for weed segmentation, detection, and classification followed by the YOLO family of algorithms with its different versions. The tVGG architecture and its versions are the third most used, followed by the ResNet, Faster R-CNNN. The Alexnet and MobileNet were the least used architectures.
The technology used to acquire the images used to train and validate the CNNs was identified. Fixed digital cameras, reflex type, or low-cost cameras which allow a wide configuration and higher image quality, are the main technology applied. Cameras integrated in UAVs are also used, despite their speed limitation and lower configurations compared to an SLR camera, which can be RGB or multispectral. Smartphones with high resolutions camera have also been used, with the drawbacks of low processing speed.
In this review, it was noted that some authors used free databases or databases from previous studies to avoid image acquisition and the difficulties associated with it.
As future work, it is expected that the review will be extended to focus on the search for the most appropriate CNN architectures for weed detection and classification. Finally, it is hoped that this review article will help researchers to create new technological developments that will improve weed detection.

Author Contributions

Conceptualization, O.L.G.-N., A.C-G. and L.M.N.-G.; methodology, O.L.G.-N., A.C-G. and L.M.N.-G.; software, O.L.G.-N.; validation, O.L.G.-N. and L.M.N.-G.; formal analysis, O.L.G.-N., A.C-G. and L.M.N.-G.; investigation, O.L.G.-N. and L.M.N.-G.; resources, L.M.N.-G.; writing—original draft preparation, O.L.G.-N. and L.M.N.-G.; writing—review and editing, O.L.G.-N., A.C-G. and L.M.N.-G.; supervision, O.L.G.-N., A.C-G. and L.M.N.-G.; project administration, A.C-G. and L.M.N.-G.; funding acquisition, A.C-G. and L.M.N.-G. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the European Union through the Horizon Europe Program (HORIZON-CL6-2022-FARM2FORK-01) under project ‘Agro-ecological strategies for resilient farming in West Africa (CIRAWA)’. Oscar Leonardo García-Navarrete has been financed under the call for University of Valladolid predoctoral contracts, co-financed by Banco Santander.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

The authors thank Research Group TADRUS of the Department of Agricultural and Forestry Engineering, University of Valladolid.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. World Population Prospects 2022 Data Sources; 2022;
  2. Rajcan, I.; Swanton, C.J. Understanding Maize–Weed Competition: Resource Competition, Light Quality and the Whole Plant. Field Crops Res 2001, 71, 139–150. [CrossRef]
  3. Iqbal, N.; Manalil, S.; Chauhan, B.S.; Adkins, S.W. Investigation of Alternate Herbicides for Effective Weed Management in Glyphosate-Tolerant Cotton. Arch Agron Soil Sci 2019, 65, 1885–1899. [CrossRef]
  4. Viì, M.; Williamson, M.; Lonsdale, M. Competition Experiments on Alien Weeds with Crops: Lessons for Measuring Plant Invasion Impact?; 2004; Vol. 6; [CrossRef]
  5. Holt, J.S. Principles of Weed Management in Agroecosystems and Wildlands. Weed Technology 2004, 18, 1559 – 1562. [CrossRef]
  6. Liu, W.; Xu, L.; Xing, J.; Shi, L.; Gao, Z.; Yuan, Q. Research Status of Mechanical Intra-Row Weed Control in Row Crops. Journal of Agricultural Mechanization Research 2017, 33, 243–250.
  7. Liu, B.; Bruch, R. Weed Detection for Selective Spraying: A Review. Current Robotics Reports 2020, 1, 19–26. [CrossRef]
  8. Hasan, A.S.M.M.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G.K. A Survey of Deep Learning Techniques for Weed Detection from Images. Comput Electron Agric 2021, 184. [CrossRef]
  9. Al-Badri, A.H.; Ismail, N.A.; Al-Dulaimi, K.; Salman, G.A.; Khan, A.R.; Al-Sabaawi, A.; Salam, M.S.H. Classification of Weed Using Machine Learning Techniques: A Review—Challenges, Current and Future Potential Techniques. Journal of Plant Diseases and Protection 2022, 129, 745–768. [CrossRef]
  10. Rai, N.; Zhang, Y.; Ram, B.G.; Schumacher, L.; Yellavajjala, R.K.; Bajwa, S.; Sun, X. Applications of Deep Learning in Precision Weed Management: A Review. Comput Electron Agric 2023, 206. [CrossRef]
  11. Mahmudul Hasan, A.S.M.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G.K. Weed Recognition Using Deep Learning Techniques on Class-Imbalanced Imagery. Crop Pasture Sci 2022. [CrossRef]
  12. Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A Compilation of UAV Applications for Precision Agriculture. Computer Networks 2020, 172, 107148. [CrossRef]
  13. Chen, D.; Lu, Y.; Li, Z.; Young, S. Performance Evaluation of Deep Transfer Learning on Multi-Class Identification of Common Weed Species in Cotton Production Systems. Comput Electron Agric 2022, 198, 107091. [CrossRef]
  14. Olsen, A.; Konovalov, D.A.; Philippa, B.; Ridd, P.; Wood, J.C.; Johns, J.; Banks, W.; Girgenti, B.; Kenny, O.; Whinney, J.; et al. DeepWeeds: A Multiclass Weed Species Image Dataset for Deep Learning. Sci Rep 2019, 9, 2058. [CrossRef]
  15. Suh, H.K.; IJsselmuiden, J.; Hofstee, J.W.; van Henten, E.J. Transfer Learning for the Classification of Sugar Beet and Volunteer Potato under Field Conditions. Biosyst Eng 2018, 174, 50–65. [CrossRef]
  16. Wu, H.; Wang, Y.; Zhao, P.; Qian, M. Small-Target Weed-Detection Model Based on YOLO-V4 with Improved Backbone and Neck Structures. Precis Agric 2023, 24, 2149–2170. [CrossRef]
  17. Subeesh, A.; Bhole, S.; Singh, K.; Chandel, N.S.; Rajwade, Y.A.; Rao, K.V.R.; Kumar, S.P.; Jat, D. Deep Convolutional Neural Network Models for Weed Detection in Polyhouse Grown Bell Peppers. Artificial Intelligence in Agriculture 2022, 6, 47–54. [CrossRef]
  18. Zhuang, F.; Qi, Z.; Duan, K.; Xi, D.; Zhu, Y.; Zhu, H.; Xiong, H.; He, Q. A Comprehensive Survey on Transfer Learning. 2019. [CrossRef]
  19. Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet Classification with Deep Convolutional Neural Networks; 2012; [CrossRef]
  20. Espejo-Garcia, B.; Mylonas, N.; Athanasakos, L.; Fountas, S.; Vasilakoglou, I. Towards Weeds Identification Assistance through Transfer Learning. Comput Electron Agric 2020, 171. [CrossRef]
  21. Tan, C.; Sun, F.; Kong, T.; Zhang, W.; Yang, C.; Liu, C. A Survey on Deep Transfer Learning. 2018. [CrossRef]
  22. Yan, X.; Deng, X.; Jin, J. Classification of Weed Species in the Paddy Field with DCNN-Learned Features. In Proceedings of the 2020 IEEE 5th Information Technology and Mechatronics Engineering Conference (ITOEC); 2020; pp. 336–340. [CrossRef]
  23. Espejo-Garcia, B.; Mylonas, N.; Athanasakos, L.; Vali, E.; Fountas, S. Combining Generative Adversarial Networks and Agricultural Transfer Learning for Weeds Identification. Biosyst Eng 2021, 204, 79–89. [CrossRef]
  24. Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going Deeper with Convolutions. 2014. [CrossRef]
  25. Simonyan, K.; Zisserman, A. Very Deep Convolutional Networks for Large-Scale Image Recognition. 2014. [CrossRef]
  26. He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition; 2015; [CrossRef]
  27. Girshick, R. Fast R-CNN. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV); 2015; pp. 1440–1448. [CrossRef]
  28. Huang, G.; Liu, Z.; van der Maaten, L.; Weinberger, K.Q. Densely Connected Convolutional Networks. 2016. [CrossRef]
  29. Howard, A.G.; Zhu, M.; Chen, B.; Kalenichenko, D.; Wang, W.; Weyand, T.; Andreetto, M.; Adam, H. MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. 2017. [CrossRef]
  30. Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. 2015. [CrossRef]
  31. Jocher, G.; Chaurasia, A.; Stoken, A.; Borovec, J.; NanoCode012; Kwon, Y.; Michael, K.; TaoXie; Fang, J.; imyhxy; et al. Ultralytics/Yolov5: V7.0 - YOLOv5 SOTA Realtime Instance Segmentation 2022. [CrossRef]
  32. Wang, C.-Y.; Bochkovskiy, A.; Liao, H.-Y.M. YOLOv7: Trainable Bag-of-Freebies Sets New State-of-the-Art for Real-Time Object Detectors. 2022. [CrossRef]
  33. Jocher, G.; Chaurasia, A.; Qiu, J. YOLO by Ultralytics 2023.
  34. Urrútia, G.; Bonfill, X. PRISMA Declaration: A Proposal to Improve the Publication of Systematic Reviews and Meta-Analyses. Med Clin (Barc) 2010, 135, 507–511. [CrossRef]
  35. Elnemr, H.A. Convolutional Neural Network Architecture for Plant Seedling Classification. International Journal of Advanced Computer Science and Applications 2019, 10, 319–325. [CrossRef]
  36. Rasti, P.; Ahmad, A.; Samiei, S.; Belin, E.; Rousseau, D. Supervised Image Classification by Scattering Transform with Application Toweed Detection in Culture Crops of High Density. Remote Sens (Basel) 2019, 11. [CrossRef]
  37. Yu, J.; Schumann, A.W.; Cao, Z.; Sharpe, S.M.; Boyd, N.S. Weed Detection in Perennial Ryegrass With Deep Learning Convolutional Neural Network. Front Plant Sci 2019, 10. [CrossRef]
  38. Asad, M.H.; Bais, A. Weed Detection in Canola Fields Using Maximum Likelihood Classification and Deep Convolutional Neural Network. Information Processing in Agriculture 2020, 7, 535–545. [CrossRef]
  39. Bah, M.D.; Hafiane, A.; Canals, R. CRowNet: Deep Network for Crop Row Detection in UAV Images. IEEE Access 2020, 8, 5189–5200. [CrossRef]
  40. Gao, J.; French, A.P.; Pound, M.P.; He, Y.; Pridmore, T.P.; Pieters, J.G. Deep Convolutional Neural Networks for Image-Based Convolvulus Sepium Detection in Sugar Beet Fields. Plant Methods 2020, 16. [CrossRef]
  41. Gupta, K.; Rani, R.; Bahia, N.K. Plant-Seedling Classification Using Transfer Learning-Based Deep Convolutional Neural Networks. INTERNATIONAL JOURNAL OF AGRICULTURAL AND ENVIRONMENTAL INFORMATION SYSTEMS 2020, 11, 25–40. [CrossRef]
  42. Mora-Fallas, A.; Goeau, H.; Joly, A.; Bonnet, P.; Mata-Montero, E. Instance Segmentation for Automated Weeds and Crops Detection in Farmlands. TECNOLOGIA EN MARCHA 2020, 33, 13–17. [CrossRef]
  43. Osorio, K.; Puerto, A.; Pedraza, C.; Jamaica, D.; Rodríguez, L. A Deep Learning Approach for Weed Detection in Lettuce Crops Using Multispectral Images. AgriEngineering 2020, 2, 471–488. [CrossRef]
  44. Parico, A.I.B.; Ahamed, T. An Aerial Weed Detection System for Green Onion Crops Using the You Only Look Once (YOLOv3) Deep Learning Algorithm. Engineering in Agriculture, Environment and Food 2020, 13, 42–48. [CrossRef]
  45. Sivakumar, A.N. V; Li, J.; Scott, S.; Psota, E.; Jhala, A.J.; Luck, J.D.; Shi, Y. Comparison of Object Detection and Patch-Based Classification Deep Learning Models on Mid-to Late-Season Weed Detection in UAV Imagery. Remote Sens (Basel) 2020, 12. [CrossRef]
  46. Haq, M.A. CNN Based Automated Weed Detection System Using UAV Imagery. Computer Systems Science and Engineering 2021, 42, 837–849. [CrossRef]
  47. Hennessy, P.J.; Esau, T.J.; Farooque, A.A.; Schumann, A.W.; Zaman, Q.U.; Corscadden, K.W. Hair Fescue and Sheep Sorrel Identification Using Deep Learning in Wild Blueberry Production. Remote Sens (Basel) 2021, 13, 1–18. [CrossRef]
  48. Hu, C.; Thomasson, J.A.; Bagavathiannan, M. V A Powerful Image Synthesis and Semi-Supervised Learning Pipeline for Site-Specific Weed Detection. Comput Electron Agric 2021, 190. [CrossRef]
  49. Jabir, B.; Falih, N.; Rahmani, K. Accuracy and Efficiency Comparison of Object Detection Open-Source Models. International journal of online and biomedical engineering 2021, 17, 165–184. [CrossRef]
  50. Khan, S.; Tufail, M.; Khan, M.T.; Khan, Z.A.; Anwar, S. Deep Learning-Based Identification System of Weeds and Crops in Strawberry and Pea Fields for a Precision Agriculture Sprayer. Precis Agric 2021, 22, 1711–1727. [CrossRef]
  51. Moazzam, S.I.; Khan, U.S.; Qureshi, W.S.; Tiwana, M.I.; Rashid, N.; Alasmary, W.S.; Iqbal, J.; Hamza, A. A Patch-Image Based Classification Approach for Detection of Weeds in Sugar Beet Crop. IEEE Access 2021, 9, 121698–121715. [CrossRef]
  52. Urmashev, B.; Buribayev, Z.; Amirgaliyeva, Z.; Ataniyazova, A.; Zhassuzak, M.; Turegali, A. Development of a Weed Detection System Using Machine Learning and Neural Network Algorithms. Eastern-European Journal of Enterprise Technologies 2021, 6. [CrossRef]
  53. Xie, S.; Hu, C.; Bagavathiannan, M.; Song, D. Toward Robotic Weed Control: Detection of Nutsedge Weed in Bermudagrass Turf Using Inaccurate and Insufficient Training Data. IEEE Robot Autom Lett 2021, 6, 7365–7372. [CrossRef]
  54. Xu, K.; Zhu, Y.; Cao, W.; Jiang, X.; Jiang, Z.; Li, S.; Ni, J. Multi-Modal Deep Learning for Weeds Detection in Wheat Field Based on RGB-D Images. Front Plant Sci 2021, 12. [CrossRef]
  55. Al-Badri, A.H.; Ismail, N.A.; Al-Dulaimi, K.; Rehman, A.; Abunadi, I.; Bahaj, S.A. Hybrid CNN Model for Classification of Rumex Obtusifolius in Grassland. IEEE ACCESS 2022, 10, 90940–90957. [CrossRef]
  56. Babu, V.S.; Ram, N. V Deep Residual CNN with Contrast Limited Adaptive Histogram Equalization for Weed Detection in Soybean Crops. Traitement du Signal 2022, 39, 717–722. [CrossRef]
  57. Chen, J.; Wang, H.; Zhang, H.; Luo, T.; Wei, D.; Long, T.; Wang, Z. Weed Detection in Sesame Fields Using a YOLO Model with an Enhanced Attention Mechanism and Feature Fusion. Comput Electron Agric 2022, 202. [CrossRef]
  58. G C, S.; Koparan, C.; Ahmed, M.R.; Zhang, Y.; Howatt, K.; Sun, X. A Study on Deep Learning Algorithm Performance on Weed and Crop Species Identification under Different Image Background. Artificial Intelligence in Agriculture 2022, 6, 242–256. [CrossRef]
  59. Hennessy, P.J.; Esau, T.J.; Schumann, A.W.; Zaman, Q.U.; Corscadden, K.W.; Farooque, A.A. Evaluation of Cameras and Image Distance for CNN-Based Weed Detection in Wild Blueberry. Smart Agricultural Technology 2022, 2. [CrossRef]
  60. Jabir, B.; Falih, N. Deep Learning-Based Decision Support System for Weeds Detection in Wheat Fields. International Journal of Electrical and Computer Engineering 2022, 12, 816–825. [CrossRef]
  61. Liu, S.; Jin, Y.; Ruan, Z.; Ma, Z.; Gao, R.; Su, Z. Real-Time Detection of Seedling Maize Weeds in Sustainable Agriculture. Sustainability (Switzerland) 2022, 14. [CrossRef]
  62. Mohammed, H.; Tannouche, A.; Ounejjar, Y. Weed Detection in Pea Cultivation with the Faster RCNN ResNet 50 Convolutional Neural Network. Revue d’Intelligence Artificielle 2022, 36, 13–18. [CrossRef]
  63. Nasiri, A.; Omid, M.; Taheri-Garavand, A.; Jafari, A. Deep Learning-Based Precision Agriculture through Weed Recognition in Sugar Beet Fields. Sustainable Computing: Informatics and Systems 2022, 35. [CrossRef]
  64. Razfar, N.; True, J.; Bassiouny, R.; Venkatesh, V.; Kashef, R. Weed Detection in Soybean Crops Using Custom Lightweight Deep Learning Models. J Agric Food Res 2022, 8. [CrossRef]
  65. Saboia, H. de S.; Mion, R.L.; Silveira, A. de O.; Mamiya, A.A. REAL-TIME SELECTIVE SPRAYING FOR VIOLA ROPE CONTROL IN SOYBEAN AND COTTON CROPS USING DEEP LEARNING. ENGENHARIA AGRICOLA 2022, 42. [CrossRef]
  66. Saleem, M.H.; Potgieter, J.; Arif, K.M. Weed Detection by Faster RCNN Model: An Enhanced Anchor Box Approach. Agronomy 2022, 12, 1580. [CrossRef]
  67. Saleem, M.H.; Velayudhan, K.K.; Potgieter, J.; Arif, K.M. Weed Identification by Single-Stage and Two-Stage Neural Networks: A Study on the Impact of Image Resizers and Weights Optimization Algorithms. Front Plant Sci 2022, 13. [CrossRef]
  68. Sapkota, B.B.; Hu, C.; Bagavathiannan, M. V Evaluating Cross-Applicability of Weed Detection Models Across Different Crops in Similar Production Environments. Front Plant Sci 2022, 13. [CrossRef]
  69. Sapkota, B.B.; Popescu, S.; Rajan, N.; Leon, R.G.; Reberg-Horton, C.; Mirsky, S.; Bagavathiannan, M. V Use of Synthetic Images for Training a Deep Learning Model for Weed Detection and Biomass Estimation in Cotton. Sci Rep 2022, 12. [CrossRef]
  70. Subeesh, A.; Bhole, S.; Singh, K.; Chandel, N.S.; Rajwade, Y.A.; Rao, K.V.R.; Kumar, S.P.; Jat, D. Deep Convolutional Neural Network Models for Weed Detection in Polyhouse Grown Bell Peppers. Artificial Intelligence in Agriculture 2022, 6, 47–54. [CrossRef]
  71. Tannouche, A.; Gaga, A.; Boutalline, M.; Belhouideg, S. Weeds Detection Efficiency through Different Convolutional Neural Networks Technology. International Journal of Electrical and Computer Engineering 2022, 12, 1048–1055. [CrossRef]
  72. Valente, J.; Hiremath, S.; Ariza-Sentís, M.; Doldersum, M.; Kooistra, L. Mapping of Rumex Obtusifolius in Nature Conservation Areas Using Very High Resolution UAV Imagery and Deep Learning. International Journal of Applied Earth Observation and Geoinformation 2022, 112. [CrossRef]
  73. Yang, J.; Bagavathiannan, M.; Wang, Y.; Chen, Y.; Yu, J. A Comparative Evaluation of Convolutional Neural Networks, Training Image Sizes, and Deep Learning Optimizers for Weed Detection in Alfalfa. WEED TECHNOLOGY 2022, 36, 512–522. [CrossRef]
  74. Ajayi, O.G.; Ashi, J. Effect of Varying Training Epochs of a Faster Region-Based Convolutional Neural Network on the Accuracy of an Automatic Weed Classification Scheme. Smart Agricultural Technology 2023, 3, 100128. [CrossRef]
  75. Almalky, A.M.; Ahmed, K.R. Deep Learning for Detecting and Classifying the Growth Stages of Consolida Regalis Weeds on Fields. Agronomy 2023, 13. [CrossRef]
  76. Arif, S.; Hussain, R.; Ansari, N.M.; Rauf, W. A Novel Hybrid Feature Method for Weeds Identification in the Agriculture Sector. Research in Agricultural Engineering 2023, 69, 132–142. [CrossRef]
  77. Bidve, V.; Mane, S.; Tamkhade, P.; Pakle, G. Weed Detection by Using Image Processing. Indonesian Journal of Electrical Engineering and Computer Science 2023, 30, 341–349. [CrossRef]
  78. Devi, B.S.; Sandhya, N.; Chatrapati, K.S. WeedFocusNet: A Revolutionary Approach Using the Attention-Driven ResNet152V2 Transfer Learning. International Journal on Recent and Innovation Trends in Computing and Communication 2023, 11, 334–341. [CrossRef]
  79. Fan, X.; Chai, X.; Zhou, J.; Sun, T. Deep Learning Based Weed Detection and Target Spraying Robot System at Seedling Stage of Cotton Field. Comput Electron Agric 2023, 214. [CrossRef]
  80. Gallo, I.; Rehman, A.U.; Dehkordi, R.H.; Landro, N.; La Grassa, R.; Boschetti, M. Deep Object Detection of Crop Weeds: Performance of YOLOv7 on a Real Case Dataset from UAV Images. Remote Sens (Basel) 2023, 15, 539. [CrossRef]
  81. Husham Al-Badri, A.; Azman Ismail, N.; Al-Dulaimi, K.; Ahmed Salman, G.; Sah Hj Salam, M. Adaptive Non-Maximum Suppression for Improving Performance of Rumex Detection. Expert Syst Appl 2023, 219. [CrossRef]
  82. Janneh, L.L.; Zhang, Y.; Cui, Z.; Yang, Y. Multi-Level Feature Re-Weighted Fusion for the Semantic Segmentation of Crops and Weeds. Journal of King Saud University - Computer and Information Sciences 2023, 35, 101545. [CrossRef]
  83. Jin, X.; Liu, T.; McCullough, P.E.; Chen, Y.; Yu, J. Evaluation of Convolutional Neural Networks for Herbicide Susceptibility-Based Weed Detection in Turf. Front Plant Sci 2023, 14. [CrossRef]
  84. Kansal, I.; Khullar, V.; Verma, J.; Popli, R.; Kumar, R. IoT-Fog-Enabled Robotics-Based Robust Classification of Hazy and Normal Season Agricultural Images for Weed Detection. Paladyn 2023, 14. [CrossRef]
  85. Modi, R.U.; Kancheti, M.; Subeesh, A.; Raj, C.; Singh, A.K.; Chandel, N.S.; Dhimate, A.S.; Singh, M.K.; Singh, S. An Automated Weed Identification Framework for Sugarcane Crop: A Deep Learning Approach. CROP PROTECTION 2023, 173. [CrossRef]
  86. Moreno, H.; Gómez, A.; Altares-López, S.; Ribeiro, A.; Andújar, D. Analysis of Stable Diffusion-Derived Fake Weeds Performance for Training Convolutional Neural Networks. Comput Electron Agric 2023, 214. [CrossRef]
  87. Ong, P.; Teo, K.S.; Sia, C.K. UAV-Based Weed Detection in Chinese Cabbage Using Deep Learning. Smart Agricultural Technology 2023, 4. [CrossRef]
  88. Patel, J.; Ruparelia, A.; Tanwar, S.; Alqahtani, F.; Tolba, A.; Sharma, R.; Raboaca, M.S.; Neagu, B.C. Deep Learning-Based Model for Detection of Brinjal Weed in the Era of Precision Agriculture. Computers, Materials and Continua 2023, 77, 1281–1301. [CrossRef]
  89. Rajeena, P.P.F.; Ismail, W.N.; Ali, M.A.S. A Metaheuristic Harris Hawks Optimization Algorithm for Weed Detection Using Drone Images. APPLIED SCIENCES-BASEL 2023, 13. [CrossRef]
  90. Reddy, B.S.; Neeraja, S. An Optimal Superpixel Segmentation Based Transfer Learning Using AlexNet–SVM Model for Weed Detection. International Journal of System Assurance Engineering and Management 2023. [CrossRef]
  91. Saqib, M.A.; Aqib, M.; Tahir, M.N.; Hafeez, Y. Towards Deep Learning Based Smart Farming for Intelligent Weeds Management in Crops. Front Plant Sci 2023, 14. [CrossRef]
  92. Shahi, T.B.; Dahal, S.; Sitaula, C.; Neupane, A.; Guo, W. Deep Learning-Based Weed Detection Using UAV Images: A Comparative Study. Drones 2023, 7. [CrossRef]
  93. Singh, V.; Gourisaria, M.K.; Harshvardhan, G.M.; Choudhury, T. Weed Detection in Soybean Crop Using Deep Neural Network. Pertanika J Sci Technol 2023, 31, 401–423. [CrossRef]
  94. Yu, H.; Che, M.; Yu, H.; Ma, Y. Research on Weed Identification in Soybean Fields Based on the Lightweight Segmentation Model DCSAnet. Front Plant Sci 2023, 14. [CrossRef]
  95. Zhuang, J.; Jin, X.; Chen, Y.; Meng, W.; Wang, Y.; Yu, J.; Muthukumar, B. Drought Stress Impact on the Performance of Deep Convolutional Neural Networks for Weed Detection in Bahiagrass. Grass and Forage Science 2023, 78, 214–223. [CrossRef]
Figure 1. The basic structure of a CNN model.
Figure 1. The basic structure of a CNN model.
Preprints 96390 g001
Figure 2. Flow chart illustrating the number of articles included in the systematic review according to the PRISMA process.
Figure 2. Flow chart illustrating the number of articles included in the systematic review according to the PRISMA process.
Preprints 96390 g002
Figure 3. Number of publications for years.
Figure 3. Number of publications for years.
Preprints 96390 g003
Figure 4. Number of publications for technology.
Figure 4. Number of publications for technology.
Preprints 96390 g004
Figure 5. The frequency of use of the different CNN architectures found in this study.
Figure 5. The frequency of use of the different CNN architectures found in this study.
Preprints 96390 g005
Table 1. Summary of articles describing CNN architecture, technology, and species.
Table 1. Summary of articles describing CNN architecture, technology, and species.
No. Year Reference CNN architecture Technology Species
1 2019 [35] Specifically designed CNN Camera Canon 600D uses a dataset group from Aarhus University in collaboration with Southern Denmark University. Black grass (Alopecurus myosuroides), Charlock (Sinapis arvensis), Cleavers (Galium aparine), Chickweed (Stellaria media), Common wheat (Triticum aestivum), Fat hen (Chenopodium album), Loose silky-bent (Apera spica-venti) and Maize (Zea mays), Scentless mayweed (Tripleurospermum perforatum), Shepherd's purse (Capsella bursapastoris), Small-flowered Cranesbill (Geranium pusillum), Sugar beet (Beta vulgaris)
2 2019 [36] Design-specific CNN with SVM 20 MP JAI camera with a spatial resolution of 5120 × 3840 pixels, mounted with a 35 mm lens. Mache lettuce (Valerianella locusta f.)
3 2019 [37] AlexNet, GoogLeNet and VGGNet Sony Cyber-Shot camera and a Canon EOS Rebel T6 digital camera Perennial ryegrass, dandelion (Taraxacum officinale Web.), ground ivy (Glechoma hederacea L.), and spotted spurge (Euphorbia maculata L.)
4 2020 [38] SegNet, UNet, VGG16 and ResNet-50. Nikon D610 Quad Camera Canola (Brassica napus)
5 2020 [39] CRowNet is based on SegNet and the Hough transform Parrot RedEdge -M multispectral camera Beet (Beta vulgaris) and Corn (Zea mays)
6 2020 [40] YOLOv3 and YOLOv3-Tiny Camera Nikon D7200 Sugar beet (Beta vulgaris subsp) and C. sepium
7 2020 [41] ResNet50, VGG16, VGG19, Xception and MobileNetV2. Camera Canon 600D uses the dataset group of the Aarhus University Sugar Beet, Black grass, Charlock, Cleavers, Common Chickweed, Common Wheat, Fat Hen, Loosy Silky-bent, Maize, Scentless Mayweed, Shepherd's purse and Small-flowered cranesbill
8 2020 [42] Mask R-CNN Unspecified digital cameras and smartphones Zea mays (corn), Phaseolus vulgaris (green bean), Brassica nigra, Matricaria chamomilla, Lolium perenne, and Chenopodium album
9 2020 [43] YOLOv3, Mask R-CNN, and CNN with SVM-HOG (histograms of oriented gradients) Mavic Pro with the Parrot Sequoia multispectral camera Lettuce (Lactuca sativa)
10 2020 [44] YOLO-WEED, based on YOLOv3 DJI Phantom 3 Pro Green Onion (Allium fistulosum)
11 2020 [45] Faster R-CNN and Single Shot Detector (SSD) DJI Matrice 600 pro with Zenmuse X5R camera soybean leaves (Glycine max), water hemp (Amaranthus tuberculatus), Palmer amaranthus (Amaranthus palmeri), common lamb's quarters (Chenopodium album), velvetleaf (Abutilon theophrasti) and foxtail species.
12 2021 [46] CNN-LVQ-specific design based on Learning Vector Quantization (LVQ) DJI Phantom 3 Professional Soybean (Glycine max), grass, and broadleaf weeds
13 2021 [47] YOLOv3, YOLOv3-Tiny and YOLOv3-Tiny-PRN Digital cameras with resolutions between 4,000 × 3,000 and 6,000 × 4,000 pixels, not specified Hair Fescue, Sheep Sorrel, and Blueberry (Vaccinium sect. Cyanococcus)
14 2021 [48] Faster R-CNN and ResNet50 Unspecified top digital camera Cotton (Gossypium hirsutum), Bellflower (Ipomoea spp .), Palmer Amaranth (Amaranthus palmeri), prostrate spurge (Euphorbia maculata) and Soybean (Glycine max).
15 2021 [49] Detectron2, EfficientDet, YOLOv5, and Faster R-CNN. Camera Nikon 7000 professional Phalaris Paradoxa and Convolvulus
16 2021 [50] Faster R-CNN, ResNet-101, VGG16 and Yolov3 DJI Spark with an onboard camera with1/2.3" CMOS sensor, 12-megapixel and FOV 81.9° 25 mm f/2.6 lens Peas (Pisum sativum), strawberries (Fragaria ananassa), and prickly grass (Eleusine indica)
17 2021 [51] VGG- Beet based on VGG16 DJI Phantom 4 Sugar beet (Beta vulgaris subsp)
18 2021 [52] YOLOv5 and Classic K-Nearest Neighbors, Random Forest, and Decision Algorithms tree Uses Non-specific Dataset Ambrosia, Amaranthus, Bindweed, Bromus and Quinoa
19 2021 [53] Faster R-CNN and Mask R-CNN Cameras NikonTM D3300 and Canon EOS Rebel T7 Nutsedge weed and Bermudagrass Turf
20 2021 [54] Faster R-CNN and VGG16 RealSense D415 Depth Camera (99mm × 20mm × 23mm) Wheat (Triticum aestivum L.), Alopecurus aequalis, Poa annua, Bromus japonicus, E. crusgalli, Amaranthus retroflexus, and C. bursa-pastoris
21 2022 [55] VGG16, ResNet-50 and Inception-V3 Camera SONY Cybershot DSC-60 Rumex obtusifolius
22 2022 [56] AlexNet vs VGG-16 Unspecified drone camera Soybean (Glycine max), broadleaf weeds and grasses.
23 2022 [57] YOLOv4, YOLO- sesame. DahengMER-132-43U3C-l camera, 1/3” CCD type sensor, with a resolution of 1292 × 964 Sesame (Sesamum indicum)
24 2022 [58] VGG16 and ResNet16 Canon EOS T7 digital cameras Horseweed, Palmer Amaranth, Redroot Pigweed, Ragweed, Waterhemp, Canola, Kochia and Sugar beets
25 2022 [59] YOLOv3-Tiny Canon T6 DSLR 7 camera, LG G6 smartphone, and Logitech c920 camera Blueberries (Vaccinium corymbosum)
26 2022 [60] YOLOv5 Nikon D7200 Digital Single Lens Reflex (DSLR) Camera Wheat (Triticum), Monocotylodone weed (Convolvulus), and Dicotyledonous weed (Phalaris)
27 2022 [61] Faster R-CNN, SSD, YOLOv3, YOLOv3-tiny and YOLOv4-tiny phone with a 12-megapixel resolution Corn (Zea mays)
28 2022 [62] Faster R-CNN and ResNet Huawei Y7 prime digital camera phone Pea (Pisum sativum)
29 2022 [63] UNet based on ResNet50 FotoClip 2164 digital camera 480 × 640-pixel resolution Beta vulgaris subsp and weed
30 2022 [64] MobileNetV2, ResNet50 DJI Phantom 3 Professional - Uses dataset soybean (Glycine max), grass, and broadleaf weeds
31 2022 [65] YOLO-v3 and faster R-CNN Nikon P250 semi-professional camera Soybean (Glycine max), cotton (Gossypium), and Viola Rope
32 2022 [66] Faster R-CNN, ResNet-101, YOLOv4, SSD-Inception-v2, MobileNet, ResNet-50, EfficientDet and CenterNet Uses Datasets DeepWeeds Chinee Apple, Lantana, Prickly Acacia, Parthenium, Parkinsonia, Rubber vine, Siam Weed, Snake Weed
33 2022 [67] SSD- MobileNet, SSD-InceptionV2, Faster R-CNN, CenterNet, EfficientDet, RetinaNet and YOLOv4 Uses Datasets DeepWeeds Chinee Apple, Lantana, Negative, Prickly Acacia, Parthenium, Parkinsonia, Rubber vine, Siam Weed and Snake Weed
34 2022 [68] YOLOv4 and Faster R-CNN. FUJIFILM GFX100 100-megapixel camera with a multi-copter drone Hylio AG-110 Cotton (Gossypium), Soybean leaves (Glycine max), bluebells (Ipomoea spp .) composed of tall bluebells (Ipomoea purpurea) and ivy-leaf bluebells (Ipomoea hederacea), Texas millet (Urochloa texana), johnsongrass (Sorghum halepense), Palmer amaranth (Amaranthus palmeri), the prostrate spurge (Euphorbia humistrata) and brown panic (Panicum fasiculatum).
35 2022 [69] Mask R-CNN and GAN FUJIFILM GFX100 100-megapixel camera with a multi-copter drone Hylio AG-110 Cotton (Gossypium) and grass weeds
36 2022 [70] Alexnet, GoogLeNet, InceptionV3, Xception Xiaomi Mi 11x mobile device camera, 48 MP, f/1.8, 26 mm (wide angle) Peppers (Capsicum annuum)
37 2022 [71] VGGNet (16 and 19), GoogLeNet (Inception V3 and V4) and MobileNet (V1 and V2) Unspecified tractor-mounted digital cameras Bean (Phaseolus vulgaris) and Beet (Beta vulgaris)
38 2022 [72] VGG, Resnet, DenseNet, ShuffleNet, MobileNet, EfficientNet and MNASNet DJI Phantom 3 Professional Rumex obtusifolius
39 2022 [73] AlexNet, GoogLeNet, VGGNet and ResNet Panasonic DMC-ZS110 Digital Camera Alfalfa (Medicago sativa), broadleaf weeds and grasses
40 2023 [74] Faster R-CNN DJI Phantom 4 equipped with a 12-megapixel RGB camera Saccharum officinarum, Spinacia oleracea, Capsicum annuum, Musa paradisiaca and Weeds
41 2023 [75] YOLOv5, RetinaNet, and Faster R-CNN UAV digital camera 1″ CMOS Sensor, Effective Pixels: 20 million, Still Image Size 5472 × 3648 Consolidates regalia
42 2023 [76] Hybrid CNN, AlexNet, GoogLeNET, VGG-Net, ResNet, and GAN Raspberry Pi-3 with a Pi camera of version 2.1 Beetroot, Rice, Siam weed, Parkinsonia, and Chinese snakeweed Manzana
43 2023 [77] VGGNet, VGG16, VGG19 and SVM Uses Datasets agri_data available on Kaggle Falsethistle grass, Walnut (Carya illinoinensis)
44 2023 [78] WeedFocusNet based on ResNet152v2 Uses Non-specific Dataset Soybean (Glycine max), broadleaf weed
45 2023 [79] Faster R-CNN and VGG16 SONY IMX386 camera and an Honor Play mobile phone Cotton thistle, purslane, solanumnigrum, sclerochloa dura, Sonchus oleraceus, salsola hill pall, chenopodium album, and convolvulus
46 2023 [80] YOLOv7, YOLOv7-m, YOLOv7-x, YOLOv7-w6, YOLOv7-d6s, YOLOv5, YOLOv4 and Faster R-CNN UAV camera not specified Sugar beet (Beta vulgaris subsp) and weeds (Mercuralis annual)
47 2023 [81] Inception-V3, VGG-16 and ResNet-50 Monochrome Camera (PointGrey GS3-U3-23S6M-C) Uses database. Rumex obtusifolius
48 2023 [82] YOLOv3, YOLOv3-tiny, YOLOv4 and YOLOv4-tiny Camera JAI AD-130GE camera with a resolution of 1296 × 966 pixels Carrots (Daucus carota), Sugar beet (Beta vulgaris subsp), and rice seedlings (Oryza sativa)
49 2023 [83] DenseNet, EfficientNet and ResNet SONY DSC-HX1 digital camera Crabgrass, Dollargrass, Goosegrass, Old World Diamondflower, Purple Nutsedge, Tropical signalgrass, Virginia Buttongrass, and Bermuda Grass
50 2023 [84] 2D-CNN of specific design AUV not specified Soybean (Glycine max), grass, broadleaf weed
51 2023 [85] Alexnet, DarkNet53, GoogLeNet, InceptionV3, ResNet50 and Xception One Plus Nord AC2001 Phone, High Image Quality 3:4 Camera Frame in 48 MP Sony IMX586 with OIS+8 MP Sugar cane (Saccharum officinarum), male goat (Ageratum conyzoides L.), purple nutsedge (Cyperus rotundus L.), scarlet pimpernel (Anagallis arvensis L.), Lepidium didymum (Coronopus didymus L.), field creeper (Convolvulus arvensis L.), ragweed parthenium (Parthenium hysterophorus L.), prickly thistle (Sonchus asper L.), cornspur (Spergula arvensis L.) and Asian escalistem (Elytraria acaulis L.).
52 2023 [86] Yolov8l, RetinaNet, and GAN Canon PowerShot SX540 HS integrating a 20.3-megapixel CMOS Solanum lycopersicum, Solanum nigrum L.; Portulaca oleracea L. and Setaria Verticillata L.
53 2023 [87] AlexNet and CNN-RF specifies X10 drone, equipped with a 20-megapixel resolution camera. Chinese cabbage (Brassica rapa)
54 2023 [88] ResNet-18, YOLOv3, CenterNet, and Faster R-CNN phone 48 M.P. resolution Eggplant (Solanum melongena)
55 2023 [89] DenseHHO is based on Harris Hawk (HHO), DenseNet-121, and DenseNet-201 optimization algorithms. Unspecified drone camera Wheat (Triticum aestivum L.), Rumex crispus and Rumex obtusifolius
56 2023 [90] AlexNet and AlexNet -SVM Uses Datasets " crop and weed detection data with bounding boxes," available on Kaggle Sesame (Sesamum indicum)
57 2023 [91] YOLOv3, YOLOv3-tiny, YOLOv4 and YOLOv4-tiny Logitech H.D. 920c professional webcam with a resolution of 1 M.P. and dimensions of 1280 × 720 Creeping thistle, bindweed, and California poppy.
58 2023 [92] CoFly-WeedDB is based on SegNet, VGG16, ResNet50, DenseNet121, EfficientNetB0 and MobileNetV2 DJI Phantom Pro 4 Cotton (Gossypium), Johnson grass (Sorghum) halepense), bindweed (Convolvulus arvensis) and purslane (Portulaca oleracea)
59 2023 [93] Xception, VGG (16, 19), ResNet (50, 101, 152, 101v2, 152v2), InceptionV3, InceptionResNetV2, MobileNet, MobileNetV2, DenseNet (121, 169, 201), NASNetMobile, NASNetLarge DJI Phantom 3 Professional Soybean leaves (Glycine max)
60 2023 [94] MobileNetV3 and ShuffleNet Huawei mate30 cell phone Soybean leaves (Glycine max), Digitaria sanguinalis L, Scop and Setaria viridis L, Beauv and broadleaf weeds such as Chenopodium glaucum L, Acalypha australis L, and Amaranthus retroflexus L.
61 2023 [95] YOLOv3, Faster R-CNN, AlexNet, GoogLeNet and VGGNet Digital camera SONY DSC-HX1 at a ratio of 16:9 with a resolution of 1920 x 1080 pixels Florida pusley (Richardia scabra L.) and bahia grass (Paspalum natatum Flugge)
Table 2. Summary of CNN architecture and technology used for weed detection over the last five years.
Table 2. Summary of CNN architecture and technology used for weed detection over the last five years.
Technology CNN architectures References
Cameras YOLO [40, 47, 49, 57, 59, 60, 65, 68, 82, 86, 91, 95]
VGG [37, 38, 41, 54, 55, 58, 71, 73, 76, 79, 81, 95]
ResNet [38, 41, 48, 55, 58, 63, 73, 76, 81, 83]
Faster R-CNN [48, 49, 53, 54, 65, 68, 79, 95]
Alexnet [37, 73, 76, 95]
MobileNet [41, 71]
Cameras AUV YOLO [43, 44, 50, 75, 80]
VGG [50, 51, 56, 72, 92, 93]
ResNet [50, 64, 72, 92, 93]
Faster R-CNN [45, 50, 74, 75, 80]
Alexnet [56, 87]
MobileNet [64, 72, 92, 93]
Smartphone YOLO [61, 88]
VGG -
ResNet [62, 85, 88]
Faster R-CNN [61, 62, 88]
Alexnet [70, 85]
MobileNet [94]
Database YOLO [52, 66, 67]
VGG [77]
ResNet [66, 78]
Faster R-CNN [66, 67]
Alexnet [90]
MobileNet [66, 67]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated