Preprint
Article

UAV-LiDAR-based 3D Mapping of Apple Orchards and Automatic Georeferencing of Segmented Apple Tree Locations

Altmetrics

Downloads

234

Views

151

Comments

0

This version is not peer-reviewed

Submitted:

28 March 2023

Posted:

29 March 2023

You are already at the latest version

Alerts
Abstract
In this paper, we propose a system to create high-precision maps using UAV-LiDAR and to determine the location of individual fruit trees (apple trees) on the maps. The system is based on a UAV-LiDAR system that flies over an actual orchard. A UAV was flown over an actual orchard, and the point cloud of the onboard LiDAR and the location information of RTK-GNSS were obtained. The system records the LiDAR point cloud and RTK-GNSS position information. Automated software processes point cloud data offline and Automated software processes point cloud data offline and automatically segments each apple tree in the map. The RTK-GNSS position information is used for the segmented trees. The positional information obtained from RTK-GNSS was georeferenced to the segmented trees without using ground evaluation points. As a sample, location information was obtained from trees using the Quasi-Zenith Satellite System (QZSS) MICHIBIKI. The positional accuracy of the trees was evaluated using the positional information obtained from the Quasi-Zenith Satellite System MICHIBIKI as a reference. As a result, the alignment accuracy was sufficient to identify individual fruit trees.
Keywords: 
Subject: Engineering  -   Other

1. Introduction

Precision agriculture is a technique that utilizes technologies, such as Global Navigation Satellite System (GNSS) and Geographic Information System (GIS) programs, to gather, process and analyze spatial, temporal, and individual data on agricultural ecosystems. This information is then used to determine appropriate management strategies [1,2]. Precision agriculture has been extensively utilized in field crops such as corn, soybeans, and grains [3]. In particular, Unmanned Aircraft Vehicles (UAVs) are particularly useful for time-series observations and can acquire data with high spatial resolution. This makes them highly responsive and advantageous for precision agriculture applications [4]. The analysis of data acquired from UAVs enables pest and disease management, yield estimation, weed detection, and nutritional status evaluation on a field-by-field basis [5]. Crop management using a UAV equipped with a hyperspectral camera to generate a field map divided into a grid pattern has been proposed as an example of agricultural remote sensing using UAVs [6]. This approach allows for precise identification and cartography of crop stress [7] and diseases [8] on a grid-by-grid basis, ultimately resulting in improved crop management. Duan et al. propose that by linking geographic information with rice fields and dividing the fields into grids, the variation in NDVI over the growing season can be captured, enabling yield prediction to be achieved in each divided area [6]. Although precision agriculture has been extensively used in field crops such as corn, soybeans, and grains detection, it has not been widely adopted in horticultural crops such as berries, field vegetables, and orchards [3]. In horticulture, quality analysis is more significant than in other crops. In addition, individual treatments may be applied to individual plants according to spatial or temporal patterns [9]. When fertilizers and pesticides are applied to field crops such as wheat and rice in variable amounts, it is appropriate to manage them on a map delimited by a grid. An example of the urgency to identify individual plants is a Panama disease of bananas. The Panama disease can be detected by hyperspectral cameras, and the diseased trees should be immediately sprayed with pesticides directly or cut down [10]. To achieve this concept, regular crop disease monitoring and identification of individual trees are crucial, and accurate mapping system that incorporates geographic location information for each tree is essential. A proposed method to improve the accuracy of variable fertilizer application in horticultural agriculture involves generating a highly accurate Prescription map of nitrogen fertilizer application by superimposing an NDVI map with aerial images that identify individual trees in an olive grove and account for differences in nitrogen content per square meter [11]. It has been observed that horticultural agriculture necessitates geographical data linkage to individual crops with an accuracy of approximately 1 m for diverse applications, such as crop yield prediction, quality cartography, and product sampling [9].
This paper aims to produce maps of plantation crops, such as palm trees, bananas, and rubber, and to provide location information for each crop automatically with a precision of approximately 1 m. The proposed method involves using UAV-LiDAR to create a three-dimensional map of a farm field, extract trees from the automatically generated map, and geo-reference the extracted trees. To validate the effectiveness of the proposed method, experiments are conducted in an apple orchard, also in the field of horticultural agriculture. Furthermore, the paper aims to construct a system that can be easily adjusted for large plantations by tweaking the parameters.
Photogrammetry combined with SfM (Structure-from-Motion) processing is the most economical way to generate maps for UAV remote sensing, and georeferencing, which relates the local coordinate system of the SfM-generated map to the geographic coordinate system, is mostly done manually using ground control points (GCP) [12,13,14]. However, establishing ground reference points can be burdensome and unsuitable for the rapid and efficient mapping of large areas, such as plantations. Thus, research is being conducted on direct georeferencing, which is a georeferencing method that relies solely on acquired data without the need for ground control points, to overcome the burden and inefficiency of ground control point establishment in mapping large areas such as plantations [15,16]. It has been proposed that SfM with RTK-GNSS equipped UAVs can produce horizontally accurate maps to a few centimeters without ground control points [16]. However, due to the standard deviation of camera projection points, a bias in the vertical component can occur and requires at least one GCP for to cancel the bias [17]. Acquired image data may be limited because we should set appropriate flight paths in order to improve map quality given by SfM [18,19]. On the other hand, LiDAR, which can directly acquire accurate 3D information, has attracted attention in the field of agricultural remote sensing, and the quality of data acquired from UAV-LiDAR has been studied [20]. Štroner et al. aligned point clouds obtained from the inexpensive DJI ZENMUSE L1 scanner mounted on the UAV DJI Matrice 300 with maps created using DJI Terra and ground control points and found that the point clouds in the created maps had errors within 3.5 cm in all directions [21]. Jóźków et al. evaluated the accuracy of maps created with the Velodyne HDL-32E laser scanner mounted on a UAV and showed that the quality factor of the trajectory reconstruction had the greatest impact on accuracy. The results showed that UAS equipped with the Velodyne HDL-32E laser scanner can provide point clouds with absolute positional accuracy of less than 10 cm [22]. UAV-LiDAR has been shown to generate maps with errors of a few centimeters, therefore we adopt it for this study. Research has been conducted on methods to georeference point clouds obtained from LiDAR to determine the location of trees and canopy structures in orchards. Yuan proposed the method for aligning a specific apple tree as a reference point using conventional triangulation techniques [23], while another method automatically performs coordinate transformation during the map generation phase. However, the latter method is limited to when the UAV is moving straight ahead [24].
This paper proposes a method to obtain a coordinate transformation matrix by time synchronizing the self-position estimation results obtained from LiDAR SLAM and the geographic coordinates obtained from RTK-GNSS of the UAV and registering each trajectory with the Iterative Closest Point (ICP) algorithm. The proposed method is independent of the flight path and does not require a ground control point, since the trajectories are matched to each other.

2. Materials and Methods

2.1. Data Collection

The point clouds used in the study were acquired from the Azumai Orchard located in Iwamizawa City, Hokkaido with coordinates (43.15141285267103,141.96582345266313) as shown in Figure 1. The study area covered a 45 x 100 m region with trees spaced approximately 3 m apart and rows spaced approximately 5 m apart. Although the rows were sufficiently far apart, the canopy of the rows partially overlapped. As shown in Figure 1, between the rows of the area was weeded, but the between trees on the row and around trees were unweeded.

2.2. Use Instrument

2.2.1. UAV DJI Matrice 300 RTK

The DJI Matrice 300 RTK utilized in this investigation is a quadcopter outfitted with a GNSS RTK receiver. The fundamental specifications of this UAV are displayed in Table 1.

2.2.2. Laser Scanner OS1-64

OUSTER OS1-64 was installed as LiDAR, which was mounted vertically on the fuselage as shown in Figure 2. The technical details and specifications of the OUSTER OS1-64 are shown in Table 2.
As the LiDAR was mounted perpendicular to the UAV, the point cloud was obstructed by the drone during scanning. This led to a lower number of points being acquired per scan than the maximum specified, resulting in a total of 3,145,728 data points being acquired per scan.

2.3. Acquisition of Standing Tree Location

In this study, ground-truth positions were obtained using the GNSS system MICHIBIKI. MICHIBIKI has CLAS (Centimeter Level Augmentation Service) functions that enable high-precision positioning. The AQLOC, a high-precision GNSS positioning terminal that supports CLAS, provides a horizontal positioning accuracy of 12 cm and a vertical positioning accuracy of 24 cm.
The aim of this paper is to develop a georeferencing system that can accurately locate trees with an accuracy of less than 1 m. In order to achieve this, the antenna was positioned to overlap with the canopy of trees as shown in Figure 3, and rough geographic coordinates of the tree were obtained as reference data.

2.4. Processing Strategy

Initially, high-precision 3D maps were generated using the open-source Autoware NDT-mapping method [25], which is a Simultaneous Localization and Mapping (SLAM) approach based on NDT (Normal Distributions Transform). The recorded point cloud data were processed offline to create the map. MATLAB LiDAR toolbox (2022a) was used to process the point clouds after mapping, and the LiDAR point clouds were normalized in elevation to remove terrain effects. Figure 4 shows the actual processing procedure, which is described in detail in the following subsections.

2.4.1. Point Cloud Normalization

The normalization of the point cloud is performed by separating the ground point cloud from the non-ground point cloud as a preprocessing step. The SMRF algorithm [26] was used to separate the ground point clouds. The SMRF algorithm divides the point cloud data into a grid, calculates the minimum elevation of each grid, and composes a raster image called a minimum elevation surface map based on all the minimum elevation locations. The minimum elevation surface map is subjected to morphological opening using disc-shaped structuring elements. The algorithm calculates the gradient between the surface map and the minimum elevation surface map, and if the difference between them exceeds a certain threshold value, it is considered to be a non-ground surface. The orchard used in this study has a gently sloping ground, and the grid was set to 1 m × 1 m . The slope threshold was set to 0.1 m. The elevation of the separated ground point cloud was normalized by taking the difference of the original point cloud map.

2.4.2. Remove Grass Point Cloud

Based on the gradient magnitude, the SMRF algorithm differentiates between ground point clouds and non-ground point clouds, resulting in tall weeds being categorized as non-ground point clouds. Therefore, a point cloud of weeds is automatically extracted from the non-ground point cloud using the PCA algorithm [27].

2.4.3. Canopy Height Model (CHM) Creation and Individual Tree Detection

A CHM was generated for the point cloud model with the ground and weeds removed, at a resolution of 0.1 m per pixel. Apple trees have an irregular canopy shape and without filtering, many treetops would be detected. 10 × 10 pixel ( 1.0 × 1.0 m ) filtering was applied to detect treetops using the Local Maxima algorithm [28]. Point clouds with a normalized elevation greater than 1 m were selected as treetops.

2.4.4. Segment Individual Trees

The individual tree point clouds were clustered using marker-controlled watershed segmentation [29], in which treetops served as markers.
For accuracy evaluation, the number of trees in the orchard was recorded and manually compared with the reference data and segmentation results. True positives ( T P ), false positives ( F P ), and false negatives ( F N ) were counted, and recall (r), precision (p), and F-score (F) were computed using the formula based on [30,31].
r = T P ( T P + F N )
p = T P ( T P + F P )
F = 2 × r × p ( r + p )

2.4.5. Alignment of GNSS Coordinate System and Map Coordinate System

The local coordinate system of the created map and the Universal Transverse Mercator (UTM) coordinate system of the geographic coordinate system is matched by coordinate transformation. The UAV’s position information is supplemented by RTK-GNSS, so that the UAV can obtain a highly accurate self-position of 1 cm + 1 ppm in the horizontal direction and 1.5 cm + 1 ppm in the vertical direction. The RTK-GNSS position information and the LiDAR-SLAM self-position are time-synchronized offline, and the Iterative Closest Points (ICP) algorithm [32] is used for the alignment of the source and target points. and the target point set as input, the ICP algorithm estimates the rigid body transformation matrices of rotation and translation required to align the source point set to the target point set.

2.5. Statistical Analsis

The seven points obtained as a sample and the representative points of the trees transformed into the geographic coordinate system were compared using the root mean square coordinate errors.
R M S E x = i = 1 n Δ x i 2 n
R M S E y = i = 1 n Δ y i 2 n
R M S E x y = R M S E x 2 + R M S E y 2
where Δ x i and Δ y i are the difference between the geographic coordinate system and the local coordinates of the map. The map generated by Autoware NDT-mapping is shown in Figure 5, and the ground point cloud of this map is removed in Figure 6. Since we can confirm that weeds remain in this map in addition to trees, the point cloud with the smallest z-axis among the principal components obtained by PCA is removed as shown in Figure 7. The point cloud of weeds was automatically removed using this value as the threshold value.
Figure 8 shows a CHM that was generated by filtering the normalized point cloud using a 10 × 10 pixel kernel. Furthermore, Figure 9 shows the segmentation of trees from the original 3D map using treetops information obtained from CHM. The total number of trees detected was 345, and the segmentation accuracy was evaluated with a score of r = 0.974 for recall, p = 0.952 for precision and F = 0.9627 for the F-score.
Figure 10 shows a Google map of the trees with GNSS information added. The root mean square coordinate errors with the actually measured locations of the trees are R M S E x = 1.19 m , R M S E y = 1.40 m , and R M S E x y = 1.84 m , respectively. Figure 11 shows the plots of the root mean square coordinate errors for each of the tree point groups transformed to the respective geographic coordinate system.

3. Discussion

In this paper, we created a high-precision 3D map of an actual orchard and performed tree segmentation by processing point clouds of the map. We also proposed an automatic georeferencing system that links the trees on the map to geographic information. The location information of all the trees obtained as references was included in the point clouds of the trees transformed by georeferencing, indicating that the location accuracy of the system in this study is sufficient to determine the rough location of individual trees. In this study, the accuracy of georeferencing was targeted to be accurate enough to identify specific trees, and the tree locations measured as reference data were located within the canopy of the tree. The accuracy of GNSS positioning was reduced in areas where there was a dense growth of foliage, leading to the failure to obtain a FIX solution. Therefore, data were procured from a location situated outside the central region of the leafy canopy. To ensure even greater precision in accuracy validation, it is imperative to establish a ground assessment point constructed from a reflective material, such as metal, and juxtapose its position with the GNSS-derived coordinates. Individual tree detection has been studied mainly in the forestry field, and the most well-known method is to create a CHM and extract a single tree using the LM (Local maxima) algorithm. The LM algorithm assumes that high local intensity maxima in the imagery corresponding to the treetops [33]. Consequently, the LM method demonstrates good performance in coniferous forests but may detect multiple branches on a single tree in mixed forests and broadleaf forests [34]. In horticultural agriculture, a method is proposed for clustering and labeling crops with irregular canopy shapes, such as oil palm and coconut, by adapting an optimal filter and using the LM method [30]. However, these methods use data acquired from Airborne lasers at altitudes of 1000 m or higher, and appropriate filters for relatively inexpensive LiDAR used with small UAVs at altitudes of 10-150 m have not been identified. In this study, we designed a filter of appropriate size for apple trees and constructed a system to extract and save only tree point clouds by clustering and labeling trees. Mohan et al. proposed single-tree detection using CHM of coconut obtained from Airborne Laser Scanning, with a score of r = 0.87, p = 0.94, and F = 0.90 [30]. Wu et al. also proposed single-tree detection using Deep-Learning’s Faster R-CNN, with a score of r = 0.94, p = 0.91, and F 1 = 0.93 [35]. The results of this study show that individual trees can be detected with accuracy comparable to other studies, and although segmentation of point clouds with closely overlapping crowns is not perfect, segmentation of each apple tree can be performed even for point clouds with overlapping crowns. This method can also be applied to plantation trees such as palm trees and banana trees by adjusting the size of the filtering. The proposed georeferencing methodology in this study involves synchronizing the position obtained through SLAM on the local map with the position obtained through GNSS in the geographic coordinate system. This synchronization is achieved by utilizing the ICP algorithm to register the position in each coordinate system as a point cloud. The accuracy of georeferencing is directly linked to the precision of time synchronization and localization in each coordinate system. With an increase in the size of the map, there is a possibility of drift, leading to a decline in the accuracy of the 3D map. Hence, SLAM-based mapping can include supplementary functionalities such as an enhanced initial estimation of registration utilizing IMUs or Loop Closing. Loop Closing involves observing the same point by circling a circumferential path (loop) and adding that data to a simultaneous equation, significantly reducing cumulative errors, and ensuring accurate localization.

4. Conclusions

In this paper, we propose a system that uses point clouds obtained from UAV-LiDAR to create high-precision 3D maps of apple orchards, segment individual fruit trees on the maps, and determine the geographic coordinates of the trees. The system performs tree segmentation on a map and transforms the map coordinate system of segmented trees into a geographic coordinate system. The geographic coordinates obtained from UAV-LiDAR and RTK-GNSS in an actual apple orchard were obtained for the system proposal. Autoware NDT-mapping was used to create the maps, and single-tree detection and segmentation of fruit trees were accomplished by applying appropriate filtering to the CHM creation. The extraction accuracy of trees was more than 95 % . For tree georeferencing, the geographic coordinates obtained from RTK-GNSS and the location of Localization obtained from NDT-Matching were aligned using the ICP algorithm to estimate the relationship between their positions. As a result, tree identification with an accuracy of about 1 m was achieved.

References

  1. Yousefi, M.R.; Razdari, A.M. Application of GIS and GPS in precision agriculture (a review). International Journal of Advanced Biological and Biomedical Research 2015, 3, 7–9. [Google Scholar]
  2. Hall, A.; Lamb, D.H.B.L.J. Optical remote sensing applications in viticulture - a review. Australian Journal of Grape and Wine Research 2002, 8, 36–47. [Google Scholar] [CrossRef]
  3. Longchamps, L.; Tisseyre, B.; Taylor, J.; Sagoo, L.; Momin, M.; Fountas, S.; Manfrini, L.; Ampatzidis, Y.; Schueller, J.; Khosla, R. Yield sensing technologies for perennial and annual horticultural crops: A review. Precision Agriculture 2022, 23, 1–42. [Google Scholar] [CrossRef]
  4. Hassler, S.C.; Baysal-Gurel, F. Unmanned Aircraft System (UAS) Technology and Applications in Agriculture. Agronomy 2019, 9. [Google Scholar] [CrossRef]
  5. Amarasingam, N.; Ashan Salgadoe, A.S.; Powell, K.; Gonzalez, L.F.; Natarajan, S. A review of UAV platforms, sensors, and applications for monitoring of sugarcane crops. Remote Sensing Applications: Society and Environment 2022, 26, 100712. [Google Scholar] [CrossRef]
  6. Duan, T.; Chapman, S.; Guo, Y.; Zheng, B. Dynamic monitoring of NDVI in wheat agronomy and breeding trials using an unmanned aerial vehicle. Field Crops Research 2017, 210, 71–80. [Google Scholar] [CrossRef]
  7. Delalieux, S.; van Aardt, J.; Keulemans, W.; Schrevens, E.; Coppin, P. Detection of biotic stress (Venturia inaequalis) in apple trees using hyperspectral data: Non-parametric statistical approaches and physiological implications. European Journal of Agronomy 2007, 27, 130–143. [Google Scholar] [CrossRef]
  8. Huang, W.; Guan, Q.; Luo, J.; Zhang, J.; Zhao, J.; Liang, D.; Huang, L.; Zhang, D. New Optimized Spectral Indices for Identifying and Monitoring Winter Wheat Diseases. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 2014, 7, 2516–2524. [Google Scholar] [CrossRef]
  9. Zude-Sasse, M.; Fountas, S.; Gemtos, T.; Abu-Khalaf, N. Applications of precision agriculture in horticultural crops. European Journal of Horticultural Science 2016, 81, 78–90. [Google Scholar] [CrossRef]
  10. Ye, H.; Huang, W.; Huang, S.; Cui, B.; Dong, Y.; Guo, A.; Ren, Y.; Jin, Y. Recognition of Banana Fusarium Wilt Based on UAV Remote Sensing. Remote Sensing 2020, 12. [Google Scholar] [CrossRef]
  11. Roma, E.; Laudicina, V.A.; Vallone, M.; Catania, P. Application of Precision Agriculture for the Sustainable Management of Fertilization in Olive Groves. Agronomy 2023, 13. [Google Scholar] [CrossRef]
  12. Pothou, A.; Toth, C.; Spyros, K.; Georgopoulos, A. An approach to optimize reference ground control requirements for estimating LiDAR/IMU boresight misalignment. ISPRS Journal of Photogrammetry and Remote Sensing 2008. [Google Scholar]
  13. Turner, D.; Lucieer, A.; Watson, C.S. An Automated Technique for Generating Georectified Mosaics from Ultra-High Resolution Unmanned Aerial Vehicle (UAV) Imagery, Based on Structure from Motion (SfM) Point Clouds. Remote. Sens. 2012, 4, 1392–1410. [Google Scholar] [CrossRef]
  14. Sun, H.; Li, L.; Ding, X.; Guo, B. The precise multimode GNSS positioning for UAV and its application in large scale photogrammetry. Geo-spatial Information Science 2016, 19, 188–194. [Google Scholar] [CrossRef]
  15. Turner, D.; Lucieer, A.; Wallace, L. Direct Georeferencing of Ultrahigh-Resolution UAV Imagery. IEEE Transactions on Geoscience and Remote Sensing 2014, 52, 2738–2745. [Google Scholar] [CrossRef]
  16. Benassi, F.; Dall’Asta, E.; Diotri, F.; Forlani, G.; Morra di Cella, U.; Roncella, R.; Santise, M. Testing Accuracy and Repeatability of UAV Blocks Oriented with GNSS-Supported Aerial Triangulation. Remote Sensing 2017, 9. [Google Scholar] [CrossRef]
  17. Forlani, G.; Diotri, F.; Cella, U.M.d.; Roncella, R. Indirect UAV strip georeferencing by on-board GNSS data under poor satellite coverage. Remote sensing 2019, 11, 1765. [Google Scholar] [CrossRef]
  18. McMahon, C.; Mora, O.E.; Starek, M.J. Evaluating the performance of sUAS photogrammetry with PPK positioning for infrastructure mapping. Drones 2021, 5, 50. [Google Scholar] [CrossRef]
  19. Sanz-Ablanedo, E.; Chandler, J.H.; Rodríguez-Pérez, J.R.; Ordóñez, C. Accuracy of unmanned aerial vehicle (UAV) and SfM photogrammetry survey as a function of the number and location of ground control points used. Remote Sensing 2018, 10, 1606. [Google Scholar] [CrossRef]
  20. Lin, Y.C.; Habib, A. Quality control and crop characterization framework for multi-temporal UAV LiDAR data over mechanized agricultural fields. Remote Sensing of Environment 2021, 256, 112299. [Google Scholar] [CrossRef]
  21. Štroner, M.; Urban, R.; Línková, L. A New Method for UAV Lidar Precision Testing Used for the Evaluation of an Affordable DJI ZENMUSE L1 Scanner. Remote Sensing 2021, 13. [Google Scholar] [CrossRef]
  22. Jozkow, G.; Wieczorek, P.; Karpina, M.; Walicka, A.; Borkowski, A. PERFORMANCE EVALUATION OF sUAS EQUIPPED WITH VELODYNE HDL-32E LiDAR SENSOR. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences 2017, XLII-2/W6, 171–177. [Google Scholar] [CrossRef]
  23. Yuan, W.; Choi, D.; Bolkas, D. GNSS-IMU-assisted colored ICP for UAV-LiDAR point cloud registration of peach trees. Computers and Electronics in Agriculture 2022, 197, 106966. [Google Scholar] [CrossRef]
  24. Hadas, E.; Jozkow, G.; Walicka, A.; Borkowski, A. Apple orchard inventory with a LiDAR equipped unmanned aerial system. International Journal of Applied Earth Observation and Geoinformation 2019, 82, 101911. [Google Scholar] [CrossRef]
  25. Biber, P.; Straßer, W. The Normal Distributions Transform: A New Approach to Laser Scan Matching. 2003, 3, 2743–2748. [Google Scholar] [CrossRef]
  26. Pingel, T.J.; Clarke, K.C.; McBride, W.A. An improved simple morphological filter for the terrain classification of airborne LIDAR data. ISPRS Journal of Photogrammetry and Remote Sensing 2013, 77, 21–30. [Google Scholar] [CrossRef]
  27. Abdi, H.; Williams, L.J. Principal component analysis. Wiley interdisciplinary reviews: Computational statistics 2010, 2, 433–459. [Google Scholar] [CrossRef]
  28. Korpela, I.; Anttila, P.; Pitkänen, J. The performance of a local maxima method for detecting individual tree tops in aerial photographs. International Journal of Remote Sensing 2006, 27, 1159–1175. [Google Scholar] [CrossRef]
  29. Chen, Q.; Baldocchi, D.; Gong, P.; Kelly, M. Isolating individual trees in a savanna woodland using small footprint lidar data. Photogrammetric Engineering & Remote Sensing 2006, 72, 923–932. [Google Scholar] [CrossRef]
  30. Mohan, M.; de Mendonça, B.A.F.; Silva, C.A.; Klauberg, C.; de Saboya Ribeiro, A.S.; de Araújo, E.J.G.; Monte, M.A.; Cardil, A. Optimizing individual tree detection accuracy and measuring forest uniformity in coconut (Cocos nucifera L.) plantations using airborne laser scanning. Ecological Modelling 2019, 409, 108736. [Google Scholar] [CrossRef]
  31. Maillard, P.; Gomes, M.F. Detection and counting of orchard trees from VHR images using a geometrical-optical model and marked template matching. ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences 2016, 75–82. [Google Scholar] [CrossRef]
  32. Besl, P.; McKay, N.D. A method for registration of 3-D shapes. IEEE Transactions on Pattern Analysis and Machine Intelligence 1992, 14, 239–256. [Google Scholar] [CrossRef]
  33. Wulder, M.; Niemann, K.O.; Goodenough, D.G. Local maximum filtering for the extraction of tree locations and basal area from high spatial resolution imagery. Remote Sensing of environment 2000, 73, 103–114. [Google Scholar] [CrossRef]
  34. Jing, L.; Hu, B.; Noland, T.; Li, J. An individual tree crown delineation method based on multi-scale segmentation of imagery. ISPRS Journal of Photogrammetry and Remote Sensing 2012, 70, 88–98. [Google Scholar] [CrossRef]
  35. Wu, J.; Yang, G.; Yang, H.; Zhu, Y.; Li, Z.; Lei, L.; Zhao, C. Extracting apple tree crown information from remote imagery using deep learning. Computers and Electronics in Agriculture 2020, 174, 105504. [Google Scholar] [CrossRef]
Figure 1. Exemplary pictures of the orchard.
Figure 1. Exemplary pictures of the orchard.
Preprints 70255 g001
Figure 2. UAV Configuration.
Figure 2. UAV Configuration.
Preprints 70255 g002
Figure 3. Installation of antennas for the acquisition of standing tree locations.
Figure 3. Installation of antennas for the acquisition of standing tree locations.
Preprints 70255 g003
Figure 4. System diagram.
Figure 4. System diagram.
Preprints 70255 g004
Figure 5. 3D orchard map.
Figure 5. 3D orchard map.
Preprints 70255 g005
Figure 6. Normalized 3D orchard map.
Figure 6. Normalized 3D orchard map.
Preprints 70255 g006
Figure 7. Nonground 3D orchard map.
Figure 7. Nonground 3D orchard map.
Preprints 70255 g007
Figure 8. CHM with Detected tree tops.
Figure 8. CHM with Detected tree tops.
Preprints 70255 g008
Figure 9. Top view of segmented 3D orchard map.
Figure 9. Top view of segmented 3D orchard map.
Preprints 70255 g009
Figure 10. Location of trees shown on google map, © https://www.google.com/maps/(accessed on 14 March 2023).
Figure 10. Location of trees shown on google map, © https://www.google.com/maps/(accessed on 14 March 2023).
Preprints 70255 g010
Figure 11. Example of reference points and representative points obtained for trees.
Figure 11. Example of reference points and representative points obtained for trees.
Preprints 70255 g011
Table 1. Basic specifications of the DJI Matrice 300 RTK.
Table 1. Basic specifications of the DJI Matrice 300 RTK.
Dimensions 810 × 670 × 430 mm (L×W×H)
Max Takeoff Weight 9 kg
Max Flight Time 55 min
Max Speed 23 m/s
GNSS GPS+GLONASS+BeiDou+Galileo
RTK Positioning Accuracy Vertical : 1.5 cm + 1 ppm; Horizontal : 1 cm+1 ppm
Table 2. Basic specifications of the OUSTER OS-1.
Table 2. Basic specifications of the OUSTER OS-1.
Horizontal channels 64
Points Per Second 5,242,880
Range 120 m
Range Resolution 0.1 cm
Field of View Vertical : 45°(+22.5°to -22.5°); Horizontal : 360°
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated