Drone Approach for Remote Sensing The Intercrop On Durian Plantations Using YOLOv5 Model
Main Article Content
Abstract
This paper proposes a potential solution for monitoring the durian plantations which apply intercropping by using drones equipped with RGB and spectral cameras. Currently, farmers mainly rely on their naked eyes to estimate whether a density of papayas around a durian tree is suitable. This eye estimation is time consuming and often not accurate enough, especially when trees reach the heights above the human head. To help the farmers, the proposed method used drones to create the ortho-mosaic map of the monitoring areas then YOLO V5 model is used to detect and locate durian and papaya trees. These results were used to evaluate the durian growth conditions. The trained model result showed a high accuracy at over 95% in detecting and locating trees which is reliable enough to apply to the practice. Furthermore, in the validating process, durian growth conditions also correctly evaluated and detected regions where density of papaya trees must be adjusted.
Article Details
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
References
B. Horwith, “A role for intercropping in modern agriculture,”. BioScience vol.35, no. 5, pp. 286-91, May. 1985.
M. S. Hossain, K. M. Abdul, and N. Islam, “Multiple cropping for sustainable and exaggerated agricultural production system,” J. Biosci. Agric. Res., vol. 14, no. 2, pp. 1202-9, Aug. 2017.
S. Wolfert, L. Ge, C. Verdouw, and M. J. Bogaardt, “Big data in smart farming a review,” Agric. Syst., vol. 153, pp. 69-80, May. 2017.
A. Walter, R. Finger, R. Huber, and N. Buchmann, “Opinion: Smart farming is key to developing sustainable agriculture,” in Proc. Natl. Acad. Sci., USA, 2017, pp. 6148-50.
C. Zhang and J. M. Kovacs, “The application of small unmanned aerial systems for precision agriculture: a review,” Precis. Agric., vol. 13, pp. 693-712 , Jul. 2012.
F. Waldner., G. S. Canto, and P. Defourny, “Automated annual cropland mapping using knowledge-based temporal features, “ ISPRS J. Photogramm. Remote. Sens., vol. 110, pp. 1-13, Dec. 2015.
A. Begue et al., “Remote sensing and cropping practices: A review,” Remote. Sens., vol. 10, no. 1, Jan. 2018.
D. S. Culvenor, “TIDA: An algorithm for the delineation of tree crowns in high spatial resolution remotely sensed imagery,” Comput. Geosci., vol. 28, no. 1, pp. 33-44, Feb. 2022.
P. K. Mayossa, C. D’Eeckenbrugge, F. Borne, S. Gadal, and G. Viennois, “Developing a method to map coconut agrosystems from high-resolution satellite images,” in Proc. Int. Cartographic Conf., Rio de Janeiro, Brazil, 2015.
P. Liu and X. Chen, “Intercropping classification from GF-1 and GF-2 satellite imagery using a rotation forest based on an SVM,” ISPRS Int. J. Geo-Inf., vol. 8, no. 2, Feb. 2019.
N. Jamil, G. Kootstra, and L. Kooistra, “Evaluation of individual plant growth estimation in an intercropping field with UAV imagery,” Agric., vol. 12, no. 1, Jan. 2022.
S. Huang, W. Han, H. Chen, G. Li, and J. Tang, “Recognizing zucchinis intercropped with sunflowers in UAV visible images using an improved method based on OCRNet,” Remote. Sens., vol. 13, no. 14, Jul. 2021.
L. Parra, D. Mostaza-Colado, J. F. Marin, P.V. Mauri, and J. Lloret, “Methodology to differentiate legume species in intercropping agroecosystems based on UAV with RGB camera,” Electronics, vol. 11, no. 4, Feb.2022.
X. Xu, et al., “Detection and counting of maize leaves based on two-stage deep learning with UAV-based RGB image,” Remote. Sens., vol. 14, no. 21, Oct. 2022.
C. Y. Song et al., “Detection of maize tassels for UAV remote sensing image with an improved YOLOX model,” J. Integr. Agric., vol. 22, no. 6, pp.1671-83, Jun. 2022.
L. Lac, J. P. Da Costa, M. Donias, B. Keresztes, and A. Bardet, “Crop stem detection and tracking for precision hoeing using deep learning,”. Comput. Electron. Agric., vol. 192, Jan. 2022.
S. Liu, et al., “Estimating maize seedling number with UAV RGB images and advanced image processing methods,” Precis. Agric., vol. 23, pp. 1604-32, Apr. 2022.
Y. H. Kim and K. R. Park, “MTS-CNN: Multi-task semantic segmentationconvolutional neural network for detecting crops and weeds,” Comput. Electron. Agric., vol. 199, Aug. 2022.
P. Wang., Y. Zhang, B. Jiang, and J. Hou, “An maize leaf segmentation algorithm based on image repairing technology,” Comput. Electron. Agric., vol. 172, May. 2020.
A. Barreto et al., “Automatic UAV-based counting of seedlings in sugar-beet field and extension to maize and strawberry,” Comput. Electron. Agric., vol. 191, Dec. 2021.
X. Yu et al., 2022. “Maize tassel area dynamic monitoring based on near-ground and UAV RGB images by U-Net model,” Comput. Electron. Agric., vol. 203, Dec. 2022.
Y. L. Tu., W. Y. Lin, and Y. C. Lin, “Automatic leaf counting using improved YOLOv3”, in Proc. Int. Symp. Comput. Consum. Contr. (IS3C) Taichung, Taiwan,2020 , pp. 197-200.
S. Lu et al., “Counting dense leaves under natural environments via an improved deeplearning-based bbject detection algorithm,” Agric., vol. 11, no. 10, Nov. 2021.
X. Li et al., “Detecting plant leaves based on vision transformer enhanced YOLOv5,” in Proc. Int. Conf. Pattern Recog. Mach. Learn, Chengdu, China, 2022, pp. 32-37.
X. Xu et al., “Maize seedling leave counting based on semi-supervised learning and UAV RGB images,” Sustainability, vol. 15, no. 12, Jun. 2023.
C. Zhang, H. Ding, Q. Shi, and Y. Wang, “Grape cluster real-time detection in complex natural scenes based on YOLOv5s deep learning network,” Agric., vol. 12, no. 8, Aug. 2022.
L. Wang et al.,. “Fast and precise detection of litchi fruits for yield estimation based on the improved YOLOv5 model,” Front. Plant. Sci., vol.13, Agu. 2022.
O. Oliwaseyi, M. Irhebhude, A. Evwiekpaefe, “A comparative study of YOLOv5 and YOLOv7 object detection algorithms,” Jour. of Inform. Comp., vol.2, Feb. 2023.
B. Selcuk, T. Serif, “A Comparison of YOLOv5 and YOLOv8 in the Context of Mobile UI Detection”. In: Younas, M., Awan, I., Grønli, TM. (eds) Mobile Web and Intelligent Information Systems. MobiWIS 2023. Lecture Notes in Computer Science, vol. 13977. Springer, Cham, 2023.
J. Glenn, “Ultralytics YOLOv5 v7.0”.
R. O. Duda and P. E. Hart, “Use of the Hough transformation to detect lines and curves in pictures,” Commun. ACM, vol. 15, no. 1, pp. 11-15, Jan. 1972.
T.H., Luu et al., ” Evaluation of land roughness and weather effects on paddy field using cameras mounted on drone: A comprehensive analysis from early to midgrowth stages”. J. King Saud Univ. - Comput. Inf. Sci., vol. 35, Dec., 2023.