Using the Convolution Neural Network (YOLOv8) with Cartesian Robot for Automatic Disease Detection and Watering

Main Article Content

Dumrongsak Kijdech
Weerapun Duangthongsuk

Abstract

Currently, smart farm has influence to more agriculture. Due to production costs have decreased and productivity has increased. However, the study found that diseases occurring in plants still require human observation. A convolutional neural network (YOLOv8) with a cartesian robot for automatic disease detection and watering is proposed in this research. Which species of cactus are used as experimental model in this research. When the soil moisture falls below the target level, the robot is moved to the target location to watering the plants and collect images for disease detection using artificial intelligence. In case a disease is detected, the robot displays an alert on the LINE application and moves the injector to target position to spray disease eradication. During the research process, the robot moves to each target location and collects images of cactuses. After that, artificial intelligence is used for disease detection and measuring the usage time. Subsequently, the disease detection results from the before and after images are validated. The results indicated that the average prediction time was approximately 0.57 seconds and the accuracy was 90 percentage.

Article Details

How to Cite
[1]
D. Kijdech and W. Duangthongsuk, “Using the Convolution Neural Network (YOLOv8) with Cartesian Robot for Automatic Disease Detection and Watering”, RMUTP Sci J, vol. 19, no. 2, pp. 101–118, Dec. 2025.
Section
บทความวิจัย (Research Articles)

References

A. Kuznetsova, T. Maleva, and V. Soloviev, “Detecting Apples in Orchards Using YOLOv3 and YOLOv5 in General and Close-Up Images,” Advances in Neural Networks – ISNN, pp. 233-243, 2020.

B. Benjdira, T. Khursheed, A. Koubaa, A. Ammar, and K. Ouni, “Car Detection using Unmanned Aerial Vehicles: Comparison between Faster R-CNN and YOLOv3,” in Proceedings of the 1st International Conference on Unmanned Vehicle Systems (UVS), Muscat, Oman, 2019.

Y. Tian, G. Yang, Z. Wang, H. Wang, E. Li, and Z. Liang, “Apple detection during different growth stages in orchards using the improved YOLO-V3 model,” Computers and Electronics in Agriculture, vol. 157, pp. 417-426, Feb. 2019.

Z. Jiang, L. Zhao, S. Li and Y. Jia, “Real-time object detection method based on improved YOLOv4-tiny,” in Proceedings of Computer Vision and Pattern Recognition, USA, 2020.

G. Yang, W. Feng, J. Jin, Q. Lei, X. Li, G. Gui, and W. Wang, “Face Mask Recognition System with YOLOV5 Based on Image Recognition,” in Proceedings of IEEE 6th International Conference on Computer and Communications (ICCC), Chengdu, China, 2020, pp. 1398-1404.

J. Du, “Understanding of Object Detection Based on CNN Family and YOLO,” in Proceedings of 2nd International Conference on Machine Vision and Information Technology (CMVIT 2018), Hong Kong, 2018.

Y. J. Sang, Z. Wu, P. Guo, H. Hu, H. Xiang, Q. Zhang, and B. Cai, “An Improved YOLOv2 for Vehicle Detection,” Sensors, vol. 18, no.12, 2018.

C. Li, L. Li, H. Jiang, K. Weng, Y. Geng, L. Li, Z. Ke, Q. Li, M. Cheng, W. Nie, Y. Li, B. Zhang, Y. Liang, L. Zhou, X. Xu, X. Chu, X. Wei, and X. Wei, “YOLOv6: A single-stage object detection framework for industrial applications,” in Proceedings of Computer Vision and Pattern Recognition, USA, 2022, pp. 9387-9396.

C.-Y. Wang, A. Bochkovskiy, and H.-Y.M. Liao, “YOLOv7: Train able bag-of-freebies sets new state-of-the-art for real-time object detectors,” in Proceedings of 2022 Conference on Computer Vision and Pattern Recognition (CVPR), 2023, pp. 7464-7475.

J. Terven, D. M. C. Esparza, and J. A. R. González, “A Comprehensive Review of YOLO: From YOLOv1 to YOLOv8 and YOLO-NAS,” Machine learning and knowledge extraction, vol. 5, no. 4, pp. 1680-1716, Nov. 2023.

D. Kijdech, “Weed Classification by Using Convolution Neural Network for Studying and Weed Eliminate Robot,” in Proceedings of the 7th SAU National Interdisciplinary Conference 2020, Bangkok, Thailand, 2020, pp. 203-209.

D. Kijdech and S. Wongbunyong, “Artificial Intelligence in Localization and Classification of Cactus for Automatic Watering Works,” in Proceedings of The 8th SAU National Interdisciplinary Conference 2021, Bangkok, Thailand, 2021, pp. 814-821.

K. P. Ferentinos, "Deep learning models for plant disease detection and diagnosis," Computers and Electronics in Agriculture, vol, 145, pp. 311-318, Feb. 2018.

D. Divani, P. Patil, and S.K. Punjabi, “Automated plant Watering system,” in Proceedings of 2016 International Conference on Computation of Power, Energy Information and Communication (ICCPEIC), Melmaruvathur, India, 2016, pp. 180-182.

M. S. Munir, I.S. Bajwa, and S. M. Cheema, “An intelligent and secure smart watering system using fuzzy logic and blockchain,” Computer & Electrical Engineering, vol. 77, pp. 109-119, Jul. 2019.

Y. Huang, Z. Liu, and Y. Dong, “Design of Remote Wireless Automatic Watering Robot Control System,” in Proceedings of International Conference on Applications and Techniques in Cyber Intelligence ATCI 2019, China, 2019, pp. 839-846.

N. Hema, R. Aswani, and M. Malik, “Plant Watering Autonomous Mobile Robot,” IAES International Journal of Robotics and Automation, vol.1, no. 3, pp. 152, Sep. 2012.

P. Tangtisanon, “Small Gardening Robot with Decision-making Watering System,” Sensors and Materials, vol. 31, no. 6, pp. 1905–1916, 2019.

L. N. Smith, “Cyclical learning rates for training neural networks,” in Proceedings of 2017 IEEE winter conference on applications of computer vision (WACV), USA, 2017, pp. 464-472.

Ultralytics, "YOLOv8 Docs – Modes: Detect, Segment, Classify, Pose, Track," [Online]. Available: https://docs.ultralytics.com/modes/. [Accessed: Jun. 6, 2025].