Mobile Application for Breeding Bird Classification using Deep Learning Technique
Main Article Content
Abstract
Currently, several bird parks have open aviaries where visitors can see various bird species up close. However, visitors are occasionally unable to identify the bird species they are watching. Because the bird species found in open aviaries are not typically seen in everyday life, and there are no zoo signs inside the open aviaries. Therefore, this article proposes the design and development of a mobile application using the Flutter framework. The application can detect and classify 10 bird species using a deep learning model named EfficientDet Lite, which is created by the TFLite model maker library. The users can use a smartphone camera to examine the birds. Then, when birds are found, the application will provide users the bird information. From the experiments, we found that the EfficientDet Lite 0 gave the most suitable results for the application. The model took 62.3 ms for the inference time with the precision, recall, accuracy and F1-score of 0.94, 0.94, 0.99, and 0.94, respectively.
Article Details
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
I/we certify that I/we have participated sufficiently in the intellectual content, conception and design of this work or the analysis and interpretation of the data (when applicable), as well as the writing of the manuscript, to take public responsibility for it and have agreed to have my/our name listed as a contributor. I/we believe the manuscript represents valid work. Neither this manuscript nor one with substantially similar content under my/our authorship has been published or is being considered for publication elsewhere, except as described in the covering letter. I/we certify that all the data collected during the study is presented in this manuscript and no data from the study has been or will be published separately. I/we attest that, if requested by the editors, I/we will provide the data/information or will cooperate fully in obtaining and providing the data/information on which the manuscript is based, for examination by the editors or their assignees. Financial interests, direct or indirect, that exist or may be perceived to exist for individual contributors in connection with the content of this paper have been disclosed in the cover letter. Sources of outside support of the project are named in the cover letter.
I/We hereby transfer(s), assign(s), or otherwise convey(s) all copyright ownership, including any and all rights incidental thereto, exclusively to the Journal, in the event that such work is published by the Journal. The Journal shall own the work, including 1) copyright; 2) the right to grant permission to republish the article in whole or in part, with or without fee; 3) the right to produce preprints or reprints and translate into languages other than English for sale or free distribution; and 4) the right to republish the work in a collection of articles in any other mechanical or electronic format.
We give the rights to the corresponding author to make necessary changes as per the request of the journal, do the rest of the correspondence on our behalf and he/she will act as the guarantor for the manuscript on our behalf.
All persons who have made substantial contributions to the work reported in the manuscript, but who are not contributors, are named in the Acknowledgment and have given me/us their written permission to be named. If I/we do not include an Acknowledgment that means I/we have not received substantial contributions from non-contributors and no contributor has been omitted.
References
J. Law, "Why we need birds (far more than they need us)", BirdLife International, 2021. [Online]. Available: https://www.birdlife.org/news/2019/01/04/why-we-need-birds-far-more-than-they-need-us/. [Accessed 15 November 2021].
P. Chatayapha. “Technology and early childhood in 21st century,” Journal of Graduate Studies Valaya Alongkorn Rajabhat University, vol. 14, no. 3, 2020.
"Cross-platform mobile frameworks used by global developers 2021 | Statista", Statista, 2021. [Online]. Available: https://www.statista.com/statistics/869224/ worldwide-software-developer-workinghours/?fbclid=IwAR1 QrmP_JDP_yMdGu_C56ami-xPuUniTqCv-D0W1Eyw8cGJ NrVK-7ZmfAaU. [Accessed 15 November 2021].
S. B. Kotsiantis, I. Zaharakis, and P. Pintelas, “Supervised machine learning: A review of classification techniques,” Emerging Artificial Intelligence Applications in Computer Engineering, vol. 160, no. 1, pp. 3-24, 2007.
M. Tabak, et al, “Machine learning to classify animal species in camera trap images: Applications in ecology,” Methods in Ecology and Evolution, vol. 10, no. 4, pp. 585-590, 2018.
R. L. Galvez, A. A. Bandala, E. P. Dadios, R. R. P. Vicerra and J. M. Z. Maningo, “Object detection using convolutional neural networks,” TENCON 2018 - 2018 IEEE Region 10 Conference, pp. 2023-2027, 2018.
“Flutter - Beautiful native apps in record Time”, Flutter.dev, 2021. [Online]. Available: https://flutter.dev/. [Accessed 15 November 2021].
“TensorFlow”, TensorFlow, 2015. [Online]. Available: https://www.tensorflow.org/. [Accessed 15 November 2021].
“TensorFlowLite”, TensorFlow, 2018. [Online]. Available: https://www.tensorflow.org/lite/guide. [Accessed 15 November 2021].
S. Albawi, T. A. Mohammed and S. Al-Zawi, “Understanding of a convolutional neural network”. In 2017 International Conference on Engineering and Technology (ICET), pp. 1-6, 2017.
M. Tan, R. Pang and Q. V. Le, "EfficientDet: scalable and efficient object detection", Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp.1-10, 2020.
M. Tan and Q. V. Le, "EfficientNet: rethinking model scaling for convolutional neural networks", International Conference on Machine Learning, pp.1-11, 2019.
"TensorFlow Lite Model Maker", TensorFlow, 2020. [Online]. Available: https://www.tensorflow.org/lite/guide/ model_maker. [Accessed 15 November 2021].
"Object Detection with TensorFlow Lite Model Maker", TensorFlow, 2020. [Online]. Available: https://www.tensorflow.org/ lite/tutorials/model_maker_object_detection. [Accessed 15 November 2021].
M. X. He and P. Hao, "Robust Automatic Recognition of Chinese License Plates in Natural Scenes," in IEEE Access, vol. 8, pp. 173804-173814, 2020.
"The Zoological Park Organization of Thailand", Zoothailand.org, 2021. [Online]. Available: http://zoothailand.org/animal_more.php. [Accessed 15 November 2021].