A Case Study on Detecting and Mapping Individual Coconut Trees Using YOLOV3 in Conjunction with UAV Remote Sensing for Smart Plantation Management
Abstract
Location and number data of individual coconut trees are important for surveying
of planting areas, predicting coconut yield, and managing and planning coconut
plantations. This data is usuallyobtained through manual investigation and
statistics, which is time-consuming and tedious. Deep learning object recognition
models, widely used in computer vision, can provide an opportunity to accurately
identify individual coconut trees, which is essential for rapid data acquisition and
the reduction of human error. This study proposes an approach to identify
individual coconut trees and map their spatial distribution by combining deep
learning with unmanned aerial vehicle (UAV) remote sensing. High-resolution true colour images of coconut trees at the Mahayaya Coconut Model Plantation in Sri
Lanka were collected through UAV remote sensing, and an image dataset of deep
learning models of individual coconut trees (ICTs) was constructed by visual
description and field surveys based on coconut tree images captured by UAV remote
sensing. YOLOv3 was selected to train, validate and test the image dataset of coconut
trees. The results show that the average accuracy of the YOLOv3 model for
validation reaches 91.7%. The number of ICTs in the study area was calculated using
YOLOv3, and their spatial distribution map was created using the non-maximum
suppression method and ArcGIS software. This study will provide basic data and
technical support for smart coconut plantation management in Mahayaya coconut
model plantation and other coconut-producing areas.