Optimized Deep Learning Model as a Basis for Fast UAV Mapping of Weed Species in Winter Wheat Crops

dc.bibliographicCitation.firstPage1704
dc.bibliographicCitation.issue9
dc.bibliographicCitation.journalTitleRemote Sensingeng
dc.bibliographicCitation.volume13
dc.contributor.authorde Camargo, Tibor
dc.contributor.authorSchirrmann, Michael
dc.contributor.authorLandwehr, Niels
dc.contributor.authorDammer, Karl-Heinz
dc.contributor.authorPflanz, Michael
dc.date.accessioned2022-03-31T11:27:38Z
dc.date.available2022-03-31T11:27:38Z
dc.date.issued2021
dc.description.abstractWeed maps should be available quickly, reliably, and with high detail to be useful for site-specific management in crop protection and to promote more sustainable agriculture by reducing pesticide use. Here, the optimization of a deep residual convolutional neural network (ResNet-18) for the classification of weed and crop plants in UAV imagery is proposed. The target was to reach sufficient performance on an embedded system by maintaining the same features of the ResNet-18 model as a basis for fast UAV mapping. This would enable online recognition and subsequent mapping of weeds during UAV flying operation. Optimization was achieved mainly by avoiding redundant computations that arise when a classification model is applied on overlapping tiles in a larger input image. The model was trained and tested with imagery obtained from a UAV flight campaign at low altitude over a winter wheat field, and classification was performed on species level with the weed species Matricaria chamomilla L., Papaver rhoeas L., Veronica hederifolia L., and Viola arvensis ssp. arvensis observed in that field. The ResNet-18 model with the optimized image-level prediction pipeline reached a performance of 2.2 frames per second with an NVIDIA Jetson AGX Xavier on the full resolution UAV image, which would amount to about 1.78 ha h−1 area output for continuous field mapping. The overall accuracy for determining crop, soil, and weed species was 94%. There were some limitations in the detection of species unknown to the model. When shifting from 16-bit to 32-bit model precision, no improvement in classification accuracy was observed, but a strong decline in speed performance, especially when a higher number of filters was used in the ResNet-18 model. Future work should be directed towards the integration of the mapping process on UAV platforms, guiding UAVs autonomously for mapping purpose, and ensuring the transferability of the models to other crop fields.eng
dc.description.fondsLeibniz_Fonds
dc.description.versionpublishedVersioneng
dc.identifier.urihttps://oa.tib.eu/renate/handle/123456789/8502
dc.identifier.urihttps://doi.org/10.34657/7540
dc.language.isoeng
dc.publisherBasel : MDPI AG
dc.relation.doihttps://doi.org/10.3390/rs13091704
dc.relation.essn2072-4292
dc.rights.licenseCC BY 4.0 Unported
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subject.ddc620eng
dc.subject.otherResNeteng
dc.subject.otherdeep residual networkseng
dc.subject.otherUAV imageryeng
dc.subject.otherembedded systemseng
dc.subject.othercrop monitoringeng
dc.subject.otherimage classificationeng
dc.subject.othersite-specific weed managementeng
dc.subject.otherreal-time mappingeng
dc.titleOptimized Deep Learning Model as a Basis for Fast UAV Mapping of Weed Species in Winter Wheat Cropseng
dc.typeArticleeng
dc.typeTexteng
tib.accessRightsopenAccess
wgl.contributorATB
wgl.subjectIngenieurwissenschaften
wgl.typeZeitschriftenartikel
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
remotesensing-13-01704.pdf
Size:
3.42 MB
Format:
Adobe Portable Document Format
Description: