Search Results

Now showing 1 - 10 of 10
  • Item
    Deep Learning Object Detection for Image Analysis of Cherry Fruit Fly (Rhagoletis cerasi L.) on Yellow Sticky Traps
    (Berlin ; Heidelberg : Springer, 2022) Salamut, Christian; Kohnert, Iris; Landwehr, Niels; Pflanz, Michael; Schirrmann, Michael; Zare, Mohammad
    Insect populations appear with a high spatial, temporal and type-specific diversity in orchards. One of the many monitoring tools for pest management is the manual assessment of sticky traps. However, this type of assessment is laborious and time-consuming so that only a few locations can be controlled in an orchard. The aim of this study is to test state-of-the art object detection algorithms from deep learning to automatically detect cherry fruit flies (Rhagoletis cerasi), a common insect pest in cherry plantations, within images from yellow sticky traps. An image annotation database was built with images taken from yellow sticky traps with more than 1600 annotated cherry fruit flies. For better handling in the computational algorithms, the images were augmented to smaller ones by the known image preparation methods “flipping” and “cropping” before performing the deep learning. Five deep learning image recognition models were tested including Faster Region-based Convolutional Neural Network (R-CNN) with two different methods of pretraining, Single Shot Detector (SSD), RetinaNet, and You Only Look Once version 5 (YOLOv5). R‑CNN and RetinaNet models outperformed other ones with a detection average precision of 0.9. The results indicate that deep learning can act as an integral component of an automated system for high-throughput assessment of pest insects in orchards. Therefore, this can reduce the time for repetitive and laborious trap assessment but also increase the observed amount of sticky traps
  • Item
    Spectral shift as advanced index for fruit chlorophyll breakdown
    (Heidelberg : Springer, 2013) Seifert, Birgit; Pflanz, Michael; Zude, Manuela
    The decline of fruit chlorophyll is a valuable indicator of fruit ripeness. Fruit chlorophyll content can be nondestructively estimated by UV/VIS spectroscopy at fixed wavelengths. However, this approach cannot explain the complex changes in chlorophyll catabolism during fruit ripening. We introduce the apparent peak position of the red band chlorophyll absorption as a new qualitative spectral indicator. Climacteric fruit (apple: n = 24, mango: n = 38, tomato: n = 48) were analysed at different ripeness stages. The peak position and corresponding intensity values were determined between 650 and 690 nm of nondestructively measured fruit spectra as well as of corresponding spectra of fruit extracts. In the extracts, individual contents of chlorophyll a, chlorophyll b, pheophytin a and carotenoids were analysed photometrically, using an established iterative multiple linear regression approach. Nondestructively measured peak positions shifted unimodal in all three fruit species with significant shifts between fruit ripeness classes of maximal 2.00 ± 0.27 nm (mean ± standard error) in tomato and 0.57 ± 0.11 nm in apple. Peak positions in extract spectra were related to varying pigment ratios (Rmax = −0.91), considering individual pigments in the pool. The peak intensities in both spectral readings, nondestructive and fruit extracts, were correlated with absolute chlorophyll contents with Rmax = −0.84 and Rmax = 1.00, respectively. The introduced spectral marker of the apparent peak position of chlorophyll absorbance bears the potential for an advanced information gain from nondestructive spectra for the determination of fruit ripeness.
  • Item
    Rapid and low-cost insect detection for analysing species trapped on yellow sticky traps
    (London : Nature Publishing Group, 2021) Böckmann, Elias; Pfaff, Alexander; Schirrmann, Michael; Pflanz, Michael
    While insect monitoring is a prerequisite for precise decision-making regarding integrated pest management (IPM), it is time- and cost-intensive. Low-cost, time-saving and easy-to-operate tools for automated monitoring will therefore play a key role in increased acceptance and application of IPM in practice. In this study, we tested the differentiation of two whitefly species and their natural enemies trapped on yellow sticky traps (YSTs) via image processing approaches under practical conditions. Using the bag of visual words (BoVW) algorithm, accurate differentiation between both natural enemies and the Trialeurodes vaporariorum and Bemisia tabaci species was possible, whereas the procedure for B. tabaci could not be used to differentiate this species from T. vaporariorum. The decay of species was considered using fresh and aged catches of all the species on the YSTs, and different pooling scenarios were applied to enhance model performance. The best performance was reached when fresh and aged individuals were used together and the whitefly species were pooled into one category for model training. With an independent dataset consisting of photos from the YSTs that were placed in greenhouses and consequently with a naturally occurring species mixture as the background, a differentiation rate of more than 85% was reached for natural enemies and whiteflies.
  • Item
    An automated field phenotyping pipeline for application in grapevine research
    (Basel : MDPI, 2015) Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard
    Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale.
  • Item
    Publisher Correction: Rapid and low-cost insect detection for analysing species trapped on yellow sticky traps
    (London : Nature Publishing Group, 2021) Böckmann, Elias; Pfaff, Alexander; Schirrmann, Michael; Pflanz, Michael
    Correction to: Scientific Reports https://doi.org/10.1038/s41598-021-89930-w, published online 17 May 2021
  • Item
    Optimized Deep Learning Model as a Basis for Fast UAV Mapping of Weed Species in Winter Wheat Crops
    (Basel : MDPI AG, 2021) de Camargo, Tibor; Schirrmann, Michael; Landwehr, Niels; Dammer, Karl-Heinz; Pflanz, Michael
    Weed maps should be available quickly, reliably, and with high detail to be useful for site-specific management in crop protection and to promote more sustainable agriculture by reducing pesticide use. Here, the optimization of a deep residual convolutional neural network (ResNet-18) for the classification of weed and crop plants in UAV imagery is proposed. The target was to reach sufficient performance on an embedded system by maintaining the same features of the ResNet-18 model as a basis for fast UAV mapping. This would enable online recognition and subsequent mapping of weeds during UAV flying operation. Optimization was achieved mainly by avoiding redundant computations that arise when a classification model is applied on overlapping tiles in a larger input image. The model was trained and tested with imagery obtained from a UAV flight campaign at low altitude over a winter wheat field, and classification was performed on species level with the weed species Matricaria chamomilla L., Papaver rhoeas L., Veronica hederifolia L., and Viola arvensis ssp. arvensis observed in that field. The ResNet-18 model with the optimized image-level prediction pipeline reached a performance of 2.2 frames per second with an NVIDIA Jetson AGX Xavier on the full resolution UAV image, which would amount to about 1.78 ha h−1 area output for continuous field mapping. The overall accuracy for determining crop, soil, and weed species was 94%. There were some limitations in the detection of species unknown to the model. When shifting from 16-bit to 32-bit model precision, no improvement in classification accuracy was observed, but a strong decline in speed performance, especially when a higher number of filters was used in the ResNet-18 model. Future work should be directed towards the integration of the mapping process on UAV platforms, guiding UAVs autonomously for mapping purpose, and ensuring the transferability of the models to other crop fields.
  • Item
    Monitoring Agronomic Parameters of Winter Wheat Crops with Low-Cost UAV Imagery
    (Basel : MDPI, 2016) Schirrmann, Michael; Giebel, Antje; Gleiniger, Franziska; Pflanz, Michael; Lentschke, Jan; Dammer, Karl-Heinz
    Monitoring the dynamics in wheat crops requires near-term observations with high spatial resolution due to the complex factors influencing wheat growth variability. We studied the prospects for monitoring the biophysical parameters and nitrogen status in wheat crops with low-cost imagery acquired from unmanned aerial vehicles (UAV) over an 11 ha field. Flight missions were conducted at approximately 50 m in altitude with a commercial copter and camera system—three missions were performed between booting and maturing of the wheat plants and one mission after tillage. Ultra-high resolution orthoimages of 1.2 cm·px−1 and surface models were generated for each mission from the standard red, green and blue (RGB) aerial images. The image variables were extracted from image tone and surface models, e.g., RGB ratios, crop coverage and plant height. During each mission, 20 plots within the wheat canopy with 1 × 1 m2 sample support were selected in the field, and the leaf area index, plant height, fresh and dry biomass and nitrogen concentrations were measured. From the generated UAV imagery, we were able to follow the changes in early senescence at the individual plant level in the wheat crops. Changes in the pattern of the wheat canopy varied drastically from one mission to the next, which supported the need for instantaneous observations, as delivered by UAV imagery. The correlations between the biophysical parameters and image variables were highly significant during each mission, and the regression models calculated with the principal components of the image variables yielded R2 values between 0.70 and 0.97. In contrast, the models of the nitrogen concentrations yielded low R2 values with the best model obtained at flowering (R2 = 0.65). The nitrogen nutrition index was calculated with an accuracy of 0.10 to 0.11 NNI for each mission. For all models, information about the surface models and image tone was important. We conclude that low-cost RGB UAV imagery will strongly aid farmers in observing biophysical characteristics, but it is limited for observing the nitrogen status within wheat crops.
  • Item
    Growth Height Determination of Tree Walls for Precise Monitoring in Apple Fruit Production Using UAV Photogrammetry
    (Basel : MDPI, 2020) Hobart, Marius; Pflanz, Michael; Weltzien, Cornelia; Schirrmann, Michael
    In apple cultivation, spatial information about phenotypic characteristics of tree walls would be beneficial for precise orchard management. Unmanned aerial vehicles (UAVs) can collect 3D structural information of ground surface objects at high resolution in a cost-effective and versatile way by using photogrammetry. The aim of this study is to delineate tree wall height information in an apple orchard applying a low-altitude flight pattern specifically designed for UAVs. This flight pattern implies small distances between the camera sensor and the tree walls when the camera is positioned in an oblique view toward the trees. In this way, it is assured that the depicted tree crown wall area will be largely covered with a larger ground sampling distance than that recorded from a nadir perspective, especially regarding the lower crown sections. Overlapping oblique view images were used to estimate 3D point cloud models by applying structure-from-motion (SfM) methods to calculate tree wall heights from them. The resulting height models were compared with ground-based light detection and ranging (LiDAR) data as reference. It was shown that the tree wall profiles from the UAV point clouds were strongly correlated with the LiDAR point clouds of two years (2018: R2 = 0.83; 2019: R2 = 0.88). However, underestimation of tree wall heights was detected with mean deviations of −0.11 m and −0.18 m for 2018 and 2019, respectively. This is attributed to the weaknesses of the UAV point clouds in resolving the very fine shoots of apple trees. Therefore, the shown approach is suitable for precise orchard management, but it underestimated vertical tree wall expanses, and widened tree gaps need to be accounted for.
  • Item
    Regression kriging for improving crop height models fusing ultra-sonic sensing with UAV imagery
    (Basel : MDPI, 2017) Schirrmann, Michael; Hamdorf, André; Giebel, Antje; Gleiniger, Franziska; Pflanz, Michael; Dammer, Karl-Heinz
    A crop height model (CHM) can be an important element of the decision making process in agriculture, because it relates well with many agronomic parameters, e.g., crop height, plant biomass or crop yield. Today, CHMs can be inexpensively obtained from overlapping imagery captured from unmanned aerial vehicle (UAV) platforms or from proximal sensors attached to ground-based vehicles used for regular management. Both approaches have their limitations and combining them with a data fusion may overcome some of these limitations. Therefore, the objective of this study was to investigate if regression kriging, as a geostatistical data fusion approach, can be used to improve the interpolation of ground-based ultrasonic measurements with UAV imagery as covariate. Regression kriging might be suitable because we have a sparse data set (ultrasound) and an exhaustive data set (UAV) and both data sets have favorable properties for geostatistical analysis. To confirm this, we conducted four missions in two different fields in total, where we collected UAV imagery and ultrasonic data alongside. From the overlapping UAV images, surface models and ortho-images were generated with photogrammetric processing. The maps generated by regression kriging were of much higher detail than the smooth maps generated by ordinary kriging, because regression kriging ensures that for each prediction point information from the UAV, imagery is given. The relationship with crop height, fresh biomass and, to a lesser extent, with crop yield, was stronger using CHMs generated by regression kriging than by ordinary kriging. The use of UAV data from the prior mission was also of benefit and could improve map accuracy and quality. Thus, regression kriging is a flexible approach for the integration of UAV imagery with ground-based sensor data, with benefits for precision agriculture-oriented farmers and agricultural service providers.
  • Item
    Weed Mapping with UAS Imagery and a Bag of Visual Words Based Image Classifier
    (Basel : MDPI, 2018-9-24) Pflanz, Michael; Nordmeyer, Henning; Schirrmann, Michael
    Weed detection with aerial images is a great challenge to generate field maps for site-specific plant protection application. The requirements might be met with low altitude flights of unmanned aerial vehicles (UAV), to provide adequate ground resolutions for differentiating even single weeds accurately. The following study proposed and tested an image classifier based on a Bag of Visual Words (BoVW) framework for mapping weed species, using a small unmanned aircraft system (UAS) with a commercial camera on board, at low flying altitudes. The image classifier was trained with support vector machines after building a visual dictionary of local features from many collected UAS images. A window-based processing of the models was used for mapping the weed occurrences in the UAS imagery. The UAS flight campaign was carried out over a weed infested wheat field, and images were acquired between a 1 and 6 m flight altitude. From the UAS images, 25,452 weed plants were annotated on species level, along with wheat and soil as background classes for training and validation of the models. The results showed that the BoVW model allowed the discrimination of single plants with high accuracy for Matricaria recutita L. (88.60%), Papaver rhoeas L. (89.08%), Viola arvensis M. (87.93%), and winter wheat (94.09%), within the generated maps. Regarding site specific weed control, the classified UAS images would enable the selection of the right herbicide based on the distribution of the predicted weed species. © 2018 by the authors.