Search Results

Now showing 1 - 6 of 6
  • Item
    A Pixel-wise Segmentation Model to Identify Bur Chervil (Anthriscus caucalis M. Bieb.) Within Images from a Cereal Cropping Field
    (Berlin ; Heidelberg : Springer, 2022) Karimi, Hadi; Navid, Hossein; Dammer, Karl-Heinz
    Because of insufficient effectiveness after herbicide application in autumn, bur chervil (Anthriscus caucalis M. Bieb.) is often present in cereal fields in spring. A second reason for spreading is the warm winter in Europe due to climate change. This weed continues to germinate from autumn to spring. To prevent further spreading, a site-specific control in spring is reasonable. Color imagery would offer cheap and complete monitoring of entire fields. In this study, an end-to-end fully convolutional network approach is presented to detect bur chervil within color images. The dataset consisted of images taken at three sampling dates in spring 2018 in winter wheat and at one date in 2019 in winter rye from the same field. Pixels representing bur chervil were manually annotated in all images. After a random image augmentation was done, a Unet-based convolutional neural network model was trained using 560 (80%) of the sub-images from 2018 (training images). The power of the trained model at the three different sampling dates in 2018 was evaluated at 141 (20%) of the manually annotated sub-images from 2018 and all (100%) sub-images from 2019 (test images). Comparing the estimated and the manually annotated weed plants in the test images the Intersection over Union (Jaccard index) showed mean values in the range of 0.9628 to 0.9909 for the three sampling dates in 2018, and a value of 0.9292 for the one date in 2019. The Dice coefficients yielded mean values in the range of 0.9801 to 0.9954 for 2018 and a value of 0.9605 in 2019.
  • Item
    Methods for Recognition of Colorado Beetle (Leptinotarsa decemlineata (Say)) with Multispectral and Color Camera-sensors
    (Berlin ; Heidelberg : Springer, 2022) Dammer, Karl-Heinz
    At the beginning of an epidemic, the Colorado beetle occur sparsely on few potato plants in the field. A target-orientated crop protection applies insecticides only on infested plants. For this, a complete monitoring of the whole field is required, which can be done by camera-sensors attached to tractors or unmanned aerial vehicles (UAVs). The gathered images have to be analyzed using appropriate classification methods preferably in real-time to recognize the different stages of the beetle in high precision. In the paper, the methodology of the application of one multispectral and three commercially available color cameras (RGB) and the results from field tests for recognizing the development stages of the beetle along the vegetation period of the potato crop are presented. Compared to multispectral cameras color cameras are low-cost. The use of artificial neural network for classification of the larvae within the RGB-images are discussed. At the bottom side of the potato leaves the eggs are deposited. Sensor based monitoring from above the crop canopy cannot detect the eggs and the hatching first instar. The ATB developed a camera equipped vertical sensor for scanning the bottom of the leaves. This provide a time advantage for the spray decision of the farmer (e.g. planning of the machine employment, purchase of insecticides). In this paper, example images and a possible future use of the presented monitoring methods above and below the crop surface are presented and discussed.
  • Item
    Early Detection of Stripe Rust in Winter Wheat Using Deep Residual Neural Networks
    (Lausanne : Frontiers Media, 2021) Schirrmann, Michael; Landwehr, Niels; Giebel, Antje; Garz, Andreas; Dammer, Karl-Heinz
    Stripe rust (Pst) is a major disease of wheat crops leading untreated to severe yield losses. The use of fungicides is often essential to control Pst when sudden outbreaks are imminent. Sensors capable of detecting Pst in wheat crops could optimize the use of fungicides and improve disease monitoring in high-throughput field phenotyping. Now, deep learning provides new tools for image recognition and may pave the way for new camera based sensors that can identify symptoms in early stages of a disease outbreak within the field. The aim of this study was to teach an image classifier to detect Pst symptoms in winter wheat canopies based on a deep residual neural network (ResNet). For this purpose, a large annotation database was created from images taken by a standard RGB camera that was mounted on a platform at a height of 2 m. Images were acquired while the platform was moved over a randomized field experiment with Pst-inoculated and Pst-free plots of winter wheat. The image classifier was trained with 224 × 224 px patches tiled from the original, unprocessed camera images. The image classifier was tested on different stages of the disease outbreak. At patch level the image classifier reached a total accuracy of 90%. To test the image classifier on image level, the image classifier was evaluated with a sliding window using a large striding length of 224 px allowing for fast test performance. At image level, the image classifier reached a total accuracy of 77%. Even in a stage with very low disease spreading (0.5%) at the very beginning of the Pst outbreak, a detection accuracy of 57% was obtained. Still in the initial phase of the Pst outbreak with 2 to 4% of Pst disease spreading, detection accuracy with 76% could be attained. With further optimizations, the image classifier could be implemented in embedded systems and deployed on drones, vehicles or scanning systems for fast mapping of Pst outbreaks.
  • Item
    Primarily tests of a optoelectronic in-canopy sensor for evaluation of vertical disease infection in cereals
    (New York, NY : Wiley, 2022) Dammer, Karl-Heinz; Schirrmann, Michael
    BACKGROUND: Health scouting of crops by satellite, airplanes, unmanned aerial (UAV) and ground vehicles can only evaluate the crop from above. The visible leaves may show no disease symptoms, but lower, older leaves not visible from above can do. A mobile in-canopy sensor was developed, carried by a tractor to detect diseases in cereal crops. Photodiodes measure the reflected light in the red and infrared wavelength range at 10 different vertical heights in lateral directions. RESULTS: Significant differences occurred in the vegetation index NDVI of sensor levels operated inside and near the winter wheat canopy between infected (stripe rust: 2018, 2019 / leaf rust: 2020) and control plots. The differences were not significant at those sensor levels operated far above the canopy. CONCLUSIONS: Lateral reflectance measurements inside the crop canopy are able to distinguish between disease-infected and healthy crops. In future mobile in-canopy scouting could be an extension to the common above-canopy scouting praxis for making spraying decisions by the farmer or decision support systems. © 2021 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.
  • Item
    Optimized Deep Learning Model as a Basis for Fast UAV Mapping of Weed Species in Winter Wheat Crops
    (Basel : MDPI AG, 2021) de Camargo, Tibor; Schirrmann, Michael; Landwehr, Niels; Dammer, Karl-Heinz; Pflanz, Michael
    Weed maps should be available quickly, reliably, and with high detail to be useful for site-specific management in crop protection and to promote more sustainable agriculture by reducing pesticide use. Here, the optimization of a deep residual convolutional neural network (ResNet-18) for the classification of weed and crop plants in UAV imagery is proposed. The target was to reach sufficient performance on an embedded system by maintaining the same features of the ResNet-18 model as a basis for fast UAV mapping. This would enable online recognition and subsequent mapping of weeds during UAV flying operation. Optimization was achieved mainly by avoiding redundant computations that arise when a classification model is applied on overlapping tiles in a larger input image. The model was trained and tested with imagery obtained from a UAV flight campaign at low altitude over a winter wheat field, and classification was performed on species level with the weed species Matricaria chamomilla L., Papaver rhoeas L., Veronica hederifolia L., and Viola arvensis ssp. arvensis observed in that field. The ResNet-18 model with the optimized image-level prediction pipeline reached a performance of 2.2 frames per second with an NVIDIA Jetson AGX Xavier on the full resolution UAV image, which would amount to about 1.78 ha h−1 area output for continuous field mapping. The overall accuracy for determining crop, soil, and weed species was 94%. There were some limitations in the detection of species unknown to the model. When shifting from 16-bit to 32-bit model precision, no improvement in classification accuracy was observed, but a strong decline in speed performance, especially when a higher number of filters was used in the ResNet-18 model. Future work should be directed towards the integration of the mapping process on UAV platforms, guiding UAVs autonomously for mapping purpose, and ensuring the transferability of the models to other crop fields.