Search Results

Now showing 1 - 4 of 4
  • Item
    Methods for Recognition of Colorado Beetle (Leptinotarsa decemlineata (Say)) with Multispectral and Color Camera-sensors
    (Berlin ; Heidelberg : Springer, 2022) Dammer, Karl-Heinz
    At the beginning of an epidemic, the Colorado beetle occur sparsely on few potato plants in the field. A target-orientated crop protection applies insecticides only on infested plants. For this, a complete monitoring of the whole field is required, which can be done by camera-sensors attached to tractors or unmanned aerial vehicles (UAVs). The gathered images have to be analyzed using appropriate classification methods preferably in real-time to recognize the different stages of the beetle in high precision. In the paper, the methodology of the application of one multispectral and three commercially available color cameras (RGB) and the results from field tests for recognizing the development stages of the beetle along the vegetation period of the potato crop are presented. Compared to multispectral cameras color cameras are low-cost. The use of artificial neural network for classification of the larvae within the RGB-images are discussed. At the bottom side of the potato leaves the eggs are deposited. Sensor based monitoring from above the crop canopy cannot detect the eggs and the hatching first instar. The ATB developed a camera equipped vertical sensor for scanning the bottom of the leaves. This provide a time advantage for the spray decision of the farmer (e.g. planning of the machine employment, purchase of insecticides). In this paper, example images and a possible future use of the presented monitoring methods above and below the crop surface are presented and discussed.
  • Item
    A Pixel-wise Segmentation Model to Identify Bur Chervil (Anthriscus caucalis M. Bieb.) Within Images from a Cereal Cropping Field
    (Berlin ; Heidelberg : Springer, 2022) Karimi, Hadi; Navid, Hossein; Dammer, Karl-Heinz
    Because of insufficient effectiveness after herbicide application in autumn, bur chervil (Anthriscus caucalis M. Bieb.) is often present in cereal fields in spring. A second reason for spreading is the warm winter in Europe due to climate change. This weed continues to germinate from autumn to spring. To prevent further spreading, a site-specific control in spring is reasonable. Color imagery would offer cheap and complete monitoring of entire fields. In this study, an end-to-end fully convolutional network approach is presented to detect bur chervil within color images. The dataset consisted of images taken at three sampling dates in spring 2018 in winter wheat and at one date in 2019 in winter rye from the same field. Pixels representing bur chervil were manually annotated in all images. After a random image augmentation was done, a Unet-based convolutional neural network model was trained using 560 (80%) of the sub-images from 2018 (training images). The power of the trained model at the three different sampling dates in 2018 was evaluated at 141 (20%) of the manually annotated sub-images from 2018 and all (100%) sub-images from 2019 (test images). Comparing the estimated and the manually annotated weed plants in the test images the Intersection over Union (Jaccard index) showed mean values in the range of 0.9628 to 0.9909 for the three sampling dates in 2018, and a value of 0.9292 for the one date in 2019. The Dice coefficients yielded mean values in the range of 0.9801 to 0.9954 for 2018 and a value of 0.9605 in 2019.
  • Item
    Improving Deep Learning-based Plant Disease Classification with Attention Mechanism
    (Berlin ; Heidelberg : Springer, 2022) Alirezazadeh, Pendar; Schirrmann, Michael; Stolzenburg, Frieder
    In recent years, deep learning-based plant disease classification has been widely developed. However, it is challenging to collect sufficient annotated image data to effectively train deep learning models for plant disease recognition. The attention mechanism in deep learning assists the model to focus on the informative data segments and extract the discriminative features of inputs to enhance training performance. This paper investigates the Convolutional Block Attention Module (CBAM) to improve classification with CNNs, which is a lightweight attention module that can be plugged into any CNN architecture with negligible overhead. Specifically, CBAM is applied to the output feature map of CNNs to highlight important local regions and extract more discriminative features. Well-known CNN models (i.e. EfficientNetB0, MobileNetV2, ResNet50, InceptionV3, and VGG19) were applied to do transfer learning for plant disease classification and then fine-tuned by a publicly available plant disease dataset of foliar diseases in pear trees called DiaMOS Plant. Amongst others, this dataset contains 3006 images of leaves affected by different stress symptoms. Among the tested CNNs, EfficientNetB0 has shown the best performance. EfficientNetB0+CBAM has outperformed EfficientNetB0 and obtained 86.89% classification accuracy. Experimental results show the effectiveness of the attention mechanism to improve the recognition accuracy of pre-trained CNNs when there are few training data.
  • Item
    Deep Learning Object Detection for Image Analysis of Cherry Fruit Fly (Rhagoletis cerasi L.) on Yellow Sticky Traps
    (Berlin ; Heidelberg : Springer, 2022) Salamut, Christian; Kohnert, Iris; Landwehr, Niels; Pflanz, Michael; Schirrmann, Michael; Zare, Mohammad
    Insect populations appear with a high spatial, temporal and type-specific diversity in orchards. One of the many monitoring tools for pest management is the manual assessment of sticky traps. However, this type of assessment is laborious and time-consuming so that only a few locations can be controlled in an orchard. The aim of this study is to test state-of-the art object detection algorithms from deep learning to automatically detect cherry fruit flies (Rhagoletis cerasi), a common insect pest in cherry plantations, within images from yellow sticky traps. An image annotation database was built with images taken from yellow sticky traps with more than 1600 annotated cherry fruit flies. For better handling in the computational algorithms, the images were augmented to smaller ones by the known image preparation methods “flipping” and “cropping” before performing the deep learning. Five deep learning image recognition models were tested including Faster Region-based Convolutional Neural Network (R-CNN) with two different methods of pretraining, Single Shot Detector (SSD), RetinaNet, and You Only Look Once version 5 (YOLOv5). R‑CNN and RetinaNet models outperformed other ones with a detection average precision of 0.9. The results indicate that deep learning can act as an integral component of an automated system for high-throughput assessment of pest insects in orchards. Therefore, this can reduce the time for repetitive and laborious trap assessment but also increase the observed amount of sticky traps