Search Results

Now showing 1 - 7 of 7
Loading...
Thumbnail Image
Item

Climate change impacts on European arable crop yields: Sensitivity to assumptions about rotations and residue management

2022, Faye, Babacar, Webber, Heidi, Gaiser, Thomas, Müller, Christoph, Zhang, Yinan, Stella, Tommaso, Latka, Catharina, Reckling, Moritz, Heckelei, Thomas, Helming, Katharina, Ewert, Frank

Most large scale studies assessing climate change impacts on crops are performed with simulations of single crops and with annual re-initialization of the initial soil conditions. This is in contrast to the reality that crops are grown in rotations, often with sizable proportion of the preceding crop residue to be left in the fields and varying soil initial conditions from year to year. In this study, the sensitivity of climate change impacts on crop yield and soil organic carbon to assumptions about annual model re-initialization, specification of crop rotations and the amount of residue retained in fields was assessed for seven main crops across Europe. Simulations were conducted for a scenario period 2040–2065 relative to a baseline from 1980 to 2005 using the SIMPLACE1 framework. Results indicated across Europe positive climate change impacts on yield for C3 crops and negative impacts for maize. The consideration of simulating rotations did not have a benefit on yield variability but on relative yield change in response to climate change which slightly increased for C3 crops and decreased for C4 crops when rotation was considered. Soil organic carbon decreased under climate change in both simulations assuming a continuous monocrop and plausible rotations by between 1% and 2% depending on the residue management strategy.

Loading...
Thumbnail Image
Item

Improving Deep Learning-based Plant Disease Classification with Attention Mechanism

2022, Alirezazadeh, Pendar, Schirrmann, Michael, Stolzenburg, Frieder

In recent years, deep learning-based plant disease classification has been widely developed. However, it is challenging to collect sufficient annotated image data to effectively train deep learning models for plant disease recognition. The attention mechanism in deep learning assists the model to focus on the informative data segments and extract the discriminative features of inputs to enhance training performance. This paper investigates the Convolutional Block Attention Module (CBAM) to improve classification with CNNs, which is a lightweight attention module that can be plugged into any CNN architecture with negligible overhead. Specifically, CBAM is applied to the output feature map of CNNs to highlight important local regions and extract more discriminative features. Well-known CNN models (i.e. EfficientNetB0, MobileNetV2, ResNet50, InceptionV3, and VGG19) were applied to do transfer learning for plant disease classification and then fine-tuned by a publicly available plant disease dataset of foliar diseases in pear trees called DiaMOS Plant. Amongst others, this dataset contains 3006 images of leaves affected by different stress symptoms. Among the tested CNNs, EfficientNetB0 has shown the best performance. EfficientNetB0+CBAM has outperformed EfficientNetB0 and obtained 86.89% classification accuracy. Experimental results show the effectiveness of the attention mechanism to improve the recognition accuracy of pre-trained CNNs when there are few training data.

Loading...
Thumbnail Image
Item

Establishment of a Laboratory Scale Set-Up with Controlled Temperature and High Humidity to Investigate Dry Matter Losses of Wood Chips from Poplar during Storage

2022, Hernandez-Estrada, Albert, Pecenka, Ralf, Dumfort, Sabrina, Ascher-Jenull, Judith, Lenz, Hannes, Idler, Christine, Hoffmann, Thomas

The aim of this work was to improve the understanding of dry matter losses (DML) that occur in wood chips during the initial phase of storage in outdoor piles. For this purpose, a laboratory scale storage chamber was developed and investigated regarding its ability to recreate the conditions that chips undergo during the initial phase of outdoor storage. Three trials with poplar Max-4 (Populus maximowiczii Henry  Populus nigra L.) chips were performed for 6–10 weeks in the storage chamber under controlled temperature and assisted humidity. Two different setups were investigated to maintain a high relative humidity (RH) inside the storage chamber; one using water containers, and one assisted with a humidifier. Moisture content (MC) and DML of the chips were measured at different storage times to evaluate their storage behaviour in the chamber. Additionally, microbiological analyses of the culturable fraction of saproxylic microbiota were performed, with a focus on mesophilic fungi, but discriminating also xerophilic fungi, and mesophilic bacteria, with focus on actinobacteria, in two trials, to gain a view on the poplar wood chip-inhabiting microorganisms as a function of storage conditions (moisture, temperature) and time. Results show that DML up to 8.8–13.7% occurred in the chips within 6–10 storage weeks. The maximum DML were reached in the trial using the humidifier, which seemed a suitable technique to keep a high RH in the testing chamber, and thus, to analyse the wood chips in conditions comparable to those in outdoor piles during the initial storage phase.

Loading...
Thumbnail Image
Item

Deep Learning Object Detection for Image Analysis of Cherry Fruit Fly (Rhagoletis cerasi L.) on Yellow Sticky Traps

2022, Salamut, Christian, Kohnert, Iris, Landwehr, Niels, Pflanz, Michael, Schirrmann, Michael, Zare, Mohammad

Insect populations appear with a high spatial, temporal and type-specific diversity in orchards. One of the many monitoring tools for pest management is the manual assessment of sticky traps. However, this type of assessment is laborious and time-consuming so that only a few locations can be controlled in an orchard. The aim of this study is to test state-of-the art object detection algorithms from deep learning to automatically detect cherry fruit flies (Rhagoletis cerasi), a common insect pest in cherry plantations, within images from yellow sticky traps. An image annotation database was built with images taken from yellow sticky traps with more than 1600 annotated cherry fruit flies. For better handling in the computational algorithms, the images were augmented to smaller ones by the known image preparation methods “flipping” and “cropping” before performing the deep learning. Five deep learning image recognition models were tested including Faster Region-based Convolutional Neural Network (R-CNN) with two different methods of pretraining, Single Shot Detector (SSD), RetinaNet, and You Only Look Once version 5 (YOLOv5). R‑CNN and RetinaNet models outperformed other ones with a detection average precision of 0.9. The results indicate that deep learning can act as an integral component of an automated system for high-throughput assessment of pest insects in orchards. Therefore, this can reduce the time for repetitive and laborious trap assessment but also increase the observed amount of sticky traps

Loading...
Thumbnail Image
Item

A Pixel-wise Segmentation Model to Identify Bur Chervil (Anthriscus caucalis M. Bieb.) Within Images from a Cereal Cropping Field

2022, Karimi, Hadi, Navid, Hossein, Dammer, Karl-Heinz

Because of insufficient effectiveness after herbicide application in autumn, bur chervil (Anthriscus caucalis M. Bieb.) is often present in cereal fields in spring. A second reason for spreading is the warm winter in Europe due to climate change. This weed continues to germinate from autumn to spring. To prevent further spreading, a site-specific control in spring is reasonable. Color imagery would offer cheap and complete monitoring of entire fields. In this study, an end-to-end fully convolutional network approach is presented to detect bur chervil within color images. The dataset consisted of images taken at three sampling dates in spring 2018 in winter wheat and at one date in 2019 in winter rye from the same field. Pixels representing bur chervil were manually annotated in all images. After a random image augmentation was done, a Unet-based convolutional neural network model was trained using 560 (80%) of the sub-images from 2018 (training images). The power of the trained model at the three different sampling dates in 2018 was evaluated at 141 (20%) of the manually annotated sub-images from 2018 and all (100%) sub-images from 2019 (test images). Comparing the estimated and the manually annotated weed plants in the test images the Intersection over Union (Jaccard index) showed mean values in the range of 0.9628 to 0.9909 for the three sampling dates in 2018, and a value of 0.9292 for the one date in 2019. The Dice coefficients yielded mean values in the range of 0.9801 to 0.9954 for 2018 and a value of 0.9605 in 2019.

Loading...
Thumbnail Image
Item

Workshop on "Sensor-supported detection of pests in outdoor cultivation" at the Leibniz Institute for Agricultural Engineering and Bioeconomy in Potsdam-Bornim (ATB), May 11 and 12, 2022

2023, Dammer, Karl-Heinz

Loading...
Thumbnail Image
Item

Methods for Recognition of Colorado Beetle (Leptinotarsa decemlineata (Say)) with Multispectral and Color Camera-sensors

2022, Dammer, Karl-Heinz

At the beginning of an epidemic, the Colorado beetle occur sparsely on few potato plants in the field. A target-orientated crop protection applies insecticides only on infested plants. For this, a complete monitoring of the whole field is required, which can be done by camera-sensors attached to tractors or unmanned aerial vehicles (UAVs). The gathered images have to be analyzed using appropriate classification methods preferably in real-time to recognize the different stages of the beetle in high precision. In the paper, the methodology of the application of one multispectral and three commercially available color cameras (RGB) and the results from field tests for recognizing the development stages of the beetle along the vegetation period of the potato crop are presented. Compared to multispectral cameras color cameras are low-cost. The use of artificial neural network for classification of the larvae within the RGB-images are discussed. At the bottom side of the potato leaves the eggs are deposited. Sensor based monitoring from above the crop canopy cannot detect the eggs and the hatching first instar. The ATB developed a camera equipped vertical sensor for scanning the bottom of the leaves. This provide a time advantage for the spray decision of the farmer (e.g. planning of the machine employment, purchase of insecticides). In this paper, example images and a possible future use of the presented monitoring methods above and below the crop surface are presented and discussed.