Browsing by Author "Schirrmann, Michael"
Now showing 1 - 20 of 20
Results Per Page
Sort Options
- ItemBiochar research activities and their relation to development and environmental quality. A meta-analysis(Berlin ; Heidelberg : Springer, 2017-6-6) Mehmood, Khalid; Chávez Garcia, Elizabeth; Schirrmann, Michael; Ladd, Brenton; Kammann, Claudia; Wrage-Mönnig, Nicole; Siebe, Christina; Estavillo, Jose M.; Fuertes-Mendizabal, Teresa; Cayuela, Mariluz; Sigua, Gilbert; Spokas, Kurt; Cowie, Annette L.; Novak, Jeff; Ippolito, James A.; Borchard, NilsBiochar is the solid product that results from pyrolysis of organic materials. Its addition to highly weathered soils changes physico-chemical soil properties, improves soil functions and enhances crop yields. Highly weathered soils are typical of humid tropics where agricultural productivity is low and needs to be raised to reduce human hunger and poverty. However, impact of biochar research on scientists, politicians and end-users in poor tropical countries remains unknown; assessing needs and interests on biochar is essential to develop reliable knowledge transfer/translation mechanisms. The aim of this publication is to present results of a meta-analysis conducted to (1) survey global biochar research published between 2010 and 2014 to assess its relation to human development and environmental quality, and (2) deduce, based on the results of this analysis, priorities required to assess and promote the role of biochar in the development of adapted and sustainable agronomic methods. Our main findings reveal for the very first time that: (1) biochar research associated with less developed countries focused on biochar production technologies (26.5 ± 0.7%), then on biochars’ impact on chemical soil properties (18.7 ± 1.2%), and on plant productivity (17.1 ± 2.6%); (2) China dominated biochar research activities among the medium developed countries focusing on biochar production technologies (26.8 ± 0.5%) and on use of biochar as sorbent for organic and inorganic compounds (29.1 ± 0.4%); and (3) the majority of biochar research (69.0±2.9%) was associated with highly developed countries that are able to address a higher diversity of questions. Evidently, less developed countries are eager to improve soil fertility and agricultural productivity, which requires transfer and/or translation of biochar knowledge acquired in highly developed countries. Yet, improving local research capacities and encouraging synergies across scientific disciplines and countries are crucial to foster development of sustainable agronomy in less developed countries. © 2017, The Author(s).
- ItemCrop Monitoring Using Sentinel-2 and UAV Multispectral Imagery: A Comparison Case Study in Northeastern Germany(Basel : MDPI, 2022) Li, Minhui; Shamshiri, Redmond R.; Weltzien, Cornelia; Schirrmann, MichaelMonitoring within-field crop variability at fine spatial and temporal resolution can assist farmers in making reliable decisions during their agricultural management; however, it traditionally involves a labor-intensive and time-consuming pointwise manual process. To the best of our knowledge, few studies conducted a comparison of Sentinel-2 with UAV data for crop monitoring in the context of precision agriculture. Therefore, prospects of crop monitoring for characterizing biophysical plant parameters and leaf nitrogen of wheat and barley crops were evaluated from a more practical viewpoint closer to agricultural routines. Multispectral UAV and Sentinel-2 imagery was collected over three dates in the season and compared with reference data collected at 20 sample points for plant leaf nitrogen (N), maximum plant height, mean plant height, leaf area index (LAI), and fresh biomass. Higher correlations of UAV data to the agronomic parameters were found on average than with Sentinel-2 data with a percentage increase of 6.3% for wheat and 22.2% for barley. In this regard, VIs calculated from spectral bands in the visible part performed worse for Sentinel-2 than for the UAV data. In addition, large-scale patterns, formed by the influence of an old riverbed on plant growth, were recognizable even in the Sentinel-2 imagery despite its much lower spatial resolution. Interestingly, also smaller features, such as the tramlines from controlled traffic farming (CTF), had an influence on the Sentinel-2 data and showed a systematic pattern that affected even semivariogram calculation. In conclusion, Sentinel-2 imagery is able to capture the same large-scale pattern as can be derived from the higher detailed UAV imagery; however, it is at the same time influenced by management-driven features such as tramlines, which cannot be accurately georeferenced. In consequence, agronomic parameters were better correlated with UAV than with Sentinel-2 data. Crop growers as well as data providers from remote sensing services may take advantage of this knowledge and we recommend the use of UAV data as it gives additional information about management-driven features. For future perspective, we would advise fusing UAV with Sentinel-2 imagery taken early in the season as it can integrate the effect of agricultural management in the subsequent absence of high spatial resolution data to help improve crop monitoring for the farmer and to reduce costs.
- ItemDeep Learning Object Detection for Image Analysis of Cherry Fruit Fly (Rhagoletis cerasi L.) on Yellow Sticky Traps(Berlin ; Heidelberg : Springer, 2022) Salamut, Christian; Kohnert, Iris; Landwehr, Niels; Pflanz, Michael; Schirrmann, Michael; Zare, MohammadInsect populations appear with a high spatial, temporal and type-specific diversity in orchards. One of the many monitoring tools for pest management is the manual assessment of sticky traps. However, this type of assessment is laborious and time-consuming so that only a few locations can be controlled in an orchard. The aim of this study is to test state-of-the art object detection algorithms from deep learning to automatically detect cherry fruit flies (Rhagoletis cerasi), a common insect pest in cherry plantations, within images from yellow sticky traps. An image annotation database was built with images taken from yellow sticky traps with more than 1600 annotated cherry fruit flies. For better handling in the computational algorithms, the images were augmented to smaller ones by the known image preparation methods “flipping” and “cropping” before performing the deep learning. Five deep learning image recognition models were tested including Faster Region-based Convolutional Neural Network (R-CNN) with two different methods of pretraining, Single Shot Detector (SSD), RetinaNet, and You Only Look Once version 5 (YOLOv5). R‑CNN and RetinaNet models outperformed other ones with a detection average precision of 0.9. The results indicate that deep learning can act as an integral component of an automated system for high-throughput assessment of pest insects in orchards. Therefore, this can reduce the time for repetitive and laborious trap assessment but also increase the observed amount of sticky traps
- ItemEarly Detection of Stripe Rust in Winter Wheat Using Deep Residual Neural Networks(Lausanne : Frontiers Media, 2021) Schirrmann, Michael; Landwehr, Niels; Giebel, Antje; Garz, Andreas; Dammer, Karl-HeinzStripe rust (Pst) is a major disease of wheat crops leading untreated to severe yield losses. The use of fungicides is often essential to control Pst when sudden outbreaks are imminent. Sensors capable of detecting Pst in wheat crops could optimize the use of fungicides and improve disease monitoring in high-throughput field phenotyping. Now, deep learning provides new tools for image recognition and may pave the way for new camera based sensors that can identify symptoms in early stages of a disease outbreak within the field. The aim of this study was to teach an image classifier to detect Pst symptoms in winter wheat canopies based on a deep residual neural network (ResNet). For this purpose, a large annotation database was created from images taken by a standard RGB camera that was mounted on a platform at a height of 2 m. Images were acquired while the platform was moved over a randomized field experiment with Pst-inoculated and Pst-free plots of winter wheat. The image classifier was trained with 224 × 224 px patches tiled from the original, unprocessed camera images. The image classifier was tested on different stages of the disease outbreak. At patch level the image classifier reached a total accuracy of 90%. To test the image classifier on image level, the image classifier was evaluated with a sliding window using a large striding length of 224 px allowing for fast test performance. At image level, the image classifier reached a total accuracy of 77%. Even in a stage with very low disease spreading (0.5%) at the very beginning of the Pst outbreak, a detection accuracy of 57% was obtained. Still in the initial phase of the Pst outbreak with 2 to 4% of Pst disease spreading, detection accuracy with 76% could be attained. With further optimizations, the image classifier could be implemented in embedded systems and deployed on drones, vehicles or scanning systems for fast mapping of Pst outbreaks.
- ItemGlobal data on earthworm abundance, biomass, diversity and corresponding environmental properties(London : Nature Publ. Group, 2021) Phillips, Helen R. P.; Bach, Elizabeth M.; Bartz, Marie L. C.; Bennett, Joanne M.; Beugnon, Rémy; Briones, Maria J. I.; Brown, George G.; Ferlian, Olga; Gongalsky, Konstantin B.; Guerra, Carlos A.; König-Ries, Birgitta; López-Hernández, Danilo; Loss, Scott R.; Marichal, Raphael; Matula, Radim; Minamiya, Yukio; Moos, Jan Hendrik; Moreno, Gerardo; Morón-Ríos, Alejandro; Motohiro, Hasegawa; Muys, Bart; Krebs, Julia J.; Neirynck, Johan; Norgrove, Lindsey; Novo, Marta; Nuutinen, Visa; Nuzzo, Victoria; Mujeeb Rahman, P.; Pansu, Johan; Paudel, Shishir; Pérès, Guénola; Pérez-Camacho, Lorenzo; Orgiazzi, Alberto; Ponge, Jean-François; Prietzel, Jörg; Rapoport, Irina B.; Rashid, Muhammad Imtiaz; Rebollo, Salvador; Rodríguez, Miguel Á.; Roth, Alexander M.; Rousseau, Guillaume X.; Rozen, Anna; Sayad, Ehsan; Ramirez, Kelly S.; van Schaik, Loes; Scharenbroch, Bryant; Schirrmann, Michael; Schmidt, Olaf; Schröder, Boris; Seeber, Julia; Shashkov, Maxim P.; Singh, Jaswinder; Smith, Sandy M.; Steinwandter, Michael; Russell, David J.; Szlavecz, Katalin; Talavera, José Antonio; Trigo, Dolores; Tsukamoto, Jiro; Uribe-López, Sheila; de Valença, Anne W.; Virto, Iñigo; Wackett, Adrian A.; Warren, Matthew W.; Webster, Emily R.; Schwarz, Benjamin; Wehr, Nathaniel H.; Whalen, Joann K.; Wironen, Michael B.; Wolters, Volkmar; Wu, Pengfei; Zenkova, Irina V.; Zhang, Weixin; Cameron, Erin K.; Eisenhauer, Nico; Wall, Diana H.; Brose, Ulrich; Decaëns, Thibaud; Lavelle, Patrick; Loreau, Michel; Mathieu, Jérôme; Mulder, Christian; van der Putten, Wim H.; Rillig, Matthias C.; Thakur, Madhav P.; de Vries, Franciska T.; Wardle, David A.; Ammer, Christian; Ammer, Sabine; Arai, Miwa; Ayuke, Fredrick O.; Baker, Geoff H.; Baretta, Dilmar; Barkusky, Dietmar; Beauséjour, Robin; Bedano, Jose C.; Birkhofer, Klaus; Blanchart, Eric; Blossey, Bernd; Bolger, Thomas; Bradley, Robert L.; Brossard, Michel; Burtis, James C.; Capowiez, Yvan; Cavagnaro, Timothy R.; Choi, Amy; Clause, Julia; Cluzeau, Daniel; Coors, Anja; Crotty, Felicity V.; Crumsey, Jasmine M.; Dávalos, Andrea; Cosín, Darío J. Díaz; Dobson, Annise M.; Domínguez, Anahí; Duhour, Andrés Esteban; van Eekeren, Nick; Emmerling, Christoph; Falco, Liliana B.; Fernández, Rosa; Fonte, Steven J.; Fragoso, Carlos; Franco, André L. C.; Fusilero, Abegail; Geraskina, Anna P.; Gholami, Shaieste; González, Grizelle; Gundale, Michael J.; López, Mónica Gutiérrez; Hackenberger, Branimir K.; Hackenberger, Davorka K.; Hernández, Luis M.; Hirth, Jeff R.; Hishi, Takuo; Holdsworth, Andrew R.; Holmstrup, Martin; Hopfensperger, Kristine N.; Lwanga, Esperanza Huerta; Huhta, Veikko; Hurisso, Tunsisa T.; Iannone, Basil V.; Iordache, Madalina; Irmler, Ulrich; Ivask, Mari; Jesús, Juan B.; Johnson-Maynard, Jodi L.; Joschko, Monika; Kaneko, Nobuhiro; Kanianska, Radoslava; Keith, Aidan M.; Kernecker, Maria L.; Koné, Armand W.; Kooch, Yahya; Kukkonen, Sanna T.; Lalthanzara, H.; Lammel, Daniel R.; Lebedev, Iurii M.; Le Cadre, Edith; Lincoln, Noa K.Earthworms are an important soil taxon as ecosystem engineers, providing a variety of crucial ecosystem functions and services. Little is known about their diversity and distribution at large spatial scales, despite the availability of considerable amounts of local-scale data. Earthworm diversity data, obtained from the primary literature or provided directly by authors, were collated with information on site locations, including coordinates, habitat cover, and soil properties. Datasets were required, at a minimum, to include abundance or biomass of earthworms at a site. Where possible, site-level species lists were included, as well as the abundance and biomass of individual species and ecological groups. This global dataset contains 10,840 sites, with 184 species, from 60 countries and all continents except Antarctica. The data were obtained from 182 published articles, published between 1973 and 2017, and 17 unpublished datasets. Amalgamating data into a single global database will assist researchers in investigating and answering a wide variety of pressing questions, for example, jointly assessing aboveground and belowground biodiversity distributions and drivers of biodiversity change.
- ItemGrowth Height Determination of Tree Walls for Precise Monitoring in Apple Fruit Production Using UAV Photogrammetry(Basel : MDPI, 2020) Hobart, Marius; Pflanz, Michael; Weltzien, Cornelia; Schirrmann, MichaelIn apple cultivation, spatial information about phenotypic characteristics of tree walls would be beneficial for precise orchard management. Unmanned aerial vehicles (UAVs) can collect 3D structural information of ground surface objects at high resolution in a cost-effective and versatile way by using photogrammetry. The aim of this study is to delineate tree wall height information in an apple orchard applying a low-altitude flight pattern specifically designed for UAVs. This flight pattern implies small distances between the camera sensor and the tree walls when the camera is positioned in an oblique view toward the trees. In this way, it is assured that the depicted tree crown wall area will be largely covered with a larger ground sampling distance than that recorded from a nadir perspective, especially regarding the lower crown sections. Overlapping oblique view images were used to estimate 3D point cloud models by applying structure-from-motion (SfM) methods to calculate tree wall heights from them. The resulting height models were compared with ground-based light detection and ranging (LiDAR) data as reference. It was shown that the tree wall profiles from the UAV point clouds were strongly correlated with the LiDAR point clouds of two years (2018: R2 = 0.83; 2019: R2 = 0.88). However, underestimation of tree wall heights was detected with mean deviations of −0.11 m and −0.18 m for 2018 and 2019, respectively. This is attributed to the weaknesses of the UAV point clouds in resolving the very fine shoots of apple trees. Therefore, the shown approach is suitable for precise orchard management, but it underestimated vertical tree wall expanses, and widened tree gaps need to be accounted for.
- ItemImpact of Camera Viewing Angle for Estimating Leaf Parameters of Wheat Plants from 3D Point Clouds(Basel : MDPI, 2021) Li, Minhui; Shamshiri, Redmond R.; Schirrmann, Michael; Weltzien, CorneliaEstimation of plant canopy using low-altitude imagery can help monitor the normal growth status of crops and is highly beneficial for various digital farming applications such as precision crop protection. However, extracting 3D canopy information from raw images requires studying the effect of sensor viewing angle by taking into accounts the limitations of the mobile platform routes inside the field. The main objective of this research was to estimate wheat (Triticum aestivum L.) leaf parameters, including leaf length and width, from the 3D model representation of the plants. For this purpose, experiments with different camera viewing angles were conducted to find the optimum setup of a mono-camera system that would result in the best 3D point clouds. The angle-control analytical study was conducted on a four-row wheat plot with a row spacing of 0.17 m and with two seeding densities and growth stages as factors. Nadir and six oblique view image datasets were acquired from the plot with 88% overlapping and were then reconstructed to point clouds using Structure from Motion (SfM) and Multi-View Stereo (MVS) methods. Point clouds were first categorized into three classes as wheat canopy, soil background, and experimental plot. The wheat canopy class was then used to extract leaf parameters, which were then compared with those values from manual measurements. The comparison between results showed that (i) multiple-view dataset provided the best estimation for leaf length and leaf width, (ii) among the single-view dataset, canopy, and leaf parameters were best modeled with angles vertically at -45⸰_ and horizontally at 0⸰_ (VA -45, HA 0), while (iii) in nadir view, fewer underlying 3D points were obtained with a missing leaf rate of 70%. It was concluded that oblique imagery is a promising approach to effectively estimate wheat canopy 3D representation with SfM-MVS using a single camera platform for crop monitoring. This study contributes to the improvement of the proximal sensing platform for crop health assessment. © 2021 by the authors. Licensee MDPI, Basel, Switzerland.
- ItemImproving Deep Learning-based Plant Disease Classification with Attention Mechanism(Berlin ; Heidelberg : Springer, 2022) Alirezazadeh, Pendar; Schirrmann, Michael; Stolzenburg, FriederIn recent years, deep learning-based plant disease classification has been widely developed. However, it is challenging to collect sufficient annotated image data to effectively train deep learning models for plant disease recognition. The attention mechanism in deep learning assists the model to focus on the informative data segments and extract the discriminative features of inputs to enhance training performance. This paper investigates the Convolutional Block Attention Module (CBAM) to improve classification with CNNs, which is a lightweight attention module that can be plugged into any CNN architecture with negligible overhead. Specifically, CBAM is applied to the output feature map of CNNs to highlight important local regions and extract more discriminative features. Well-known CNN models (i.e. EfficientNetB0, MobileNetV2, ResNet50, InceptionV3, and VGG19) were applied to do transfer learning for plant disease classification and then fine-tuned by a publicly available plant disease dataset of foliar diseases in pear trees called DiaMOS Plant. Amongst others, this dataset contains 3006 images of leaves affected by different stress symptoms. Among the tested CNNs, EfficientNetB0 has shown the best performance. EfficientNetB0+CBAM has outperformed EfficientNetB0 and obtained 86.89% classification accuracy. Experimental results show the effectiveness of the attention mechanism to improve the recognition accuracy of pre-trained CNNs when there are few training data.
- ItemMonitoring Agronomic Parameters of Winter Wheat Crops with Low-Cost UAV Imagery(Basel : MDPI, 2016) Schirrmann, Michael; Giebel, Antje; Gleiniger, Franziska; Pflanz, Michael; Lentschke, Jan; Dammer, Karl-HeinzMonitoring the dynamics in wheat crops requires near-term observations with high spatial resolution due to the complex factors influencing wheat growth variability. We studied the prospects for monitoring the biophysical parameters and nitrogen status in wheat crops with low-cost imagery acquired from unmanned aerial vehicles (UAV) over an 11 ha field. Flight missions were conducted at approximately 50 m in altitude with a commercial copter and camera system—three missions were performed between booting and maturing of the wheat plants and one mission after tillage. Ultra-high resolution orthoimages of 1.2 cm·px−1 and surface models were generated for each mission from the standard red, green and blue (RGB) aerial images. The image variables were extracted from image tone and surface models, e.g., RGB ratios, crop coverage and plant height. During each mission, 20 plots within the wheat canopy with 1 × 1 m2 sample support were selected in the field, and the leaf area index, plant height, fresh and dry biomass and nitrogen concentrations were measured. From the generated UAV imagery, we were able to follow the changes in early senescence at the individual plant level in the wheat crops. Changes in the pattern of the wheat canopy varied drastically from one mission to the next, which supported the need for instantaneous observations, as delivered by UAV imagery. The correlations between the biophysical parameters and image variables were highly significant during each mission, and the regression models calculated with the principal components of the image variables yielded R2 values between 0.70 and 0.97. In contrast, the models of the nitrogen concentrations yielded low R2 values with the best model obtained at flowering (R2 = 0.65). The nitrogen nutrition index was calculated with an accuracy of 0.10 to 0.11 NNI for each mission. For all models, information about the surface models and image tone was important. We conclude that low-cost RGB UAV imagery will strongly aid farmers in observing biophysical characteristics, but it is limited for observing the nitrogen status within wheat crops.
- ItemNew Tropical Peatland Gas and Particulate Emissions Factors Indicate 2015 Indonesian Fires Released Far More Particulate Matter (but Less Methane) than Current Inventories Imply(Basel : MDPI, 2018-3-21) Wooster, Martin J.; Gaveau, David L.A.; Salim, Mohammad A.; Zhang, Tianran; Xu, Weidong; Green, David C.; Huijnen, Vincent; Murdiyarso, Daniel; Gunawan, Dodo; Borchard, Nils; Schirrmann, Michael; Main, Bruce; Sepriando, AlponDeforestation and draining of the peatlands in equatorial SE Asia has greatly increased their flammability, and in September-October 2015 a strong El Niño-related drought led to further drying and to widespread burning across parts of Indonesia, primarily on Kalimantan and Sumatra. These fires resulted in some of the worst sustained outdoor air pollution ever recorded, with atmospheric particulate matter (PM) concentrations exceeding those considered "extremely hazardous to health" by up to an order of magnitude. Here we report unique in situ air quality data and tropical peatland fire emissions factors (EFs) for key carbonaceous trace gases (CO2, CH4 and CO) and PM2.5 and black carbon (BC) particulates, based on measurements conducted on Kalimantan at the height of the 2015 fires, both at locations of "pure" sub-surface peat burning and spreading vegetation fires atop burning peat. PM2.5 are the most significant smoke constituent in terms of human health impacts, and we find in situ PM2.5 emissions factors for pure peat burning to be 17.8 to 22.3 g·kg-1, and for spreading vegetation fires atop burning peat 44 to 61 g·kg-1, both far higher than past laboratory burning of tropical peat has suggested. The latter are some of the highest PM2.5 emissions factors measured worldwide. Using our peatland CO2, CH4 and CO emissions factors (1779 ± 55 g·kg-1, 238 ± 36 g·kg-1, and 7.8 ± 2.3 g·kg-1 respectively) alongside in situ measured peat carbon content (610 ± 47 g-C·kg-1) we provide a new 358 Tg (± 30%) fuel consumption estimate for the 2015 Indonesian fires, which is less than that provided by the GFEDv4.1s and GFASv1.2 global fire emissions inventories by 23% and 34% respectively, and which due to our lower EFCH4 produces far less (~3×) methane. However, our mean in situ derived EFPM2.5 for these extreme tropical peatland fires (28 ± 6 g·kg-1) is far higher than current emissions inventories assume, resulting in our total PM2.5 emissions estimate (9.1 ± 3.5 Tg) being many times higher than GFEDv4.1s, GFASv1.2 and FINNv2, despite our lower fuel consumption. We find that two thirds of the emitted PM2.5 come from Kalimantan, one third from Sumatra, and 95% from burning peatlands. Using new geostationary fire radiative power (FRP) data we map the fire emissions' spatio-temporal variations in far greater detail than ever before (hourly, 0.05°), identifying a tropical peatland fire diurnal cycle twice as wide as in neighboring non-peat areas and peaking much later in the day. Our data show that a combination of greatly elevated PM2.5 emissions factors, large areas of simultaneous, long-duration burning, and very high peat fuel consumption per unit area made these Sept to Oct tropical peatland fires the greatest wildfire source of particulate matter globally in 2015, furthering evidence for a regional atmospheric pollution impact whose particulate matter component in particular led to millions of citizens being exposed to extremely poor levels of air quality for substantial periods. © 2018 by the authors.
- ItemOptimized Deep Learning Model as a Basis for Fast UAV Mapping of Weed Species in Winter Wheat Crops(Basel : MDPI AG, 2021) de Camargo, Tibor; Schirrmann, Michael; Landwehr, Niels; Dammer, Karl-Heinz; Pflanz, MichaelWeed maps should be available quickly, reliably, and with high detail to be useful for site-specific management in crop protection and to promote more sustainable agriculture by reducing pesticide use. Here, the optimization of a deep residual convolutional neural network (ResNet-18) for the classification of weed and crop plants in UAV imagery is proposed. The target was to reach sufficient performance on an embedded system by maintaining the same features of the ResNet-18 model as a basis for fast UAV mapping. This would enable online recognition and subsequent mapping of weeds during UAV flying operation. Optimization was achieved mainly by avoiding redundant computations that arise when a classification model is applied on overlapping tiles in a larger input image. The model was trained and tested with imagery obtained from a UAV flight campaign at low altitude over a winter wheat field, and classification was performed on species level with the weed species Matricaria chamomilla L., Papaver rhoeas L., Veronica hederifolia L., and Viola arvensis ssp. arvensis observed in that field. The ResNet-18 model with the optimized image-level prediction pipeline reached a performance of 2.2 frames per second with an NVIDIA Jetson AGX Xavier on the full resolution UAV image, which would amount to about 1.78 ha h−1 area output for continuous field mapping. The overall accuracy for determining crop, soil, and weed species was 94%. There were some limitations in the detection of species unknown to the model. When shifting from 16-bit to 32-bit model precision, no improvement in classification accuracy was observed, but a strong decline in speed performance, especially when a higher number of filters was used in the ResNet-18 model. Future work should be directed towards the integration of the mapping process on UAV platforms, guiding UAVs autonomously for mapping purpose, and ensuring the transferability of the models to other crop fields.
- ItemPrimarily tests of a optoelectronic in-canopy sensor for evaluation of vertical disease infection in cereals(New York, NY : Wiley, 2022) Dammer, Karl-Heinz; Schirrmann, MichaelBACKGROUND: Health scouting of crops by satellite, airplanes, unmanned aerial (UAV) and ground vehicles can only evaluate the crop from above. The visible leaves may show no disease symptoms, but lower, older leaves not visible from above can do. A mobile in-canopy sensor was developed, carried by a tractor to detect diseases in cereal crops. Photodiodes measure the reflected light in the red and infrared wavelength range at 10 different vertical heights in lateral directions. RESULTS: Significant differences occurred in the vegetation index NDVI of sensor levels operated inside and near the winter wheat canopy between infected (stripe rust: 2018, 2019 / leaf rust: 2020) and control plots. The differences were not significant at those sensor levels operated far above the canopy. CONCLUSIONS: Lateral reflectance measurements inside the crop canopy are able to distinguish between disease-infected and healthy crops. In future mobile in-canopy scouting could be an extension to the common above-canopy scouting praxis for making spraying decisions by the farmer or decision support systems. © 2021 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.
- ItemProximal Soil Sensing - A Contribution for Species Habitat Distribution Modelling of Earthworms in Agricultural Soils?(San Francisco, California, US : PLOS, 2016) Schirrmann, Michael; Joschko, Monika; Gebbers, Robin; Kramer, Eckart; Zörner, Mirjam; Barkusky, Dietmar; Timmer, JensBackground: Earthworms are important for maintaining soil ecosystem functioning and serve as indicators of soil fertility. However, detection of earthworms is time-consuming, which hinders the assessment of earthworm abundances with high sampling density over entire fields. Recent developments of mobile terrestrial sensor platforms for proximal soil sensing (PSS) provided new tools for collecting dense spatial information of soils using various sensing principles. Yet, the potential of PSS for assessing earthworm habitats is largely unexplored. This study investigates whether PSS data contribute to the spatial prediction of earthworm abundances in species distribution models of agricultural soils. Methodology/Principal Findings: Proximal soil sensing data, e.g., soil electrical conductivity (EC), pH, and near infrared absorbance (NIR), were collected in real-time in a field with two management strategies (reduced tillage / conventional tillage) and sandy to loam soils. PSS was related to observations from a long-term (11 years) earthworm observation study conducted at 42 plots. Earthworms were sampled from 0.5 x 0.5 x 0.2 m³ soil blocks and identified to species level. Sensor data were highly correlated with earthworm abundances observed in reduced tillage but less correlated with earthworm abundances observed in conventional tillage. This may indicate that management influences the sensor-earthworm relationship. Generalized additive models and state-space models showed that modelling based on data fusion from EC, pH, and NIR sensors produced better results than modelling without sensor data or data from just a single sensor. Regarding the individual earthworm species, particular sensor combinations were more appropriate than others due to the different habitat requirements of the earthworms. Earthworm species with soil-specific habitat preferences were spatially predicted with higher accuracy by PSS than more ubiquitous species. Conclusions/Significance: Our findings suggest that PSS contributes to the spatial modelling of earthworm abundances at field scale and that it will support species distribution modelling in the attempt to understand the soil-earthworm relationships in agroecosystems.
- ItemPublisher Correction: Rapid and low-cost insect detection for analysing species trapped on yellow sticky traps(London : Nature Publishing Group, 2021) Böckmann, Elias; Pfaff, Alexander; Schirrmann, Michael; Pflanz, MichaelCorrection to: Scientific Reports https://doi.org/10.1038/s41598-021-89930-w, published online 17 May 2021
- ItemRapid and low-cost insect detection for analysing species trapped on yellow sticky traps(London : Nature Publishing Group, 2021) Böckmann, Elias; Pfaff, Alexander; Schirrmann, Michael; Pflanz, MichaelWhile insect monitoring is a prerequisite for precise decision-making regarding integrated pest management (IPM), it is time- and cost-intensive. Low-cost, time-saving and easy-to-operate tools for automated monitoring will therefore play a key role in increased acceptance and application of IPM in practice. In this study, we tested the differentiation of two whitefly species and their natural enemies trapped on yellow sticky traps (YSTs) via image processing approaches under practical conditions. Using the bag of visual words (BoVW) algorithm, accurate differentiation between both natural enemies and the Trialeurodes vaporariorum and Bemisia tabaci species was possible, whereas the procedure for B. tabaci could not be used to differentiate this species from T. vaporariorum. The decay of species was considered using fresh and aged catches of all the species on the YSTs, and different pooling scenarios were applied to enhance model performance. The best performance was reached when fresh and aged individuals were used together and the whitefly species were pooled into one category for model training. With an independent dataset consisting of photos from the YSTs that were placed in greenhouses and consequently with a naturally occurring species mixture as the background, a differentiation rate of more than 85% was reached for natural enemies and whiteflies.
- ItemRegression kriging for improving crop height models fusing ultra-sonic sensing with UAV imagery(Basel : MDPI, 2017) Schirrmann, Michael; Hamdorf, André; Giebel, Antje; Gleiniger, Franziska; Pflanz, Michael; Dammer, Karl-HeinzA crop height model (CHM) can be an important element of the decision making process in agriculture, because it relates well with many agronomic parameters, e.g., crop height, plant biomass or crop yield. Today, CHMs can be inexpensively obtained from overlapping imagery captured from unmanned aerial vehicle (UAV) platforms or from proximal sensors attached to ground-based vehicles used for regular management. Both approaches have their limitations and combining them with a data fusion may overcome some of these limitations. Therefore, the objective of this study was to investigate if regression kriging, as a geostatistical data fusion approach, can be used to improve the interpolation of ground-based ultrasonic measurements with UAV imagery as covariate. Regression kriging might be suitable because we have a sparse data set (ultrasound) and an exhaustive data set (UAV) and both data sets have favorable properties for geostatistical analysis. To confirm this, we conducted four missions in two different fields in total, where we collected UAV imagery and ultrasonic data alongside. From the overlapping UAV images, surface models and ortho-images were generated with photogrammetric processing. The maps generated by regression kriging were of much higher detail than the smooth maps generated by ordinary kriging, because regression kriging ensures that for each prediction point information from the UAV, imagery is given. The relationship with crop height, fresh biomass and, to a lesser extent, with crop yield, was stronger using CHMs generated by regression kriging than by ordinary kriging. The use of UAV data from the prior mission was also of benefit and could improve map accuracy and quality. Thus, regression kriging is a flexible approach for the integration of UAV imagery with ground-based sensor data, with benefits for precision agriculture-oriented farmers and agricultural service providers.
- ItemSoil pH mapping with an on-the-go sensor(Basel : MDPI, 2011) Schirrmann, Michael; Gebbers, Robin; Kramer, Eckart; Seidel, JanSoil pH is a key parameter for crop productivity, therefore, its spatial variation should be adequately addressed to improve precision management decisions. Recently, the Veris pH ManagerTM, a sensor for high-resolution mapping of soil pH at the field scale, has been made commercially available in the US. While driving over the field, soil pH is measured on-the-go directly within the soil by ion selective antimony electrodes. The aim of this study was to evaluate the Veris pH ManagerTM under farming conditions in Germany. Sensor readings were compared with data obtained by standard protocols of soil pH assessment. Experiments took place under different scenarios: (a) controlled tests in the lab, (b) semicontrolled test on transects in a stop-and-go mode, and (c) tests under practical conditions in the field with the sensor working in its typical on-the-go mode. Accuracy issues, problems, options, and potential benefits of the Veris pH ManagerTM were addressed. The tests demonstrated a high degree of linearity between standard laboratory values and sensor readings. Under practical conditions in the field (scenario c), the measure of fit (r2) for the regression between the on-the-go measurements and the reference data was 0.71, 0.63, and 0.84, respectively. Field-specific calibration was necessary to reduce systematic errors. Accuracy of the on-the-go maps was considerably higher compared with the pH maps obtained by following the standard protocols, and the error in calculating lime requirements was reduced by about one half. However, the system showed some weaknesses due to blockage by residual straw and weed roots. If these problems were solved, the on-the-go sensor investigated here could be an efficient alternative to standard sampling protocols as a basis for liming in Germany.
- ItemUAV Oblique Imagery with an Adaptive Micro-Terrain Model for Estimation of Leaf Area Index and Height of Maize Canopy from 3D Point Clouds(Basel : MDPI, 2022) Li, Minhui; Shamshiri, Redmond R.; Schirrmann, Michael; Weltzien, Cornelia; Shafian, Sanaz; Laursen, Morten StigaardLeaf area index (LAI) and height are two critical measures of maize crops that are used in ecophysiological and morphological studies for growth evaluation, health assessment, and yield prediction. However, mapping spatial and temporal variability of LAI in fields using handheld tools and traditional techniques is a tedious and costly pointwise operation that provides information only within limited areas. The objective of this study was to evaluate the reliability of mapping LAI and height of maize canopy from 3D point clouds generated from UAV oblique imagery with the adaptive micro-terrain model. The experiment was carried out in a field planted with three cultivars having different canopy shapes and four replicates covering a total area of 48 × 36 m. RGB images in nadir and oblique view were acquired from the maize field at six different time slots during the growing season. Images were processed by Agisoft Metashape to generate 3D point clouds using the structure from motion method and were later processed by MATLAB to obtain clean canopy structure, including height and density. The LAI was estimated by a multivariate linear regression model using crop canopy descriptors derived from the 3D point cloud, which account for height and leaf density distribution along the canopy height. A simulation analysis based on the Sine function effectively demonstrated the micro-terrain model from point clouds. For the ground truth data, a randomized block design with 24 sample areas was used to manually measure LAI, height, N-pen data, and yield during the growing season. It was found that canopy height data from the 3D point clouds has a relatively strong correlation (R2 = 0.89, 0.86, 0.78) with the manual measurement for three cultivars with CH90 . The proposed methodology allows a cost-effective high-resolution mapping of in-field LAI index extraction through UAV 3D data to be used as an alternative to the conventional LAI assessments even in inaccessible regions.
- ItemWeed Mapping with UAS Imagery and a Bag of Visual Words Based Image Classifier(Basel : MDPI, 2018-9-24) Pflanz, Michael; Nordmeyer, Henning; Schirrmann, MichaelWeed detection with aerial images is a great challenge to generate field maps for site-specific plant protection application. The requirements might be met with low altitude flights of unmanned aerial vehicles (UAV), to provide adequate ground resolutions for differentiating even single weeds accurately. The following study proposed and tested an image classifier based on a Bag of Visual Words (BoVW) framework for mapping weed species, using a small unmanned aircraft system (UAS) with a commercial camera on board, at low flying altitudes. The image classifier was trained with support vector machines after building a visual dictionary of local features from many collected UAS images. A window-based processing of the models was used for mapping the weed occurrences in the UAS imagery. The UAS flight campaign was carried out over a weed infested wheat field, and images were acquired between a 1 and 6 m flight altitude. From the UAS images, 25,452 weed plants were annotated on species level, along with wheat and soil as background classes for training and validation of the models. The results showed that the BoVW model allowed the discrimination of single plants with high accuracy for Matricaria recutita L. (88.60%), Papaver rhoeas L. (89.08%), Viola arvensis M. (87.93%), and winter wheat (94.09%), within the generated maps. Regarding site specific weed control, the classified UAS images would enable the selection of the right herbicide based on the distribution of the predicted weed species. © 2018 by the authors.
- ItemZielflächenorientierte, präzise Echtzeit-Fungizidapplikation in Getreide(Darmstadt : KTBL, 2015) Dammer, Karl-Heinz; Hamdorf, André; Ustyuzhanin, Anton; Schirrmann, Michael; Leithold, Peer; Leithold, Hermann; Volk, Thomas; Tackenberg, MariaIm Rahmen eines Verbundprojektes wurden Echtzeit-Applikationstechnologien mit berührungslosen Sensoren für präzise Fungizid-Spritzungen in Getreide entwickelt. Das Entscheidungshilfe- System proPlant expert.classic bzw. die Internetversion proPlant expert.com (proPlant GmbH) empfiehlt geeignete Fungizide und Dosierungen für ein bestimmtes Infektionsszenario der acht wichtigsten Blatt- und Ährenkrankheiten von Winterweizen. Das Precision- Farming-Modul „Fungizid“, welches auf dem Terminal in der Traktorenkabine läuft, steuert das präzise Spritzverfahren. Das Modul bestimmt die lokale Zielapplikationsmenge während des Spritzens durch Nutzung des lokalen Ultraschallsensorwerts als Eingabeparameter. In den Jahren 2013 und 2014 wurden Feldversuche in Winterweizen durchgeführt, um die Beziehung zwischen den Sensorwerten (Ultraschall- und Kamerasensor) und den Pflanzenparametern Pflanzenoberfläche (Leaf Area Index, LAI) sowie Biomasse zu analysieren. Diese sind für einen örtlich angepassten variablen Fungizideinsatz zur Bemessung der Spritzmenge wichtig. Die Messungen wurden mehrmals während der Vegetationsperiode an visuell ausgewählten Stichprobenpunkten entsprechend der unterschiedlichen Bestandsdichte durchgeführt. Nach Änderungen an der Sensortechnik konnten für 2014 signifikante lineare Regressionsmodelle zur Beschreibung der Beziehung zwischen den Sensorwerten und den zwei Pflanzenparametern LAI sowie Biomasse gefunden werden.