Search Results

Now showing 1 - 6 of 6
  • Item
    Monitoring Agronomic Parameters of Winter Wheat Crops with Low-Cost UAV Imagery
    (Basel : MDPI, 2016) Schirrmann, Michael; Giebel, Antje; Gleiniger, Franziska; Pflanz, Michael; Lentschke, Jan; Dammer, Karl-Heinz
    Monitoring the dynamics in wheat crops requires near-term observations with high spatial resolution due to the complex factors influencing wheat growth variability. We studied the prospects for monitoring the biophysical parameters and nitrogen status in wheat crops with low-cost imagery acquired from unmanned aerial vehicles (UAV) over an 11 ha field. Flight missions were conducted at approximately 50 m in altitude with a commercial copter and camera system—three missions were performed between booting and maturing of the wheat plants and one mission after tillage. Ultra-high resolution orthoimages of 1.2 cm·px−1 and surface models were generated for each mission from the standard red, green and blue (RGB) aerial images. The image variables were extracted from image tone and surface models, e.g., RGB ratios, crop coverage and plant height. During each mission, 20 plots within the wheat canopy with 1 × 1 m2 sample support were selected in the field, and the leaf area index, plant height, fresh and dry biomass and nitrogen concentrations were measured. From the generated UAV imagery, we were able to follow the changes in early senescence at the individual plant level in the wheat crops. Changes in the pattern of the wheat canopy varied drastically from one mission to the next, which supported the need for instantaneous observations, as delivered by UAV imagery. The correlations between the biophysical parameters and image variables were highly significant during each mission, and the regression models calculated with the principal components of the image variables yielded R2 values between 0.70 and 0.97. In contrast, the models of the nitrogen concentrations yielded low R2 values with the best model obtained at flowering (R2 = 0.65). The nitrogen nutrition index was calculated with an accuracy of 0.10 to 0.11 NNI for each mission. For all models, information about the surface models and image tone was important. We conclude that low-cost RGB UAV imagery will strongly aid farmers in observing biophysical characteristics, but it is limited for observing the nitrogen status within wheat crops.
  • Item
    Growth Height Determination of Tree Walls for Precise Monitoring in Apple Fruit Production Using UAV Photogrammetry
    (Basel : MDPI, 2020) Hobart, Marius; Pflanz, Michael; Weltzien, Cornelia; Schirrmann, Michael
    In apple cultivation, spatial information about phenotypic characteristics of tree walls would be beneficial for precise orchard management. Unmanned aerial vehicles (UAVs) can collect 3D structural information of ground surface objects at high resolution in a cost-effective and versatile way by using photogrammetry. The aim of this study is to delineate tree wall height information in an apple orchard applying a low-altitude flight pattern specifically designed for UAVs. This flight pattern implies small distances between the camera sensor and the tree walls when the camera is positioned in an oblique view toward the trees. In this way, it is assured that the depicted tree crown wall area will be largely covered with a larger ground sampling distance than that recorded from a nadir perspective, especially regarding the lower crown sections. Overlapping oblique view images were used to estimate 3D point cloud models by applying structure-from-motion (SfM) methods to calculate tree wall heights from them. The resulting height models were compared with ground-based light detection and ranging (LiDAR) data as reference. It was shown that the tree wall profiles from the UAV point clouds were strongly correlated with the LiDAR point clouds of two years (2018: R2 = 0.83; 2019: R2 = 0.88). However, underestimation of tree wall heights was detected with mean deviations of −0.11 m and −0.18 m for 2018 and 2019, respectively. This is attributed to the weaknesses of the UAV point clouds in resolving the very fine shoots of apple trees. Therefore, the shown approach is suitable for precise orchard management, but it underestimated vertical tree wall expanses, and widened tree gaps need to be accounted for.
  • Item
    Weed Mapping with UAS Imagery and a Bag of Visual Words Based Image Classifier
    (Basel : MDPI, 2018-9-24) Pflanz, Michael; Nordmeyer, Henning; Schirrmann, Michael
    Weed detection with aerial images is a great challenge to generate field maps for site-specific plant protection application. The requirements might be met with low altitude flights of unmanned aerial vehicles (UAV), to provide adequate ground resolutions for differentiating even single weeds accurately. The following study proposed and tested an image classifier based on a Bag of Visual Words (BoVW) framework for mapping weed species, using a small unmanned aircraft system (UAS) with a commercial camera on board, at low flying altitudes. The image classifier was trained with support vector machines after building a visual dictionary of local features from many collected UAS images. A window-based processing of the models was used for mapping the weed occurrences in the UAS imagery. The UAS flight campaign was carried out over a weed infested wheat field, and images were acquired between a 1 and 6 m flight altitude. From the UAS images, 25,452 weed plants were annotated on species level, along with wheat and soil as background classes for training and validation of the models. The results showed that the BoVW model allowed the discrimination of single plants with high accuracy for Matricaria recutita L. (88.60%), Papaver rhoeas L. (89.08%), Viola arvensis M. (87.93%), and winter wheat (94.09%), within the generated maps. Regarding site specific weed control, the classified UAS images would enable the selection of the right herbicide based on the distribution of the predicted weed species. © 2018 by the authors.
  • Item
    Crop Monitoring Using Sentinel-2 and UAV Multispectral Imagery: A Comparison Case Study in Northeastern Germany
    (Basel : MDPI, 2022) Li, Minhui; Shamshiri, Redmond R.; Weltzien, Cornelia; Schirrmann, Michael
    Monitoring within-field crop variability at fine spatial and temporal resolution can assist farmers in making reliable decisions during their agricultural management; however, it traditionally involves a labor-intensive and time-consuming pointwise manual process. To the best of our knowledge, few studies conducted a comparison of Sentinel-2 with UAV data for crop monitoring in the context of precision agriculture. Therefore, prospects of crop monitoring for characterizing biophysical plant parameters and leaf nitrogen of wheat and barley crops were evaluated from a more practical viewpoint closer to agricultural routines. Multispectral UAV and Sentinel-2 imagery was collected over three dates in the season and compared with reference data collected at 20 sample points for plant leaf nitrogen (N), maximum plant height, mean plant height, leaf area index (LAI), and fresh biomass. Higher correlations of UAV data to the agronomic parameters were found on average than with Sentinel-2 data with a percentage increase of 6.3% for wheat and 22.2% for barley. In this regard, VIs calculated from spectral bands in the visible part performed worse for Sentinel-2 than for the UAV data. In addition, large-scale patterns, formed by the influence of an old riverbed on plant growth, were recognizable even in the Sentinel-2 imagery despite its much lower spatial resolution. Interestingly, also smaller features, such as the tramlines from controlled traffic farming (CTF), had an influence on the Sentinel-2 data and showed a systematic pattern that affected even semivariogram calculation. In conclusion, Sentinel-2 imagery is able to capture the same large-scale pattern as can be derived from the higher detailed UAV imagery; however, it is at the same time influenced by management-driven features such as tramlines, which cannot be accurately georeferenced. In consequence, agronomic parameters were better correlated with UAV than with Sentinel-2 data. Crop growers as well as data providers from remote sensing services may take advantage of this knowledge and we recommend the use of UAV data as it gives additional information about management-driven features. For future perspective, we would advise fusing UAV with Sentinel-2 imagery taken early in the season as it can integrate the effect of agricultural management in the subsequent absence of high spatial resolution data to help improve crop monitoring for the farmer and to reduce costs.
  • Item
    New Tropical Peatland Gas and Particulate Emissions Factors Indicate 2015 Indonesian Fires Released Far More Particulate Matter (but Less Methane) than Current Inventories Imply
    (Basel : MDPI, 2018-3-21) Wooster, Martin J.; Gaveau, David L.A.; Salim, Mohammad A.; Zhang, Tianran; Xu, Weidong; Green, David C.; Huijnen, Vincent; Murdiyarso, Daniel; Gunawan, Dodo; Borchard, Nils; Schirrmann, Michael; Main, Bruce; Sepriando, Alpon
    Deforestation and draining of the peatlands in equatorial SE Asia has greatly increased their flammability, and in September-October 2015 a strong El Niño-related drought led to further drying and to widespread burning across parts of Indonesia, primarily on Kalimantan and Sumatra. These fires resulted in some of the worst sustained outdoor air pollution ever recorded, with atmospheric particulate matter (PM) concentrations exceeding those considered "extremely hazardous to health" by up to an order of magnitude. Here we report unique in situ air quality data and tropical peatland fire emissions factors (EFs) for key carbonaceous trace gases (CO2, CH4 and CO) and PM2.5 and black carbon (BC) particulates, based on measurements conducted on Kalimantan at the height of the 2015 fires, both at locations of "pure" sub-surface peat burning and spreading vegetation fires atop burning peat. PM2.5 are the most significant smoke constituent in terms of human health impacts, and we find in situ PM2.5 emissions factors for pure peat burning to be 17.8 to 22.3 g·kg-1, and for spreading vegetation fires atop burning peat 44 to 61 g·kg-1, both far higher than past laboratory burning of tropical peat has suggested. The latter are some of the highest PM2.5 emissions factors measured worldwide. Using our peatland CO2, CH4 and CO emissions factors (1779 ± 55 g·kg-1, 238 ± 36 g·kg-1, and 7.8 ± 2.3 g·kg-1 respectively) alongside in situ measured peat carbon content (610 ± 47 g-C·kg-1) we provide a new 358 Tg (± 30%) fuel consumption estimate for the 2015 Indonesian fires, which is less than that provided by the GFEDv4.1s and GFASv1.2 global fire emissions inventories by 23% and 34% respectively, and which due to our lower EFCH4 produces far less (~3×) methane. However, our mean in situ derived EFPM2.5 for these extreme tropical peatland fires (28 ± 6 g·kg-1) is far higher than current emissions inventories assume, resulting in our total PM2.5 emissions estimate (9.1 ± 3.5 Tg) being many times higher than GFEDv4.1s, GFASv1.2 and FINNv2, despite our lower fuel consumption. We find that two thirds of the emitted PM2.5 come from Kalimantan, one third from Sumatra, and 95% from burning peatlands. Using new geostationary fire radiative power (FRP) data we map the fire emissions' spatio-temporal variations in far greater detail than ever before (hourly, 0.05°), identifying a tropical peatland fire diurnal cycle twice as wide as in neighboring non-peat areas and peaking much later in the day. Our data show that a combination of greatly elevated PM2.5 emissions factors, large areas of simultaneous, long-duration burning, and very high peat fuel consumption per unit area made these Sept to Oct tropical peatland fires the greatest wildfire source of particulate matter globally in 2015, furthering evidence for a regional atmospheric pollution impact whose particulate matter component in particular led to millions of citizens being exposed to extremely poor levels of air quality for substantial periods. © 2018 by the authors.
  • Item
    UAV Oblique Imagery with an Adaptive Micro-Terrain Model for Estimation of Leaf Area Index and Height of Maize Canopy from 3D Point Clouds
    (Basel : MDPI, 2022) Li, Minhui; Shamshiri, Redmond R.; Schirrmann, Michael; Weltzien, Cornelia; Shafian, Sanaz; Laursen, Morten Stigaard
    Leaf area index (LAI) and height are two critical measures of maize crops that are used in ecophysiological and morphological studies for growth evaluation, health assessment, and yield prediction. However, mapping spatial and temporal variability of LAI in fields using handheld tools and traditional techniques is a tedious and costly pointwise operation that provides information only within limited areas. The objective of this study was to evaluate the reliability of mapping LAI and height of maize canopy from 3D point clouds generated from UAV oblique imagery with the adaptive micro-terrain model. The experiment was carried out in a field planted with three cultivars having different canopy shapes and four replicates covering a total area of 48 × 36 m. RGB images in nadir and oblique view were acquired from the maize field at six different time slots during the growing season. Images were processed by Agisoft Metashape to generate 3D point clouds using the structure from motion method and were later processed by MATLAB to obtain clean canopy structure, including height and density. The LAI was estimated by a multivariate linear regression model using crop canopy descriptors derived from the 3D point cloud, which account for height and leaf density distribution along the canopy height. A simulation analysis based on the Sine function effectively demonstrated the micro-terrain model from point clouds. For the ground truth data, a randomized block design with 24 sample areas was used to manually measure LAI, height, N-pen data, and yield during the growing season. It was found that canopy height data from the 3D point clouds has a relatively strong correlation (R2 = 0.89, 0.86, 0.78) with the manual measurement for three cultivars with CH90 . The proposed methodology allows a cost-effective high-resolution mapping of in-field LAI index extraction through UAV 3D data to be used as an alternative to the conventional LAI assessments even in inaccessible regions.