Downy mildew disease poses a significant threat to Kimchi cabbage, a key agricultural product in Korea. Early detection is crucial for preventing the disease and mitigating its physical impact. Hyperspectral imaging can capture data across a broad spectrum, enabling early detection before visible symptoms manifest. Combining a hyperspectral camera with an unmanned aerial vehicle (UAV) can establish a non-destructive, field-scale disease detection system. This thesis presents two studies conducted in different kimchi cabbage fields, each analyzed using a distinct approach. The study, conducted in an autumn kimchi cabbage field, employed the SLIC algorithm to segment aerial hyperspectral images. This segmentation isolated hyperspectral patches of diseased kimchi cabbage leaves, which were then utilized to train a three-dimensional residual network (3D-ResNet) to detect early signs of disease and their locations in the field. The model achieved an overall accuracy of 87.6%. This study successfully detected downy mildew diseases using different methodologies. Future research should focus on leveraging the advantages of each approach to develop a more robust disease detection model.
As one of the primary agricultural commodities in Korea, Chinese cabbage is susceptible to disease infections. The plants which exposed to a high moisture are easily infected by downy mildew disease. The disease is identified by irregular yellow-tan spots appearing on the upper leaf surface, leading to cell damage thus degrading the product quality. An early detection system to identify and treat the disease would be essential to prevent disease occurrence and reduce the plant damage caused by the disease. Hyperspectral imaging, as one of the non-destructive evaluation methods, has recently become more popular due to its capability to capture a wide range of light spectrum. It is sensitive enough to detect slight chemical difference within the plant. UAV-based hyperspectral system offers high-throughput plant phenotyping with abundant resources of data. A preliminary experiment has shown spectral differences between diseased and healthy cabbage leaves. Based on hyperspectral image data, the detection system employs a convolutional neural network (CNN) that extracts spectral and spatial features to detect the disease and its location. A 3D CNN architecture will be used in this study to further exploit the spectral variance and accurately detect the disease.
Unmanned Aerial Vehicle (UAV)-based remote sensing techniques have significant potential in agriculture and smart farming applications for the efficient monitoring of plant growth, the irrigation process, disease detection, etc. Most research on field phenotyping with remote sensing was accomplished by a typical UAV equipped with an RGB camera or a multispectral camera over a large farm field. Due to the effects of wind disturbances on point-cloud generation processing with a single-camera image captured from the UAV, precise field phenotyping measurement for crop breeding and agriculture production requires the simultaneous collection of images by multiple cameras that are far enough apart to provide for structure from motion calculations. To improve digital surface models by minimizing measurement errors caused by the motion of the UAV and plants during a flying mission, a cooperative operation system of multiple UAVs was proposed to enable the simultaneous collection of images from different perspectives. A coordinated navigation system based on the Robot Operation System was constructed to compute control commands to stabilize pose control and the location of the UAVs. Based on a leader-follower formation control algorithm through a wireless network system, a follower UAV performed coordination with a leader UAV to maintain the desired constant speed, direction, and percentage of image overlap in a synchronized motion, ultimately enabling task achievement in a short time and improvement of target models based on 3D reconstruction. To validate the performance of the proposed method, measurement errors of field phenotyping, obtained from synchronized multiple UAV-based image collection, were compared with the single UAV-based image collection in simulation and field tests.
Cotton root rot (CRR) is a persistent soil-borne fungal disease that is devastating to cotton crops in certain fields, predominantly in Texas. Research has shown that CRR can be prevented or mitigated by applying fungicide during planting, but fungicide application is expensive. The potentially infected area within a field has been shown to be consistent, so it is possible to apply the fungicide only at locations where CRR exists, thus minimizing the amount of fungicide applied across the field. Previous studies have shown that remote sensing from manned aircraft is an effective means of delineating CRR-infected field areas. In 2015, an unmanned aerial vehicle was used to collect high-resolution remote-sensing images in a field known to be infected with CRR. A method was developed to produce a prescription map (PM) from these data, and in 2017, fungicide was applied based on a PM derived from the 2015 image data. The results showed that the PM reduced the fungicide applied by 88.3%, with a reduction in CRR area of 90% compared to 2015. A simple economic model suggested that it is generally better to treat an entire CRR-infested field rather than leaving it untreated, and application based on a PM becomes preferable as the size of the farm and the yield increase while the CRR-infestation level and the number of fields on the farm decrease.
The future of phenotyping for crop breeding and agricultural production will likely involve cooperation between autonomous aerial and ground-based vehicles. Consideration of this advance in autonomous systems is relatively new in the academic literature, particularly in relation to agriculture. Areas of study to date have included the broadly applied concepts of environment perception and modeling, autonomous cooperation, collaborative position control, and path planning. Multiple opportunities are emerging for the technology to be advantageous in agriculture. An example includes using an unmanned aerial vehicle (UAV) for remote sensing in cooperation with an unmanned ground vehicle (UGV) that will perform ground-based activities in accord with analysis of the remote-sensing data. This case could be applied in mapping and mitigation of insects and weeds as well as harvesting according to variations in crop yield and maturity. Another example includes using a UGV to serve as an autonomous ground-control point in order to maximize the accuracy of UAV remote-sensing data. Ongoing research in this area has shown major improvements in the accuracy of measurements of plant reflectance, height, and temperature, not to mention improvements in georectification.
Ground control points (GCPs) are critical for agricultural remote sensing that require georeferencing and calibration of images collected from an unmanned aerial vehicle (UAV) at different times. However, the conventional stationary GCPs are time-consuming and labor-intensive to measure, distribute, and collect information in a large field setup. An autonomous mobile GCP and a cooperation strategy to communicate with the UAV were developed to improve the efficiency and accuracy of the UAV-based data collection process. Prior to actual field testing, preliminary tests were conducted using the system to show the capability of automatic path tracking by reducing the root mean square error (RMSE) for lateral deviation from 34.3 cm to 15.6 cm based on the proposed look-ahead tracking method. The tests also indicated the feasibility of moving reflectance reference panels for every two successive flight paths without having detrimental effects on pixel values in the mosaicked images, with the percentage errors in digital number values ranging from -1.1% to 0.1%. In the actual field testing, the autonomous mobile GCP was able to successfully cooperate with the UAV in real-time without any interruption, showing superior performances for georeferencing, radiometric calibration, height calibration, and temperature calibration, compared to the conventional calibration method that has stationary GCPs.
Commercial off-the shelf systems of UAVs and sensors are touted as being able to collect remote-sensing data on crops that include spectral reflectance and plant height. Historically a great deal of effort has gone into quantifying and reducing the error levels in the geometry of UAV-based orthomosaics, but little effort has gone into quantifying and reducing the error of the reflectance and plant-height. We have been developing systems and protocols involving multifunctional ground-control points (GCPs) in order to produce crop phenotypic data that are as repeatable as possible. These multifunctional GCPs aid not only geometric correction, but also image calibration of reflectance and plantheight. The GCPs have known spectral-reflectance characteristics that are used to enable reference-based digital numberto-reflectance calibration of multispectral images. They also have known platform heights that are used to enable reference-based digital surface model-to-height maps. Results show that using these GCPs for reflectance and plantheight calibrations significantly reduces the error levels in reflectance (ca. 50% reduction) and plant-height (ca. 20% reduction) measurements.
Ground control points (GCPs) are critical for agricultural applications that require geographic registration, radiometric and height calibrations of images when overlaying images collected at different times. However, in terms of using conventional GCPs, it is time-consuming and labor-intensive to measure, distribute and collect all GCPs around a large field. An automatic mobile GCP is essential to replace the conventional GCPs, which can collaborate with the UAV during the flight allowing the mobile GCP to be captured in the images based on a proposed cooperation strategy. To investigate the plant phenotyping across a field based on multi-types of imaging sensors (RGB, multispectral, and thermal), two radiometric calibration references and two temperature calibration references were installed on the top of the mobile GCP. The mobile GCP used for auto-guidance driving is a four-wheel platform with differential speed steering control for front-wheel, which equipped with two RTK-GPS units for position determination, a navigation computer for path planning and tracking, an integrated driving controller for steering and traveling. The traveling angles and velocities of the wheels at each location can be generated using a path-tracking algorithm to follow a predefined driving map based on UAV’s flight planning. It is noteworthy that all positions of the mobile GCP can be recorded on the UAV during the flight for future image mosaicking. The automatic mobile GCP enables to reliably recognize and predict the behavior and activities of the UAVs in agricultural remote sensing and has a potential to improve the efficiency of the data collection in the field.
Field-based high-throughput phenotyping is a bottleneck to future breeding advances. The use of remote sensing with
unmanned aerial vehicles (UAVs) can change the way agricultural research operates by increasing the spatiotemporal
resolution of data collection to monitor status of plant growth. A fixed-wing UAV (Tuffwing) was operated to collect
images of a sorghum breeding research field with 70% overlap at an altitude of 120 m. The study site was located at Texas
A and M AgriLife Research’s Brazos Bottom research farm near College Station, Texas, USA. Relatively high-resolution
(>2.7cm/pixel) images were collected from May to July 2017 over 880 sorghum plots (including six treatments with four
replications). The collected images were mosaicked and structure from motion (SfM) calculated, which involves
construction of a digital surface model (DSM) by interpolation of 3D point clouds. Maximum plant height for each
genotype (plot) was estimated from the DSM and height calibration implemented with aerial measured values of groundcontrol
points with known height. Correlations and RMSE values between actual height and estimated height were
observed over sorghum across all genotypes and flight dates. Results indicate that the proposed height calibration method
has a potential for future application to improve accuracy in plant height estimations from UAVs.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.