Autonomous detection of plant disease symptoms directly from aerial imagery

H Wu, T Wiesner‐Hanks, EL Stewart…�- The plant phenome�…, 2019 - Wiley Online Library
H Wu, T Wiesner‐Hanks, EL Stewart, C DeChant, N Kaczmar, MA Gore, RJ Nelson…
The plant phenome journal, 2019Wiley Online Library
Core Ideas A deep learning model identified plant disease in UAV images with 95%
accuracy. Transfer learning allowed for faster model optimization. This method detected
plant disease symptoms at a very fine spatial scale. The detection, diagnosis and
quantification of plant diseases using digital technologies is an important research frontier.
New and accurate methods would be an asset to growers, for whom early disease detection
can mean the difference between successful intervention and massive losses, and plant�…
    Core Ideas
  • A deep learning model identified plant disease in UAV images with 95% accuracy.
  • Transfer learning allowed for faster model optimization.
  • This method detected plant disease symptoms at a very fine spatial scale.
The detection, diagnosis and quantification of plant diseases using digital technologies is an important research frontier. New and accurate methods would be an asset to growers, for whom early disease detection can mean the difference between successful intervention and massive losses, and plant breeders, who often must rely on time‐consuming phenotyping by eye. We have developed such a method for detecting an important maize (Zea mays L.) disease. Northern leaf blight [NLB; causal agent Setosphaeria turcica (Luttrell) Leonard & Suggs] is a foliar disease of maize that causes significant yield losses. Accurately measuring NLB infection is necessary both for breeding more resistant maize lines and for guiding crop management decisions. Visual disease scoring in a large area is time‐consuming and human evaluations are subjective and prone to error. In this work, we demonstrate an automated, high‐throughput system for the detection of NLB in field images of maize plants. Through the use of an unmanned aerial vehicle (UAV) to acquire high resolution images, we trained a convolutional neural network (CNN) model on lower resolution sub‐images, achieving 95.1% accuracy on a separate test set of sub‐images. The CNN model was used to create interpretable heat maps of the original images, indicating the locations of putative lesions. Detecting lesions at a fine spatial scale allows for the potential of unprecedented high‐resolution disease detection for plant breeding and crop management strategies.
Wiley Online Library