Publications
Sort:
Open Access Issue
Wheat Lodging Ratio Detection Based on UAS Imagery Coupled with Different Machine Learning and Deep Learning Algorithms
Smart Agriculture 2021, 3 (2): 23-34
Published: 30 June 2021
Abstract PDF (1.4 MB) Collect
Downloads:20

Wheat lodging is a negative factor affecting yield production. Obtaining timely and accurate wheat lodging information is critical. Using unmanned aerial systems (UASs) images for wheat lodging detection is a relatively new approach, in which researchers usually apply a manual method for dataset generation consisting of plot images. Considering the manual method being inefficient, inaccurate, and subjective, this study developed a new image processing-based approach for automatically generating individual field plot datasets. Images from wheat field trials at three flight heights (15, 46, and 91 m) were collected and analyzed using machine learning (support vector machine, random forest, and K nearest neighbors) and deep learning (ResNet101, GoogLeNet, and VGG16) algorithms to test their performances on detecting levels of wheat lodging percentages: non- (0%), light (< 50%), and severe (> 50%) lodging. The results indicated that the images collected at 91 m (2.5 cm/pixel) flight height could yield a similar, even slightly higher, detection accuracy over the images collected at 46 m (1.2 cm/pixel) and 15 m (0.4 cm/pixel) UAS mission heights. Comparison of random forest and ResNet101 model results showed that ResNet101 resulted in more satisfactory performance (75% accuracy) with higher accuracy over random forest (71% accuracy). Thus, ResNet101 is a suitable model for wheat lodging ratio detection. This study recommends that UASs images collected at the height of about 91 m (2.5 cm/pixel resolution) coupled with ResNet101 model is a useful and efficient approach for wheat lodging ratio detection.

Open Access Issue
Distinguishing Volunteer Corn from Soybean at Seedling Stage Using Images and Machine Learning
Smart Agriculture 2020, 2 (3): 61-74
Published: 30 September 2020
Abstract PDF (1.9 MB) Collect
Downloads:4

Volunteer corn in soybean fields are harmful as they disrupt the benefits of corn-soybean rotation. Volunteer corn does not only reduce soybean yield by competing for water, nutrition and sunlight, but also interferes with pest control (e.g., corn rootworm). It is therefore critical to monitor the volunteer corn in soybean at the crop seedling stage for better management. The current visual monitoring method is subjective and inefficient. Technology progress in sensing and automation provides a potential solution towards the automatic detection of volunteer corn from soybean. In this study, corn and soybean were planted in pots in greenhouse to mimic field conditions. Color images were collected by using a low-cost Intel RealSense camera for five successive days after the germination. Individual crops from images were manually cropped and subjected to image segmentation based on color threshold coupled with noise removal to create a dataset. Shape (i.e., area, aspect ratio, rectangularity, circularity, and eccentricity), color (i.e., R, G, B, H, S, V, L, a, b, Y, Cb, and Cr) and texture (coarseness, contrast, linelikeness, and directionality) features of individual crops were extracted. Individual feature's weights were ranked with the top 12 relevant features selected for this study. The 12 features were fed into three feature-based machine learning algorithms: support vector machine (SVM), neural network (NN) and random forest (RF) for model training. Prediction precision values on the test dataset for SVM, NN and RF were 85.3%, 81.6%, and 82.0%, respectively. The dataset (without feature extraction) was fed into two deep learning algorithms—GoogLeNet and VGG-16, resulting into 96.0% and 96.2% accuracies, respectively. The more satisfactory models from feature-based machine learning and deep learning were compared. VGG-16 was recommended for the purpose of distinguishing volunteer corn from soybean due to its higher detection accuracy, as well as smaller standard deviation (STD). This research demonstrated RGB images, coupled with VGG-16 algorithm could be used as a novel, reliable (accuracy > 96%), and simple tool to detect volunteer corn from soybean. The research outcome helps provide critical information for farmers, agronomists, and plant scientists in monitoring volunteer corn infestation conditions in soybean for better decision making and management.

Total 2