AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
PDF (1.9 MB)
Collect
AI Chat Paper
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Open Access

Distinguishing Volunteer Corn from Soybean at Seedling Stage Using Images and Machine Learning

Paulo Flores1Zhao Zhang1( )Jithin Mathew2Nusrat Jahan1John Stenger1
Department of Agricultural and Biosystems Engineering, North Dakota State University, Fargo, ND 58102, USA
Department of Plant Sciences, North Dakota State University, Fargo, ND 58108, USA
Show Author Information

Abstract

Volunteer corn in soybean fields are harmful as they disrupt the benefits of corn-soybean rotation. Volunteer corn does not only reduce soybean yield by competing for water, nutrition and sunlight, but also interferes with pest control (e.g., corn rootworm). It is therefore critical to monitor the volunteer corn in soybean at the crop seedling stage for better management. The current visual monitoring method is subjective and inefficient. Technology progress in sensing and automation provides a potential solution towards the automatic detection of volunteer corn from soybean. In this study, corn and soybean were planted in pots in greenhouse to mimic field conditions. Color images were collected by using a low-cost Intel RealSense camera for five successive days after the germination. Individual crops from images were manually cropped and subjected to image segmentation based on color threshold coupled with noise removal to create a dataset. Shape (i.e., area, aspect ratio, rectangularity, circularity, and eccentricity), color (i.e., R, G, B, H, S, V, L, a, b, Y, Cb, and Cr) and texture (coarseness, contrast, linelikeness, and directionality) features of individual crops were extracted. Individual feature's weights were ranked with the top 12 relevant features selected for this study. The 12 features were fed into three feature-based machine learning algorithms: support vector machine (SVM), neural network (NN) and random forest (RF) for model training. Prediction precision values on the test dataset for SVM, NN and RF were 85.3%, 81.6%, and 82.0%, respectively. The dataset (without feature extraction) was fed into two deep learning algorithms—GoogLeNet and VGG-16, resulting into 96.0% and 96.2% accuracies, respectively. The more satisfactory models from feature-based machine learning and deep learning were compared. VGG-16 was recommended for the purpose of distinguishing volunteer corn from soybean due to its higher detection accuracy, as well as smaller standard deviation (STD). This research demonstrated RGB images, coupled with VGG-16 algorithm could be used as a novel, reliable (accuracy > 96%), and simple tool to detect volunteer corn from soybean. The research outcome helps provide critical information for farmers, agronomists, and plant scientists in monitoring volunteer corn infestation conditions in soybean for better decision making and management.

References

[1]

CROOKSTON R, KURLE J, COPELAND P, et al. Rotational cropping sequence affects yield of corn and soybean[J]. Agronomy Journal, 1991, 83(1): 108-113.

[2]

BULLOCK G. Crop rotation[J]. Critical Reviews in Plant Sciences, 1992, 11(4): 309-326.

[3]

MEESE G, CARTER R, OPLINGER E S. Corn/soybean rotation effect as influenced by tillage, nitrogen, and hybrid/cultivar[J]. Journal of Production Agriculture, 1991, 4(1): 74-80.

[4]

DE BRUIN L, PORTER M, NICHOLAS J. Use of a rye cover crop following corn in rotation with soybean in the upper Midwest[J]. Agronomy Journal, 2005, 97(2): 587-598.

[5]

LAUER J, PORTER P, OPLINGER E. The corn and soybean rotation effect[J]. Field Crops, 1997, 28: 426-514.

[6]
RHODES C. Why do they do that? Rotating crops kernel description[EB/OL]. (2018-02-12) [2020-06-29]. https://iowaagliteracy.wordpress.com/2018/02/12/why-do-they-do-that-rotating-crops/.
[8]
TYLKA G. Soybean cyst nematode [EB/OL]. (1994-12-20) [2020-06-29]. https://nematode.unl.edu/scn/scnisu.htm.
[10]
JEFF G, NICOLAI D, STAHL L. Managing the potential for volunteer corn in 2019 [EB/OL]. (2018-10-30) [2020-06-29]. https://blog-crop-news.extension.umn.edu/2018/10/managing-potential-for-volunteer-corn.html#:~:text=If%20Enlist%20corn%20is%20grown,volunteer%20corn%20in%20Enlist%20Corn.
[11]

MARQUARDT T, TERRY M, JOHNSON G. The impact of volunteer corn on crop yields and insect resistance management strategies[J]. Agronomy, 2013, 3(2): 488-496.

[12]

ALMS J, MOECHNIG M, VOS D. Yield loss and management of volunteer corn in soybean[J]. Weed Technology, 2016, 30(1): 254-262.

[13]
GUNSOLUS J, DAVE N. Managing volunteer corn[EB/OL]. [2020-06-29]. https://blog-cropnews.extension.umn.edu/2018/10/managing-potential-for-volunteer-corn.html.
[14]

BECKETT H, STOLLER W. Volunteer corn (zea mays) interference in soybeans (glycine max)[J]. Weed Science, 1988, 36(2): 159-166.

[15]

CONLEY P, SANTINI B. Crop management practices in Indiana soybean production systems[J]. Crop Management, 2007, 6(1): 1-9.

[16]
ALMS J, MOECHNIG D, DENEKE D. Volunteer corn effect on corn and soybean yield[C]// Annual Meeting of North Central Weed Science Society. Indianapolis, Indiana, USA: North Central Weed Sci. Soc., 2008: 8-11.
[17]

MARQUARDT P, KRUPKE C, JOHNSON W G. Competition of transgenic volunteer corn with soybean and the effect on western corn rootworm emergence[J]. Weed Science, 2012, 60(2): 193-198.

[18]
JHALA A, WRIGHT B. Volunteer corn in soybean: Impact and management kernel description[EB/OL]. (2018-10-20) [2020-06-29]. https://cropwatch.unl.edu/volunteer-corn-soybean-impact-and-management.
[19]
LINGENFELTER D. Controlling volunteer corn in soybeans[EB/OL]. (2019-06-23) [2020-06-29]. https://extension.psu.edu/controlling-volunteer-corn-in-soybeans.
[20]

ZHANG Z, HEINEMANN P H, LIU J, et al. The development of mechanical apple harvesting technology: A review[J]. Transactions of the ASABE, 2016, 59(5): 1165-1180.

[21]

ZHANG Z, POTHULA K, LU R. A review of bin filling technologies for apple harvest and postharvest handling[J]. Applied Engineering in Agriculture, 2018, 34(4): 687-703.

[22]

SUNOJ S, SUBHASHREE S N, DHARANI S, et al. Sunflower floral dimension measurements using digital image processing[J]. Computers and Electronics in Agriculture, 2018, 151: 403-415.

[23]

CEN H, WAN L, ZHU J, et al. Dynamic monitoring of biomass of rice under different nitrogen treatments using a lightweight UAV with dual image-frame snapshot cameras[J]. Plant Methods, 2019, 15(1): ID 32.

[24]

ABDALLA A, CEN H, El-MANAWY A, et al. Infield oilseed rape images segmentation via improved unsupervised learning models combined with supreme color features[J]. Computers and Electronics in Agriculture, 2019, 162: 1057-1068.

[25]

HASANIJALILIAN O, IGATHINATHANE C, DOETKOTT C, et al. Chlorophyll estimation in soybean leaves infield with smartphone digital imaging and machine learning[J]. Computers and Electronics in Agriculture, 2020, 174: ID 105433.

[26]

ZHANG Z, IGATHINATHANE C, LI J, et al. Technology progress in mechanical harvest of fresh market apples[J]. Computers and Electronics in Agriculture, 2020, 175: ID 105606.

[27]

EL-FAKI S, ZHANG N, PETERSON D E. Weed detection using color machine vision[J]. Transactions of the ASAE, 2000, 43(6): 1969-1978.

[28]

ZHANG Z, HEINEMANN P. Economic analysis of a low-cost apple harvest-assist unit[J]. HortTechnology, 2017, 27(2): 240-247.

[29]

ZHANG Z, POTHULA K, LU R. Economic evaluation of apple harvest and in-field sorting technology[J]. Transactions of the ASABE, 2017, 60(5), 1537-1550.

[30]

ZHANG Z, HEINEMANN P H, LIU J, et al. Design and field test of a low-cost apple harvest-assist unit[J]. Transactions of the ASABE, 2016, 59(5): 1149-1156.

[31]

ZHANG Z, HEINEMANN P H, LIU J, et al. Brush mechanism for distributing apples in a low-cost apple harvest-assist unit[J]. Applied Engineering in Agriculture, 2017, 33(2): 195-201.

[32]

WU L, WEN Y. Weed/corn seedling recognition by support vector machine using texture features[J]. African Journal of Agricultural Research, 2009, 4(9): 840-846.

[33]

WANG A, ZHANG W, WEI X. A review on weed detection using ground-based machine vision and image processing techniques[J]. Computers and Electronics in Agriculture, 2019, 158: 226-240.

[34]

TANG J L, WANG D, ZHANG Z G, et al. Weed identification based on k-means feature learning combined with convolutional neural network[J]. Computers and Electronics in Agriculture, 2017, 135: 63-70.

[35]

FERREIRA S, FREITAS M, SILVA GDA, et al. Weed detection in soybean crops using convnets[J]. Computers and Electronics in Agriculture, 2017, 143: 314-324.

[36]

BAH M D, HAFIANE A, CANALS R. Deep learning with unsupervised data labeling for weed detection in line crops in uav images[J]. Remote Sensing, 2018, 10(11): ID 1690.

[37]

YU J, SHARPE M, SCHUMANN W, et al. Deep learning for image-based weed detection in turfgrass[J]. European Journal of Agronomy, 2019, 104: 78-84.

[38]

LOTTES P, BEHLEY J, MILIOTO A, et al. Fully convolutional networks with sequential information for robust crop and weed detection in precision farming[J]. IEEE Robotics and Automation Letters, 2018, 3(4): 2870-2877.

[39]
VENKATARAMAN D, MANGAYARKARASI N. Computer vision based feature extraction of leaves for identification of medicinal values of plants[C]//2016 IEEE International Conference on Computational Intelligence and Computing Research (ICCIC). Piscataway, New York, USA: IEEE, 2016: 1-5.
[40]

ARSENOVIC M, KARANOVIC M, SLADOJEVIC S, et al. Solving current limitations of deep learning based approaches for plant disease detection[J]. Symmetry, 2019, 11(7): ID 939.

[41]

AZLAH M A F, CHUA L S, RAHMAD F R, et al. Review on techniques for plant leaf classification and recognition[J]. Computers, 2019, 8(4): ID 77.

[42]

KADIR A, NUGROHO L E, SUSANTO A, et al. A comparative experiment of several shape methods in recognizing plants[J]. International Journal of Computer Science & Information Technology, 2011, 3(3): 256-263.

[43]

PEREZ A J, LOPEZ F, BENLLOCH J V, et al. Colour and shape analysis techniques for weed detection in cereal fields[J]. Computers and Electronics in Agriculture, 2000, 25(3): 197-212.

[44]

LIU T, CHEN W, WU W, et al. Detection of aphids in wheat fields using a computer vision technique[J]. Biosystems Engineering, 2016, 141: 82-93.

[45]

HAMUDA E, GINLEY BMC, GLAVIN M, et al. Automatic crop detection under field conditions using the HSV colour space and morphological operations[J]. Computers and Electronics in Agriculture, 2017, 133: 97-107.

[46]

SUNOJ S, IGATHINATHANE C, SALIENDRA N, et al. Color calibration of digital images for agriculture and other applications[J]. ISPRS Journal of Photogrammetry and Remote Sensing, 2018, 146: 221-234.

[47]

LIU T, LI R, ZHONG X, et al. Estimates of rice lodging using indices derived from UAV visible and thermal infrared images[J]. Agricultural and Forest Meteorology, 2018, 252: 144-154.

[48]

TAMURA H, MORI S, YAMAWAKI T. Textural features corresponding to visual perception[J]. IEEE Transactions on Systems Man and Cybernetics, 1978, 8(6): 460-473.

[49]

ZHANG B, HUANG W, GONG L, et al. Computer vision detection of defective apples using automatic lightness correction and weighted RVM classifier[J]. Journal of Food Engineering, 2015(146): 143-151.

[50]

SUN Y. Iterative relief for feature weighting: Algorithms, theories, and applications[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2007, 29(6): 1035-1051.

[51]

ZHANG Z, FLORES P, IGATHINATHANE C, et al. Wheat lodging detection from UAS imagery using machine learning algorithms[J]. Remote Sensing, 2020, 12(11): ID 1838.

[52]

ZHOU Z. Ensemble methods: Foundations and algorithms[M]. CRC Press: Boca Raton, FL, USA, 2012.

[53]
HAN S, CAO Q, MENG H. Parameter selection in SVM with RBF kernel function[C]// World Automation Congress 2012. Piscataway, New York, USA: IEEE, 2012.
[54]

AMARI S, WU S. Improving support vector machine classifiers by modifying kernel functions[J]. Neural Networks, 1999, 12(6): 783-789.

[55]

NAIK D L, KIRAN R. Identification and characterization of fracture in metals using machine learning based texture recognition algorithms[J]. Engineering Fracture Mechanics, 2019, 219: ID 106618.

[56]

SIMONYAN K, ZISSERMAN A. Very deep convolutional networks for large-scale image recognition[J]. arXiv preprint arXiv: 1409.1556, 2014.

[57]
BHARATH RAJ. A simple guide to the versions of the inception network kernel description[EB/OL]. [2020-06-29]. https://towardsdatascience.com/a-simple-guide-to-the-versions-ofthe-inception-network-7fc52b863202.
[58]
SZEGEDY C, LIU W, JIA Y, et al. Going deeper with convolutions[C]// Proceedings of the IEEE conference on computer vision and pattern recognition. Piscataway, New York, USA: IEEE, 2015: 1-9.
[59]
BROWNLEEJASON. How to develop VGG, conception and ResNet modules from scratch in keras kernel description[EB/OL] [2020-06-29]. https://machinelearningmastery.com/how-to-implement-major-architecture-innovations-for-convolutional-neural-networks/.
Smart Agriculture
Pages 61-74
Cite this article:
Flores P, Zhang Z, Mathew J, et al. Distinguishing Volunteer Corn from Soybean at Seedling Stage Using Images and Machine Learning. Smart Agriculture, 2020, 2(3): 61-74. https://doi.org/10.12133/j.smartag.2020.2.3.202007-SA002

263

Views

4

Downloads

0

Crossref

12

Scopus

Altmetrics

Received: 01 July 2020
Accepted: 01 August 2020
Published: 30 September 2020
© The Author(s) 2020.

This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Return