AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
View PDF
Collect
Submit Manuscript AI Chat Paper
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Research Article | Open Access

Deciphering the contributions of spectral and structural data to wheat yield estimation from proximal sensing

Qing LiaShichao Jina,b( )Jingrong ZangaXiao WangaZhuangzhuang SunaZiyu LiaShan XuaQin MacYanjun SudQinghua GuoeDong Jianga( )
Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Regional Technique Innovation Center for Wheat Production, Ministry of Agriculture, Key Laboratory of Crop Physiology and Ecology in Southern China, Ministry of Agriculture, Collaborative Innovation Centre for Modern Crop Production co-sponsored by Province and Ministry, Jiangsu Key Laboratory for Information Agriculture, Nanjing Agricultural University, Nanjing 210095, Jiangsu, China
Jiangsu Provincial Key Laboratory of Geographic Information Science and Technology, International Institute for Earth System Sciences, Nanjing University, Nanjing 210023, Jiangsu, China
Department of Forestry, Mississippi State University, Mississippi State, MS 39759, USA
State Key Laboratory of Vegetation and Environmental Change, Institute of Botany, Chinese Academy of Sciences, Beijing 100093, China
Department of Ecology, College of Environmental Sciences, and Key Laboratory of Earth Surface Processes of the Ministry of Education, Peking University, Beijing 100871, China
Show Author Information

Abstract

Accurate, efficient, and timely yield estimation is critical for crop variety breeding and management optimization. However, the contributions of proximal sensing data characteristics (spectral, temporal, and spatial) to yield estimation have not been systematically evaluated. We collected long-term, hyper-temporal, and large-volume light detection and ranging (LiDAR) and multispectral data to (i) identify the best machine learning method and prediction stage for wheat yield estimation, (ii) characterize the contribution of multisource data fusion and the dynamic importance of structural and spectral traits to yield estimation, and (iii) elucidate the contribution of time-series data fusion and 3D spatial information to yield estimation. Wheat yield could be accurately (R2 = 0.891) and timely (approximately-two months before harvest) estimated from fused LiDAR and multispectral data. The artificial neural network model and the flowering stage were always the best method and prediction stage, respectively. Spectral traits (such as CIgreen) dominated yield estimation, especially in the early stage, whereas the contribution of structural traits (such as height) was more stable in the late stage. Fusing spectral and structural traits increased estimation accuracy at all growth stages. Better yield estimation was realized from traits derived from complete 3D points than from canopy surface points and from integrated multi-stage (especially from jointing to heading and flowering stages) data than from single-stage data. We suggest that this study offers a novel perspective on deciphering the contributions of spectral, structural, and time-series information to wheat yield estimation and can guide accurate, efficient, and timely estimation of wheat yield.

The Crop Journal
Pages 1334-1345
Cite this article:
Li Q, Jin S, Zang J, et al. Deciphering the contributions of spectral and structural data to wheat yield estimation from proximal sensing. The Crop Journal, 2022, 10(5): 1334-1345. https://doi.org/10.1016/j.cj.2022.06.005

236

Views

5

Downloads

17

Crossref

14

Web of Science

13

Scopus

1

CSCD

Altmetrics

Received: 19 December 2021
Revised: 19 January 2022
Accepted: 05 July 2022
Published: 19 July 2022
© 2022 Crop Science Society of China and Institute of Crop Science, CAAS.

This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

Return