Get Our e-AlertsSubmit Manuscript
Plant Phenomics / 2022 / Article

Research Article | Open Access

Volume 2022 |Article ID 9851096 | https://doi.org/10.34133/2022/9851096

Chu Zhang, Lei Zhou, Qinlin Xiao, Xiulin Bai, Baohua Wu, Na Wu, Yiying Zhao, Junmin Wang, Lei Feng, "End-to-End Fusion of Hyperspectral and Chlorophyll Fluorescence Imaging to Identify Rice Stresses", Plant Phenomics, vol. 2022, Article ID 9851096, 14 pages, 2022. https://doi.org/10.34133/2022/9851096

End-to-End Fusion of Hyperspectral and Chlorophyll Fluorescence Imaging to Identify Rice Stresses

Received19 Mar 2022
Accepted03 Jul 2022
Published02 Aug 2022

Abstract

Herbicides and heavy metals are hazardous substances of environmental pollution, resulting in plant stress and harming humans and animals. Identification of stress types can help trace stress sources, manage plant growth, and improve stress-resistant breeding. In this research, hyperspectral imaging (HSI) and chlorophyll fluorescence imaging (Chl-FI) were adopted to identify the rice plants under two types of herbicide stresses (butachlor (DCA) and quinclorac (ELK)) and two types of heavy metal stresses (cadmium (Cd) and copper (Cu)). Visible/near-infrared spectra of leaves (L-VIS/NIR) and stems (S-VIS/NIR) extracted from HSI and chlorophyll fluorescence kinetic curves of leaves (L-Chl-FKC) and stems (S-Chl-FKC) extracted from Chl-FI were fused to establish the models to detect the stress of the hazardous substances. Novel end-to-end deep fusion models were proposed for low-level, middle-level, and high-level information fusion to improve identification accuracy. Results showed that the high-level fusion-based convolutional neural network (CNN) models reached the highest detection accuracy (97.7%), outperforming the models using a single data source (<94.7%). Furthermore, the proposed end-to-end deep fusion models required a much simpler training procedure than the conventional two-stage deep learning fusion. This research provided an efficient alternative for plant stress phenotyping, including identifying plant stresses caused by hazardous substances of environmental pollution.

1. Introduction

Rice (Oryza sativa L.) is one of the main staple food in the world. With the development of breeding techniques, high yield and stress-resistant varieties have been developed and promoted for planting to increase the rice yield. However, the yield will reach the plateaus when the planting fields no longer increase. On the other hand, climate instability and biotic and abiotic stresses show great threats to rice production. Although stress-resistant varieties have been bred, they can only deal with a few stresses. These varieties also suffer from a complex growth environment, resulting in yield and quality loss.

Herbicides and heavy metals are the main hazardous substances causing environmental pollution, resulting in the abiotic stress of plants. Generally, the rice plants are exposed to herbicides or heavy metals, which might not cause death to the plants, and the plants under the threat of relatively low concentrations of herbicides and heavy metals might not show obvious differences in the symptoms. Under this situation, it is hard to identify the stress types of plants. Moreover, the stresses will affect plant growth, and different plant organs may respond differently. The acquisition of phenotyping traits of different organs can provide complementary information of plants, improving the precision and stability of plant growth status monitoring. Accurately and automatically identifying abiotic stress types can obtain plant growth information and find the optimal solution to deal with the stresses. Knowing what happens to the plants and how to treat them makes it possible to minimize the influence of abiotic stresses to stabilize the yield and quality.

High-throughput phenotyping of plants can help obtain a large number of phenotyping traits in digital and automatic manners. The development of modern analytical techniques has made great contributions to high-throughput plant phenotyping [15]. The numerous data acquired by the analytical instruments contained various information relating to plant growth status. The plant phenotyping traits can evaluate the plant growth status. High-throughput phenotyping also helps study the relationship between phenomics and genomics [6, 7]. Hyperspectral imaging (HSI) [811] and chlorophyll fluorescence imaging (Chl-FI) [1113] are two widely used techniques for high-throughput plant phenotyping, providing different information on plant growth. HSI and Chl-FI have been studied for heavy metal and herbicide stresses [1417].

HSI integrates the spectroscopy technique and imaging technique, capturing spectral information of pixels within the image. The spectral information is related to the chemical compositions and physiological and biochemical reactions. Image information relates to external information, such as plant structure, color, and morphological features. Full-range spectra, feature wavelengths, and spectral indices derived from full-range spectra are mostly used for analysis. Chl-FI has also been used for high-throughput phenotyping of plants. It captures chlorophyll fluorescence signals from the samples, which relates to photosynthesis. Chlorophyll fluorescence kinetic parameters are mostly used for analysis, and some studies have used chlorophyll fluorescence spectra for analysis [1820].

Previous studies have proved the effectiveness of HSI and Chl-FI for plant stress phenotyping individually. The combination of HSI and Chl-FI has also been studied for plant stress phenotyping [2123]. However, most of these studies have analyzed HSI and Chl-FI separately [2124]. Since HSI and Chl-FI acquired different phenotyping traits based on different principles, the fusion of the features of HSI and Chl-FI for plant stress type identification can be explored to use complementary information of different phenotyping traits. Information fusion has been widely used for integrating multimodal or multisensor data to improve analysis performances for different purposes with higher precision and reliability. In the review of [25], the authors discussed the potential of information fusion of HSI and Chl-FI. In the review of [26], the authors also discussed the potential of information fusion for plant stress phenotyping. Various studies have shown the good performances of information fusion for plant phenotyping, including the fusion of data acquainted by different types of sensors (representing different techniques) [2735].

Deep learning has been the hottest topic in machine learning and artificial intelligence. With the ability to learn deep and representative features from big data, deep learning has been used in various fields, including plant phenotyping. Deep learning has also been used for high-throughput plant phenotyping [26, 3640]. According to previous studies, shallow CNN models can work well on one-dimensional (1D) spectral data [14, 41]. To our knowledge, no previous studies have used the Chl-FKC as inputs of CNN. The 1D Chl-FKC is similar to the VIS/NIR spectra. Due to the significant feature learning ability, DL can fuse features from VIS/NIR spectra and Chl-FKC to fully reveal the phenotyping information of plants. Researchers have conducted deep learning-based information fusion for plant phenotyping [38, 42, 43].

In general, there are three different levels of information fusion, including low-level fusion (fusion of original data), middle-level fusion (fusion of features extracted by a feature extractor), and high-level fusion (decision fusion) [4446]. However, most existing information fusion models are built based on a two-stage training procedure, requiring individual feature extractors and classifiers. The first step is to train a feature extractor or directly use a manually defined one to produce the features for fusion. The second step is to train another model for discrimination based on the fused features obtained in the first step [27, 28, 33]. These information fusion models are complex and need manual intervention. Recent studies have developed end-to-end deep fusion models for applications, combining feature extraction and modeling in one model [47, 48]. The features are automatically learned, fused, and fed into the classifier. The end-to-end deep fusion models have simpler training procedures and are more applicable for real-world applications.

This study focused on identifying different types of abiotic stresses using HSI and Chl-FI techniques. The specific objectives were to (1) identify two types of herbicide stresses (butachlor (DCA) and quinclorac (ELK)) and two types of heavy metal stresses (cadmium (Cd) and copper (Cu)), (2) explore the performances of the stress type identification using different techniques (HSI and Chl-FI) and different organs (leaves and stems), and (3) explore the stress type identification using the three levels of end-to-end deep fusion to fuse the information acquired by different techniques (HSI and Chl-FI) and different organs (leaves and stems).

2. Materials and Methods

2.1. Sample Preparation

The rice variety used in this study was Zhongheyou 4, provided by the Institute of Crop Science and Nuclear Technology Utilization, Zhejiang Academy of Agricultural Sciences. The rice seeds were sowed onto the seedbed. The rice seedlings were transplanted to the laboratory for stress one month later. The rice seedlings were transplanted in plug trays with nutrient soil. Regular water management and fertilizer management were conducted. After one week of transplantation, the seedlings were used for treatments with different stresses.

Two different herbicides were used, including butachlor (DCA) and quinclorac (ELK). Two heavy metals were used, including copper (Cu) and cadmium (Cd). For butachlor stress, butachlor solutions with a 50% active constituent were purchased from the local pesticide shop, and the dosage of used butachlor solutions was 0 (control (CK)), 1.5, 3, and 6 mL/L. For quinclorac stress, quinclorac powders with a 99% active constituent were purchased, and the dosage of used quinclorac powder was 0 (CK), 0.56, 1.12, and 2.24 g/L. For Cu and Cd stresses, the concentration of Cu and Cd was 0 (CK), 10, 30, and 60 μmL/L. The different concentrations of herbicides and heavy metals were used to include more sample variations under one type of stress. In all, the number of samples under the treatments of CK, Cd, Cu, DCA, and ELK was 240, 360, 360, 268, and 358, respectively.

After the first time of stress, regular water management and fertilizer management were conducted. The efficiency of hyperspectral image acquisition and chlorophyll fluorescence image acquisition makes it unable to acquire a lot of samples a day. Thus, the four stresses were conducted on four successive days. After one week, the image acquisition of the corresponding stresses was conducted on another 4 successive days for the second batch of samples. As for hyperspectral image acquisition and chlorophyll fluorescence image acquisition, both leaves and stems were cut from the seedling and used for image acquisition separately.

2.2. Hyperspectral Image Acquisition

An assembled hyperspectral imaging system (as described previously in [49]) was used to acquire hyperspectral images of rice leaves and stems. The system covers the spectral range from 380 to 1030 nm, integrated with a spectrograph, a camera with a lens, and a tungsten halogen light source. The samples could be placed on a conveyer belt driven by a stepper motor. The hyperspectral images were collected using line-scan mode and calibrated by black-white calibration. where denotes the calibrated spectral image. , , and denote the raw image, black image, and white image.

The system parameters were adjusted to acquire clear and nondeformable images. The camera exposure time was set as 0.027 s, the speed of the moving plate was set as 3.4 mm/s, and the distance between the lens and the moving plate was adjusted to 26 cm. It was expected to cover the tested plant as much as possible, and the plant was cut into segments and put symmetrically under the camera (on the conveyor belt) for line-scan image collection.

2.3. Chlorophyll Fluorescence Image Acquisition

A pulse-amplitude-modulated chlorophyll fluorescence imaging system (FluorCam FC800, Photon Systems Instruments, Brno, Czechia) (as described previously in [50]) was used to acquire chlorophyll fluorescence images. This chlorophyll fluorescence imaging system consists of a CCD camera ( resolution) with an industry lens SV-H1.4/6 (VS Technology, Tokyo, Japan). The light source is formed by five light-emitting diodes (LED). An elevating table (HTVS120, SPL, Hangzhou, China) is used to adjust the distance between the samples and the lens. The image acquisition procedure was the same as [51], while the system parameters differed. When acquiring the chlorophyll fluorescence images, the actinic lights, saturating flashes, exposure time, and sensitivity were adjusted at 90%, 75%, 33.3 μs, and 33.3%. The segmented plants were placed inside a dark room (the area that the camera could cover) for chlorophyll fluorescence image acquisition.

2.4. Spectral Data Extraction and Chlorophyll Fluorescence Kinetic Curve Extraction
2.4.1. Spectral Curve Extraction

For hyperspectral images, the leaves of a sample are identified as the region of interest (ROI) of leaves (LROI), and the stem of a sample is also identified as an ROI (SROI). Considering the fact that the head and the end of the spectra contained obvious noises caused by the instrument, only the visible/near-infrared (VIS/NIR) spectra in the range of 454-957 nm (396 wavebands) were used for analysis. The pixel-wise spectra within the ROI were preprocessed by wavelet transform (wavelet function Daubechies 8 with decomposition level 3). All pixel-wise spectra within each ROI were averaged as the spectrum of the samples. The VIS/NIR spectra of leaves (L-VIS/NIR) and stems (S-VIS/NIR) were obtained for each rice plant, respectively. For chlorophyll fluorescence images, the leaves and the stem were identified as ROIs (similar to the ROI definition in hyperspectral images), respectively. Pixel-wise Chl-FKC within each ROI were averaged as the chlorophyll fluorescence kinetic spectrum. The Chl-FKC of leaves (L-Chl-FKC) and stems (S-Chl-FKC) were obtained for each rice plant, respectively. Maximum normalization was conducted to preprocess the VIS/NIR spectra and the Chl-FKC for further modeling procedures.

2.4.2. Dataset Preparation

To establish models, the class labels of the samples in the control group (CK) and the class labels of the samples under the stresses of Cd, Cu, DCA, and ELK were assigned as 0, 1, 2, 3, and 4, respectively. A typical dataset split approach in the deep learning area was adopted. For each group, the samples were randomly split into the training, validation, and testing sets [5254], using the ratio of 4 : 1 : 1. The samples in the training, validation, and testing sets were the same for each single data source. The details of the dataset are listed in Table 1.


Data sourceNumber of featuresNumber of samples
TrainingValidationTesting

L-VIS/NIR3961060265261
S-VIS/NIR3961060265261
L-Chl-FKC2861060265261
S-Chl-FKC2861060265261

2.5. Classification Models
2.5.1. Conventional Machine Learning Methods

The support vector machine (SVM) is a widely used pattern recognition method [55]. For linearly separable issues, a linear classier is developed. For nonlinear classification, SVM maps the original data into high dimensions by kernel functions and establishes hyperplanes to maximally classify the closest training samples of different classes. In this study, SVM was used to compare deep learning approaches.

2.5.2. Deep Learning Method

The convolutional neural network (CNN) is a widely used deep learning algorithm. The extracted VIS/NIR spectra and Chl-FKC were used to identify the stress types in this study.

A 1D CNN architecture was designed as the base classifier for processing each single data source. This model was defined as CNN-S. Figure 1 shows the shallow CNN architectures used in this study. The first part was an attention layer, which operated based on where , , , and are the trainable parameters (weights and bias) in the attention layer, is the activation function, and denotes the input. The second part was a 1D convolution block consisting of three 1D convolution layers (kernel size of 3, stride of 1; activated using RELU). To reduce the dimension of features, a max-pooling layer (pool size of 2, stride of 2) was added after the first convolution layer. The third part was a fully connected neural network including three dense layers. The numbers of neurons were 512, 128, and 5, respectively, using RELU activation. Furthermore, a batch normalization layer was added before each convolution and dense layer. The SoftMax Cross-Entropy was utilized as the loss function.

2.6. Fusion Strategies

Leaves and stems showed different physiochemical characteristics under stress. During the seedling stage of rice, the leaves and stems are all green, and the VIS/NIR spectra and Chl-FKC of leaves and stems are quite similar (Figures 2 and 3). Since HSI and Chl-FI acquired different phenotyping traits based on different principles, each technique can provide limited plant phenotyping information. Thus, the fact that the fusion of HSI and Chl-FI can help identify rice stress types is worthy of investigation. There were two organs (leaves and stems) and two techniques (HSI and Chl-FI). Five different fusion strategies of data were conducted according to Table 2. It should be noted that L-VIS/NIR, S-VIS/NIR, L-Chl-FKC, and S-Chl-FKC of the same sample were used for fusion.


Fusion strategyL-VIS/NIRS-VIS/NIRL-Chl-FKCS-Chl-FKC

Fusion 1××
Fusion 2××
Fusion 3××
Fusion 4××
Fusion 5

‡The symbol ✓ means that the corresponding features were used for fusion, and the symbol × means that the corresponding features were not used for fusion.

This study explored three different levels of information fusion (low level, middle level, and high level). For each information fusion level, end-to-end deep fusion models were developed. For low-level fusion, the original data of each type of feature were concatenated directly as a long vector and performed the classification task by the CNN-S model according to the fusion strategies in Table 2.

For middle-level fusion, an end-to-end CNN architecture for information fusion was proposed to fuse different types of original spectra for plant stress type identification. Each type of original feature was fed into the corresponding feature extractor, and the model required only a one-stage training. The fusion strategies for the end-to-end CNN were based on Table 2. The middle-level fusion model fused the deep features for further classification. In Figure 4, the end-to-end CNN model for middle-level fusion using all the four types of features is presented. Four single CNN-S models were employed for processing different data sources. In particular, the last dense layer of the CNN-S was removed. Therefore, the 128-dimension output (4 vectors from the second dense layer of each CNN-S submodel) was considered the deep features. They were weighted and concatenated for fusion.

The middle-level fusion could be described using where means that this function concatenates all inputs to generate a vector; , , , and are the deep features from L-VIS/NIR, S-VIS/NIR, L-Chl-FKC, and S-Chl-FKC, respectively.

For high-level fusion, another end-to-end CNN model was designed (shown in Figure 5). In this paper, high-level fusion could also be understood as decision fusion, which provides the final decision on classification based on the output of different classifiers. Different from the mentioned middle-level fusion model, the high-level fusion model used four complete CNN-S models to process the four data sources. Then, the outputs of the four submodels (four vectors with a shape of ) were concatenated and further fed into another dense layer with five neurons. It should be pointed out that a new loss function was proposed especially for this deep decision fusion model; see where denotes the output of the whole decision fusion-based deep learning model; , , , and are the outputs of four submodels; and denotes the Cross-Entropy loss. This loss function could simultaneously promote the final classification loss and that of the submodels as small as possible. Moreover, due to the constraint of the loss function on the output of submodels, the fused , , , and were forced to approach the expected probability distribution. Thus, the combination of them was a typical decision fusion.

To evaluate the superiority of the proposed methods, two-stage middle- and high-level fusion approaches were also used for comparison. In this study, only the two-stage fusion of all the four types of features was implemented. As for middle-level fusion, feature extraction, feature fusion, and modeling were conducted separately. The features were learned and extracted from CNN-S models using each single type of feature, and all the features extracted from the four types of features were concatenated. The fused features were then fed into a CNN-S model. As for high-level fusion, the predicted probability distribution vectors of the CNN-S models using four data sources were first extracted. Then, the averaged probability distribution vectors were used to make the final decision.

2.7. Model Performance Evaluation and Software

The classification accuracy was evaluated by the ratio of the number of correctly classified samples to the number of total samples. SVM was conducted on scikit-learn (version: 1.0.1) in Python 3.7, and CNN was conducted using the MXNet framework (Amazon, Seattle, WA, United States) with Python 3.7.

3. Results

3.1. Profiles of VIS/NIR Spectra and Chl-FKC

Figure 2 shows the average VIS/NIR spectra of the leaves (Figure 2(a)) and stems (Figure 2(b)) of the healthy rice plants and the plants under the stresses of Cd, Cu, DCA, and ELK. Figure 3 shows the average Chl-FKC of the leaves (Figure 3(a)) and stems (Figure 3(b)). As shown in Figures 2 and 3, typical hyperspectral profiles of plants can be found for the leaves and stems of rice plants, as well as the Chl-FKC. Differences can be found for the leaves and stems under different stresses. However, no particular regulation (for example, which stress has higher reflectance or fluorescence intensity) could be found. There were variations during the measurement caused by the samples, instruments, and measurement conditions. Thus, it was difficult to identify the stress types by observing the differences in spectral curves or Chl-FKC. Further investigation should be conducted for stress type identification.

3.2. Model Establishment Using a Single Type of Feature

The full VIS/NIR spectra and Chl-FKC of leaves and stems were used as inputs of SVM and CNN-S models. Table 3 shows the statistical results of the classification models. To construct SVM and CNN-S models, the training sets and validation sets were used, and the optimal models were selected according to the classification performances of the validation sets. For SVM, the kernel function was chosen as “rbf.” The optimization ranges of parameters and were both . For all deep learning models in this study, the number of epochs was set as 500. A scheduled learning rate was used by starting with 0.005 for the first 100 epochs, which was reduced to one-tenth by every 100 epochs. The batch size was set as 128. The statistical results of SVM and CNN-S models are shown in Table 3.


Dataset typeModelAccuracy (%)
TrainingValidationTesting

L-VIS/NIRSVM10091.792.0
CNN98.895.192.7
S-VIS/NIRSVM96.287.994.7
CNN98.491.392.0
L-Chl-FKCSVM10074.375.5
CNN10077.475.5
S-Chl-FKCSVM92.978.178.9
CNN99.975.876.2

Both the CNN-S and SVM models performed well for L-VIS/NIR and S-VIS/NIR. CNN-S models obtained slightly better results than the corresponding SVM models, with the classification accuracy of the training, validation, and testing sets all over 90%. For L-Chl-FKC and S-Chl-FKC, both CNN-S models and SVM models failed to obtain satisfactory results, with the classification accuracy of the training, validation, and testing sets over 70%. CNN-S models and SVM models obtained close results. It could be noted that overfitting occurred for both the SVM and CNN models using chlorophyll fluorescence induction kinetic spectra. For VIS/NIR spectra, the classification models of leaves performed better than the corresponding models of stems, indicating that leaves were more suitable for rice stress type identification when using HSI. For Chl-FKC, the classification models of leaves obtained close results to those of stems.

Figure 6 shows the confusion matrix of CNN models using the four data types. No particular regulation could be found for the misclassification of samples among different classes. One sample under one type of stress could be misclassified as any other type of stress. The results of the confusion matrix showed the feasibility to identify the stress types of Cd, Cu, DCA, and ELK using HSI and Chl-FI.

3.3. Model Establishment Using Fused Datasets

For low-level fusion, the fused datasets were constructed according to the fusion strategies in Table 2. The fused datasets were also used as inputs of the CNN-S models with the same architectures as Figure 1. The parameter settings of the CNN-S models for fused datasets were the same as those of the CNN-S models for a single type of feature. The results of the CNN models using the fused datasets are presented in Table 4. The CNN-S model using the fusion of L-VIS/NIR and S-VIS/NIR obtained slightly better results than that using VIS/NIR spectra of leaves and stems individually. The CNN-S model using the fusion of L-Chl-FKC and S-Chl-FKC obtained significantly better results than that using features of leaves and stems individually. For fusion of VIS/NIR spectra and chlorophyll fluorescence induction kinetic spectra of leaves, the performance of the CNN model was worse than that of the CNN-S model using L-VIS/NIR and significantly better than that of the CNN-S model using L-Chl-FKC. A similar phenomenon could be found for stems. Classification models using the fusion of L-VIS/NIR and L-Chl-FKC showed better classification performances than the corresponding models using the fusion of S-VIS/NIR and S-Chl-FKC. The CNN-S model using the fusion of VIS/NIR spectra and Chl-FKC of leaves and stems showed good performances, and the performances were close to those of the CNN-S model using L-VIS/NIR.


Fusion levelFusion strategyAccuracy (%)
TrainingValidationTesting

Low levelFusion 199.997.494.6
Fusion 210079.280.5
Fusion 310091.789.3
Fusion 410085.385.8
Fusion 510095.193.1
Fusion 5-TwoStage§///

Middle levelFusion 197.695.192.3
Fusion 299.984.582.8
Fusion 3100.095.591.6
Fusion 499.684.985.4
Fusion 510095.196.6
Fusion 5-TwoStage||10097.497.3

High levelFusion 199.695.595.4
Fusion 210094.393.5
Fusion 399.589.187.0
Fusion 410086.485.1
Fusion 510097.497.7
Fusion 5-TwoStage10097.797.7

§Low-level fusion directly fuses the raw data as the input of the model, which does not need a two-stage training. ||The two-stage-based middle-level fusion model fuses the deep features extracted by single models (CNN-S). Another CNN classifier is trained for processing the fused features. ¶The two-stage-based high-level fusion model fused 4 predicted probability distribution vectors (each includes 5 elements). The averaged probability distribution vectors are for making the final decision.

As for the end-to-end fusion approach for middle-level fusion, the original VIS/NIR spectra and Chl-FKC of leaves and stems were fed into an end-to-end deep learning fusion model (shown in Figure 4) according to the fusion strategies in Table 2. By fusing L-VIS/NIR and S-VIS/NIR, the CNN model obtained good results, with the classification accuracy of the training, validation, and testing sets over 90%. The CNN model using the fusion of L-Chl-FKC and S-Chl-FKC obtained worse results, with the classification accuracy of the training, validation, and testing sets over 80%. When L-VIS/NIR and L-Chl-FKC were fused, the corresponding CNN model obtained good performances, with the classification of all the three sets over 90%. The classification accuracy of the training, validation, and testing sets in the CNN model using the fusion of S-VIS/NIR and S-Chl-FKC was over 80%. When VIS/NIR spectra and Chl-FKC of leaves and stems were all fused, the corresponding CNN model obtained the best performances, with the classification accuracy of all the three sets over 95%. The CNN model using the fusion of L-VIS/NIR and S-VIS/NIR obtained slightly better results than that using L-VIS/NIR and S-VIS/NIR individually. The CNN model using the fusion of L-Chl-FKC and S-Chl-FKC obtained equivalent or better results than that using L-Chl-FKC and S-Chl-FKC individually. As for the two-stage fusion approach of middle-level fusion, the deep features extracted from the VIS/NIR spectra and Chl-FKC of leaves and stems by the corresponding CNN-S models were all used for fusion. Better performances were obtained, with the classification accuracy of the training, validation, and testing sets all over 97%.

As for high-level fusion, an end-to-end CNN model for decision fusion was also applied. The classification results were the best, and the classification accuracy of the training, validation, and testing sets was 100%, 97.4%, and 97.7%. A two-stage high-level fusion approach was also used for comparison, and close results were obtained. The classification accuracy of the training, validation, and testing sets was 100%, 97.7%, and 97.7%.

The high-level fusion showed the best performances of rice stress type identification compared with low-level fusion and middle-level fusion. High-level fusion depended on the outputs of each single model. The results of the classification models using each single type of feature illustrated the potential of different types of features for rice stress type identification. The modeling performances varied for each type of feature. High-level fusion could reduce the interference by the defects of different decision models using different features, producing excellent performances. The overall results illustrated the effectiveness of high-level fusion for rice stress type identification.

Compared with end-to-end fusion models, the two-stage information fusion approaches were more complex. The features were extracted by the trained and optimized models for two-stage middle-level fusion. Moreover, one more classifier should be trained on the combined features. The whole procedure was complex, with more computation and manual intervention (here, 5 models should be trained and optimized). For two-stage high-level fusion, the predicted probability distribution vectors of the CNN-S model using each type of feature were first extracted. A decision-making procedure was implemented to make the final decision. End-to-end models for information fusion showed a simpler operation with only one-stage calculation.

As shown in Table 4, the models using the fusion of L-Chl-FKC and S-Chl-FKC obtained better results than those using L-Chl-FKC and S-Chl-FKC individually. The results indicated that combining Chl-FKC features could improve classification performances. Since the CNN-S models using L-VIS/NIR and S-VIS/NIR individually obtained good performances, the models using the fusion containing the features of L-VIS/NIR and S-VIS/NIR all obtained good performances. The results were not good when CNN-S models were built using L-Chl-FKC and S-Chl-FKC individually. The fusion of Chl-FKC features and the corresponding VIS/NIR features showed better results than the models using the corresponding Chl-FKC features and obtained lower or close results than the models using the corresponding VIS/NIR features. Due to the gap between the performances of models using VIS/NIR spectra and the corresponding Chl-FKC individually, the fusion of Chl-FKC features and the corresponding VIS/NIR features obtained the performances between the performances of the models using each type of feature individually.

The fusion of all the features showed relatively better performances. By fusing all the types of features, the complementary information relating to rice stress type identification hidden in the obtained features could be revealed for rice stress type identification.

Furthermore, the confusion matrices of the three levels of information fusion using the Fusion 5 strategy were explored (shown in Figure 7). Although Cd and Cu were both heavy metal stresses and DCA and ELK were herbicide stresses, no particular rules could be found for the misclassification of samples among different classes. The good performances of the classification of rice plants under different stress types showed that information fusion of phenotyping traits of leaves and stems acquired by HSI and Chl-FI had great potential for rice stress type identification.

4. Discussion

The effectiveness of HSI and Chl-FI for high-throughput plant stress phenotyping has been widely verified in various studies [10, 13, 5660]. In this study, VIS/NIR spectra acquired by HSI and chlorophyll fluorescence induction kinetic spectra acquired by Chl-FI of leaves and stems of rice plants were used to identify the stress types, and the results were promising.

In previous studies, SVM has been widely used as a conventional machine learning method to compare with deep learning methods for 1D spectral analysis. The results between SVM and deep learning methods were quite close [6164]. This study found similar trends for plant stress type identification using VIS/NIR spectra and chlorophyll fluorescence induction kinetic spectra of leaves and stems. The 1D VIS/NIR spectra and chlorophyll fluorescence induction kinetic spectra have simple data structures, and the potential of CNN models for deep feature learning could not be fully revealed. On the other hand, the number of samples was not large enough, which were suitable for SVM. The potential of CNN models for big data could not be fully revealed.

Moreover, variations can be found between VIS/NIR spectra and chlorophyll fluorescence induction kinetic spectra for stress type identification. As for HSI, the classification performances of CNN models showed slightly better results than those of SVM models. As for Chl-FI, the classification performances of CNN models using Chl-FKC showed close results to those of SVM models. Models using VIS/NIR spectra performed better than those using chlorophyll fluorescence induction kinetic spectra. In some other studies, the performances for plant phenotyping using features of HSI were better than those using features of Chl-FI [18, 22, 65]. Although spectral indices calculated from VIS/NIR spectra are widely used for analysis, the full-range spectra are also widely used for analysis. Unlike VIS/NIR spectra, Chl-FI parameters calculated from Chl-FKC were more widely used for analysis in Chl-FI rather than the full-range Chl-FKC. The results illustrated the potential of stress type identification using Chl-FKC, and more efforts should be made to improve the performances. VIS/NIR can reflect the physiological and biochemical changes under different stresses, and Chl-FI is an efficient technique to assess the status of plant photosynthesis. VIS/NIR spectra may provide more information than Chl-FKC.

Both leaves and stems showed promising results for plant stress type identification, and the performances of leaves were better than those of stems. In general, leaves were more likely to be used for plant phenotyping. Fewer studies have studied the high-throughput phenotyping of stems [6673]. The overall results showed that stems can be an efficient alternative for plant phenotyping in addition to leaves. Stems and leaves showed different phenotyping traits under different stresses. The good performances indicated that the combinations of leaves and stems have the great potential to provide more information for plant stress phenotyping.

HSI and Chl-FI acquired different phenotyping traits of leaves and stems. The fusion of VIS/NIR spectral features and Chl-FKC features combined different aspects of features of plants. The results showed that using the fusion of VIS/NIR spectral features and Chl-FKC features had the potential to improve the performances of plant stress type identification. In some other studies, the authors also fused data from different sensors to improve the performances of plant phenotyping [30, 74, 75].

With the advantage of deep learning on feature learning and classification tasks, end-to-end CNN models were designed to implement the information fusion of phenotyping traits acquired from leaves and stems by HSI and Chl-FI. The most widely used two-stage information fusion consisted of training or manually selecting features, concatenating features manually, and training an independent classifier using fused features. The end-to-end deep fusion models combined all these steps into one model, and the whole procedure was implemented using one-stage training. These models were much simpler and did not need manual intervention to extract or select appropriate features. By obtaining the phenotyping traits, these phenotyping traits could be directly fed into the end-to-end deep fusion models, which had great potential for real-world application.

With the development of advanced phenotyping techniques, various aspects of phenotyping traits can be obtained. How to fully reveal the information relating to plant growth status from various aspects of phenotyping traits is of importance. Information fusion provided an effective alternative to combine different aspects of phenotyping traits to reveal the phenotyping information related to plant growth status fully. The effectiveness of information fusion for rice stress type identification proved that information fusion was promising for plant growth status evaluation. On the other hand, different organs have different phenotyping traits, even measured by the same techniques. Thus, different organs could be analyzed to evaluate the plant growth status. The combination of phenotyping traits of different organs will provide more information of plants, and how to effectively fuse the phenotyping traits of different organs remains a challenging issue. This study provided an efficient alternative for the fusion of the phenotyping traits of different organs for plant phenotyping, which could enhance the high-throughput phenotyping performances, including the abiotic stresses caused by environmental factors.

Data Availability

The data used in this study are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare that there is no conflict of interest regarding the publication of this article.

Authors’ Contributions

The contributions of the authors involved in this study are as follows: Chu Zhang: conceptualization, data curation, formal analysis, funding acquisition, methodology, supervision, investigation, writing—original draft, and writing—review and editing; Lei Zhou: data curation, formal analysis, methodology, software, validation, visualization, writing—original draft, and writing—review and editing; Qinlin Xiao: methodology and investigation; Xiulin Bai: methodology and investigation; Baohua Wu: formal analysis and methodology; Na Wu: validation; Yiying Zhao: writing—original draft; Junmin Wang: resources; and Lei Feng: conceptualization, funding acquisition, project administration, software, supervision, and writing—review and editing. Chu Zhang and Lei Zhou contributed equally to this work.

Acknowledgments

This study was supported by the Shenzhen Science and Technology Projects (CJGJZD20210408092401004) and the National Natural Science Foundation of China (61705195).

References

  1. Z. C. Campbell, L. M. Acosta-Gamboa, N. Nepal, and A. Lorence, “Engineering plants for tomorrow: how high-throughput phenotyping is contributing to the development of better crops,” Phytochemistry Reviews, vol. 17, no. 6, pp. 1329–1343, 2018. View at: Publisher Site | Google Scholar
  2. N. Shakoor, S. Lee, and T. C. Mockler, “High throughput phenotyping to accelerate crop breeding and monitoring of diseases in the field,” Current Opinion in Plant Biology, vol. 38, pp. 184–192, 2017. View at: Publisher Site | Google Scholar
  3. A. Singh, S. Jones, B. Ganapathysubramanian et al., “Challenges and opportunities in machine-augmented plant stress phenotyping,” Trends in Plant Science, vol. 26, no. 1, pp. 53–69, 2021. View at: Publisher Site | Google Scholar
  4. P. Song, J. Wang, X. Guo, W. Yang, and C. Zhao, “High-throughput phenotyping: breaking through the bottleneck in future crop breeding,” Crop Journal, vol. 9, no. 3, pp. 633–645, 2021. View at: Publisher Site | Google Scholar
  5. W. Yang, H. Feng, X. Zhang et al., “Crop phenomics and high-throughput phenotyping: past decades, current challenges, and future perspectives,” Molecular Plant, vol. 13, no. 2, pp. 187–214, 2020. View at: Publisher Site | Google Scholar
  6. R. R. Mir, M. Reynolds, F. Pinto, M. A. Khan, and M. A. Bhat, “High-throughput phenotyping for crop improvement in the genomics era,” Plant Science, vol. 282, pp. 60–72, 2019. View at: Publisher Site | Google Scholar
  7. Q. Xiao, X. Bai, C. Zhang, and Y. He, “Advanced high-throughput plant phenotyping techniques for genome-wide association studies: a review,” Journal of Advanced Research, vol. 35, pp. 215–230, 2022. View at: Publisher Site | Google Scholar
  8. H. Liu, B. Bruning, T. Garnett, and B. Berger, “Hyperspectral imaging and 3D technologies for plant phenotyping: from satellite to close-range sensing,” Computers and Electronics in Agriculture, vol. 175, p. 105621, 2020. View at: Publisher Site | Google Scholar
  9. P. Mishra, M. S. M. Asaari, A. Herrero-Langreo, S. Lohumi, B. Diezma, and P. Scheunders, “Close range hyperspectral imaging of plants: a review,” Biosystems Engineering, vol. 164, pp. 49–67, 2017. View at: Publisher Site | Google Scholar
  10. R. Saric, V. D. Nguyen, T. Burge et al., “Applications of hyperspectral imaging in plant phenotyping,” Trends in Plant Science, vol. 27, no. 3, pp. 301–315, 2022. View at: Publisher Site | Google Scholar
  11. Y. Zhang and N. Zhang, “Imaging technologies for plant high-throughput phenotyping: a review,” Frontiers of Agricultural Science and Engineering, vol. 5, no. 4, pp. 406–419, 2018. View at: Publisher Site | Google Scholar
  12. E. Gorbe and A. Calatayud, “Applications of chlorophyll fluorescence imaging technique in horticultural research: a review,” Scientia Horticulturae, vol. 138, pp. 24–35, 2012. View at: Publisher Site | Google Scholar
  13. M. Luisa Perez-Bueno, M. Pineda, and M. Baron, “Phenotyping plant responses to biotic stress by chlorophyll fluorescence imaging,” Frontiers in Plant Science, vol. 10, p. 1135, 2019. View at: Publisher Site | Google Scholar
  14. H. Chu, C. Zhang, M. Wang et al., “Hyperspectral imaging with shallow convolutional neural networks (SCNN) predicts the early herbicide stress in wheat cultivars,” Journal of Hazardous Materials, vol. 421, p. 126706, 2022. View at: Publisher Site | Google Scholar
  15. H. Li, P. Wang, J. F. Weber, and R. Gerhards, “Early identification of herbicide stress in soybean (Glycine max (L.) Merr.) using chlorophyll fluorescence imaging technology,” Sensors, vol. 18, no. 1, p. 21, 2017. View at: Publisher Site | Google Scholar
  16. J. Wang, C. Zhang, Y. Shi et al., “Evaluation of quinclorac toxicity and alleviation by salicylic acid in rice seedlings using ground-based visible/near-infrared hyperspectral imaging,” Plant Methods, vol. 16, no. 1, p. 30, 2020. View at: Publisher Site | Google Scholar
  17. J. F. Weber, C. Kunz, G. G. Peteinatos, H.-J. Santel, and R. Gerhards, “Utilization of chlorophyll fluorescence imaging technology to detect plant injury by herbicides in sugar beet and soybean,” Weed Technology, vol. 31, no. 4, pp. 523–535, 2017. View at: Publisher Site | Google Scholar
  18. D. S. Kasampalis, P. Tsouvaltzis, K. Ntouros, A. Gertsis, I. Gitas, and A. S. Siomos, “The use of digital imaging, chlorophyll fluorescence and Vis/NIR spectroscopy in assessing the ripening stage and freshness status of bell pepper fruit,” Computers and Electronics in Agriculture, vol. 187, p. 106265, 2021. View at: Publisher Site | Google Scholar
  19. J. Marques da Silva, A. Figueiredo, J. Cunha et al., “Using rapid chlorophyll fluorescence transients to classify Vitis genotypes,” Plants-Basel, vol. 9, no. 2, p. 174, 2020. View at: Publisher Site | Google Scholar
  20. A. Mishra, K. Matous, K. B. Mishra, and L. Nedbal, “Towards discrimination of plant species by machine vision: advanced statistical analysis of chlorophyll fluorescence transients,” Journal of Fluorescence, vol. 19, no. 5, pp. 905–913, 2009. View at: Publisher Site | Google Scholar
  21. X. Feng, C. Yu, Y. Chen et al., “Non-destructive determination of shikimic acid concentration in transgenic maize exhibiting glyphosate tolerance using chlorophyll fluorescence and hyperspectral imaging,” Frontiers in Plant Science, vol. 9, p. 468, 2018. View at: Publisher Site | Google Scholar
  22. P. Kumar, R. L. Eriksen, I. Simko, and B. Mou, “Molecular mapping of water-stress responsive genomic loci in lettuce (Lactuca spp.) using kinetics chlorophyll fluorescence, hyperspectral imaging and machine learning,” Frontiers in Genetics, vol. 12, p. 634554, 2021. View at: Publisher Site | Google Scholar
  23. G. Li, S. Wan, J. Zhou, Z. Yang, and P. Qin, “Leaf chlorophyll fluorescence, hyperspectral reflectance, pigments content, malondialdehyde and proline accumulation responses of castor bean Ricinus communis L. seedlings to salt stress levels,” Industrial Crops and Products, vol. 31, no. 1, pp. 13–19, 2010. View at: Publisher Site | Google Scholar
  24. J. C. Naumann, D. R. Young, and J. E. Anderson, “Leaf chlorophyll fluorescence, reflectance, and physiological response to freshwater and saltwater flooding in the evergreen shrub, Myrica cerifera,” Environmental and Experimental Botany, vol. 63, no. 1-3, pp. 402–409, 2008. View at: Publisher Site | Google Scholar
  25. E. Bauriegel and W. Herppich, “Hyperspectral and chlorophyll fluorescence imaging for early detection of plant diseases, with special reference to Fusarium spec. infections on wheat,” Agriculture, vol. 4, no. 1, pp. 32–57, 2014. View at: Publisher Site | Google Scholar
  26. A. K. Singh, B. Ganapathysubramanian, S. Sarkar, and A. Singh, “Deep learning for plant stress phenotyping: trends and future perspectives,” Trends in Plant Science, vol. 23, no. 10, pp. 883–898, 2018. View at: Publisher Site | Google Scholar
  27. C. A. Berdugo, R. Zito, S. Paulus, and A. K. Mahlein, “Fusion of sensor data for the detection and differentiation of plant diseases in cucumber,” Plant Pathology, vol. 63, no. 6, pp. 1344–1356, 2014. View at: Publisher Site | Google Scholar
  28. L. Feng, B. Wu, S. Zhu et al., “Investigation on data fusion of multisource spectral data for rice leaf diseases identification using machine learning methods,” Frontiers in Plant Science, vol. 11, p. 577063, 2020. View at: Publisher Site | Google Scholar
  29. Z. Feng, L. Song, J. Duan et al., “Monitoring wheat powdery mildew based on hyperspectral, thermal infrared, and RGB image data fusion,” Sensors, vol. 22, no. 1, p. 31, 2021. View at: Publisher Site | Google Scholar
  30. D. Moshou, C. Bravo, R. Oberti et al., “Plant disease detection based on data fusion of hyper-spectral and multi- spectral fluorescence imaging using Kohonen maps,” Real-Time Imaging, vol. 11, no. 2, pp. 75–83, 2005. View at: Publisher Site | Google Scholar
  31. D. Moshou, X.-E. Pantazi, D. Kateris, and I. Gravalos, “Water stress detection based on optical multisensor fusion with a least squares support vector machine classifier,” Biosystems Engineering, vol. 117, pp. 15–22, 2014. View at: Publisher Site | Google Scholar
  32. P. Rischbeck, S. Elsayed, B. Mistele, G. Barmeier, K. Heil, and U. Schmidhalter, “Data fusion of spectral, thermal and canopy height parameters for improved yield prediction of drought stressed spring barley,” European Journal of Agronomy, vol. 78, pp. 44–59, 2016. View at: Publisher Site | Google Scholar
  33. P. Salve, P. Yannawar, and M. Sardesai, “Multimodal plant recognition through hybrid feature fusion technique using imaging and non-imaging hyper-spectral data,” Journal of King Saud University-Computer and Information Sciences, vol. 34, no. 1, pp. 1361–1369, 2022. View at: Publisher Site | Google Scholar
  34. X. Xu, L. Fan, Z. Li et al., “Estimating leaf nitrogen content in corn based on information fusion of multiple-sensor imagery from UAV,” Remote Sensing, vol. 13, no. 3, p. 340, 2021. View at: Publisher Site | Google Scholar
  35. L. Zhou, C. Zhang, M. F. Taha, Z. Qiu, and Y. He, “Determination of leaf water content with a portable NIRS system based on deep learning and information fusion analysis,” Transactions of the ASABE, vol. 64, no. 1, pp. 127–135, 2021. View at: Publisher Site | Google Scholar
  36. Z. Gao, Z. Luo, W. Zhang, Z. Lv, and Y. Xu, “Deep learning application in plant stress imaging: a review,” AgriEngineering, vol. 2, no. 3, pp. 430–446, 2020. View at: Publisher Site | Google Scholar
  37. Y. Jiang and C. Li, “Convolutional neural networks for image-based high-throughput plant phenotyping: a review,” Plant Phenomics, vol. 2020, article 4152816, 22 pages, 2020. View at: Publisher Site | Google Scholar
  38. M. H. Kamarudin, Z. H. Ismail, and N. B. Saidi, “Deep learning sensor fusion in plant water stress assessment: a comprehensive review,” Applied Sciences-Basel, vol. 11, no. 4, p. 1403, 2021. View at: Publisher Site | Google Scholar
  39. S. K. Noon, M. Amjad, M. A. Qureshi, and A. Mannan, “Use of deep learning techniques for identification of plant leaf stresses: a review,” Sustainable Computing-Informatics & Systems, vol. 28, p. 100443, 2020. View at: Publisher Site | Google Scholar
  40. M. H. Saleem, J. Potgieter, and K. M. Arif, “Plant disease detection and classification by deep learning,” Plants-Basel, vol. 8, no. 11, p. 468, 2019. View at: Publisher Site | Google Scholar
  41. X. Zhang, T. Lin, J. Xu, X. Luo, and Y. Ying, “DeepSpectra: an end-to-end deep learning approach for quantitative spectral analysis,” Analytica Chimica Acta, vol. 1058, pp. 48–57, 2019. View at: Publisher Site | Google Scholar
  42. V. M. Scholl, J. McGlinchy, T. Price-Broncucia, J. K. Balch, and M. B. Joseph, “Fusion neural networks for plant classification: learning to combine RGB, hyperspectral, and lidar data,” PeerJ, vol. 9, p. e11790, 2021. View at: Publisher Site | Google Scholar
  43. S. Weng, P. Tang, H. Yuan et al., “Hyperspectral imaging for accurate determination of rice variety using a deep learning network with multi-feature fusion,” Spectroscopy, vol. 234, p. 118237, 2020. View at: Publisher Site | Google Scholar
  44. A. Alofi, A. Alghamdi, R. Alahmadi, N. Aljuaid, and M. Hemalatha, “A review of data fusion techniques,” International Journal of Computer Applications, vol. 167, no. 7, pp. 37–41, 2017. View at: Publisher Site | Google Scholar
  45. S. M. Azcarate, R. Rios-Reina, J. M. Amigo, and E. C. Goicoechea, “Data handling in data fusion: methodologies and applications,” TrAC Trends in Analytical Chemistry, vol. 143, p. 116355, 2021. View at: Publisher Site | Google Scholar
  46. B. Khaleghi, A. Khamis, F. O. Karray, and S. N. Razavi, “Multisensor data fusion: a review of the state-of-the-art,” Information Fusion, vol. 14, no. 1, pp. 28–44, 2013. View at: Publisher Site | Google Scholar
  47. R. M. Jomaa, H. Mathkour, Y. Bazi, and M. S. Islam, “End-to-end deep learning fusion of fingerprint and electrocardiogram signals for presentation attack detection,” Sensors, vol. 20, no. 7, p. 2085, 2020. View at: Publisher Site | Google Scholar
  48. S. R. Stahlschmidt, B. Ulfenborg, and J. Synnergren, “Multimodal deep learning for biomedical data fusion: a review,” Briefings in Bioinformatics, vol. 23, no. 2, 2022. View at: Publisher Site | Google Scholar
  49. W. Kong, C. Zhang, F. Cao et al., “Detection of Sclerotinia stem rot on oilseed rape (Brassica napus L.) leaves using hyperspectral imaging,” Sensors, vol. 18, no. 6, p. 1764, 2018. View at: Publisher Site | Google Scholar
  50. H. Cen, H. Weng, J. Yao et al., “Chlorophyll fluorescence imaging uncovers photosynthetic fingerprint of citrus Huanglongbing,” Frontiers in Plant Science, vol. 8, p. 1509, 2017. View at: Publisher Site | Google Scholar
  51. J. Yao, D. Sun, H. Cen et al., “Phenotyping of Arabidopsis drought stress response using kinetic chlorophyll fluorescence and multicolor fluorescence imaging,” Frontiers in Plant Science, vol. 9, p. 603, 2018. View at: Publisher Site | Google Scholar
  52. S. Kuutti, R. Bowden, Y. Jin, P. Barber, and S. Fallah, “A survey of deep learning applications to autonomous vehicle control,” IEEE Transactions on Intelligent Transportation Systems, vol. 22, no. 2, pp. 712–733, 2021. View at: Publisher Site | Google Scholar
  53. K. Lim, K. Pan, Z. Yu, and R. H. Xiao, “Pattern recognition based on machine learning identifies oil adulteration and edible oil mixtures,” Nature Communications, vol. 11, no. 1, pp. 1–10, 2020. View at: Publisher Site | Google Scholar
  54. Z. Sun, Q. Li, S. Jin et al., “Simultaneous prediction of wheat yield and grain protein content using multitask deep learning from time-series proximal sensing,” Plant Phenomics, vol. 2022, article 9757948, 13 pages, 2022. View at: Publisher Site | Google Scholar
  55. C. J. C. Burges, “A tutorial on support vector machines for pattern recognition,” Data Mining and Knowledge Discovery, vol. 2, no. 2, pp. 121–167, 1998. View at: Publisher Site | Google Scholar
  56. C. A. F. de Sousa, D. S. de Paiva, R. A. D. C. N. Casari et al., “A procedure for maize genotypes discrimination to drought by chlorophyll fluorescence imaging rapid light curves,” Plant Methods, vol. 13, no. 1, p. 61, 2017. View at: Publisher Site | Google Scholar
  57. H. M. Kalaji, A. Rastogi, M. Zivcak et al., “Prompt chlorophyll fluorescence as a tool for crop phenotyping: an example of barley landraces exposed to various abiotic stress factors,” Photosynthetica, vol. 56, no. 3, pp. 953–961, 2018. View at: Publisher Site | Google Scholar
  58. A. Lowe, N. Harrison, and A. P. French, “Hyperspectral image analysis techniques for the detection and classification of the early onset of plant disease and stress,” Plant Methods, vol. 13, no. 1, p. 80, 2017. View at: Publisher Site | Google Scholar
  59. A. K. Mahlein, M. T. Kuska, J. Behmann, G. Polder, and A. Walter, “Hyperspectral sensors and imaging technologies in phytopathology: state of the art,” in Annual Review of Phytopathology, J. E. Leach and S. E. Lindow, Eds., vol. 56, pp. 535–558, Annual Reviews Inc., 2018. View at: Publisher Site | Google Scholar
  60. M. Moustakas, A. Calatayud, and L. Guidi, “Editorial: chlorophyll fluorescence imaging analysis in biotic and abiotic stress,” Frontiers in Plant Science, vol. 12, p. 658500, 2021. View at: Publisher Site | Google Scholar
  61. A. M. Fernandes, A. B. Utkin, J. Eiras-Dias, J. Cunha, J. Silvestre, and P. Melo-Pinto, “Grapevine variety identification using “big data” collected with miniaturized spectrometer combined with support vector machines and convolutional neural networks,” Computers and Electronics in Agriculture, vol. 163, p. 104855, 2019. View at: Publisher Site | Google Scholar
  62. Z. Qiu, J. Chen, Y. Zhao, S. Zhu, Y. He, and C. Zhang, “Variety identification of single rice seed using hyperspectral imaging combined with convolutional neural network,” Applied Sciences-Basel, vol. 8, no. 2, p. 212, 2018. View at: Publisher Site | Google Scholar
  63. S. Tarandeep, N. Mittal Garg, and S. R. S. Iyengar, “Nondestructive identification of barley seeds variety using near-infrared hyperspectral imaging coupled with convolutional neural network,” Journal of Food Process Engineering, vol. 44, no. 10, p. e13821, 2021. View at: Publisher Site | Google Scholar
  64. T. Yan, L. Duan, X. Chen, P. Gao, and W. Xu, “Application and interpretation of deep learning methods for the geographical origin identification of Radix Glycyrrhizae using hyperspectral imaging,” RSC Advances, vol. 10, no. 68, pp. 41936–41945, 2020. View at: Publisher Site | Google Scholar
  65. A.-K. Mahlein, E. Alisaac, A. Al Masri, J. Behmann, H.-W. Dehne, and E.-C. Oerke, “Comparison and combination of thermal, fluorescence, and hyperspectral imaging for monitoring Fusarium head blight of wheat on spikelet scale,” Sensors, vol. 19, no. 10, p. 2281, 2019. View at: Publisher Site | Google Scholar
  66. Y. Fan, C. Zhang, Z. Liu, Z. Qiu, and Y. He, “Cost-sensitive stacked sparse auto-encoder models to detect striped stem borer infestation on rice based on hyperspectral imaging,” Knowledge-Based Systems, vol. 168, pp. 49–58, 2019. View at: Publisher Site | Google Scholar
  67. J. F. Garcia-Martin, A. T. Badaro, D. F. Barbin, and P. Alvarez-Mateos, “Identification of copper in stems and roots of Jatropha curcas L. by hyperspectral imaging,” Processes, vol. 8, no. 7, p. 823, 2020. View at: Publisher Site | Google Scholar
  68. X. Jin, S. Madec, D. Dutartre, B. de Solan, A. Comar, and F. Baret, “High-throughput measurements of stem characteristics to estimate ear density and above-ground biomass,” Plant Phenomics, vol. 2019, article 4820305, 10 pages, 2019. View at: Publisher Site | Google Scholar
  69. W. Kong, C. Zhang, W. Huang, F. Liu, and Y. He, “Application of hyperspectral imaging to detect Sclerotinia sclerotiorum on oilseed rape stems,” Sensors, vol. 18, no. 1, p. 123, 2018. View at: Publisher Site | Google Scholar
  70. K. Nagasubramanian, S. Jones, S. Sarkar, A. K. Singh, A. Singh, and B. Ganapathysubramanian, “Hyperspectral band selection using genetic algorithm and support vector machines for early identification of charcoal rot disease in soybean stems,” Plant Methods, vol. 14, no. 1, p. 86, 2018. View at: Publisher Site | Google Scholar
  71. Z. Wang, X. Liu, R. Li, X. Chang, and R. Jing, “Development of near-infrared reflectance spectroscopy models for quantitative determination of water-soluble carbohydrate content in wheat stem and glume,” Analytical Letters, vol. 44, no. 15, pp. 2478–2490, 2011. View at: Publisher Site | Google Scholar
  72. L. Xiang, L. Tang, J. Gai, and L. Wang, “Measuring stem diameter of sorghum plants in the field using a high-throughput stereo vision system,” Transactions of the ASABE, vol. 64, no. 6, pp. 1999–2010, 2021. View at: Publisher Site | Google Scholar
  73. Y. Zhang, J. Wang, J. Du et al., “Dissecting the phenotypic components and genetic architecture of maize stem vascular bundles using high-throughput phenotypic analysis,” Plant Biotechnology Journal, vol. 19, no. 1, pp. 35–50, 2021. View at: Publisher Site | Google Scholar
  74. P. Huang, X. Luo, J. Jin et al., “Improving high-throughput phenotyping using fusion of close-range hyperspectral camera and low-cost depth sensor,” Sensors, vol. 18, no. 8, p. 2711, 2018. View at: Publisher Site | Google Scholar
  75. M. Maimaitijiang, A. Ghulam, P. Sidike et al., “Unmanned aerial system (UAS)-based phenotyping of soybean using multi-sensor data fusion and extreme learning machine,” ISPRS Journal of Photogrammetry and Remote Sensing, vol. 134, pp. 43–58, 2017. View at: Publisher Site | Google Scholar

Copyright © 2022 Chu Zhang et al. Exclusive Licensee Nanjing Agricultural University. Distributed under a Creative Commons Attribution License (CC BY 4.0).

 PDF Download Citation Citation
Views558
Downloads315
Altmetric Score
Citations