Get Our e-AlertsSubmit Manuscript
Space: Science & Technology / 2022 / Article

Research Article | Open Access

Volume 2022 |Article ID 9761567 | https://doi.org/10.34133/2022/9761567

Jun Chen, Weifu Li, Shuxin Li, Hong Chen, Xuebin Zhao, Jiangtao Peng, Yanhong Chen, Hao Deng, "Two-Stage Solar Flare Forecasting Based on Convolutional Neural Networks", Space: Science & Technology, vol. 2022, Article ID 9761567, 10 pages, 2022. https://doi.org/10.34133/2022/9761567

Two-Stage Solar Flare Forecasting Based on Convolutional Neural Networks

Received03 Jan 2022
Accepted10 Jun 2022
Published06 Jul 2022

Abstract

Solar flares are solar storm events driven by the magnetic field in the solar activity area. Solar flare, often associated with solar proton event or CME, has a negative impact on ratio communication, aviation, and aerospace. Therefore, its forecasting has attracted much attention from the academic community. Due to the limitation of the unbalanced distribution of the observation data, most techniques failed to effectively learn complex magnetic field characteristics, leading to poor forecasting performance. Through the statistical analysis of solar flare magnetic map data observed by SDO/HMI from 2010 to 2019, we find that unsupervised clustering algorithms have high accuracy in identifying the sunspot group in which the positive samples account for the majority. Furthermore, for these identified sunspot groups, the ensemble model that integrates the capability of boosting and convolutional neural network (CNN) achieves high-precision prediction of whether the solar flares will occur in the next 48 hours. Based on the above findings, a two-stage solar flare early warning system is established in this paper. The F1 score of our method is 0.5639, which shows that it is superior to the traditional methods such as logistic regression and support vector machine (SVM).

1. Introduction

Solar flare is one of the most violent outbursts occurring in a localized region of the solar atmosphere, manifesting itself as an enhancement of electromagnetic radiation in a wide wavelength range and the emission of various particle streams with energies ranging from 103 electron volts to 1011 electron volts [1]. When the flare radiation comes to the Earth’s vicinity, the photoionization increases the electron density in the D-layer of the ionosphere, causing absorption of high-frequency radio communication, and enhanced background noise interference with radar.

The intensity scale of solar flare can reflect the overall level of solar eruption. Statistics and experience [26] show that the larger the flare, the more likely it is to be accompanied by other solar outbursts such as solar proton event or CME, and the more severe the effects on the Earth, thus affecting spaceflight, communication, navigation, power transmission, and other technological systems. In a sense, solar flare is an important indicator of solar storm. Providing forecast information on the likelihood and intensity of flare outbreaks is an important element at the beginning of operational space weather forecasting. The modeling study of solar flare forecasting is a necessary part of accurate flare forecasting and has important application value.

Presently, solar flare forecasting is mainly short-term, forecasting the probability of flare occurrence and its level in the next 1-3 days. Meanwhile, the forecast of solar flare is mainly based on the physical quantities of the active region. Li et al. [7] established a flare short-term prediction system using SVM and obtained good prediction accuracy. Wang et al. [8] established a short-term flare prediction system using multilayer perceptron based on the characteristic parameters extracted from the photospheric magnetic map. Huang et al. [9, 10] established a flare prediction model using several machine learning methods. The results show the effectiveness of the evolution information of magnetic field time series in flare prediction. In terms of sunspot group category features, the outbreak of solar flare is closely related to the appearance morphology, magnetic field polarity, and other features of the sunspot group. Many different classifications of sunspot groups have been proposed, such as the Wilson Mountain magnetic classification [11], Zurich typing [1214], and McIntosh black group classification [15], whose categories can be used as important features for flare forecasting.

In terms of the magnetic parametric characteristics of the active region, since most of the constituents of the Sun are high-temperature plasma, the Sun’s motion and the evolution of its activity are related to the nature of the magnetic field. Meanwhile, several magnetic parameters can be used to predict flare [1624]. For example, Schrijver [25] found that neutral lines are related to large flare; Song et al. [26] chose parameters such as total unsigned magnetic flux, strong gradient neutral line length, and total magnetic field dissipation to predict the solar flare.

Machine learning has also been applied to flare forecasting due to its significant advantages in data analysis and processing [810, 2736]. For example, Song et al. [26] used an ordered logistic regression method to build a prediction model to forecast the probability of producing X-, M-, or C-class solar flares in each solar active region during the next day; Qahwaji and Colak [27] compared the performance of machine learning algorithms such as cascaded neural networks, SVM, and radial basis function networks for flare forecasting.

In recent years, deep learning has been favored by researchers and has the advantage of automatic feature extraction from large data compared to traditional machine learning algorithms. Deep learning has been applied to flare forecasting work with good results [3742]. For example, Huang et al. [40] used CNN to train a deep learning model for solar flare forecasting based on collecting most of the SOHO/MDI and SDO/HMI active region magnetogram data and achieved automatic extraction of solar active region magnetogram forecast patterns; Nishizuka et al. [41] built a solar flare prediction model, called deep flame network, based on deep neural networks. This model can calculate the probability of occurring flares in the next 24 hours in each active region.

In the modeling of solar flare forecasting, some of the deep learning networks only use images as input, but they are difficult to extract accurate physical quantities from images. Therefore, they ignore the importance of parameters for flare forecasting. At the same time, these networks may stretch the images or use other methods to meet the input size of the network, so that the information contained in the images, such as area and magnetic field, is distorted and cannot be used effectively. In addition, though deep learning has the advantage of automatically extracting features from large data, the ability of deep learning to extract valid information is limited. And prior knowledge can improve the prediction performance of the model. Therefore, how to apply the prior knowledge of flare forecasting to design deep learning network to optimize the algorithm for more efficient and accurate training is one of the urgent problems to be solved.

In this paper, we combine the -means clustering algorithm and several CNN models to build a warning system that can predict whether solar flare will occur in the next 48 hours. The rest of this paper is organized as follows. Section 2 introduces the data we use and analyses them from the statistical point of view to provide a basis for the design of the solar flare warning system. Section 3 builds the whole pipeline and Section 4 conducts experiments and discusses the results. Finally, Section 5 draws the conclusion.

2. Data and Analysis

In this paper, SHARP (http://jsoc.stanford.edu/) magnetogram data collected by SDO/HMI from May 2010 to December 2018 are used [40]. The 10 magnetic characteristic parameters [43, 44] are shown in Table 1. The data are taken every 96 min to ensure the variability. To reduce the effect of projection effect, we select the center of the active region located within ±30°of the solar disk center. After that, we label the data according to the solar event reports provided by NOAA Space Weather Prediction Center (SWPC) based on GOES satellite observation and solar observations. The report gave the information of solar flare and its active region resources, including the start and end times of the flares, the number of the active region, and the magnitude of the flares. The solar flare events without corresponding active regions were excluded. We set a sample every 96 minutes, with the 10 magnetic characteristic parameters as input. For the samples with or above class flares in the next 48 hours, we label them as 1 (positive samples) and for the samples without class or higher flares in the next 48 hours, we label them as 0 (negative samples). The whole data include 1000 solar active regions, each of which will exist for several days on the visible disk. Therefore, during the existence period of a solar active region, there are multiple continuous samples. Totally, 2837 positive and 59,834 negative samples are determined. The data from May 2010 to December 2016 are regarded as the train set and the rest as test data.


SymbolName

Total unsigned current helicity
Total photospheric magnetic energy density
Total unsigned vertical current
Absolute value of the net current helicity
Sum of the absolute value of the net currents per polarity
Total unsigned flux
Active region strong magnetic field area
Mean photospheric excess magnetic energy density
Neutral line magnetic flux
Percentage of pixels with a mean shear angle greater than 45 degrees

There is a serious imbalance between the number of positive and negative samples in our dataset. The numbers of positive and negative samples are 2837 and 59,834, and the ratio is about 1 : 20. The number of active regions that have positive samples is only 92, but the total number of active regions is 1000. In the meantime, we find that the characteristic boundary of positive and negative samples is fuzzy based on the analysis of several features in some active regions as Figure 1. Therefore, it is quite difficult to distinguish positive and negative samples if we directly input them into our model without any data preprocessing strategy.

To alleviate the imbalance of positive and negative samples, we try to find a principle to select the active regions which have positive samples as much as possible. Firstly, we visualize the probability density distribution of each feature in all negative samples and all positive samples as shown in Figure 2. We can easily find that the probability density distributions of the negative samples are all negatively skewed distributions and the characteristics of positive samples are generally larger than those of negative samples. Thus, it is possible to filter out active regions with positive samples by the feature values of each active region. Figure 3 shows the number of active regions (numbered from 1 to 1000) and the variation of corresponding magnetic characteristic parameters. We can find that in the first eight features, the parameter shows significant variations for different active regions, which makes it possible to divide the active regions into multiple categories according to the value of the features.

3. Pipeline

As shown in Figure 4, our method contains the following two steps: data preprocessing and model training.

3.1. Data Preprocessing

Considering the sample class-imbalance of the data and the significant difference between the features of the active regions with positive samples and the others, we divide the active regions into several categories to increase the proportion of positive samples. We use -means, an unsupervised clustering method, to achieve the above process. For every active region, we calculate the mean of their features which can be calculated as follows: where represents the parameter of the -th original sample of -th active region and represents the sample size of -th active region, and then input them into -means to obtain clustering results.

The results (Sum of Squared Error (SSE)) of the number of categories from 1 to 9 are shown in Figure 5, and we choose 3 as the final value of . The clustering result is shown in Table 2 and Figure 6. Obviously, the ratio of positive samples in category C is 0.340633 which is much larger than the one of the whole dataset. Therefore, for the neural network model on the next stage of our algorithm, we only use the data in category C as input data. For each sample in category A and category , we believe that the former is almost impossible to occur solar flare in the next 48 hours, and there is a very little probability of the occurrence of solar flare for the latter. The labels of all samples in both categories are predicted as 0.


Category ACategory BCategory CAll data

Number of active regions582350741000
Number of samples3231522367502359705
Number of positive samples15697017112837
Ratio of positive samples0.0048270.0433670.3406330.047517

3.2. Model Training

The neural networks we use are Resnet18, Resnet34, and Xception, which are commonly used in deep learning. Firstly, Resnet [45] is a pioneering neural network proposed by He et al. in 2016. The network avoids the problem of gradient disappearance and solves the problem that it is difficult to optimize deep neural network. Resnet18 and Resnet34 represent Resnet with 18 layers and with 34 layers, respectively. Secondly, Xception [46] is proposed by Google in 2017. The model is a network entirely based on deep separable convolution. It processes the correlation among channels and spatial correlation separately to make the model converge faster and have better performance.

After -means clustering, all active regions are divided into three categories, namely, category A, category B, and category C. For the samples in category C, we randomly choose three-fourths of samples in the existence periods of each active region as training data for the neural network models and the rest of the samples as validation data in the process of training model. To avoid the influence of dimension, we standardize the original data. The standardization method we use is different from those commonly used. The standardization calculation formula is as follows: where represents -th standardized sample of -th active region, and represent the mean and variant of -th active region, and represent the mean and variant of , and represents the total number of samples. Considering that there is a temporal relationship among adjacent samples, we input 16 consecutive samples (input size is , where 10 represents the number of features) into the model every iteration. If the label of a sample is predicted to be 1 by the neural network, we will regard this sample as a signal of solar flare which will occur in the next 48 hours. But if it is predicted to be 0, the probability of occurring solar flare in the next 48 hours will be so small that we can ignore it.

4. Experiments

In this section, we first give the introduction of our experimental setting and then conduct several ablation experiments and comparisons with different models to show the performance of our method.

We choose the three indexes recall (), precision (), and score to show the prediction effectiveness of our method. The calculation formulas are as follows:

To verify the improvement of -means clustering algorithm and boosting strategy, we conduct several ablation experiments and the results are shown in Table 3. Besides, we also make comparisons between our method and other 13 binary classification algorithms commonly used to present the prediction performance of our method in Table 4. The TSS (True skill statistics) and HSS (Heidke skill scores) are 0.5709 and 0.5587, respectively, where the HSS is used to measure the fractional improvement of prediction over the random prediction [47], and the TSS is the recall minus the false alarm rate [48].


ModelClusteringRecallPrecision score

XceptionNo0.58270.38940.4668
XceptionYes0.45700.52270.4876
Resnet18No0.47020.28290.3532
Resnet18Yes0.47020.53790.5018
Resnet34No0.55630.30110.3907
Resnet34Yes0.43710.59460.5038
Xception+Resnet18+Resnet34No0.54300.42710.4781
Xception+Resnet18+Resnet34Yes0.49670.65220.5639


AlgorithmClusteringRecallPrecision score

Logistic regressionNo0.41060.58490.4825
Logistic regressionYes0.42380.24810.3130
Gradient boosting classifierNo0.35760.58060.4426
Gradient boosting classifierYes0.52320.01490.0289
Ada boost classifierNo0.32450.72060.4475
Ada boost classifierYes0.52980.03170.0599
Bagging classifierNo0.31790.33800.3276
Bagging classifierYes0.46360.01980.0380
Extra trees classifierNo0.36420.47410.4120
Extra trees classifierYes0.40400.02050.0389
Random forest classifierNo0.29800.46390.3629
Random forest classifierYes0.41720.01850.0355
XGB classifierNo0.48340.43200.4563
XGB classifierYes0.47020.02230.0425
neighbors classifierNo0.42380.26670.3274
neighbors classifierYes0.43710.00580.0115
Decision tree classifierNo0.32540.24500.2792
Decision tree classifierYes0.56950.02560.0490
Gaussian NBNo0.70200.21030.3237
Gaussian NBYes0.36420.03100.0571
Linear discriminant analysisNo0.49670.42860.4601
Linear discriminant analysisYes0.36420.28500.3198
Quadratic discriminant analysisNo0.68210.18230.2877
Quadratic discriminant analysisYes0.33110.05390.0927
SVCNo0.35100.46490.4000
SVCYes0.43050.00800.0156
Our methodNo0.54300.42710.4781
Our methodYes0.49670.65220.5639

Our experimental results show that the prediction performance of the model which integrates several neural networks is better than the one of a single convolutional neural network. Finally, we choose to combine the prediction results of Resnet18, Resnet34, and Xception by boosting strategy. From Table 3, we can notice that -means can greatly improve the performance of a single network and triple networks. Obviously, the key to performance improvement lies in precision. For all networks, recall may be unchanged or even reduced greatly after clustering. However, precision is bound to increase significantly. After clustering, although the positive sample rate will be greatly improved, from 5% to 34%, nearly 40% of the information of positive samples will also be lost. We think this is the main reason why recall remains unchanged or even decreases. It also means that the number of positive samples we predict is less than the one without clustering, but the probability that a predicted positive sample is a true positive is higher. From Table 4, if we do not cluster active regions, although our method is not the best in this condition, it is still close to the best performance. But under the condition of clustering, the performance of our method is much better than the one of other methods. For the phenomenon that the prediction performance of other binary classification methods is decreasing or even very poor after clustering, we think the main reason is that the number of input samples (less than 10% of the total number of original samples) is not enough for them. In contrast, the performance of our method improves by more than 9% after clustering.

5. Conclusion

In this paper, a two-stage solar flare early warning system is proposed to predict whether solar flare will occur in the next 48 hours. The system consists of an unsupervised clustering algorithm (-means) and several CNN models, where the former is to increase the positive sample rate, and the latter integrates the prediction results of the CNN models to improve the prediction performance. Three CNN models, Xception, Resnet18, and Resnet34, are chosen in the 2nd stage. The score of our method reaches 0.5639, which proves the effectiveness of our method.

Data Availability

The data we use is available on https://tianchi.aliyun.com/competition/entrance/531804/information.

Conflicts of Interest

The authors declare no conflicts of interest to this work.

Authors’ Contributions

Jun Chen is assigned to the conceptualization, literature review, modeling, writing—original draft, and writing—review and editing. Weifu Li, Hong Chen, Xuebin Zhao, Jiangtao Peng, and Yanhong Chen did the conceptualization. Shuxin Li is assigned to the conceptualization, literature review, and writing—original draft. Hao Deng is also assigned to the conceptualization and literature review.

Acknowledgments

This work is support in part by the National Natural Science Foundation of China under Grant Nos. 12071166 and 42171351; Hubei Key Laboratory of Applied Mathematics under Grant Nos. HBAM 202004 and 201612; and Hubei Provincial Natural Science Foundation of China under Grant No. 2021CFA087.

References

  1. C. Y. Tu, Q. G. Zong, J. S. He, H. Tian, and L. H. Wang, Solar Terrestrial Space Physics, Science Press, Beijing, Second edition, 2020.
  2. R. Miteva and S. W. Samwel, “M-class solar flares in solar cycles 23 and 24: properties and space weather relevance,” Universe, vol. 8, no. 1, p. 39, 2022. View at: Publisher Site | Google Scholar
  3. P. S. Gour, N. P. Singh, S. Soni, and S. M. Saini, “Observation of coronal mass ejections in association with sun spot number and solar flares,” IOP Conference Series: Materials Science and Engineering, vol. 1120, no. 1, p. 012020, 2021. View at: Publisher Site | Google Scholar
  4. A. Papaioannou, I. Sandberg, A. Anastasiadis et al., “Solar flares, coronal mass ejections and solar energetic particle event characteristics,” Journal of Space Weather & Space Climate, vol. 6, p. A42, 2016. View at: Publisher Site | Google Scholar
  5. L. K. Harra, C. J. Schrijver, M. Janvier et al., “The characteristics of solar X-class flares and CMEs: a paradigm for stellar superflares and eruptions?” Solar Physics, vol. 291, no. 6, pp. 1761–1782, 2016. View at: Publisher Site | Google Scholar
  6. E. A. Kasatkina, O. I. Shumilov, M. J. Rycroft, F. Marcz, and A. V. Frank-Kamenetsky, “Atmospheric electric field anomalies associated with solar flare/coronal mass ejection events and solar energetic charged particle "ground level events",” Atmospheric Chemistry and Physics Discussions, vol. 9, no. 5, pp. 21941–21958, 2009. View at: Google Scholar
  7. R. Li, H. N. Wang, Y. M. Cui, and X. Huang, “Solar flare forecasting using learning vector quantity and unsupervised clustering techniques,” Science China Physics Mechanics & Astronomy, vol. 54, no. 8, pp. 1546–1552, 2011. View at: Publisher Site | Google Scholar
  8. H. N. Wang, Y. M. Cui, R. Li, L. Y. Zhang, and H. Han, “Solar flare forecasting model supported with artificial neural network techniques,” Advances in Space Research, vol. 42, no. 9, pp. 1464–1468, 2008. View at: Publisher Site | Google Scholar
  9. X. Huang and H. N. Wang, “Solar flare prediction using highly stressed longitudinal magnetic field parameters,” Research in Astronomy and Astrophysics, vol. 13, no. 3, pp. 351–358, 2013. View at: Publisher Site | Google Scholar
  10. X. Huang, L. Zhang, H. Wang, and L. Li, “Improving the performance of solar flare prediction using active longitudes information,” Astronomy and Astrophysics, vol. 549, p. A127, 2013. View at: Publisher Site | Google Scholar
  11. G. E. Hale, F. Ellerman, S. B. Nicholson, and A. H. Joy, “The magnetic polarity of sun-spots,” The Astrophysical Journal, vol. 49, no. 3, p. 153, 1919. View at: Publisher Site | Google Scholar
  12. W. M. Chromosphärische and I. I. Eruptionen, “Mit 5 Abbildungen,” Zeitschrift fur Astrophysik, vol. 20, p. 46, 1941. View at: Google Scholar
  13. A. L. Cortie, “On the types of sunspot disturbances,” The Astrophysical Journal, vol. 13, no. 4, p. 260, 1901. View at: Publisher Site | Google Scholar
  14. R. J. Bray, R. E. Loughhead, and B. W. Shore, “Sunspots,” Physics Today, vol. 18, no. 7, p. 68, 1965. View at: Publisher Site | Google Scholar
  15. P. S. Mcintosh, “The classification of sunspot groups,” Solar Physics, vol. 125, no. 2, pp. 251–267, 1990. View at: Publisher Site | Google Scholar
  16. I. Sammis, F. Tang, and H. Zirin, “The dependence of large flare occurrence on the magnetic structure of sunspots,” The Astrophysical Journal, vol. 540, no. 1, pp. 583–587, 2000. View at: Publisher Site | Google Scholar
  17. Y. Cui, R. Li, L. Zhang, Y. He, and H. Wang, “Correlation between solar flare productivity and photospheric magnetic field properties,” Solar Physics, vol. 237, no. 1, pp. 45–59, 2006. View at: Publisher Site | Google Scholar
  18. Y. Cui, R. Li, H. Wang, and H. He, “Correlation between solar flare productivity and photospheric magnetic field properties II. Magnetic gradient and magnetic shear,” Solar Physics, vol. 242, no. 1-2, pp. 1–8, 2007. View at: Publisher Site | Google Scholar
  19. X. L. Yan, L. H. Deng, Z. Q. Qu, and C. L. Xu, “The phase relation between sunspot numbers and soft X-ray flares,” Astrophysics & Space Science, vol. 333, no. 1, pp. 11–16, 2011. View at: Publisher Site | Google Scholar
  20. A. E. Mccloskey, P. T. Gallagher, and B. D. Shaun, “Flare forecasting using the evolution of McIntosh sunspot classifications,” Journal of Space Weather and Space Climate, vol. 8, p. A34, 2018. View at: Publisher Site | Google Scholar
  21. K. D. Leka and G. Barnes, “Photospheric magnetic field properties of flaring versus flare-quiet active regions. I. Data, general approach, and sample results,” The Astrophysical Journal, vol. 595, no. 2, pp. 1277–1295, 2003. View at: Publisher Site | Google Scholar
  22. K. D. Leka and G. Barnes, “Photospheric magnetic field properties of flaring versus flare-quiet active regions. II. Discriminant analysis,” The Astrophysical Journal, vol. 595, no. 2, pp. 1296–1306, 2003. View at: Publisher Site | Google Scholar
  23. G. Barnes and K. D. Leka, “Photospheric magnetic field properties of flaring versus flare-quiet active regions. III. Magnetic charge topology models,” The Astrophysical Journal, vol. 646, no. 2, pp. 1303–1318, 2006. View at: Publisher Site | Google Scholar
  24. S. Eren, A. Kilcik, T. Atay et al., “Flare-production potential associated with different sunspot groups,” Monthly Notices of the Royal Astronomical Society, vol. 465, no. 1, pp. 68–75, 2017. View at: Publisher Site | Google Scholar
  25. C. J. Schrijver, “A characteristic magnetic field pattern associated with all major solar flares and its use in flare forecasting,” The Astrophysical Journal, vol. 655, no. 2, pp. L117–L120, 2007. View at: Publisher Site | Google Scholar
  26. H. Song, C. Tan, J. Jing, H. Wang, V. Yurchyshyn, and V. Abramenko, “Statistical assessment of photospheric magnetic features in imminent solar flare predictions,” Solar Physics, vol. 254, no. 1, pp. 101–125, 2009. View at: Publisher Site | Google Scholar
  27. R. Qahwaji and T. Colak, “Automatic short-term solar flare prediction using machine learning and sunspot associations,” Solar Physics, vol. 241, no. 1, pp. 195–211, 2007. View at: Publisher Site | Google Scholar
  28. Y. Yuan, F. Y. Shih, J. Jing, and H. M. Wang, “Automated flare forecasting using a statistical learning technique,” Research in Astronomy and Astrophysics, vol. 10, no. 8, pp. 785–796, 2010. View at: Publisher Site | Google Scholar
  29. O. W. Ahmed, R. Qahwaji, T. Colak, P. A. Higgins, P. T. Gallagher, and D. S. Bloomfield, “Solar flare prediction using advanced feature extraction, machine learning, and feature selection,” Solar Physics, vol. 283, no. 1, pp. 157–175, 2013. View at: Publisher Site | Google Scholar
  30. A. Raboonik, H. Safari, N. Alipour, and M. S. Wheatland, “Prediction of solar flares using unique signatures of magnetic field images,” The Astrophysical Journal, vol. 834, no. 1, p. 11, 2017. View at: Publisher Site | Google Scholar
  31. A. Al-Ghraibah, L. E. Boucheron, and R. T. J. Mcateer, “An automated classification approach to ranking photospheric proxies of magnetic energy build-up,” Astronomy & Astrophysics, vol. 579, p. A64, 2015. View at: Publisher Site | Google Scholar
  32. J. A. Guerra, A. Pulkkinen, and V. M. Uritsky, “Ensemble forecasting of major solar flares: first results,” Space Weather, vol. 13, no. 10, pp. 626–642, 2015. View at: Publisher Site | Google Scholar
  33. J. F. Liu, F. Li, J. Wan, and D. R. Yu, “Short-term solar flare prediction using multi-model integration method,” Research in Astronomy and Astrophysics, vol. 17, no. 4, p. 034, 2017. View at: Publisher Site | Google Scholar
  34. F. Benvenuto, M. Piana, C. Campi, and A. M. Massone, “A hybrid supervised/unsupervised machine learning approach to solar flare prediction,” The Astrophysical Journal, vol. 853, no. 1, p. 90, 2018. View at: Publisher Site | Google Scholar
  35. N. Nishizuka, K. Sugiura, Y. Kubo, M. den, S. Watari, and M. Ishii, “Solar flare prediction model with three machine-learning algorithms using ultraviolet brightening and vector magnetograms,” The Astrophysical Journal, vol. 835, no. 2, p. 156, 2017. View at: Publisher Site | Google Scholar
  36. C. Liu, N. Deng, J. T. L. Wang, and H. Wang, “Predicting solar flares usingSDO/HMI vector magnetic data products and the random Forest algorithm,” The Astrophysical Journal, vol. 843, no. 2, p. 104, 2017. View at: Publisher Site | Google Scholar
  37. G. E. Hinton and R. R. Salakhutdinov, “Reducing the dimensionality of data with neural networks,” Science, vol. 313, no. 5786, pp. 504–507, 2006. View at: Publisher Site | Google Scholar
  38. R. Li and X. Huang, “Solar flare forecasting model based on automatic feature extraction of sunspots,” SCIENTIA SINICA Physica, Mechanica & Astronomica, vol. 48, no. 11, p. 119601, 2018. View at: Publisher Site | Google Scholar
  39. H. Liu, C. Liu, J. T. L. Wang, and H. Wang, “Predicting solar flares using a long short-term memory network,” The Astrophysical Journal, vol. 877, no. 2, pp. 1–14, 2019. View at: Publisher Site | Google Scholar
  40. X. Huang, H. Wang, L. Xu, J. Liu, R. Li, and X. Dai, “Deep learning based solar flare forecasting model. I. Results for line-of-sight magnetograms,” The Astrophysical Journal, vol. 856, no. 1, p. 7, 2018. View at: Publisher Site | Google Scholar
  41. N. Nishizuka, K. Sugiura, Y. Kubo, M. den, and M. Ishii, “Deep Flare Net (DeFN) model for solar flare prediction,” The Astrophysical Journal, vol. 858, no. 2, p. 113, 2018. View at: Publisher Site | Google Scholar
  42. N. Nishizuka, Y. Kubo, K. Sugiura, M. den, and M. Ishii, “Reliable probability forecast of solar flares: deep flare net-reliable (DeFN-R),” The Astrophysical Journal, vol. 899, no. 2, p. 150, 2020. View at: Publisher Site | Google Scholar
  43. M. G. Bobra and S. Couvidat, “Solar Flare Prediction USINGSDO/HMI Vector Magnetic Field Data with a Machine-learning Algorithm,” The Astrophysical Journal, vol. 798, no. 2, 2015. View at: Publisher Site | Google Scholar
  44. X. Sun and for the CGEM Team, “The CGEM Lorentz Force Data from HMI Vector Magnetograms,” 2014, http://arxiv.org/abs/1405.7353. View at: Google Scholar
  45. K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” in proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778, 2016. View at: Google Scholar
  46. C. F. Xception, “Deep learning with depthwise separable convolutions,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1251–1258, 2017. View at: Google Scholar
  47. K. Florios, S. H. Park, J. A. Guerra, F. Benvenuto, D. S. Bloomfield, and M. K. Georgoulis, “Forecasting solar flares using magnetogram-based predictors and machine learning,” Solar Physics, vol. 293, no. 2, 2018. View at: Publisher Site | Google Scholar
  48. D. S. Bloomfield, P. A. Higgins, R. Mcateer, and P. T. Gallagher, “Toward reliable benchmarking of solar flare forecasting methods,” The Astrophysical Journal Letters, vol. 747, no. 2, p. L41, 2012. View at: Publisher Site | Google Scholar

Copyright © 2022 Jun Chen et al. Exclusive Licensee Beijing Institute of Technology Press. Distributed under a Creative Commons Attribution License (CC BY 4.0).

 PDF Download Citation Citation
Views614
Downloads220
Altmetric Score
Citations