skip to main content
research-article

Weighted error-output recurrent Xavier echo state network for concept drift handling in water level prediction

Published: 01 November 2024 Publication History

Abstract

Water level holds utmost significance in maritime domains. Precise water level predictions furnish indispensable insights for safe maritime navigation, guiding ships and vessels through passages, harbors, and waterways. This paper introduces a novel approach: the Weighted Error-Output Recurrent Xavier Echo State Network with Adaptive Forgetting Factor (WER-XESN-AFF). One of the contributions of this study is the introduction of the Xavier weights selection method, which replaces random weight selection from the Echo State Network (ESN). This method not only enhances forecasting performance but also reduces uncertainty in predictions. Additionally, two modified concept drift detectors, the Early Drift Detection Method and the Adaptive Forgetting Factor, are employed to address concept drift challenges. Another notable contribution is the introduction of a novel weighted error-output recurrent multi-step algorithm. This algorithm successfully overcomes the error accumulation problem by using past forecast errors to update current output weights. This study performs extensive experiments to evaluate the effectiveness of our approach in multi-step prediction in synthetic and real datasets. It compares the performance between the conventional randomization-based models and the ESN with the new weights selection approach and also tests the ability of concept drift detectors and the weighted error-output multi-step algorithm. Empirical findings and statistical analyses demonstrate that our proposed methods achieve expected effects, and the proposed model has better prediction ability than baselines. A significant improvement rate of 75.39% in Mean Squared Error is evident within the Jiujiang water level dataset when contrasting the performance of WER-XESN-AFF against the baseline model R-ESN across the 1–5 period.

Graphical abstract

Display Omitted

Highlights

Novel weight selection method enhances stability and accuracy.
Modified EDDM and AFF tackle concept drift, improving adaptability and training speed.
Unique error-output algorithm mitigates error accumulation in multi-step prediction.
Outperforms in long-term water level prediction, showcasing practicality.

References

[1]
Meenal R., Binu D., Ramya K., Michael P.A., Vinoth Kumar K., Rajasekaran E., Sangeetha B., Weather forecasting for renewable energy system: a review, Arch. Comput. Methods Eng. 29 (5) (2022) 2875–2891.
[2]
Bouzidi Z., Amad M., Boudries A., Deep learning-based automated learning environment using smart data to improve corporate marketing, business strategies, fraud detection in financial services, and financial time series forecasting, in: International Conference on Managing Business Through Web Analytics, Springer, 2022, pp. 353–377.
[3]
Kalatian A., Farooq B., A context-aware pedestrian trajectory prediction framework for automated vehicles, Transp. Res. C 134 (2022).
[4]
Yuan H., Li G., A survey of traffic prediction: from spatio-temporal data to intelligent transportation, Data Sci. Eng. 6 (2021) 63–85.
[5]
Kan M.S., Tan A.C., Mathew J., A review on prognostic techniques for non-stationary and non-linear rotating systems, Mech. Syst. Signal Process. 62 (2015) 1–20.
[6]
Suryanarayana C., Sudheer C., Mahammood V., Panigrahi B.K., An integrated wavelet-support vector machine for groundwater level prediction in Visakhapatnam, India, Neurocomputing 145 (2014) 324–335.
[7]
Park K., Jung Y., Seong Y., Lee S., Development of deep learning models to improve the accuracy of water levels time series prediction through multivariate hydrological data, Water 14 (3) (2022) 469.
[8]
Kimura N., Yoshinaga I., Sekijima K., Azechi I., Baba D., Convolutional neural network coupled with a transfer-learning approach for time-series flood predictions, Water 12 (1) (2019) 96.
[9]
Wei X., Zhang L., Yang H.-Q., Zhang L., Yao Y.-P., Machine learning for pore-water pressure time-series prediction: Application of recurrent neural networks, Geosci. Front. 12 (1) (2021) 453–467.
[10]
Song X., Liu Y., Xue L., Wang J., Zhang J., Wang J., Jiang L., Cheng Z., Time-series well performance prediction based on long short-term memory (lstm) neural network model, J. Pet. Sci. Eng. 186 (2020).
[11]
Bollerslev T., Generalized autoregressive conditional heteroskedasticity, J. Econometrics 31 (3) (1986) 307–327.
[12]
Hunter J.S., The exponentially weighted moving average, J. Qual. Technol. 18 (4) (1986) 203–210.
[13]
Said S.E., Dickey D.A., Testing for unit roots in autoregressive-moving average models of unknown order, Biometrika 71 (3) (1984) 599–607.
[14]
Chen Y., Gan M., Pan S., Pan H., Zhu X., Tao Z., Application of auto-regressive (ar) analysis to improve short-term prediction of water levels in the yangtze estuary, J. Hydrol. 590 (2020).
[15]
Singh S., Parmar K.S., Kumar J., Makkhan S.J.S., Development of new hybrid model of discrete wavelet decomposition and autoregressive integrated moving average (arima) models in application to one month forecast the casualties cases of covid-19, Chaos Solitons Fractals 135 (2020).
[16]
Manavalasundaram V., Kumar K.G., Amrish M., Brindha V., Jayapraksh D., Arma based crop yield prediction using temperature and rainfall parameters with ground water level classification, Int. J. Adv. Eng. Sci. Inf. Technol. 10 (6) (2022).
[17]
Parmezan A.R.S., Souza V.M., Batista G.E., Evaluation of statistical and machine learning models for time series prediction: Identifying the state-of-the-art and the best conditions for the use of each model, Inf. Sci. 484 (2019) 302–337.
[18]
Gürel A.E., Ağbulut Ü., Biçen Y., Assessment of machine learning, time series, response surface methodology and empirical models in prediction of global solar radiation, J. Clean. Prod. 277 (2020).
[19]
Pavlyshenko B.M., Machine-learning models for sales time series forecasting, Data 4 (1) (2019) 15.
[20]
Tang T., Chen S., Zhao M., Huang W., Luo J., Very large-scale data classification based on k-means clustering and multi-kernel svm, Soft Comput. 23 (2019) 3793–3801.
[21]
Kubota K.J., Chen J.A., Little M.A., Machine learning for large-scale wearable sensor data in parkinson’s disease: Concepts, promises, pitfalls, and futures, Mov. Disorders 31 (9) (2016) 1314–1326.
[22]
Hipni A., El-shafie A., Najah A., Karim O.A., Hussain A., Mukhlisin M., Daily forecasting of dam water levels: comparing a support vector machine (svm) model with adaptive neuro fuzzy inference system (anfis), Water Resour. Manag. 27 (2013) 3803–3823.
[23]
Elzwayie A., El-Shafie A., Yaseen Z.M., Afan H.A., Allawi M.F., Rbfnn-based model for heavy metal prediction for different climatic and pollution conditions, Neural Comput. Appl. 28 (2017) 1991–2003.
[24]
D. Husmeier, D. Husmeier, Random vector functional link (rvfl) networks, in: Neural Networks for Conditional Probability Estimation: Forecasting beyond Point Predictions, 1999, pp. 87–97.
[25]
Huang G.-B., Zhu Q.-Y., Siew C.-K., Extreme learning machine: theory and applications, Neurocomputing 70 (1–3) (2006) 489–501.
[26]
Jaeger H., Echo state network, Scholarpedia 2 (9) (2007) 2330.
[27]
Huang G.-B., Zhu Q.-Y., Siew C.-K., Extreme learning machine: theory and applications, Neurocomputing 70 (1–3) (2006) 489–501.
[28]
Lin X., Yang Z., Song Y., Short-term stock price prediction based on echo state networks, Expert Syst. Appl. 36 (3) (2009) 7313–7317.
[29]
Ding S., Zhao H., Zhang Y., Xu X., Nie R., Extreme learning machine: algorithm, theory and applications, Artif. Intell. Rev. 44 (2015) 103–115.
[30]
Shiri J., Shamshirband S., Kisi O., Karimi S., Bateni S.M., Hosseini Nezhad S.H., Hashemi A., Prediction of water-level in the urmia lake using the extreme learning machine approach, Water Resour. Manag. 30 (2016) 5217–5229.
[31]
Yadav B., Ch S., Mathur S., Adamowski J., Assessing the suitability of extreme learning machines (elm) for groundwater level prediction, J. Water Land Dev. 32 (1) (2017) 103.
[32]
Liu Z., Chen C.P., Broad learning system: Structural extensions on single-layer and multi-layer neural networks, in: 2017 International Conference on Security, Pattern Analysis, and Cybernetics, SPAC, IEEE, 2017, pp. 136–141.
[33]
Assem H., Ghariba S., Makrai G., Johnston P., Gill L., Pilla F., Urban water flow and water level prediction based on deep learning, in: Machine Learning and Knowledge Discovery in Databases: European Conference, ECML PKDD 2017, Skopje, Macedonia, September 18–22, 2017, Proceedings, Part III 10, Springer, 2017, pp. 317–329.
[34]
Baek S.-S., Pyo J., Chun J.A., Prediction of water level and water quality using a cnn-lstm combined deep learning approach, Water 12 (12) (2020) 3399.
[35]
Abdi A.M., Land cover and land use classification performance of machine learning algorithms in a boreal landscape using sentinel-2 data, GISci. Remote Sens. 57 (1) (2020) 1–20.
[36]
Wang P., Fan E., Wang P., Comparative analysis of image classification algorithms based on traditional machine learning and deep learning, Pattern Recognit. Lett. 141 (2021) 61–67.
[37]
Dettori S., Matino I., Colla V., Speets R., Deep echo state networks in industrial applications, in: Artificial Intelligence Applications and Innovations: 16th IFIP WG 12.5 International Conference, AIAI 2020, Neos Marmaras, Greece, June 5–7, 2020, Proceedings, Part II 16, Springer, 2020, pp. 53–63.
[38]
Gao R., Du L., Duru O., Yuen K.F., Time series forecasting based on echo state network and empirical wavelet transformation, Appl. Soft Comput. 102 (2021).
[39]
Song Z., Wu K., Shao J., Destination prediction using deep echo state network, Neurocomputing 406 (2020) 343–353.
[40]
Dongre P.B., Malik L.G., A review on real time data stream classification and adapting to various concept drift scenarios, in: 2014 IEEE International Advance Computing Conference, IACC, IEEE, 2014, pp. 533–537.
[41]
Gama J., Žliobaite I., Bifet A., Pechenizkiy M., Bouchachia A., A survey on concept drift adaptation, ACM Comput. Surv. 46 (4) (2014) 1–37.
[42]
Tennant M., Stahl F., Rana O., Gomes J.B., Scalable real-time classification of data streams with concept drift, Future Gener. Comput. Syst. 75 (2017) 187–199.
[43]
Ramamurthy S., Bhatnagar R., Tracking recurrent concept drift in streaming data using ensemble classifiers, in: Sixth International Conference on Machine Learning and Applications, ICMLA 2007, IEEE, 2007, pp. 404–409.
[44]
Khamassi I., Sayed-Mouchaweh M., Hammami M., Ghédira K., Discussion and review on evolving data streams and concept drift adapting, Evol. Syst. 9 (2018) 1–23.
[45]
Khazaee Poul A., Shourian M., Ebrahimi H., A comparative study of mlr, knn, ann and anfis models with wavelet transform in monthly stream flow prediction, Water Resour. Manag. 33 (2019) 2907–2923.
[46]
Nguyen T.-T., Huu Q.N., Li M.J., Forecasting time series water levels on mekong river using machine learning models, in: 2015 Seventh International Conference on Knowledge and Systems Engineering, KSE, IEEE, 2015, pp. 292–297.
[47]
Jaeger H., Tutorial on Training Recurrent Neural Networks, Covering BPPT, RTRL, EKF and the Echo State Network, German National Research Institute for Computer Science, 2002.
[48]
Lukoševičius M., Jaeger H., Reservoir computing approaches to recurrent neural network training, Comp. Sci. Rev. 3 (3) (2009) 127–149.
[49]
Jaeger H., Haas H., Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication, Science 304 (5667) (2004) 78–80.
[50]
Jaeger H., Discovering Multiscale Dynamical Features with Hierarchical Echo State Networks, Jacobs University Bremen, 2007.
[51]
Zhao Y., Gao H., Beaulieu N.C., Chen Z., Ji H., Echo state network for fast channel prediction in ricean fading scenarios, IEEE Commun. Lett. 21 (3) (2016) 672–675.
[52]
Chen P.-A., Chang L.-C., Chang F.-J., Reinforced recurrent neural networks for multi-step-ahead flood forecasts, J. Hydrol. 497 (2013) 71–79.
[53]
Liu Z., Loo C.K., Masuyama N., Pasupa K., Recurrent kernel extreme reservoir machine for time series prediction, IEEE Access 6 (2018) 19583–19596.
[54]
Suradhaniwar S., Kar S., Durbha S.S., Jagarlapudi A., Time series forecasting of univariate agrometeorological data: a comparative performance evaluation via one-step and multi-step ahead forecasting strategies, Sensors 21 (7) (2021) 2430.
[55]
Du Z., Qin M., Zhang F., Liu R., Multistep-ahead forecasting of chlorophyll a using a wavelet nonlinear autoregressive network, Knowl.-Based Syst. 160 (2018) 61–70.
[56]
Liu Z., Tahir G.A., Masuyama N., Kakudi H.A., Fu Z., Pasupa K., Error-output recurrent multi-layer kernel reservoir network for electricity load time series forecasting, Eng. Appl. Artif. Intell. 117 (2023).
[57]
Shrestha A., Mahmood A., Review of deep learning algorithms and architectures, IEEE Access 7 (2019) 53040–53065.
[58]
Zhang W., Li H., Li Y., Liu H., Chen Y., Ding X., Application of deep learning algorithms in geotechnical engineering: a short critical review, Artif. Intell. Rev. (2021) 1–41.
[59]
Zhang S., Zhang S., Wang B., Habetler T.G., Deep learning algorithms for bearing fault diagnostics—a comprehensive review, IEEE Access 8 (2020) 29857–29881.
[60]
Le X.-H., Ho H.V., Lee G., Jung S., Application of long short-term memory (lstm) neural network for flood forecasting, Water 11 (7) (2019) 1387.
[61]
Hrnjica B., Bonacci O., Lake level prediction using feed forward and recurrent neural networks, Water Resour. Manag. 33 (7) (2019) 2471–2484.
[62]
Barzegar R., Aalami M.T., Adamowski J., Short-term water quality variable prediction using a hybrid cnn–lstm deep learning model, Stoch. Environ. Res. Risk Assess. 34 (2) (2020) 415–433.
[63]
Elsayed S., Thyssens D., Rashed A., Jomaa H.S., Schmidt-Thieme L., Do we really need deep learning models for time series forecasting?, 2021, arXiv preprint arXiv:2101.02118.
[64]
Gama J., Medas P., Castillo G., Rodrigues P., Learning with drift detection, in: Advances in Artificial Intelligence–SBIA 2004: 17th Brazilian Symposium on Artificial Intelligence, Sao Luis, Maranhao, Brazil, September 29–Ocotber 1, 2004. Proceedings 17, Springer, 2004, pp. 286–295.
[65]
Suárez-Cetrulo A.L., Quintana D., Cervantes A., A survey on machine learning for recurring concept drifting data streams, Expert Syst. Appl. (2022).
[66]
Baena-Garcıa M., del Campo-Ávila J., Fidalgo R., Bifet A., Gavalda R., Morales-Bueno R., Early drift detection method, Fourth International Workshop on Knowledge Discovery from Data Streams, vol. 6, Citeseer, 2006, pp. 77–86.
[67]
Gama J., Castillo G., Learning with local drift detection, in: Advanced Data Mining and Applications: Second International Conference, ADMA 2006, Xi’an, China, August 14–16, 2006 Proceedings 2, Springer, 2006, pp. 42–55.
[68]
Frias-Blanco I., campo avila J.d., ramos jimenez g., morales bueno r., ortiz diaz a., caballero mota y., Online and non-parametric drift detection methods based on hoeffdings bounds, IEEE Trans. Knowl. Data Eng. 27 (3) (2015) 810–823.
[69]
Liu A., Zhang G., Lu J., Fuzzy time windowing for gradual concept drift adaptation, in: 2017 IEEE International Conference on Fuzzy Systems, FUZZ-IEEE, IEEE, 2017, pp. 1–6.
[70]
Yu H., Webb G.I., Adaptive online extreme learning machine by regulating forgetting factor by concept drift map, Neurocomputing 343 (2019) 141–153.
[71]
Cao W., Wang X., Ming Z., Gao J., A review on neural networks with random weights, Neurocomputing 275 (2018) 278–287.
[72]
Chouikhi N., Ammar B., Rokbani N., Alimi A.M., Pso-based analysis of echo state network parameters for time series forecasting, Appl. Soft Comput. 55 (2017) 211–225.
[73]
Sun Y., Xue B., Zhang M., Yen G.G., Evolving deep convolutional neural networks for image classification, IEEE Trans. Evol. Comput. 24 (2) (2019) 394–407.
[74]
Datta L., A survey on activation functions and their relation with xavier and he normal initialization, 2020, arXiv preprint arXiv:2004.06632.
[75]
Abuqaddom I., Mahafzah B.A., Faris H., Oriented stochastic loss descent algorithm to train very deep multi-layer neural networks without vanishing gradients, Knowl.-Based Syst. 230 (2021).
[76]
X. Glorot, Y. Bengio, Understanding the difficulty of training deep feedforward neural networks, in: Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, JMLR Workshop and Conference Proceedings, 2010, pp. 249–256.
[77]
D. Ye, H. Lv, Y. Jiang, Z. Wu, Q. Bao, Y. Gao, R. Huang, Improved echo state network (ESN) for the prediction of network traffic, in: Proceedings of the 11th EAI International Conference on Mobile Multimedia Communications, 2018, pp. 1–10.
[78]
Das D., Nayak D.R., Dash R., Majhi B., An empirical evaluation of extreme learning machine: application to handwritten character recognition, Multimedia Tools Appl. 78 (2019) 19495–19523.
[79]
Wu Q., Fokoue E., Kudithipudi D., On the statistical challenges of echo state networks and some potential remedies, 2018, arXiv preprint arXiv:1802.07369.
[80]
Lukoševičius M., A practical guide to applying echo state networks, in: Neural Networks: Tricks of the Trade, second ed., Springer, 2012, pp. 659–686.
[81]
Liu Z., Loo C.K., Masuyama N., Pasupa K., Multiple steps time series prediction by a novel recurrent kernel extreme learning machine approach, in: 2017 9th International Conference on Information Technology and Electrical Engineering, icitee, IEEE, 2017, pp. 1–4.
[82]
Pasupa K., Jungjareantrat S., Water levels forecast in Thailand: A case study of chao phraya river, in: Proceedings of the 14th International Conference on Control, Automation, Robotics and Vision, ICARCV 2016, 13–15 November 2016, Phuket, Thailand, 2016, pp. 1–6,.
[83]
Zhang Y.-L., Bai Y.-T., Jin X.-B., Su T.-L., Kong J.-L., Zheng W.-Z., Deep fusion prediction method for nonstationary time series based on feature augmentation and extraction, Appl. Sci. 13 (8) (2023) 5088.
[84]
Li S., Fang H., Shi B., Multi-step-ahead prediction with long short term memory networks and support vector regression, in: 2018 37th Chinese Control Conference, CCC, IEEE, 2018, pp. 8104–8109.
[85]
Chandra R., Goyal S., Gupta R., Evaluation of deep learning models for multi-step ahead time series prediction, IEEE Access 9 (2021) 83105–83123.
[86]
Dey R., Salem F.M., Gate-variants of gated recurrent unit (gru) neural networks, in: 2017 IEEE 60th International Midwest Symposium on Circuits and Systems, MWSCAS, IEEE, 2017, pp. 1597–1600.
[87]
Graves A., Graves A., Long short-term memory, in: Supervised Sequence Labelling with Recurrent Neural Networks, 2012, pp. 37–45.
[88]
S. Zhang, D. Zheng, X. Hu, M. Yang, Bidirectional long short-term memory networks for relation classification, in: Proceedings of the 29th Pacific Asia Conference on Language, Information and Computation, 2015, pp. 73–78.
[89]
Han K., Xiao A., Wu E., Guo J., Xu C., Wang Y., Transformer in transformer, Advances in Neural Information Processing Systems, vol. 34, 2021, pp. 15908–15919.
[90]
Pan M., Zhou H., Cao J., Liu Y., Hao J., Li S., Chen C.-H., Water level prediction model based on gru and cnn, IEEE Access 8 (2020) 60090–60100.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Applied Soft Computing
Applied Soft Computing  Volume 165, Issue C
Nov 2024
1386 pages

Publisher

Elsevier Science Publishers B. V.

Netherlands

Publication History

Published: 01 November 2024

Author Tags

  1. Echo state network
  2. Concept drift detector
  3. Error-output recurrent algorithm
  4. Error accumulation
  5. Water level prediction

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 09 Feb 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media