Efficiency of the Combination of Machine Learning Models in the Evaluation of Weather Parameters

Authors

  • Yannick Mubakilayi University of Kinshasa
  • Simon Ntumba University of Kinshasa
  • Pierre Kafunda University of Kinshasa
  • Salem Cimanga Official University of Mbujimayi
  • Gracias Kabulu University of Mbujimayi

DOI:

https://doi.org/10.24014/coreit.v9i1.21713

Abstract

In this article we exploit the potential presented by the combination of machine learning models (Ensemble Learning) as one of the essential points of the Soft aspect, i.e. observation tools, monitoring, sampling and study of meteorological parameters in order to provide effective support and monitoring of measures taken at different levels in the fight against climate change and sustainable management of the environment by creating a learning model automatic composed of the measurements of the various meteorological parameters (Temperature, Rainfall, Humidity rate, Wind speed, etc.) by training this model using the Ensemble Learning technique called "BOOSTING" on the various measurements taken from each indicator so as to continuously train on past data and be able to predict the next weather forecast with high precision or even make annual or multi-year projections of the evolution of our climatic situation and present this to the various players in our environment and thus enable them to better anticipate possible extreme situations that could negatively affect our environmental situation.

Author Biographies

Yannick Mubakilayi, University of Kinshasa

Department of Computer Science

Simon Ntumba, University of Kinshasa

Department of Computer Science

Pierre Kafunda, University of Kinshasa

Department of Computer Science

Salem Cimanga, Official University of Mbujimayi

Department of Computer Science

Gracias Kabulu, University of Mbujimayi

Department of Computer Science

References

Schmidhuber, J. (2015). "Deep Learning in Neural Networks: An Overview". Neural Networks. 61: 85-117. arXiv:1404.7828. doi:10.1016/j.neunet.2014.09.003. PMID 25462637. S2CID 11715509.

Hardesty, Larry (14 April 2017). "Explained: Neural networks". MIT News Office. Retrieved 2 June 2022.

S. S. Nikam, “A comparative study of classification techniques in data mining algorithms,” Oriental journal of computer science & technology, vol. 8, no. 1, pp. 13–19, 2015.

C. Z. Janikow, “Fuzzy decision trees: issues and methods,” IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), vol. 28, no. 1, pp. 1–14, 1998.

G. Stein, B. Chen, A. S. Wu, and K. A. Hua, “Decision tree classifier for network intrusion detection with GA-based feature selection,” in Proceedings of the 43rd annual Southeast regional conferenceVolume 2, 2005, pp. 136–141.

I. S. Damanik, A. P. Windarto, A. Wanto, S. R. Andani, and W. Saputra, “Decision Tree Optimization in C4. 5 Algorithm Using Genetic Algorithm,” in Journal of Physics: Conference Series, 2019, vol. 1255, no. 1, p. 012012.

R. Barros, M. Basgalupp, A. de Carvalho, and A. Freitas, “A Survey of Evolutionary Algorithms for Decision-Tree Induction,” IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, vol. 42, pp. 291–312, Jan. 2012, doi: 10.1109/TSMCC.2011.2157494.

G. Gupta, “A self explanatory review of decision tree classifiers,” in International conference on recent advances and innovations in engineering (ICRAIE-2014), 2014, pp. 1–7.

S. S. Gavankar and S. D. Sawarkar, “Eager decision tree,” in 2017 2nd International Conference for Convergence in Technology (I2CT), Mumbai, Apr. 2017, pp. 837–840, doi: 10.1109/I2CT.2017.8226246.

P. H. Swain and H. Hauska, “The decision tree classifier: Design and potential,” IEEE Transactions on Geoscience Electronics, vol. 15, no. 3, pp. 142–147, 1977.

A. Dey, “Machine learning algorithms: a review,” International Journal of Computer Science and Information Technologies, vol. 7, no. 3, pp. 1174–1179, 2016.

J. Mrva, Š. Neupauer, L. Hudec, J. Ševcech, and P. Kapec, “Decision Support in Medical Data Using 3D Decision Tree Visualisation,” in 2019 E-Health and Bioengineering Conference (EHB), Nov. 2019, pp. 1–4, doi: 10.1109/EHB47216.2019.8969926.

Y. Bengio, O. Delalleau, and C. Simard, “DECISION TREES DO NOT GENERALIZE TO NEW VARIATIONS,” COMPUTATIONAL INTELLIGENCE, p. 19.

S.-Y. Liang, D.-Q. Han, and C.-Z. Han, “A novel diversity measure based on geometric relationship and its application to design of multiple classifier systems,” Acta Automatica Sinica, vol. 40, no. 3, pp. 449–458, 2014.

Downloads

Published

2023-07-08

Issue

Section

Articles