Hybrid LSTM Self-Attention Mechanism Model for Forecasting the Reform of Scientific Research in Morocco
Education is the cultivation of people to promote and guarantee the development of society. Education reforms can play a vital role in the development of a country. However, it is crucial to continually monitor the educational model’s performance by forecasting the outcome’s progress. Machine learni...
Gespeichert in:
Veröffentlicht in: | Computational intelligence and neuroscience 2021, Vol.2021 (1), p.6689204, Article 6689204 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Education is the cultivation of people to promote and guarantee the development of society. Education reforms can play a vital role in the development of a country. However, it is crucial to continually monitor the educational model’s performance by forecasting the outcome’s progress. Machine learning-based models are currently a hot topic in improving the forecasting research area. Forecasting models can help to analyse the impact of future outcomes by showing yearly trends. For this study, we developed a hybrid, forecasting time-series model by long short-term memory (LSTM) network and self-attention mechanism (SAM) to monitor Morocco’s educational reform. We analysed six universities’ performance and provided a prediction model to evaluate the best-performing university’s performance after implementing the latest reform, i.e., from 2015–2030. We forecasted the six universities’ research outcomes and tested our proposed methodology’s accuracy against other time-series models. Results show that our model performs better for predicting research outcomes. The percentage increase in university performance after nine years is discussed to help predict the best-performing university. Our proposed algorithm accuracy and performance are better than other algorithms like LSTM and RNN. |
---|---|
ISSN: | 1687-5265 1687-5273 1687-5273 |
DOI: | 10.1155/2021/6689204 |