Interpretable software estimation with graph neural networks and orthogonal array tunning method
Software estimation rates are still suboptimal regarding efficiency, runtime, and the accuracy of model predictions. Graph Neural Networks (GNNs) are complex, yet their precise forecasting reduces the gap between expected and actual software development efforts, thereby minimizing associated risks....
Gespeichert in:
Veröffentlicht in: | Information processing & management 2024-09, Vol.61 (5), p.103778, Article 103778 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Software estimation rates are still suboptimal regarding efficiency, runtime, and the accuracy of model predictions. Graph Neural Networks (GNNs) are complex, yet their precise forecasting reduces the gap between expected and actual software development efforts, thereby minimizing associated risks. However, defining optimal hyperparameter configurations remains a challenge. This paper compares state-of-the-art models such as Long-Short-Term-Memory (LSTM), Graph Gated Neural Networks (GGNN), and Graph Gated Sequence Neural Networks (GGSNN), and conducts experiments with various hyperparameter settings to optimize performance. We also aim to gain the most informative feedback from our models by exploring insights using a post-hoc agnostic method like Shapley Additive Explanations (SHAP). Our findings indicate that the Taguchi orthogonal array optimization method is the most computationally efficient, yielding notably improved performance metrics. This suggests a compromise between computational efficiency and prediction accuracy while still requiring the lowest number of runnings, with an RMSE of 0.9211 and an MAE of 310.4. For the best-performing model, the GGSNN model, within the Constructive Cost Model (COCOMO), Function Point Analysis (FPA), and Use Case Points (UCP) frameworks, applying the SHAP method leads to a more accurate determination of relevance, as evidenced by the norm reduction in activation vectors. The SHAP method stands out by exhibiting the smallest area under the curve and faster convergence, indicating its efficiency in pinpointing concept relevance.
•Graph Neural Networks enhance software prediction accuracy, lowering risk and effort gaps.•Taguchi hyperparameter optimization method excels in Use Case point efficiency and accuracy.•Graph Gated Sequence Neural Network achieves the best performance for efficient effort and cost estimation.•SHAP method outperforms other centrality measures in terms of speed and norm reduction. |
---|---|
ISSN: | 0306-4573 1873-5371 |
DOI: | 10.1016/j.ipm.2024.103778 |