IJITCS Vol. 17, No. 2, 8 Apr. 2025
Cover page and Table of Contents: PDF (size: 400KB)
PDF (400KB), PP.1-9
Views: 0 Downloads: 0
Green AI, Sustainable Environment, Multi Objective Fitness, Machine Learning, Hyperparameter Optimization
The hyperparameter tuning process is an essential step for ML model optimization, as it is necessary to improve model performance. However, this enhancement involves high computational resources and time costs. Model tuning can significantly raise energy consumption and consequently increase carbon emissions. Therefore, there is an essential need to construct a new framework for this challenge by adding carbon emissions as a vital consideration along with performance. The paper proposes a novel Sustainable Hyperparameter Optimization (SHPO) framework that uses an optimized multi-objective fitness approach. The framework focuses on ensemble classification models (ECMs) namely, Random Forest, ExtraTrees, XGBoost, and AdaBoost. All these models will be optimized using traditional and advanced techniques like Optuna, Hyperopt, and Grid Search. The proposed framework tracks carbon emissions during model hyperparameter tuning. The methodology uses the Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) as a method of multi-criteria decision-making (MCDM). This TOPSIS method ranks the hyperparameter sets based on both accuracy and carbon emissions. The objective of the multi-objective fitness approach is to reach the best parameter set with high accuracy and low carbon emissions. It is observed from the experimental results that Optuna based Hyperparameter optimization consistently produced low carbon emissions and achieved high predictive accuracy across the majority of benchmark hyperparameter setups for the models.
K. Jegadeeswari, R. Rathipriya, "Green AI Practices in Multi-objective Hyperparameter Optimization for Sustainable Machine Learning", International Journal of Information Technology and Computer Science(IJITCS), Vol.17, No.2, pp.1-9, 2025. DOI:10.5815/ijitcs.2025.02.01
[1]J. Bergstra, R. Bardenet, Y. Bengio, and B. Kégl, “Algorithms for hyper-parameter optimization,” Advances in Neural Information Processing Systems (NIPS), vol. 24, 2011, pp. 2546-2554.
[2]T. G. Dietterich, “Ensemble methods in machine learning,” International Workshop on Multiple Classifier Systems, 2000, pp. 1-15.
[3]J. Snoek, H. Larochelle, and R. P. Adams, “Practical Bayesian optimization of machine learning algorithms,” Advances in Neural Information Processing Systems (NIPS), vol. 25, 2012, pp. 2951-2959.
[4]X. Gao, Y. Zhu, and Y. Wu, “Carbon-aware computing for AI: towards sustainable AI systems,” ACM Computing Surveys, vol. 54, no. 5, 2021, pp. 1-37.
[5]A. Van Wynsberghe, “Sustainable AI: AI for sustainability and the sustainability of AI,” AI and Ethics, vol. 1, no. 3, 2021, pp. 213-218.
[6]R. Schwartz, J. Dodge, N. Smith, and O. Etzioni, “Green AI,” Communications of the ACM, vol. 63, no. 12, 2020, pp. 54-63.
[7]E. García-Martín, C. F. Rodrigues, G. Riley, and H. Grahn, “Estimation of energy consumption in machine learning,” Journal of Parallel and Distributed Computing, vol. 134, 2019, pp. 75-88. https://doi.org/10.1016/j.jpdc.2019.07.007
[8]C. Zopounidis, and M. Doumpos, “Multicriteria decision aid in financial decision making: methodologies and literature review,” Journal of Multi-Criteria Decision Analysis, vol. 11, no. 4-5, 2018, pp. 167-186.
[9]E. Strubell, A. Ganesh, and A. McCallum, “Energy and policy considerations for deep learning in NLP,” Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020, pp. 3645-3650.
[10]J. Bergstra, R. Bardenet, Y. Bengio, and B. Kégl, “Algorithms for hyper-parameter optimization,” Advances in Neural Information Processing Systems (NIPS), vol. 24, 2011, pp. 2546-2554.
[11]C. Zopounidis, and M. Doumpos, “Multicriteria decision aid in financial decision making: methodologies and literature review,” Journal of Multi-Criteria Decision Analysis, vol. 11, no. 4-5, 2018, pp. 167-186.
[12]R. Schwartz, J. Dodge, N. Smith, and O. Etzioni, “Green AI,” Communications of the ACM, vol. 63, no. 12, 2020, pp. 54-63.
[13]T. G. Dietterich, “Ensemble methods in machine learning,” International Workshop on Multiple Classifier Systems, 2000, pp. 1-15.
[14]N. C. Thompson, K. Greenewald, K. Lee, and G. F. Manso, “Deep learning’s carbon footprint,” Communications of the ACM, vol. 64, no. 7, 2021, pp. 11-13.
[15]X. Gao, Y. Zhu, and Y. Wu, “Carbon-aware computing for AI: towards sustainable AI systems,” ACM Computing Surveys, vol. 54, no. 5, 2021, pp. 1-37.
[16]J. Snoek, H. Larochelle, and R. P. Adams, “Practical Bayesian optimization of machine learning algorithms,” Advances in Neural Information Processing Systems (NIPS), vol. 25, 2012, pp. 2951-2959.
[17]C. L. Hwang, and K. Yoon, “Multiple attribute decision making: methods and applications,” Springer, 1981.
[18]A. Van Wynsberghe, “Sustainable AI: AI for sustainability and the sustainability of AI,” AI and Ethics, vol. 1, no. 3, 2021, pp. 213-218.
[19]Jegadeeswari, K., and R. Rathipriya. "Optimized Stacking Ensemble Classifier for Early Cancer Detection Using Biomarker Data." Advance Sustainable Science Engineering and Technology 6, no. 4 (2024): 02404017-02404017.
[20]Hassani, H., Shahbazi, A., Shahbalayev, E., Hamdi, Z., Behjat, S., & Bataee, M., “Machine Learning-Based CO2 Saturation Tracking in Saline Aquifers Using Bottomhole Pressure for Carbon Capture and Storage CCS Projects.” In Society of Petroleum Engineers - SPE Norway Subsurface Conference BERG 2024, 2024, https://doi.org/10.2118/218445-MS