Abstract
To minimize power consumption while maximizing performance, today’s multicore processors rely on fine-grained run-time dynamic power information – both in the time domain, e.g. \(\mu s\) to ms, and space domain, e.g. core-level. The state-of-the-art for deriving such power information is mainly based on predetermined power models which use linear modeling techniques to determine the core-performance/core-power relationship. However, with multicore processors becoming ever more complex, linear modeling techniques cannot capture all possible core-performance related power states anymore. Although, artificial neural networks (ANN) have been proposed for coarse-grained power modeling of servers with time resolutions in the range of seconds, no work has yet investigated fine-grained ANN-based power modeling. In this paper, we explore feed-forward neural networks (FFNNs) for core-level power modeling with estimation rates in the range of 10 kHz. To achieve a high estimation accuracy, we determine optimized neural network architectures and train FFNNs on performance counter and power data from a complex-out-of-order processor architecture. We show that, relative power estimation error decreases on average by 7.5% compared to a state-of-the-art linear power modeling approach and decreases by 5.5% compared to a multivariate polynomial regression model. Furthermore, we propose an implementation for run-time inference of the power modeling FFNN and show that the area overhead is negligible.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
ARM Limited: Cortex-M0 technical reference manual. Technical Report (2009)
Bertran, R., Gonzelez, M., Martorell, X., Navarro, N., Ayguade, E.: A systematic methodology to generate decomposable and responsive power models for CMPs. IEEE Trans. Comput. 62(7), 1289–1302 (2013)
Bienia, C.: Benchmarking Modern Multiprocessors (2011)
Bircher, W.L., John, L.K.: Complete system power estimation using processor performance events. IEEE Trans. Comput. 61(4), 563–577 (2012)
Carlson, T.E., Heirman, W., Eyerman, S., Hur, I., Eeckhout, L.: An evaluation of high-level mechanistic core models. ACM TACO 11(3), 1–25 (2014)
Chadha, M., Ilsche, T., Bielert, M., Nagel, W.E.: A statistical approach to power estimation for x86 processors. In: Proceedings - 2017 IEEE 31st International Parallel and Distributed Processing Symposium Workshops, IPDPSW 2017 (2017)
Cupertino, L.F., Da Costa, G., Pierson, J.-M.: Towards a generic power estimator. Comput. Sci. Res. Dev. 30(2), 145–153 (2014). https://doi.org/10.1007/s00450-014-0264-x
Huang, G.B., Chen, L., Siew, C.K.: Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Networks 17(4), 879–892 (2006)
Huang, W., et al.: Accurate fine-grained processor power proxies. In: IEEE/ACM MICRO (2012)
Kim, Y., Mercati, P., More, A., Shriver, E., Rosing, T.: P4: phase-based power/performance prediction of heterogeneous systems via neural networks. In: IEEE/ACM ICCAD (2017)
Li, S., Ahn, J.H., Strong, R.D., Brockman, J.B., Tullsen, D.M., Jouppi, N.P.: McPAT: an integrated power, area, and timing modeling framework for multicore and manycore architectures. In: IEEE MICRO (2009)
Lin, W., Wu, G., Wang, X., Li, K.: An artificial neural network approach to power consumption model construction for servers in cloud data centers. IEEE Trans. Sustain. Comput. 5(3), 329–340 (2019)
McCullough, J.C., et al.: Evaluating the effectiveness of model-based power characterization. In: Usenix Atc (2011)
Möbius, C., Dargie, W., Schill, A.: Power consumption estimation models for processors, virtual machines, and servers. IEEE TPDS 25(6), 1600–1614 (2014)
Pathania, A., Henkel, J.: HotSniper: sniper-based toolchain for many-core thermal simulations in open systems. IEEE Embed. Syst. Lett. 11(2), 54–57 (2019)
Rapp, M., Pathania, A., Mitra, T., Henkel, J.: Prediction-Based Task Migration on S-NUCA Many-Cores. In: DATE (2019)
Rapp, M., Sagi, M., Pathania, A., Herkersdorf, A., Henkel, J.: Power-and cache-aware task mapping with dynamic power budgeting for many-cores. IEEE Trans. Comput. 69(1), 1–13 (2019)
Rethinagiri, S.K., Palomar, O., Ben Atitallah, R., Niar, S., Unsal, O., Kestelman, A.C.: System-level power estimation tool for embedded processor based platforms. In: ACM RAPIDO (2014)
Samei, Y., Dömer, R.: Automated estimation of power consumption for rapid system level design. In: IEEE IPCCC (2014)
Shahid, A., Fahad, M., Manumachu, R.R., Lastovetsky, A.: Improving the accuracy of energy predictive models for multicore CPUs using additivity of performance monitoring counters. In: Malyshkin, V. (ed.) PaCT 2019. LNCS, vol. 11657, pp. 51–66. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-25636-4_5
Su, B., Gu, J., Shen, L., Huang, W., Greathouse, J.L., Wang, Z.: PPEP: online performance, power, and energy prediction framework and DVFS space exploration. In: IEEE/ACM MICRO (2014)
Walker, M.J., et al.: Accurate and stable run-time power modeling for mobile and embedded CPUs. In: IEEE TCAD (2017)
Woof, S.C., Ohara, M., Torriet, E.: The SPLASH-2 programs: characterization and methodological considerations. In: ACM ISCA (1995)
Wu, W., Lin, W., He, L., Wu, G., Hsu, C.H.: A Power Consumption Model for Cloud Servers Based on Elman Neural Network. IEEE Transactions on Cloud Computing (2019)
Acknowledgments
This work was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) - Projektnummer 146371743 - TRR 89 “Invasive Computing”.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Sagi, M., Vu Doan, N.A., Fasfous, N., Wild, T., Herkersdorf, A. (2020). Fine-Grained Power Modeling of Multicore Processors Using FFNNs. In: Orailoglu, A., Jung, M., Reichenbach, M. (eds) Embedded Computer Systems: Architectures, Modeling, and Simulation. SAMOS 2020. Lecture Notes in Computer Science(), vol 12471. Springer, Cham. https://doi.org/10.1007/978-3-030-60939-9_13
Download citation
DOI: https://doi.org/10.1007/978-3-030-60939-9_13
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-60938-2
Online ISBN: 978-3-030-60939-9
eBook Packages: Computer ScienceComputer Science (R0)