Abstract
Multi-stage cascade processes are fairly common, especially in manufacturing industry. Precursors or raw materials are transformed at each stage before being used as the input to the next stage. Setting the right control parameters at each stage is important to achieve high quality products at low cost. Finding the right parameters via trial and error approach can be time consuming. Bayesian optimization is an efficient way to optimize costly black-box function. We extend the standard Bayesian optimization approach to the cascade process through formulating a series of optimization problems that are solved sequentially from the final stage to the first stage. Epistemic uncertainties are effectively utilized in the formulation. Further, cost of the parameters are also included to find cost-efficient solutions. Experiments performed on a simulated testbed of Al-Sc heat treatment through a three-stage process showed considerable efficiency gain over a naïve optimization approach.
Similar content being viewed by others
References
Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13(4), 455–492 (1998)
Lakshminarayanan, B., Roy, D.M., Teh, Y.W.: Mondrian forests for large-scale regression when uncertainty matters. arXiv preprint arXiv:1506.03805 (2015)
Srinivas, N., Krause, A., Seeger, M., Kakade, S.M.: Gaussian process optimization in the bandit setting: no regret and experimental design. In: ICML (2010)
Osio, I.G., Amon, C.H.: An engineering design methodology with multistage Bayesian surrogates and optimal sampling. Res. Eng. Design 8, 189–206 (1996)
Wang, L., Feng, M., Zhou, B., Xiang, B., Mahadevan, S.: Efficient hyper-parameter optimization for NLP applications. In: Empirical Methods in Natural Language Processing (2015)
Quinonero-Candela, J., Girard, A., Rasmussen, C.E.: Prediction at an uncertain input for Gaussian processes and relevance vector machines-application to multiple-step ahead time-series forecasting. Technical report, IMM, Danish Technical University, Technical report (2002)
Candela, J.Q., Girard, A., Larsen, J., Rasmussen, C.E.: Propagation of uncertainty in Bayesian kernel models-application to multiple-step ahead forecasting. In: ICASSP (2003)
Wagner, R., Kampmann, R., Voorhees, P.W.: Homogeneous second-phase precipitation. In: Materials Science and Technology (1991)
Robson, J., Jones, M., Prangnell, P.: Extension of the N-model to predict competing homogeneous and heterogeneous precipitation in Al-Sc alloys. Acta Mater. 51, 1453–1468 (2003)
Snoek, J., Larochelle, H., Adams, R.P.: Practical Bayesian optimization of machine learning algorithms. In: NIPS, pp. 2951–2959 (2012)
Rasmussen, C.E.: Gaussian processes for machine learning (2006)
Jones, D.R., Perttunen, C.D., Stuckman, B.E.: Lipschitzian optimization without the Lipschitz constant. J. Optim. Theory Appl. 79, 157–181 (1993)
Acknowledgement
This work is partially supported by the Telstra-Deakin Centre of Excellence in Big Data and Machine Learning.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing AG
About this paper
Cite this paper
Nguyen, T.D. et al. (2016). Cascade Bayesian Optimization. In: Kang, B.H., Bai, Q. (eds) AI 2016: Advances in Artificial Intelligence. AI 2016. Lecture Notes in Computer Science(), vol 9992. Springer, Cham. https://doi.org/10.1007/978-3-319-50127-7_22
Download citation
DOI: https://doi.org/10.1007/978-3-319-50127-7_22
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-50126-0
Online ISBN: 978-3-319-50127-7
eBook Packages: Computer ScienceComputer Science (R0)