Skip to main content

Cascade Bayesian Optimization

  • Conference paper
  • First Online:
Book cover AI 2016: Advances in Artificial Intelligence (AI 2016)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9992))

Included in the following conference series:

Abstract

Multi-stage cascade processes are fairly common, especially in manufacturing industry. Precursors or raw materials are transformed at each stage before being used as the input to the next stage. Setting the right control parameters at each stage is important to achieve high quality products at low cost. Finding the right parameters via trial and error approach can be time consuming. Bayesian optimization is an efficient way to optimize costly black-box function. We extend the standard Bayesian optimization approach to the cascade process through formulating a series of optimization problems that are solved sequentially from the final stage to the first stage. Epistemic uncertainties are effectively utilized in the formulation. Further, cost of the parameters are also included to find cost-efficient solutions. Experiments performed on a simulated testbed of Al-Sc heat treatment through a three-stage process showed considerable efficiency gain over a naïve optimization approach.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

References

  1. Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13(4), 455–492 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  2. Lakshminarayanan, B., Roy, D.M., Teh, Y.W.: Mondrian forests for large-scale regression when uncertainty matters. arXiv preprint arXiv:1506.03805 (2015)

  3. Srinivas, N., Krause, A., Seeger, M., Kakade, S.M.: Gaussian process optimization in the bandit setting: no regret and experimental design. In: ICML (2010)

    Google Scholar 

  4. Osio, I.G., Amon, C.H.: An engineering design methodology with multistage Bayesian surrogates and optimal sampling. Res. Eng. Design 8, 189–206 (1996)

    Article  Google Scholar 

  5. Wang, L., Feng, M., Zhou, B., Xiang, B., Mahadevan, S.: Efficient hyper-parameter optimization for NLP applications. In: Empirical Methods in Natural Language Processing (2015)

    Google Scholar 

  6. Quinonero-Candela, J., Girard, A., Rasmussen, C.E.: Prediction at an uncertain input for Gaussian processes and relevance vector machines-application to multiple-step ahead time-series forecasting. Technical report, IMM, Danish Technical University, Technical report (2002)

    Google Scholar 

  7. Candela, J.Q., Girard, A., Larsen, J., Rasmussen, C.E.: Propagation of uncertainty in Bayesian kernel models-application to multiple-step ahead forecasting. In: ICASSP (2003)

    Google Scholar 

  8. Wagner, R., Kampmann, R., Voorhees, P.W.: Homogeneous second-phase precipitation. In: Materials Science and Technology (1991)

    Google Scholar 

  9. Robson, J., Jones, M., Prangnell, P.: Extension of the N-model to predict competing homogeneous and heterogeneous precipitation in Al-Sc alloys. Acta Mater. 51, 1453–1468 (2003)

    Article  Google Scholar 

  10. Snoek, J., Larochelle, H., Adams, R.P.: Practical Bayesian optimization of machine learning algorithms. In: NIPS, pp. 2951–2959 (2012)

    Google Scholar 

  11. Rasmussen, C.E.: Gaussian processes for machine learning (2006)

    Google Scholar 

  12. Jones, D.R., Perttunen, C.D., Stuckman, B.E.: Lipschitzian optimization without the Lipschitz constant. J. Optim. Theory Appl. 79, 157–181 (1993)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgement

This work is partially supported by the Telstra-Deakin Centre of Excellence in Big Data and Machine Learning.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Thanh Dai Nguyen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing AG

About this paper

Cite this paper

Nguyen, T.D. et al. (2016). Cascade Bayesian Optimization. In: Kang, B.H., Bai, Q. (eds) AI 2016: Advances in Artificial Intelligence. AI 2016. Lecture Notes in Computer Science(), vol 9992. Springer, Cham. https://doi.org/10.1007/978-3-319-50127-7_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-50127-7_22

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-50126-0

  • Online ISBN: 978-3-319-50127-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics