Machine Learning enabled Adaptive Optimization of a Transonic Compressor Rotor with pre-compression

[+] Author and Article Information
Michael Joly

Thermal & Fluid Sciences, United Technologies Research Center, East Hartford, CT, USA

Soumalya Sarkar

Autonomous and Intelligent Systems, United Technologies Research Center, East Hartford, CT, USA

Dhagash Mehta

Autonomous and Intelligent Systems, United Technologies Research Center, East Hartford, CT, USA

1Corresponding author.

ASME doi:10.1115/1.4041808 History: Received September 21, 2018; Revised October 17, 2018


In aerodynamic design, accurate and robust surrogate models are important to accelerate computationally expensive CFD-based optimization. In this paper, a machine learning framework is presented to speed-up the design optimization of a highly-loaded transonic compressor rotor. The approach is three-fold: (1) dynamic selection and self-tuning among several surrogate models; (2) classification to anticipate failure of the performance evaluation; and (3) adaptive selection of new candidates to perform CFD evaluation for updating the surrogate, which facilitates design space exploration and reduces surrogate uncertainty. The framework is demonstrated with a multi-point optimization of the transonic NASA rotor 37, yielding increased compressor efficiency in less than 48 hours on 100 CPU cores. The optimized rotor geometry features pre-compression that relocates and attenuates the shock, without the stability penalty or undesired reacceleration usually observed in the literature.

Copyright (c) 2018 by ASME
Your Session has timed out. Please sign back in to continue.





Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In