Inverse Design of Composite Turbine Blade Circular Coolant Flow Passages

[+] Author and Article Information
T.-L. Chiang, G. S. Dulikravich

Department of Aerospace Engineering and Engineering Mechanics, The University of Texas at Austin, Austin, TX 78712

J. Turbomach 108(2), 275-282 (Oct 01, 1986) (8 pages) doi:10.1115/1.3262048 History: Received February 10, 1986; Online November 09, 2009


An inverse design and optimization method is developed to determine the proper size and location of the circular holes (coolant flow passages) in a composite turbine blade. The temperature distributions specified on the outer blade surface and on the surfaces of the inner holes can be prescribed a priori. In addition, heat flux distribution on the outer blade surface can be prescribed and iteratively enforced using optimization procedures. The prescribed heat flux distribution on the outer surface is iteratively approached by using the Sequential Unconstrained Minimization Technique (SUMT) to adjust the sizes and locations of the initially guessed circular holes. During each optimization iteration, a two-dimensional heat conduction equation is solved using direct Boundary Element Method (BEM) with linear temperature singularity distribution. For manufacturing purposes the additional constraints are enforced assuring the minimal prescribed blade wall thickness and spacing between the walls of two neighboring holes. The method is applicable to both single material (homogeneous) and coated (composite) turbine blades. Three different cases were tested to prove the feasibility and the accuracy of the method.

Copyright © 1986 by ASME
Your Session has timed out. Please sign back in to continue.





Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In