HPCx homepage
Services User support Projects Research About us Sitemap Search  
      line      
Atomic and Molecular Physics Biochemistry Chemistry Computational Engineering Numerical Algorithms Environmental Sciences
Computational Material Science New Application Areas Publications Euro HPC HPC Research Newsletter
               
home > research > engineering > uktc
spacer
hr
spacer

UK Turbulence Consortium (UKTC)

spacer

N. D. Sandham (Southampton) and D. R. Emerson (CCLRC Daresbury Laboratory)

Content

Introduction

The long-term objective of the UKTC is to develop a world lead in turbulence simulation and scientific usage of the resulting data. Critical to achieving this goal are two main aspects:

The first of these requirements is largely out of our hands, since it involves decisions taken by government as to the appropriate level of supercomputing investment in the UK, although members of the consortium are active on EPSRC committees with some influence over national strategy. The second aspect has been addressed by pursuing research topics, perhaps initiated by contacts within the consortium, in separate grant applications to EPSRC. Examples are the Southampton–Cambridge joint projects, as well as the Imperial College (Morrison) and UMIST work (Craft) that have been funded explicitly to use simulation data arising from Consortium activities. Such inter-institution activity is certainly stronger now than before the first consortia were formed, and we hope to build on this in the future. Examples of our international competitors are the NASA-Stanford CTR and the J.M.Burgers centre of the Netherlands. The former has benefited from massive investment to become the world leader but has only made data available via residential summer programmes. The Burgers Centre, by contrast, have established excellent inter-institution links, giving the Centre an international profile with strong industry involvement, despite the more limited computer facilities. This has only been achieved with co-ordinated investment in staff, of the kind not facilitated by the UK funding system. However, developments are underway in the UK that we believe can help the consortium in its ambitions. Besides the HPCx procurement, the whole area of e-science and the computational GRID have recently been given major impetus in the UK. Investment is being made in network infrastructure and computational tools to facilitate the movement of large amounts of data between institutions. This is exactly what is needed to make optimal use of the turbulence data that is being generated. We have already made exploratory studies of database tools (for example http://www.hpcc.ecs.soton.ac.uk/~turbulence/ and http://fluindigo.mech.surrey.ac.uk/indexpage.html, which is a European-wide repository of turbulence data) and are well-placed to use the technological advances of improved database technology to improve the scientific take-up of turbulence data. The consortium is not exclusive – new members have been added to the current consortium, and we endeavour to make data widely available outside the consortium when this is appropriate.

Support from Daresbury

Turbulent flow calculations are efficient only when codes are designed from the beginning for specific applications. In this sense the field is different to many other users of HPC, where only a handful of codes need to be parallelised and optimised. Principally there are five different code structures used in the consortium for large calculations:

Initially the consortium received support from the Southampton HPCI centre. At this time, some parts of codes were optimised and a database project was initiated to prototype the exchange of data in common formats. When the Southampton HPCI Centre closed, support was transferred to Daresbury under the direction of Dr. David Emerson. As coordinator of CCP12, Dr. Emerson has been involved with all of the CFD research consortia since their formation in 1994. Support work has involved staff at Daresbury Laboratory, both through CCP12 and the HPCI Centre, in a range of activities that includes parallelisation of key application codes, porting, tuning and optimising codes, and evaluation of codes and algorithms on the UK's flagship computing facilities. With respect to the UKTC activities, Daresbury have worked on a range of codes and have contributed at various levels, from general advice to detailed optimisations. These include the parallelisation and optimisation of the SBLI code, the porting and evaluation of NEWT with Dr Savill at Cambridge, and the initial conversion of the vectoral code EBL to F90 with G. Coleman at Southampton. So far two conference papers have been written specifically on this parallelisation and optimisation work (Yao et al 2000, Ashworth et al 2001). The following gives the main activities proposed to take place at Daresbury in support of the new UKTC project.

EBL: This code was initially written in Vectoral, an efficient computer language based on Fortran that exploited vector systems. Converting the code from Vectoral to F90 has started but further effort will be needed to complete the F90 conversion and get the code fully operational. Additional effort is needed for the parallelisation to be completed, tested and written up.

SBLI: The general maintenance of the code, for a much broader user base, and extension to multiblock will be handled at Daresbury along with porting and optimising the code for HPCx.

NEWT: With this code there are difficult issues of partitioning and load balancing. Under the consortium it is planned to work on porting and optimisation.

Other codes require specific assistance at different points in their development. We are allowing extra time to deal with optimisation and porting issues associated with the complex geometry codes: (i) spectral element code and (ii) CgLes, together with optimisation of other established codes. Examples include code modifications to use machine-specific FFTs and optimisation of tridiagonal matrix solvers. These are generic across several codes.

An area of increasing importance, in the context of GRID and e-science developments, is the exploitation of data arising from large calculations. Large datasets are currently transferred between Manchester, Southampton and Imperial via remote logins and shared F90 code. Over the next few years we expect the development of more user-friendly front ends and faster access to data. Support from Daresbury has therefore been requested in years 2 and 3 to create (and curate) a database archive of simulation data.

UKTC Members

The permanent staff involved in the scientific and technical work of the UKTC span 8 institutions. The consortium is led by Professor Neil Sandham at Southampton University and the grant (GR/R64964/01) started on October 1st, 2001. The grant provides funding for supercomputing access, particularly HPCx, and 18 months of support at Daresbury Laboratory.

Name Institution
Professor Neil Sandham Southampton University Principal Investigator
Dr. T. Craft UMIST
Dr. Kai Luo QMW
Dr. J. F. Morrison Imperial College
Dr. Mark Savill Cambridge University
Dr. Spencer Sherwin Imperial College
Dr. J. C. Vassilicos Cambridge University
Dr. John J. R. Williams QMW
Professor Peter Voke University of Surrey
Dr. E. J. Avital QMW
Dr. Stewart Cant Cambridge University
Dr. Gary N. Coleman Southampton University
Dr. F. Nicolleau University of Sheffield
Dr. David R. Emerson Daresbury Laboratory
Dr. Roderick Johnstone Daresbury Laboratory

Scientific Applications

A new code for compressible flow with shock waves has been developed at Southampton using new techniques of entropy splitting and consistent boundary formulations. With assistance from Daresbury Laboratory, this new code has been parallelised and optimised for the Cray T3E and will be for HPCx Problems involving shock boundary-layer interaction, compressible channel flow, transitional separation bubbles and jet aeroacoustics will be investigated. Further information about shock boundary-layer interaction can be found here and work on aeroacoustics is being undertaken by Dr. Zhewei Hu.

Publications relating to the UK Turbulence Consortium

Publications relating to support work:

The Communication Performance of the Cray T3D and its Effect on Iterative Solvers
Y. F. Hu, D. R. Emerson and R. J. Blake
Parallel Computing 22 (1996), pp 829-844.

An Optimal Migration Algorithm for Dynamic Load Balancing
Y. F. Hu, R. J. Blake and D. R. Emerson
Concurrency: Practice and Experience, 10 (1998), pp. 467-483.

An Evaluation of Cost Effective Parallel Computers for CFD
D. R. Emerson, K. Maguire, K. Takeda and D. Nicole
in proc. Parallel CFD 1998, pp 223-230.

LES on Parallel Computers
L. Temmerman, M. Leschziner, M. Ashworth and D. R. Emerson
in proc. Parallel CFD 2000.

Massively Parallel Simulation of Shock/Bounday Layer Interactions
Y. F. Yao, A. Laval, N. D. Sandham, I. Wolton, M. Ashworth and D. R. Emerson
ACFD 2000.

Parallel DNS Using a Turbulent Channel Flow Benchmark
M. Ashworth, D. R. Emerson,Y. F. Yao, N. D. Sandham and Q. Li
in proc. ECCOMAS 2001

Contact Details

Dr. David Emerson
Daresbury Laboratory
Daresbury
Warrington WA4 4AD
England
Tel. +44 (0)1925 603221
Fax. +44 (0)1925 603634
Email: D.R.Emerson@dl.ac.uk
Web Page: Computational Engineering Group's Home Page

spacer
hr
spacer
http://www.hpcx.ac.uk/research/engineering/uktc.html contact email - www@hpcx.ac.uk © UoE HPCX Ltd