High Performance Computing: Where We Are Today and A Look Into The Future

Jack Dongarra, University of Tennessee, Oak Ridge National Laboratory, and University of Manchester


In this talk I examine how high-performance computing has changed over the last 10 years and what future trends will likely be. These changes have had and will continue to have a major impact on our numerical scientific software. A new generation of software libraries and algorithms are needed for the effective and reliable use of (wide-area) dynamic, distributed and parallel environments. Some of the software and algorithm challenges have already been met, such as management of communication and memory hierarchies through a combination of compile-time and run-time techniques, but the increased scale of computation, depth of memory hierarchies, range of latencies, and increased run-time environment variability will make these problems much harder. 


Jack Dongarrra


Biographical Sketch:

Jack Dongarra, recipient this year of the prestigious A. M. Turing Award from the Association of Computing Machinery (ACM), holds an appointment as University Distinguished Professor of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee at Knoxville (UTK), as well as the title of Distinguished Research Staff in the Computer Science and Mathematics Division at Oak Ridge National Laboratory. He is a Turing Fellow at Manchester University and an Adjunct Professor in the Computer Science Department at Rice University. He is the director of both UTK’s Innovative Computing Laboratory and UTK’s Center for Information Technology Research, which coordinates and facilitates information technology research efforts at the university. He received a B.S. degree in mathematics from Chicago State University in 1972, an M.S. degree in computer science from the Illinois Institute of Technology in 1973, and a Ph.D. in applied mathematics from the University of New Mexico in 1980. From 1980 to 1989, he worked at Argonne National Laboratory where he became a senior scientist.

Jack specializes in numerical algorithms in linear algebra, parallel computing, the use of advanced computer architectures, programming methodology and tools for parallel computers. In addition to the 2022 Turing Award, he has received the Institute of Electrical and Electronics Engineers (IEEE) Sid Fernbach Award (2004), the first IEEE Medal of Excellence in Scalable Computing (2008), the first Series in Applied Mathematics (SIAM) Special Interest Group on Supercomputing’s Award for Career Achievement (2010), the IEEE Charles Babbage Award (2011), the ACM/IEEE Ken Kennedy Award (2013), the ACM/SIAM Computational Science and Engineering Prize (2019) and the IEEE-CS Computer Pioneer Award (2020). He received the ACM A.M. Turing Award “for pioneering contributions to numerical algorithms and software that have driven decades of extraordinary progress in computing performance and applications.” He is a Fellow of the American Association for the Advancement of Science, ACM, IEEE, and SIAM, a foreign member of the British Royal Society, and a member of the National Academy of Engineering.