UTAustinX: LAFF-On Programming for High Performance
Learn to squeeze high performance out of modern CPUs.
About this course
Is my code fast? Can it be faster? Scientific computing, machine learning, and data science are about solving problems that are compute intensive. Choosing the right algorithm, extracting parallelism at various levels, and amortizing the cost of data movement are vital to achieving scalable speedup and high performance.
In this course, the simple but important example of matrix-matrix multiplication is used to illustrate fundamental techniques for attaining high-performance on modern CPUs. A carefully designed and scaffolded sequence of exercises leads the learner from a naive implementation to one that effectively utilizes instruction level parallelism and culminates in a high-performance multithreaded implementation. Along the way, it is discovered that careful attention to data movement is key to efficient computing.
Prerequisites for this course are a basic understanding of matrix computations (roughly equivalent toWeeks 1-5 of Linear Algebra: Foundations to Frontiers on edX) and an exposure to programming. Hands-on exercises start with skeletal code in the C programming language that is progressively modified, so that extensive experience with C is not required. Access to a relatively recent x86 processor such as Intel Haswell or AMD Ryzen (or newer) running Linux is required.
MATLAB Online licenses will be made available to the participants free of charge for the duration of the course.
Join us to satisfy your need for speed!
At a Glance:
Institution: UTAustinX
Subject: Computer Science
Level: Intermediate
Prerequisites:
Exposure to programming and Linux. Basic understanding of matrix-matrix multiplication.
Language: English
Video Transcript: English
Associated skills:Algorithms, Matrix Multiplication, Amortization, C (Programming Language), Data Science, Linear Algebra, Machine Learning, Scalability, Scientific Computing, X86 Architecture, Linux, Extract Transform Load (ETL)
There are no reviews yet.