Guest Lectures: Keno Fischer on integrating TPUs with Julia


Please see our Papercall and submit your talk ideas for future events!

Title: Using TPUs from Julia
TPUs are the earliest of a new wave of machine learning
accelerators and have powered much of Google's recent AI
advances, including DeepMind's advances in reinforcement
learning. Through its XLA compiler interface, Julia code
is able to run on TPUs and take advantage of this enourmous
compute capacity. In this talk, I will introduce TPUs,
discuss how we generate XLA from julia and walk through
some of the difficulties and design trade offs encountered
when targeting this hardware. In addition, I will walk
through some example programs and walk through how they map
to TPUs.

Keno Fischer

How to get here
The Kiva Conference room is on the 4th floor of the Dreyfoos Wing in the Stata Center (Building 32). Enter Building 32 at the front entrance and proceed straight ahead; there will be elevators to the right. Take the elevators to the 4th floor; exit to the left and then turn right at the end of the elevator bank. At the end of the short corridor cross the R&D Dining room. The Kiva Conference Room is to the right.

If you are visiting from outside MIT, please note that the doors lock promptly at 6pm. We will have someone by the Dreyfoos elevators to help folks find their way up.

About us
C.A.J.U.N is dedicated to users and developers of the Julia language. Please see our Papercall and submit your talk ideas for future events!