Skip to content

Details

For this meeting, rounding out 2019, we'll be welcoming back Rachel Lim from Google Brain team.

Planned Talks :

"TF2.0 performance" - Rachel Lim
This talk will be in the style of "let's flip switches and see if we can train faster", apparently. Rachel is from the Google Brain TF.data team, so will surely be able to answer questions about that (and TFX) too.

"Jax, Trax and stax of new stuff" - Sam Witteveen
While TF 2.0 is now in common usage there are a number of interesting non TF projects coming out of Google and DeepMind. This talk will focus on JAX a high-performance framework for machine learning research and Trax a new library created by Google researchers and engineers as the sequel to Tensor2Tensor.

"Porting PyTorch Models to TensorFlow" - Yu Xuan Lee
If you have built a PyTorch model, and want to port it to TensorFlow, then it's possible to make use of third-party ONNX inter-operability to transfer the model. Following up to his blog posts on TowardsDataScience.com, Yu Xuan Lee will describe the process.

"TensorBoard Lite" - Martin Andrews
TensorBoard is a great tool, with ever-expanding scope. But if you want to do some 'no nonsense' plotting of metrics, then perhaps having a simpler (and more direct) method would be useful. Martin will talk about the 'tb_lite' utility that he developed recently, which works with raw TensorBoard inputs - and so can display data from both TensorFlow and PyTorch.

Members are also interested in