Welcome to our continued ML knowledge dissemination network event. For the first few weeks, I will continue to explain machine learning using an intermediate level mathematics. Later, we will be inviting prominent scholars to give talks on their latest work.
The current topic is:
"Gradient Descend Research"
Obviously everyone uses the famous gradient descend. However, there has been many recent research around this basic algorithm. I'll fully explain its beautiful mathematics over a period of a few sessions.
You need a solid understanding of linear algebra, calculus, probability and statistics. But if you just want to get a feel of how some of the recent gradient descend research works or to meet like-minded people, please come too!
All my ML research notes can be found on:
the zoom details are: