MLT Workshop: Learning in Deep Networks


Details
Machine Learning Tokyo is organizing a new deep learning workshop, Part II of our Deep Learning series. This time we will look at the learning process in Deep Networks, including small interactive implementation blocks.
This workshop is free of charge and limited to 60 seats.
-
- SCHEDULE - -
11:00 Doors open
11:30 Welcome note
11:30 ~ 12:20 (45 min + 5 min Q/A)
- Intro to Learning in Deep Nets
- Learning in ML
- Learning in the Brain: from dendrites to axons
- Learning in Deep Networks : from input to output
- Input, weight, bias and output relation
- Some concepts: prediction, probability, regression and logistic labeling
- Representation learning : supervised, unsupervised
- Meta learning : learning to learn
12:20 ~ 12:50 (30 min, interactive)
- Activation Functions
- Activate what and why?
- Tanh (implement)
- Sigmoid (implement)
- Softmax (implement)
- ReLU (implement)
- Leaky ReLU (implement)
- Design custom activation (time dependent)
12:50 ~ 13:30 (40 min)
- Lunch Break
13:30 ~ 13:50 (15 min + 5 min Q/A)
- Regularization
- Batchnorm (instance norm, …)
- Dropout
- l1/l2
- Weight decay
13:50 ~ 14:40 (50 min, interactive)
- Losses (Objective functions)
- Distance metrics analogy
- Why loss important?
- MSE (l2) (implement)
- L1 (implement)
- (categorical/binary) CrossEntropy (implement)
- Other losses (application, data based loss selection)
- GAN’s loss (learned loss)
- Design custom loss (time dependent)
14:40 ~ 14:50 (10 min)
Break
14:50 ~ 15:40 (50 min , interactive)
- Optimization (Backprop)
- What is weight updating?
- Gradient descent
- Global-local minima
- SGD , Adam , AdaGrad etc..
- Backprop in pooling (implement)
- Is it really learning? (underfitting, overfitting)
15:40 ~ 16:15 (30 min + 5 min Q/A, interactive)
-
Metrics for evaluation (Simple implementation)
-
Accuracy
-
TP, FP, FP, FN
-
Precision & Recall (f1-score)
-
Specificity & Sensitivity
-
IoU
-
mAP
-
Design custom metric (time dependent)
-
- INSTRUCTORS - -
Alisher Abdulkhaev, Machine Learning engineer @PKSHA Technology
Mustafa Yagmur, Machine Learning engineer and PhD Candidate at the University of Tokyo
Dimitris Katsios, Machine Learning engineer @LPixel
- INSTRUCTORS - -
-
- REQUIREMENTS - -
Intermediate Python and Machine Learning
Basic Deep Learning (intuition, implementation)
- REQUIREMENTS - -
### THIS MEETUP IS FOR ML ENGINEERS AND RESEARCHERS. NO RECRUITERS, THANK YOU FOR YOUR UNDERSTANDING ###
-
- ML ENGINEERS AND RESEARCHERS - -
Find workshop materials on the MLT Github https://github.com/Machine-Learning-Tokyo
[Discourse] http://discuss.mltokyo.ai
[Slack] https://goo.gl/WnbYUP
[FB] https://www.facebook.com/machinelearningtokyo
- ML ENGINEERS AND RESEARCHERS - -
#####################################################
A big THANK YOU to COOKPAD for providing the venue for this workshop! Cookpad is a tech company building a community platform for people to share recipe ideas and cooking tips. https://cookpad.com/
####################################################################

MLT Workshop: Learning in Deep Networks