Update: If you are unable to come to the event, you can view the live broadcast (http://www.youtube.com/watch?v=xkPqnnnjO9s) on Google Hangouts.
Are you bored of tutorials repeating the classic WordCount program when trying to learn Hadoop? You keep hearing Big Data, Hadoop and all the latest buzz-words, but, ask anyone, they always explain it using the word count program - and you are sick and tired of it.
People say the world is filled with so many problems in Big Data Scale; and why there isn't a single person who can show me some real world examples that doesn’t suck?
Then I talked to Saravanakumar (https://www.linkedin.com/pub/saravanakumar-karunanithi/10/b8b/a5b) and he offered to teach me and a few of us in office, from the basics on how to write map reduce programs - in a radically different manner. He literally started from scratch - opened an empty file and wrote each line, explaining how the logic works, including tips on how to effectively use the IDE. And, "NO" he didn't use the word count example. YAY!!
After his tutorial sessions, I suggested to him to teach this as a course to others who are also looking for a good practical introduction to Hadoop.
After many months persuading, here he is: doing his first mini-course on "Practical Map Reduce Programming". In this course you will learn:
• How to think in Map Reduce paradigm.
• Write map reduce programs to analyse retail sales data with multiple dimensions.
• Write more map reduce code to analyse insurance claims from multiple folders/files.
• How to develop & debug your program in Eclipse with small datasets, without having to worry about HDFS. (Running Hadoop & Eclipse during Development sucks the memory like a Death Eater)
• ... and if time permits, do a live twitter hashtag analysis.
All of this, you would be able to see how the code develops right from the blank slate.
If you want, bring along your laptop (with hadoop installed) and follow along. Even if you don’t bring your laptop, don’t worry. At the end of session, you would also get access to the complete source code and the sample datasets.
And here’s the best part: Serendio is sponsoring this entire course, so that you guys could get all of this for FREE.
So, whats the catch? None.
Serendio does this to build an awesome community of people using the latest technology. So make sure you grab this opportunity ASAP.
All you need to do now is, click the big RSVP button at the right, mark your calendars on December 13th (Saturday) and come to Serendio Software Pvt Ltd at 9:30AM (first come first seated).
PS: If you have any other interesting datasets/problems to work on, do bring it along and we can try to solve it together.
PPS: We won't cover why/how we use Hadoop. If you don’t know how to install hadoop or the basic commands to use it, I would suggest learning it before coming to the mini-course. You’ve got plenty of days.