Big Data Processing with Apache Spark

Details
Basic scala concepts which helps in learning Apache Spark - 20 Minutes1) Understand Scala and its implementation - 10 Minutes2) Apply Loops, Collection, and operation - 10 Minutes
Apache Spark Basic's - 65 Minutes
1 - What is spark, - 10 min
2- Why we need spark and why We prefer it over hadoop mapReduce - 10 min
3- What is RDD and lazy evauation - 15 min
Using Word Count Example
4 - Actions and Transformations (Scala, Paython,Java) 30 min
5 - Persistance of RDD and in memory computaion 20 min
Apache Spark Intermediate Concepts - 35 Minutes
6 - Serialization - 5 min
7 - Configuration - 10 Min
8 - Broadcast and Accumlators - 20 Min
Apache Spark Advance Concepts
9 - Spark Data Frames - 20 Minutes (Demo
8 - Spark Streaming - 10 Minutes (Case Study I : Twitter Analysis)
10 - MLIB - 15 Minutes (Case Study II : Latent Semantic Indexing and Searching)
Event is sponsored by EasyCommute (http://www.easycommute.co) and Pramati Technologies (http://www.pramati.com/)
Contact Person: Jatin : 9701403031
Mayank: 8099927902

Big Data Processing with Apache Spark