HDP Operations: Install and Manage with Apache Ambari


Details
This Meetup is designed for administrators who will be managing the Hortonworks Data Platform (HDP). It covers installation, configuration, maintenance, security and performance topics.
Target Audience
IT administrators and operators responsible for installing, configuring and supporting an Apache Hadoop 2.0 deployment in a Linux environment.
Meetu Objective
Describe various tools and frameworks in the Hadoop 2.0 ecosystemDescribe the Hadoop Distributed File System (HDFS)architectureInstall and configure an HDP 2.0 clusterUse Ambari to monitor and manage a clusterDescribe how files are written to and stored in HDFSPerform a file system check using command line and browser-based toolsConfigure the replication factor of a fileMount HDFS to a local filesystem using the NFS GatewayDeploy and configure YARN on a clusterConfigure and troubleshoot MapReduce jobsDescribe how YARN jobs are scheduledConfigure the capacity and fairschedulers of the ResourceManagerUse WebHDFS to access a cluster over HTTPConfigure a HiveserverDescribe how Hive tables are created and populatedUse Sqoop to transfer data between Hadoop and a relational databaseUse Flume to ingest streaming data into HDFSDeploy and run an Oozie workflowCommission and decommission worker nodesConfigure a cluster to be rack-awareImplement and configure NameNode HASecure a Hadoop cluster

Canceled
HDP Operations: Install and Manage with Apache Ambari