TOPICS

About The Course
The Big Data and Hadoop training course from LearnChase is designed to enhance your knowledge and skills to become a successful Hadoop developer. In-depth knowledge of core concepts will be covered in the course along with implementation on varied industry use-cases.
Course Objectives
By the end of the course, you will:
1. Master the concepts of HDFS and MapReduce framework
2. Understand Hadoop 2.x Architecture
3. Setup Hadoop Cluster and write Complex MapReduce programs
4. Learn data loading techniques using Sqoop and Flume
5. Perform data analytics using Pig, Hive and YARN
6. Implement HBase and MapReduce integration
7. Implement Advanced Usage and Indexing
8. Schedule jobs using Oozie
9. Implement best practices for Hadoop development
10. Work on a real life Project on Big Data Analytics
Who should go for this course?
Today, Hadoop has become a cornerstone of every business technology professional. To stay ahead in the game, Hadoop has become a must-know technology for the following professionals:
1. Analytics professionals
2. BI /ETL/DW professionals
3. Project managers
4. Testing professionals
5. Mainframe professionals
6. Software developers and architects
7. Graduates aiming to build a successful career around Big Data
Why learn Big Data and Hadoop?
CIOs are making Hadoop their platform of choice in 2015. For better career prospects, bigger job opportunities and financial growth, Hadoop is a must-know.

What are the pre-requisites for this Course?
You can master Hadoop, irrespective of your IT background. While basic knowledge of Core Java and SQL might help, it is not a pre-requisite for learning Hadoop.
In case you wish to brush-up your Java skills, LearnChase offers you a complimentary self-paced course: “Java essentials for Hadoop”.
How will I execute the Practicals?
For the practicals, we will help you to setup LearnChase’s Virtual Machine in your system with local access. The detailed installation guides are provided in the LMS for setting up your environment. In case your system doesn’t meet the pre-requisites e.g. 4GB RAM, you will be provided remote access to the LeanChase cluster. In case you experience any issues, our 24*7 support team will be happy to assist you.
Which Case-Studies will be a part of the Course?
Towards the end of the course, you will be working on a live project where you will be using PIG, HIVE, HBase and MapReduce to perform Big Data analytics.
Here are the few industry-specific Big Data case studies e.g. Finance, Retail, Media, Aviation etc. which you can consider foryour project work:

Project #1: Analyze social bookmarking sites to find insights
Industry: Social Media
Data: It comprises of the information gathered from sites like reddit.com, stumbleupon.com which are bookmarking sites and allow you to bookmark, review, rate, search various links on any topic.reddit.com, stumbleupon.com, etc. A bookmarking site allows you to bookmark, review, rate, search various links on any topic. The data is in XML format and contains various links/posts URL, categories defining it and the ratings linked with it.
Problem Statement: Analyze the data in the Hadoop ecosystem to:
1. Fetch the data into a Hadoop Distributed File System and analyze it with the help of MapReduce, Pig and Hive to find the top rated links based on the user comments, likes etc.
2. Using MapReduce, convert the semi-structured format (XML data) into a structured format and categorize the user rating as positive and negative for each of the thousand links.
3. Push the output HDFS and then feed it into PIG, which splits the data into two parts: Category data and Ratings data.
4. Write a fancy Hive Query to analyze the data further and push the output is into relational database (RDBMS) using Sqoop.
5. Use a web server running on grails/java/ruby/python that renders the result in real time processing on a website.

Project #2: Customer Complaints Analysis
Industry: Retail
Data: Publicly available dataset, containing a few lakh observations with attributes like; CustomerId, Payment Mode, Product Details, Complaint, Location, Status of the complaint, etc.
Problem Statement: Analyze the data in the Hadoop ecosystem to:
1. Get the number of complaints filed under each product
2. Get the total number of complaints filed from a particular location
3. Get the list of complaints grouped by location which has no timely response

Project #3: Tourism Data Analysis
Industry: Tourism
Data: The dataset comprises attributes like: City pair (combination of from and to), adults traveling, seniors traveling, children traveling, air booking price, car booking price, etc.
Problem Statement: Find the following insights from the data:
1. Top 20 destinations people frequently travel to: Based on given data we can find the most popular destinations where people travel frequently, based on the specific initial number of trips booked for a particular destination
2. Top 20 locations from where most of the trips start based on booked trip count
3. Top 20 high air-revenue destinations, i.e the 20 cities that generate high airline revenues for travel, so that the discount offers can be given to attract more bookings for these destinations.

Project #4: Airline Data Analysis
Industry: Aviation
Data: Publicly available dataset which contains the flight details of various airlines such as: Airport id, Name of the airport, Main city served by airport, Country or territory where airport is located, Code of Airport, Decimal degrees, Hours offset from UTC, Timezone, etc.
Problem Statement: Analyze the airlines’ data to:
1. Find list of airports operating in the country
2. Find the list of airlines having zero stops
3. List of airlines operating with code share
4. Which country (or) territory has the highest number of airports
5. Find the list of active airlines in the United States

Project #5: Analyze Loan Dataset
Industry: Banking and Finance
Data: Publicly available dataset which contains complete details of all the loans issued, including the current loan status (Current, Late, Fully Paid, etc.) and latest payment information.
Problem Statement: Find the number of cases per location and categorize the count with respect to reason for taking loan and display the average risk score.

Project #6: Analyze Movie Ratings
Industry: Media
Data: Publicly available data from sites like rotten tomatoes, IMDB, etc.
Problem Statement: Analyze the movie ratings by different users to:
1. Get the user who has rated the most number of movies
2. Get the user who has rated the least number of movies
3. Get the count of total number of movies rated by user belonging to a specific occupation
4. Get the number of underage users

Project #7: Analyze YouTube data
Industry: Social Media
Data: It is about the YouTube videos and contains attributes such as: VideoID, Uploader, Age, Category, Length, views, ratings, comments, etc.
Problem Statement: Identify the top 5 categories in which the most number of videos are uploaded, the top 10 rated videos, and the top 10 most viewed videos.
Apart from these there are some twenty more use-cases to choose:
Market data Analysis
Twitter Data Analysis

1. Understanding Big Data and Hadoop
Learning Objectives – In this module, you will understand Big Data, the limitations of the existing solutions for Big Data problem, how Hadoop solves the Big Data problem, the common Hadoop ecosystem components, Hadoop Architecture, HDFS, Anatomy of File Write and Read, Rack Awareness.

Topics – Big Data, Limitations and Solutions of existing Data Analytics Architecture, Hadoop, Hadoop Features, Hadoop Ecosystem, Hadoop 2.x core components, Hadoop Storage: HDFS, Hadoop Processing: MapReduce Framework, Anatomy of File Write and Read, Rack Awareness.

2. Hadoop Architecture and HDFS
Learning Objectives – In this module, you will learn the Hadoop Cluster Architecture, Important Configuration files in a Hadoop Cluster, Data Loading Techniques.

Topics – Hadoop 2.x Cluster Architecture – Federation and High Availability, A Typical Production Hadoop Cluster, Hadoop Cluster Modes, Common Hadoop Shell Commands, Hadoop 2.x Configuration Files, Password-Less SSH, MapReduce Job Execution, Data Loading Techniques: Hadoop Copy Commands, FLUME, SQOOP.
3. Hadoop MapReduce Framework – I
Learning Objectives – In this module, you will understand Hadoop MapReduce framework and the working of MapReduce on data stored in HDFS. You will learn about YARN concepts in MapReduce.

Topics – MapReduce Use Cases, Traditional way Vs MapReduce way, Why MapReduce, Hadoop 2.x MapReduce Architecture, Hadoop 2.x MapReduce Components, YARN MR Application Execution Flow, YARN Workflow, Anatomy of MapReduce Program, Demo on MapReduce.
4. Hadoop MapReduce Framework – II
Learning Objectives – In this module, you will understand concepts like Input Splits in MapReduce, Combiner & Partitioner and Demos on MapReduce using different data sets.

Topics – Input Splits, Relation between Input Splits and HDFS Blocks, MapReduce Job Submission Flow, Demo of Input Splits, MapReduce: Combiner & Partitioner, Demo on de-identifying Health Care Data set, Demo on Weather Data set.
5. Advanced MapReduce
Learning Objectives – In this module, you will learn Advanced MapReduce concepts such as Counters, Distributed Cache, MRunit, Reduce Join, Custom Input Format, Sequence Input Format and how to deal with complex MapReduce programs.

Topics – Counters, Distributed Cache, MRunit, Reduce Join, Custom Input Format, Sequence Input Format.
6. Pig
Learning Objectives – In this module, you will learn Pig, types of use case we can use Pig, tight coupling between Pig and MapReduce, and Pig Latin scripting.

Topics – About Pig, MapReduce Vs Pig, Pig Use Cases, Programming Structure in Pig, Pig Running Modes, Pig components, Pig Execution, Pig Latin Program, Data Models in Pig, Pig Data Types.
Pig Latin : Relational Operators, File Loaders, Group Operator, COGROUP Operator, Joins and COGROUP, Union, Diagnostic Operators, Pig UDF, Pig Demo on Healthcare Data set.
7. Hive
Learning Objectives – This module will help you in understanding Hive concepts, Loading and Querying Data in Hive and Hive UDF.

Topics – Hive Background, Hive Use Case, About Hive, Hive Vs Pig, Hive Architecture and Components, Metastore in Hive, Limitations of Hive, Comparison with Traditional Database, Hive Data Types and Data Models, Partitions and Buckets, Hive Tables(Managed Tables and External Tables), Importing Data, Querying Data, Managing Outputs, Hive Script, Hive UDF, Hive Demo on Healthcare Data set.
8. Advanced Hive and HBase
Learning Objectives – In this module, you will understand Advanced Hive concepts such as UDF, Dynamic Partitioning. You will also acquire in-depth knowledge of HBase, HBase Architecture and its components.

Topics – Hive QL: Joining Tables, Dynamic Partitioning, Custom Map/Reduce Scripts, Hive : Thrift Server, User Defined Functions.
HBase: Introduction to NoSQL Databases and HBase, HBase v/s RDBMS, HBase Components, HBase Architecture, HBase Cluster Deployment.
9. Advanced HBase
Learning Objectives – This module will cover Advanced HBase concepts. We will see demos on Bulk Loading , Filters. You will also learn what Zookeeper is all about, how it helps in monitoring a cluster, why HBase uses Zookeeper.

Topics – HBase Data Model, HBase Shell, HBase Client API, Data Loading Techniques, ZooKeeper Data Model, Zookeeper Service, Zookeeper, Demos on Bulk Loading, Getting and Inserting Data, Filters in HBase.
10. Oozie and Hadoop Project
Learning Objectives – In this module, you will understand working of multiple Hadoop ecosystem components together in a Hadoop implementation to solve Big Data problems. We will discuss multiple data sets and specifications of the project. This module will also cover Flume & Sqoop demo and Apache Oozie Workflow Scheduler for Hadoop Jobs.

Topics – Flume and Sqoop Demo, Oozie, Oozie Components, Oozie Workflow, Scheduling with Oozie, Demo on Oozie Workflow, Oozie Co-ordinator, Oozie Commands, Oozie Web Console, Hadoop Project Demo.

  • PRIVATE
  • 10 Days
  • 0 Units
  • 0 Hrs

Select Your Currency

WOOCS 1.1.8
Drop Us A Query
[contact-form-7 id="5639" title="Drop Us A Query"]
© 2016, ALL RIGHTS RESERVED.
Create an Account