This module has been created to cater to learning around BIG data and its delivery through HADOOP.
Big Data with Hadoop
PROGRAM OVERVIEW
–
Quick Contact
CURRICULUM
Big Data with Hadoop
Hadoop
HDFS Architecture( HDFS, YARN, MapReduce, namenode, datanode)/Cloudera
HDFS Commands
HIVE Architecture
Hive Query Language ( DDL, DML, joins, map, dictionary)
HBASE (Hbase architecture)
PIG
Assessments:- Assessment of Hadoop
Assessment of Hive
Assignment
04 Assignment;One Each Topics
Test Series
01 Full Test
PROJECT & TRAINING
Our live-projects offering prepares you for a range of analytics offerings in data science domain. For this course we would work on these projects:
- Building Data Pipeline (Hadoop|Hive|Spark): Building a data pipeline using RDBMS, creating aggration engine in HIVE and Spark-SQL and final visualization in Tableau of the aggregated data using BFSI data
SAMPLE CERTIFICATE
USP OF PROGRAM
Curriculum created by the industry experts in collaboration with NASSCOM keeping in mind the industry needs
State-of-the-art infrastructure and fully equipped labs
NASSCOM SSC official Study Material.
Training delivered by Certified and experienced trainersand industry experts
Placement Assistance.
Interview Preparation.
Certificate exam conducted by Emerging India.
FAQ
What is this program about ?
This program trains you on handling Big Data using Hadoop which is a distributed framework.
What is course duration ?
The course duration is approximately 58 hours
What all topics and tools will be covered in this program ?
This would cover key aspects of Hadoop, architecure overview, spark, hive and other related fields
How does Hadoop help in solving big data business problems?
Hadoop uses concept of mapreduce and distributed frameworks and coupled with platforms like Spark creates fast computing solution in Big Data domain