Disclosure: when you buy through links on our site, we may earn an affiliate commission.

Master Big Data – Apache Spark/Hadoop/Sqoop/Hive/Flume/Mongo

In-depth course on Big Data - Apache Spark , Hadoop , Sqoop , Flume & Apache Hive, MongoDB & Big Data Cluster setup
4.5
4.5/5
(890 reviews)
8,261 students
Created by

9.5

CourseMarks Score®

9.7

Freshness

8.6

Feedback

9.6

Content

Platform: Udemy
Video: 11h 16m
Language: English
Next start: On Demand

Table of contents

Description

In this course, you will start by learning what is hadoop distributed file system and most common hadoop commands required to work with Hadoop File system.

Then you will be introduced to Sqoop Import
•Understand lifecycle of sqoop command.
•Use sqoop import command to migrate data from Mysql to HDFS.
•Use sqoop import command to migrate data from Mysql to Hive.
•Use various file formats, compressions, file delimeter,where clause and queries while importing the data.
•Understand split-by and boundary queries.
•Use incremental mode to migrate the data from Mysql to HDFS.

Further, you will learn Sqoop Export to migrate data.
•What is sqoop export
•Using sqoop export, migrate data from HDFS to Mysql.
•Using sqoop export, migrate data from Hive to Mysql.

Further, you will learn about Apache Flume
•Understand Flume Architecture.
•Using flume, Ingest data from Twitter and save to HDFS.
•Using flume, Ingest data from netcat and save to HDFS.
•Using flume, Ingest data from exec and show on console.
•Describe flume interceptors and see examples of using interceptors.
•Flume multiple agents
•Flume Consolidation.

In the next section, we will learn about Apache Hive
•Hive Intro
•External & Managed Tables
•Working with Different Files – Parquet,Avro
•Compressions
•Hive Analysis
•Hive String Functions
•Hive Date Functions
•Partitioning
•Bucketing

You will learn about Apache Spark
•Spark Intro
•Cluster Overview
•RDD
•DAG/Stages/Tasks
•Actions & Transformations
•Transformation & Action Examples
•Spark Data frames
•Spark Data frames – working with diff File Formats & Compression
•Dataframes API’s
•Spark SQL
•Dataframe Examples
•Spark with Cassandra Integration
•Running Spark on Intellij IDE
•Running Spark on EMR

You will learn

✓ Hadoop distributed File system and commands. Lifecycle of sqoop command. Sqoop import command to migrate data from Mysql to HDFS. Sqoop import command to migrate data from Mysql to Hive. Working with various file formats, compressions, file delimeter,where clause and queries while importing the data. Understand split-by and boundary queries. Use incremental mode to migrate the data from Mysql to HDFS. Using sqoop export, migrate data from HDFS to Mysql. Using sqoop export, migrate data from Hive to Mysql. Understand Flume Architecture. Using flume, Ingest data from Twitter and save to HDFS. Using flume, Ingest data from netcat and save to HDFS. Using flume, Ingest data from exec and show on console. Flume Interceptors.

Requirements

• No

This course is for

• Who want to learn big data in detail
Premium Instructor | TechnoAvengers.com (Founder)
Navdeep is one of the renowned Premium Instructor at Udemy. Navdeep has 12 years of industry experience in different technologies and domains. With 9+ courses and 40,000+ students and rating of 4.5*, she is one of the leading instructors in the field of Big Data & Cloud.
Happy Learning!
Platform: Udemy
Video: 11h 16m
Language: English
Next start: On Demand

Students are also interested in