Disclosure: when you buy through links on our site, we may earn an affiliate commission.

Learn Big Data: The Hadoop Ecosystem Masterclass

Master the Hadoop ecosystem using HDFS, MapReduce, Yarn, Pig, Hive, Kafka, HBase, Spark, Knox, Ranger, Ambari, Zookeeper
4.3
4.3/5
(5,858 reviews)
28,089 students
Created by

9.3

CourseMarks Score®

9.3

Freshness

8.9

Feedback

9.0

Content

Platform: Udemy
Video: 5h 58m
Language: English
Next start: On Demand

Table of contents

Description

Important update: Effective January 31, 2021, all Cloudera software will require a valid subscription and only be accessible via the paywall. The sandbox can still be downloaded, but the full install requires a Cloudera subscription to get access to the yum repository.
In this course you will learn Big Data using the Hadoop Ecosystem. Why Hadoop? It is one of the most sought after skills in the IT industry. The average salary in the US is $112,000 per year, up to an average of $160,000 in San Fransisco (source: Indeed).
The course is aimed at Software Engineers, Database Administrators, and System Administrators that want to learn about Big Data. Other IT professionals can also take this course, but might have to do some extra research to understand some of the concepts.
You will learn how to use the most popular software in the Big Data industry at moment, using batch processing as well as realtime processing. This course will give you enough background to be able to talk about real problems and solutions with experts in the industry. Updating your LinkedIn profile with these technologies will make recruiters want you to get interviews at the most prestigious companies in the world.
The course is very practical, with more than 6 hours of lectures. You want to try out everything yourself, adding multiple hours of learning. If you get stuck with the technology while trying, there is support available. I will answer your messages on the message boards and we have a Facebook group where you can post questions.

You will learn

✓ Process Big Data using batch
✓ Process Big Data using realtime data
✓ Be familiar with the technologies in the Hadoop Stack
✓ Be able to install and configure the Hortonworks Data Platform (HDP)

Requirements

• You will need to have a background in IT. The course is aimed at Software Engineers, System Administrators, DBAs who want to learn about Big Data
• Knowing any programming language will enhance your course experience
• The course contains demos you can try out on your own machine. To run the Hadoop cluster on your own machine, you will need to run a virtual server. 8 GB or more RAM is recommended.

This course is for

• This course is for anyone that wants to know how Big Data works, and what technologies are involved
• The main focus is on the Hadoop ecosystem. We don’t cover any technologies not on the Hortonworks Data Platform Stack
• The course compares MapR, Cloudera, and Hortonworks, but we only use the Hortonworks Data Platform (HDP) in the demos
DevOps, Cloud, Big Data Specialist
I’ve been a System Administrator and full stack developer for over 10 years, the typical profile for a DevOps engineer. I’ve been working in multiple organizations and startups. I’ve cofounded a startup that focusses on applying DevOps and Cloud. I have been training people in newer technologies, like Big Data. I’ve trained a lot of people working in FTSE 100 & S&P 100 companies. Today I mainly work together with companies to improve their software delivery processes, while coaching and teaching on platforms like Udemy.
Browse all courses by on Coursemarks.
Platform: Udemy
Video: 5h 58m
Language: English
Next start: On Demand

Students are also interested in