Blog Post

Hadoop Fundamentals

,

Have you heard about Hadoop, but never really understood what it’s all about?

Do you need to learn about Hadoop, either for your current job or for getting a new job?

If you are new to Hadoop, and you want to have a solid understanding of this framework, then this is for you…

Hadoop Logo

Apache Hadoop is an open-source software used for distributed computing. This framework allows for the distributed processing and querying of large data sets across clusters of commodity computers using a reliable and scalable architecture and simple programming models. It is designed to scale up from a single server to thousands of servers, each offering local computation and storage.

The two core components of this architecture are HDFS (Hadoop Distributed File System), and MapReduce.

HDFS is the storage system used by Hadoop, and it is designed to partition files across a cluster of data nodes for better performance, scalability and reliability. When you dump a file into HDFS, it splits the file into blocks, and stores them on the various nodes in the Hadoop cluster. It also creates several replications of the data blocks and distributes them in the cluster in a way that will be reliable and can be retrieved faster. Hadoop will internally make sure that any node failure will never result in a data loss.

1

MapReduce is a software framework for easily writing applications, which process vast amounts of data in parallel. The framework consists of a single master JobTracker and one slave TaskTracker per cluster node. The master is responsible for scheduling the tasks on the cluster nodes, monitoring them and re-executing any failed tasks. The slaves execute the tasks as directed by the master. There are two types of tasks: Map and Reduce. The Map tasks perform filtering and sorting in a parallel manner, and the Reduce tasks perform a summary operation. The MapReduce framework orchestrates the processing by marshalling the distributed servers, running the various tasks in parallel, managing all communications and data transfers between the various parts of the system, and providing for redundancy and fault tolerance.

2

Hadoop has generated a great deal of interest in recent years. Large and successful companies are using it to do powerful analysis of the data they collect. There are many use cases for using Hadoop, ranging from risk modeling, through customer churn analysis, to ad targeting. In today’s world of big data, it is essential to be familiar with this platform.

This is why I developed a one-day seminar for Hadoop beginners. The seminar covers Hadoop fundamental. It begins with an introduction and an overview of the main components as well as the concepts and terminology around Hadoop. We then dive into the HDFS storage system and the MapReduce software framework. We also cover the rich toolset that is used to work with Hadoop and to integrate between Hadoop and other platforms, such as: Hive, Pig, Sqoop and Yarn. And in the last part of the seminar, I provide some useful tips and best practices for working with Hadoop.

I’m going to deliver this seminar as part of the Expert Days 2015 event, on 28/12/2015 between 08:30 and 16:30. You can register here. I hope to see you there…

 

Expert Days 2015

The post Hadoop Fundamentals appeared first on .

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating