In this post, we’ll walk through the process of deploying an Apache Hadoop 2 cluster on the EC2 cloud service offered by Amazon Web Services (AWS), using Hortonworks Data Platform.
Both EC2 and HDP offer many knobs and buttons to cater to your specific, performance, security, cost, data size, data protection and other requirements. I will not discuss most of these options in this blog as the goal is to walk through one particular path of deployment to get started.
- Amazon Web Services account with the ability to launch 7 large instances of EC2 nodes.
- A Mac or a Linux machine. You could also use Windows but you will have to install additional software such as SSH clients and SCP clients, etc.
- Lastly, we assume that you have basic familiarity with EC2 to the extent that you have created EC2 instances and SSH’d in.
The post Deploying a Hadoop Cluster on Amazon EC2 with HDP2 appeared first on Hortonworks.
from Hortonworks http://ift.tt/1frPKpZ