تجاوز إلى المحتوى الرئيسي
User Image

Esam A. AlWagait عصام بن عبدالله الوقيت

Associate Professor

Associate Professor

علوم الحاسب والمعلومات
Bldg 31- First Floor
إعلان

steps to get Hadoop environment running

1) Download VMWare Fusion (Mac) or VMWare Player (Windows/Linux)

 

For Mac: https://my.vmware.com/web/vmware/info/slug/desktop_end_user_computing/vmware_fusion/6_0
For Windows: https://my.vmware.com/web/vmware/free#desktop_end_user_computing/vmware_player/6_0

 

2) Download Cloudera Hadoop Image which contains a CentOS Linux with Hadoop libraries and tools on it.

 

http://www.cloudera.com/content/support/en/downloads/download-components/download-products.html?productID=F6mO278Rvo

 

3) Double click the image and it will open in VMWare

 

Start experimenting with HDFS:

hadoop fs -put /path/to/file 

hadoop fs -ls 

etc

 

Try submitting a map reduce job. Hadoop ships with few examples. You can try the following after uploading a simple text file to HDFS.

 

hadoop jar /usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar wordcount path/to/file/in/hdfs path/to/output/in/hdfs

 

4) Download IntelliJ (or eclipse if you prefer) and install it in the CentOS running in VMWare

 

http://www.jetbrains.com/idea/download/ 

 

Add jars in /usr/lib/hadoop-mapreduce/ to the classpath in either IntelliJ or eclipse or from command line javac -cp /usr/lib/hadoop-mapreduce/ *.java

مزيد من إعلان
announcement

1) Download VMWare Fusion (Mac) or VMWare Player (Windows/Linux)   For Mac: …

announcement

I have created a page for the lecture, Please make sure to visit it before the lecture …

announcement

Dear Students,  Please make sure to attend. As per university regulations, attendance will…