Learn From the Industry’s Best

Join 245516 registered members and put your career on the right track Sign up now

Who We Are

A bunch of new and experienced Hadoop, Big Data and information management enthusiasts who want to learn, contribute, and network with others with similar interests. Our community includes open source enthusiasts, the academia, professionals, and companies including IBM, Rightscale, and Jaspersoft.

Learn more »

Our Mission

Make Big Data education available to everyone, and start a journey of discovery to change the world! Big data technologies such as Hadoop and Streams paired with Cloud Computing can let even students explore data that can lead to important discoveries in the health industry, the environment, and any other area you can think of!

Big Data use cases »

Our Courses

They are mostly free, developed by experienced professionals and teachers, and nicely structured. Most courses include hands-on labs that you can perform on the Cloud, on VMWare images, or by locally installing the required software. Pass the course test to print your certificate of completion.

Course catalog »

What is Hadoop?

Akmal Chaudhri on Hadoop and Big Data

Akmal Chaudhri on Hadoop and Big Data

Sign up

According to our members…

  • The course is a great opportunity to get some quick practical experience with Hadoop and its related subprojects. Also the provided vm image could be used for future projects and deepening your understanding of Hadoop framework. Waiting for Hadoop Fundamentals II to be ready :)

  • …Content is well segregated into different lessons and allowed me to progress smoothly. Flume was something new to me, it was interesting to read about it. The format of a lesson with a transcript, video built on a ppt along with a demo, and labs is well-organized and useful.

  • Dear all, The course is excellent because it saves time from reading big books to learn Hadoop. I prefer agile practice: try to achive small results ASAP. I didn’t know anything about Hadoop two months ago. But these two months were enough for me to create 7 nodes Hadoop cluster which everyday computes recommendations for our site…