Q:

What is Apache Hadoop?

A:

Quick Answer

Apache Hadoop is a server software application that has the ability to process extremely large amounts of data. Hadoop was developed by the Apache Software Foundation and Google and written in the Java programming language. It is primarily used in distributed storage and processing applications where large sets of data are manipulated.

Continue Reading
What is Apache Hadoop?
Credit: mother image Taxi Getty Images

Full Answer

Hadoop operates by working across multiple individual servers referred to as modules. Hadoop uses the combined processing power of these modules to perform intensive data manipulation tasks. One advantage to this arrangement is that if a small number of the servers in a Hadoop cluster fail, the software continues to operate normally.

Hadoop's three main software modules are its Distributed File System (HDFS), a resource-management platform called YARN and MapReduce, used to process large amounts of data quickly. MapReduce is one of the main things that Hadoop is used for inside companies, and this software module was created by Google in response to a growing need for distributed big data processing.

Hadoop is used by large corporations such as Yahoo and Facebook, and Hadoop can be installed in numerous server environments. Hadoop is open-source software, making it free for anyone to use or contribute to. Hadoop is not meant to be installed or operated by end users, unlike other software products developed by Apache.

Learn more about Software

Related Questions

Explore