2. Meet Minimum System Requirements

To run Hadoop, your system must meet minimum requirements.

 2.1. Hardware Recommendations

There is no single hardware requirement set for installing Hadoop.

For more information on the parameters that may affect your installation, see Hardware Recommendations For Apache Hadoop.

 2.2. Operating Systems Requirements

The following operating systems are supported:

  • Red Hat Enterprise Linux (RHEL) v5.x or 6.x (64-bit)

  • CentOS v5.x or 6.x (64-bit)

  • Oracle Linux v5.x or 6.x (64-bit)

  • SUSE Linux Enterprise Server (SLES) 11, SP1 or SP3 (64-bit)

    [Note]Note

    If you plan to install HDP Stack on SLES 11 SP3, be sure to refer to Configuring Repositories in the HDP documentation for the HDP repositories specific for SLES 11 SP3. Or, if you plan to perform a Local Repository install, be sure to use the SLES 11 SP3 repositories.

[Important]Important

The installer pulls many packages from the base OS repositories. If you do not have a complete set of base OS repositories available to all your machines at the time of installation you may run into issues.

If you encounter problems with base OS repositories being unavailable, please contact your system administrator to arrange for these additional repositories to be proxied or mirrored. For more information see Optional: Configure the Local Repositories

 2.3. Browser Requirements

The Ambari Install Wizard runs as a browser-based Web app. You must have a machine capable of running a graphical browser to use this tool. The supported browsers are:

  • Windows (Vista, 7)

    • Internet Explorer 9.0 and higher (for Vista + Windows 7)

    • Firefox latest stable release

    • Safari latest stable release

    • Google Chrome latest stable release

  • Mac OS X (10.6 or later)

    • Firefox latest stable release

    • Safari latest stable release

    • Google Chrome latest stable release

  • Linux (RHEL, CentOS, SLES, Oracle Linux)

    • Firefox latest stable release

    • Google Chrome latest stable release

 2.4. Software Requirements

On each of your hosts:

  • yum and rpm (RHEL/CentOS/Oracle Linux)

  • zypper (SLES)

  • scp, curl, and wget

  • python (2.6 or later)

[Important]Important

The Python version shipped with SUSE 11, 2.6.0-8.12.2, has a critical bug that may cause the Ambari Agent to fail within the first 24 hours. If you are installing on SUSE 11, please update all your hosts to Python version 2.6.8-0.15.1.

 2.5. JDK Requirements

The following Java runtime environments are supported:

  • Oracle JDK 1.7_45 64-bit (default)

  • Oracle JDK 1.6_31 64-bit

    [Note]Note

    Deprecated as of Ambari 1.5.1

  • OpenJDK 7 64-bit (not supported on SLES)

 2.6. Database Requirements

Hive/HCatalog, Oozie, and Ambari all require their own internal databases.

  • Hive/HCatalog: By default uses an Ambari-installed MySQL 5.x instance. With appropriate preparation, you can also use an existing PostgreSQL 9.x, MySQL 5.x, or Oracle 11g r2 instance. See Using Non-Default Databases-Hive for more information on using existing instances.

  • Oozie: By default uses an Ambari-installed Derby instance. With appropriate preparation, you can also use an existing PostgreSQL 9.x, MySQL 5.x, or Oracle 11g r2 instance. See Using Non-Default Databases-Oozie for more information on using existing instances.

  • Ambari: By default uses an Ambari-installed PostgreSQL 8.x instance. With appropriate preparation, you can also use an existing PostgreSQL 9.x, MySQL 5.x, or Oracle 11g r2 instance. See Using Non-Default Databases-Ambari for more information on using existing instances.

 2.7. File System Partitioning Recommendations

For information on setting up file system partitions on master and slave nodes in a HDP cluster, see File System Partitioning Recommendations.

 2.8. Recommended Maximum Open File Descriptors

The recommended maximum number of open file descriptors is 10000 or more. To check the current value set for the maximum number of open file descriptors, execute the following shell commands:

 "ulimit -Sn" and "ulimit -Hn"

loading table of contents...