2. Option I - Central Push Install Using A Deployment Service

Many Windows Data Centers have standard corporate procedures for performing centralized push-install of software packages to hundreds or thousands of computers at the same time. In general, these same procedures also allow a centralized push-install of HDP to a Hadoop cluster.

If your Data Center already has such procedures in place, then follow this simple checklist:

  1. Identify and configure the hosts for the Hadoop cluster nodes.

  2. On the host nodes, complete all the prerequisites, see the following sections in Preparing the Environment:

    [Note]Note

    Before installation you must set an environment variable for JAVA_HOME. Do not install Java in a location that has spaces in the path name.

  3. Download the HDP Windows Installation package from here, which includes a sample clusterproperties.txt file.

  4. Create Cluster Properties file using your host information, see Define Cluster Properties.

    [Important]Important

    Nodes in the cluster communicate with each other using the host name or IP address defined in the cluster properties file. For multi-homed systems and systems with more than one NIC, ensure that the preferred name or IP address is specified in the Cluster Properties file.

  5. Using your standard procedures to push both the HDP Installer MSI and the custom clusterproperties.txt file to each node in the cluster.

  6. Continuing to use your standard procedures to remotely execute the installation with the msiexec command documented in section Understanding the HDP MSI Installer Properties.

    [Note]Note

    The HDP Installer unpacks the MSI contents to %SystemDrive%\HadoopInstallFiles. A detailed installation log is located at %SystemDrive%\HadoopInstallFiles\HadoopSetupTools\hdp-2.1.5.0.winpkg.install. This folder is required to uninstall HDP, do not remove it.

  7. Examine the return results and/or logs from your standard procedures to ensure that all nodes were successfully installed.

After the installation completes, you must configure and start the Hadoop services.


loading table of contents...