Chapter 3. Quick Start Guide for Single Node HDP Installation

Use the following instructions to deploy HDP on a single node Windows Server machine:

  1. On the host, complete all the prerequisites, see the following sections in Getting Ready to Install:


    Before installation you must set an environment variable for JAVA_HOME. Do not install Java in a location that has spaces in the path name.

  2. Prepare the single node machine.

    1. Configure firewall.

      HDP uses multiple ports for communication with clients and between service components.

      If your corporate policies require maintaining per server firewall, you must enable the ports listed here. Use the following command to open these ports:

      netsh advfirewall firewall add rule name=AllowRPCCommunication dir=in action=allow protocol=TCP localport=$PORT_NUMBER
      • For example, the following command will open up port 80 in the active Windows Firewall:

        netsh advfirewall firewall add rule name=AllowRPCCommunication dir=in action=allow protocol=TCP localport=80
      • For example, the following command will open ports all ports from 49152 to 65535. in the active Windows Firewall:

        netsh advfirewall firewall add rule name=AllowRPCCommunication dir=in action=allow protocol=TCP localport=49152-65535

      If your networks security policies allow you open all the ports, use the following instructions to disable Windows Firewall:

  3. Install and start HDP.

    1. Download the HDP for Windows MSI file from:

    2. Open a command prompt as Administrator:

      runas /user:administrator "cmd /C msiexec /lv c:\hdplog.txt /i $PATH_to_MSI_file  MSIUSEREALADMINDETECTION=1" 
    3. Run the MSI installer command. If you are installing on Windows Server 2012, use this method to open the installer:

      where the $PATH_to_MSI_file parameter should be modified to match the location of the downloaded MSI file.

      The following example illustrates the command to launch the installer:

      runas /user:administrator "cmd /C msiexec /lv c:\hdplog.txt /i C:\MSI_INSTALL\hdp-  MSIUSEREALADMINDETECTION=1"         
    4. The HDP Setup window appears pre-populated with the host name of the server, as well as default installation parameters.

      You must specify the following parameters:

      • Hadoop User Password: Enter that password for the Hadoop super user (the administrative user). This password enables you to log in as the administrative user and perform administrative actions. Password requirements are controlled by Windows, and typically require that the password include a combination of uppercase and lowercase letters, digits, and special characters.

      • Hive and Oozie DB Names, Usernames, and Passwords: Set the DB (database) name, user name, and password for the Hive and Oozie metastores. You can use the boxes at the lower left of the HDP Setup window ("Hive DB Name", "Hive DB Username", etc.) to specify these parameters.

      • DB Flavor: Select DERBY to use an embedded database for the single-node HDP installation.

      You can optionally configure the following parameters (for a detailed description of each option, see Defining Cluster Properties:

      • HDP Directory: The directory in which HDP will be installed. The default installation directory is c:\hdp.

      • Log Directory: The directory for the HDP service logs. The default location is c:\hadoop\logs.

      • Data Directory: The directory for user data for each HDP service. The default location is c:\hdpdata.

      • Delete Existing HDP Data: Selecting this check box removes any existing data from prior HDP installs. This ensures that HDFS starts with a formatted file system. For a single node installation, it is recommended that you select this option to start with a freshly formatted HDFS.

      • Install HDP Additional Components: Select this check box to install ZooKeeper, Flume, Storm, Knox or HBase as HDP services deployed to the single node server.


      When deploying HDP with the LZO compression enabled, put the following three files in the same directory as the HDP for Windows Installer (and the file):

      • hadoop-lzo- from the HDP for Windows Installation zip.

      • gplcompression.dll from the HDP for Windows Installation zip.

      • lzo2.dll LZO compression DLL downloaded from here.

    5. When you have finished setting the installation parameters, click Install to install HDP.


      The Export button on the HDP Setup window to exports the configuration information for use in a CLI/script-driven deployment. Clicking Export stops the installation and creates a clusterproperties.txt file that contains the configuration information specified in the fields on the HDP Setup window.

      The HDP Setup window closes, and a progress indicator displays while the installer is running. The installation may take several minutes. Also, the time remaining estimate may be inaccurate.

      A confirmation message displays when the installation is complete.


      If you did not select the "Delete existing HDP data"check box, and you are reinstalling Hadoop the HDFS file system must be formatted. To format the HDFS file system, open the Hadoop Command Line shortcut on the Windows desktop, then run the following command:

      %HADOOP_HOME%\bin\hadoop namenode -format
    6. Start all HDP services on the single machine.

      In a command prompt, navigate to the HDP install directory. This is the "HDP directory" setting you specified in the HDP Setup window.

      Run the following command from the HDP install directory:

    7. Validate the install by running the full suite of smoke tests:

      1. Create a smoketest user directory in HDFS:

        %HADOOP_HOME%\bin\hadoop -mkdir -p /user/smoketest
        %HADOOP_HOME%\bin\hadoop dfs -chown -R smoketest  
      2. Run the provided smoke tests as the hadoop user or create a smoketest user in HDFS:

        runas /user:hadoop "cmd /K %HADOOP_HOME%\Run-SmokeTests.cmd"
      3. Run as the smoketest user to verify that the HDP services work as expected:

        runas /user:smoketest "cmd /K %HADOOP_HOME%\Run-SmokeTests.cmd"

loading table of contents...