Configuring Authentication with Kerberos
Also available as:
PDF
loading table of contents...

Configure HTTP Authentication for HDFS, YARN, MapReduce2, HBase, Oozie, Falcon, and Storm

How to configure HTTP authentication for Hadoop components in a Kerberos environment.

  1. Create a secret key used for signing authentication tokens. This file should contain random data and be placed on every host in the cluster. It should also be owned by the hdfs user and group owned by the hadoop group. Permissions should be set to 440. For example:
    dd if=/dev/urandom of=/etc/security/http_secret bs=1024 count=1
    chown hdfs:hadoop /etc/security/http_secret
    chmod 440 /etc/security/http_secret
  2. In Ambari Web, browse to Services > HDFS > Configs .
  3. Add or modify the following configuration properties to Advanced core-site:

    Property

    New Value

    hadoop.http.authentication.simple.anonymous.allowed

    false

    hadoop.http.authentication.signature.secret.file

    /etc/security/http_secret

    hadoop.http.authentication.type

    kerberos

    hadoop.http.authentication.kerberos.keytab

    /etc/security/keytabs/spnego.service.keytab

    hadoop.http.authentication.kerberos.principal HTTP/_HOST@ EXAMPLE.COM
    hadoop.http.filter.initializers org.apache.hadoop.security.AuthenticationFilterInitializer
    hadoop.http.authentication.cookie.domain hortonworks.local
    Note
    Note

    The entries listed in the above table in bold and italicized are site-specific. The hadoop.http.authentication.cookie.domain property is based off of the fully qualified domain names of the servers in the cluster. For example if the FQDN of your NameNode is host1.hortonworks.local, the hadoop.http.authentication.cookie.domain should be set to hortonworks.local.

  4. Save the configuration, then restart the affected services.