Configuring DataNode SASL
Use the following steps to configure DataNode SASL to securely run a DataNode as a non-root user
- Shut down the DataNode using the applicable commands in “Controlling HDP Services Manually”.
- Enable SASL:
Configure the following properties in the
/etc/hadoop/conf/hdfs-site.xmlfile to enable DataNode SASL:The
dfs.data.transfer.protectionproperty enables DataNode SASL. You can set this property to one of the following values:
authentication-- Establishes mutual authentication between the client and the server.
integrity-- in addition to authentication, it guarantees that a man-in-the-middle cannot tamper with messages exchanged between the client and the server.
privacy-- in addition to the features offered by authentication and integrity, it also fully encrypts the messages exchanged between the client and the server.
In addition to setting a value for the
dfs.data.transfer.protectionproperty, you must set the
HTTPS_ONLY. You must also specify ports for the DataNode RPC and HTTP Servers.
<property> <name>dfs.data.transfer.protection</name> <value>integrity</value> </property> <property> <name>dfs.datanode.address</name> <value>0.0.0.0:10019</value> </property> <property> <name>dfs.datanode.http.address</name> <value>0.0.0.0:10022</value> </property> <property> <name>dfs.http.policy</name> <value>HTTPS_ONLY</value> </property>
- Configure the following properties in the
- Update Environment Settings. Edit the following setting in the
/etc/hadoop/conf/hadoop-env.shfile, as shown below:
#On secure datanodes, user to run the datanode as after dropping privileges export HADOOP_SECURE_DN_USER=
export HADOOP_SECURE_DN_USER=hdfsline enables the legacy security configuration, and must be set to an empty value in order for SASL to be enabled.
- Start the DataNode services.