Installing HDF Services on a New HDP Cluster
Also available as:
PDF
loading table of contents...

Contents

1. Installing Ambari
Getting Ready for an Ambari Installation
Reviewing System Requirements
Set Up Password-less SSH
Set Up Service User Accounts
Enable NTP on the Cluster and on the Browser Host
Check DNS and NSCD
Configuring iptables
Disable SELinux and PackageKit and check the umask Value
Download the Ambari Repository
RHEL/CentOS/Oracle Linux 6
RHEL/CentOS/Oracle Linux 7
Install the Ambari Server
RHEL/CentOS/Oracle Linux 6
RHEL/CentOS/Oracle Linux 7
SLES 12
SLES 11
Ubuntu 14
Ubuntu 16
Debian 7
Set Up the Ambari Server
Setup Options
Start the Ambari Server
2. Installing Databases
Installing MySQL
Configuring SAM and Schema Registry Metadata Stores in MySQL
Configuring Druid and Superset Metadata Stores in MySQL
Install Postgres
Configure Postgres to Allow Remote Connections
Configure SAM and Schema Registry Metadata Stores in Postgres
Configure Druid and Superset Metadata Stores in Postgres
Specifying an Oracle Database to Use with SAM and Schema Registry
Switching to an Oracle Database After Installation
3. Deploying an HDP Cluster Using Ambari
Installing an HDP Cluster
Customizing Druid Services
Configure Superset
Deploy the Cluster Services
Access the Stream Insight Superset UI
4. Installing the HDF Management Pack
5. Update the HDF Base URL
6. Add HDF Services to an HDP Cluster
7. Configure HDF Components
Configure Schema Registry
Configure SAM
Configure NiFi
Configure Kafka
Configure Storm
Deploy the Cluster Services
Access the UI for Deployed Services
8. Configuring Schema Registry and SAM for High Availability
9. Install the Storm Ambari View
10. Using a Local Repository
Setting Up a Local Repository
Getting Started Setting Up a Local Repository
Setting Up a Local Repository with No Internet Access
Setting up a Local Repository With Temporary Internet Access
Preparing The Ambari Repository Configuration File
11. Navigating the HDF Library