Installing HDF Services on an Existing HDP Cluster for IBM Power Systems
Also available as:
loading table of contents...


1. Upgrading Ambari and the HDF Management Pack
Preparing to Upgrade
Prepare Ambari for Upgrade
Get the Ambari Repository
Upgrade Ambari Server
Upgrade the Ambari Agents
Upgrade the HDF Management Pack
Upgrade the Ambari Database Schema
Restart Ambari
Mandatory Post-Upgrade Tasks
Upgrading Ambari Infra
Upgrading Ambari Log Search
Upgrading Ambari Metrics
Upgrading Configurations
Upgrading SmartSense
2. Upgrading to HDP 2.6.5
Before you begin
Upgrade options
3. Installing Databases
Installing MySQL
Configuring SAM and Schema Registry Metadata Stores in MySQL
Configuring Druid and Superset Metadata Stores in MySQL
Install Postgres
Configure Postgres to Allow Remote Connections
Configure SAM and Schema Registry Metadata Stores in Postgres
Configure Druid and Superset Metadata Stores in Postgres
Specifying an Oracle Database to Use with SAM and Schema Registry
Switching to an Oracle Database After Installation
4. Installing the HDF Management Pack
5. Update the HDF Base URL
6. Add HDF Services to an HDP Cluster
7. Configure HDF Components
Configure NiFi
Configure NiFi for Atlas Integration
Configure Kafka
Configure Storm
Configure Log Search
Deploy the Cluster Services
Access the UI for Deployed Services
8. Configuring Schema Registry and SAM for High Availability
9. Install the Storm Ambari View
10. Using a Local Repository
Obtaining the Repositories
Ambari Repositories
HDP Stack Repositories
Setting Up a Local Repository
Getting Started Setting Up a Local Repository
Preparing The Ambari Repository Configuration File
11. Navigating the HDF Library