Installing HDF Services on an Existing HDP Cluster
Also available as:
PDF

Configure SAM

About This Task

When you configure SAM, you need to provide information about the metadata store database, configure a connection with Schema Registry, and establish the URL for Druid's Supersets

Steps

  1. In the Customize Services step, navigate to the STREAMLINE CONFIG section of the Streaming Analytics Manager tab.

  2. Select Jar Storage Type. If you plan to enable HA for SAM on this cluster, you must select HDFS.

  3. If you selected HDFS as the Jar Storage Type, configure Jar Storage HDFS URL. This specifies the HDFS location where you want the jars to be stored. For example, hdfs://<<NN_HOST:8020:/hdfs/registry.

  4. Configure jar.storage to the directory where you want to store .jar files for custom processors.

  5. Set the streamline.dashboard.url to the Superset URL which you can access using Quick Links for Druid.

  6. Configure registry.url to the REST API Endpoint URL for the Registry. The format should be http://$FQDN_REGISTRY_HOST:$REGISTRY_PORT/api/v1, where:

    • $FQDN_REGISTRY_HOST – Specifies the host on which you are running Schema Registry

    • $REGISTRY_PORT – Specifies the Schema Registry port number. You can find the Schema Registry port in the REGISTRY_CONFIG section of the Registry tab.

    For example: http://FQDN_REGISTRY_HOST:7788/api/v1

  7. Configure the STREAMLINE STORAGE configurations based on the database you created to use as a SAM metadata store.

  8. Ensure the registry storage connector URL has the fully qualified host name of where the database was installed and the connector url and default port for the database selected.

Example

MYSQL example:

jdbc:mysql://FQDN_MYSQL:3306/streamline


Postgres Example:

jdbc:postgresql://FQDN_POSTGRES:5432/streamline

More Information

Installing Databases for HDF Components