Installation
Also available as:
PDF
loading table of contents...

Streaming Data into HCP

To prepare for HCP to ingest data source data into HCP, you must stream each raw event stream from the telemetry data source into its own individual Kafka topic. This applies to the telemetry data sources for which HCP includes parsers (for example, Bro, Snort, and YAF). Even though HCP includes parsers for these data sources, HCP does not install these data sources or ingest the raw data. This is something that you must do.

[Note]Note

When you install and configure Snort, you must configure Snort to include the year in the timestamp by modifying the snort.conf file as follows:

# Configure Snort to show year in timestamps
config show_year

Depending on the type of data you are streaming into HCP, you can use one of the following methods:

NiFi

This type of streaming method works for most types of data sources. For information on installing NiFi, see Installing HDF Services on an Existing HDP Cluster. For information on using NiFi to ingest data sources into HCP, see Building a DataFlow.

[Note]Note

Ensure that the NiFi web application is using port 8089.

Performant network ingestion probes

This type of streaming method is ideal for streaming high volume packet data. See Setting Up pcap to View Your Raw Data for more information.

Real-time and batch threat intelligence feed loaders

This type of streaming method is used for real-time and batch threat intelligence feed loadNiFiers. For more information see Using Threat Intel Feed Sources.