Hortonworks Cybersecurity Platform
Also available as:
PDF
loading table of contents...

Bulk Loading Threat Intelligence Sources

Hortonworks Cybersecurity Platform (HCP) is designed to work with STIX/Taxii threat feeds, but can also be bulk loaded with threat data from a CSV file.

You can bulk load threat intelligence information from the following sources:

  • CSV Ingestion

  • HDFS through MapReduce

  • Taxii Loader

CSV File

The shell script $METRON_HOME/bin/flatfile_loader.sh reads data from local disk and loads the threat intelligence data into an HBase table. This loader uses the special configuration parameter inputFormatHandler to specify how to consider the data. The two implementations are BY_LINE and org.apache.metron.dataloads.extractor.inputformat. WholeFileFormat.

The default is BY_LINE, which makes sense for a list of CSVs in which each line indicates a unit of information to be imported. However, if you are importing a set of STIX documents, then you want each document to be considered as input to the Extractor.

Start the user parser topology by running the following:

$METRON_HOME/bin/start_parser_topology.sh -s user -z $ZOOKEEPER_HOST:2181 -k $KAKFA_HOST:6667

The parser topology listens for data streaming in and pushes the data to HBase. Now you have data flowing into the HBase table, but you need to ensure that the enrichment topology can be used to enrich the data flowing past.

The parameters for the utility are as follows:

Short Code Long Code Is Required? Description
-h No Generate the help screen/set of options
-e --extractor_config Yes JSON document describing the extractor for this input data source
-t --hbase_table Yes The HBase table to import into
-c --hbase_cf Yes The HBase table column family to import into
-i --input Yes The input data location on local disk. If this is a file, then that file will be loaded. If this is a directory, then the files will be loaded recursively under that directory.
-l --log4j No The log4j properties file to load
-n --enrichment_config No The JSON document describing the enrichments to configure. Unlike other loaders, this is run first if specified.

HDFS via MapReduce

The shell script $METRON_HOME/bin/flatfile_loader.sh will kick off the MapReduce job to load data stated in HDFS into an HBase table. The following is as example of the syntax:
                           $METRON_HOME/bin/flatfile_loader.sh -i /tmp/top-10k.csv -t enrichment -c t -e ./extractor.json -m MR
                        

The parameters for the utility are as follows:

Short Code Long Code Is Required? Description
-h No Generate the help screen/set of options
-e --extractor_config Yes JSON document describing the extractor for this input data source
-t --hbase_table Yes The HBase table to import into
-c --hbase_cf Yes The HBase table column family to import into
-i --input Yes The input data location on local disk. If this is a file, then that file will be loaded. If this is a directory, then the files will be loaded recursively under that directory.
-l --log4j No The log4j properties file to load
-n --enrichment_config No The JSON document describing the enrichments to configure. Unlike other loaders, this is run first if specified.

Taxii Loader

The shell script $METRON_HOME/bin/threatintel_taxii_load.sh can be used to poll a Taxii server for STIX documents and ingest them into HBase. Taxii loader is a stand-alone Java application that never stops.

It is quite common for this Taxii server to be an aggregation server such as Soltra Edge.

In addition to the Enrichment and Extractor configs described in the following sections, this loader requires a configuration file describing the connection information to the Taxii server. The following is an example of a configuration file:

                        {
   "endpoint" : "http://localhost:8282/taxii-discovery-service"
  ,"type" : "DISCOVER"
  ,"collection" : "guest.Abuse_ch"
  ,"table" : "threat_intel"
  ,"columnFamily" : "cf"
  ,"allowedIndicatorTypes" : [ "domainname:FQDN", "address:IPV_4_ADDR" ]
}

                     

where:

endpoint

The URL of the endpoint

type

POLL or DISCOVER depending on the endpoint.

collection

The Taxii collection to ingest

table

The HBase table to import into

columnFamily

The column family to import into

allowedIndicatorTypes

an array of acceptable threat intelligence types (see the "Enrichment Type Name" column of the Stix table above for the possibilities).

The parameters for the utility are as follows:

Short Code Long Code Is Required? Description
-h No Generate the help screen/set of options
-e --extractor_config Yes JSON Document describing the extractor for this input data source
-c --taxii_connection_config Yes The JSON config file to configure the connection
-p --time_between_polls No The time between polling the Taxii server in milliseconds. (default: 1 hour)
-b --begin_time No Start time to poll the Taxii server (all data from that point will be gathered in the first pull). The format for the date is yyyy-MM-dd HH:mm:ss
-l --log4j No The Log4j Properties to load
-n --enrichment_config No The JSON document describing the enrichments to configure. Unlike other loaders, this is run first if specified.