Administration
Also available as:
PDF
loading table of contents...

Bulk Loading Threat Intelligence Information

You can bulk load threat intelligence information from the following sources:

  • Taxii Loader

  • HDFS via MapReduce

  • Flat File Ingestion (CVS)

  • GeoLite2 Loader

Taxii Loader

The shell script $METRON_HOME/bin/threatintel_taxii_load.sh can be used to poll a Taxii server for STIX documents and ingest them into HBase.

It is quite common for this Taxii server to be an aggregation server such as Soltra Edge.

In addition to the Enrichment and Extractor configs described in the following sections, this loader requires a configuration file describing the connection information to the Taxii server. The following is an example of a configuration file:

{
   "endpoint" : "http://localhost:8282/taxii-discovery-service"
  ,"type" : "DISCOVER"
  ,"collection" : "guest.Abuse_ch"
  ,"table" : "threatintel"
  ,"columnFamily" : "t"
  ,"allowedIndicatorTypes" : [ "domainname:FQDN", "address:IPV_4_ADDR" ]
}

where:

endpoint

The URL of the endpoint

type

POLL or DISCOVER depending on the endpoint.

collection

The Taxii collection to ingest

table

The HBase table to import into

columnFamily

The column family to import into

allowedIndicatorTypes

an array of acceptable threat intel types (see the "Enrichment Type Name" column of the Stix table above for the possibilities).

The parameters for the utility are as follows:

Short CodeLong CodeIs Required?Description
-h NoGenerate the help screen/set of options
-e--extractor_configYesJSON Document describing the extractor for this input data source
-c--taxii_connection_configYesThe JSON config file to configure the connection
-p--time_between_pollsNoThe time between polling the Taxii server in milliseconds. (default: 1 hour)
-b--begin_timeNoStart time to poll the Taxii server (all data from that point will be gathered in the first pull). The format for the date is yyyy-MM-dd HH:mm:ss
-l--log4jNoThe Log4j Properties to load
-n--enrichment_configNoThe JSON document describing the enrichments to configure. Unlike other loaders, this is run first if specified.

HDFS via MapReduce

The shell script $METRON_HOME/bin/threatintel_bulk_load.sh will kick off a MapReduce job to load data staged in HDFS into an HBase table.

[Note]Note

Despite what the naming might suggest, this utility works for enrichment as well as threat intel due to the underlying infrastructure being the same.

The parameters for the utility are as follows:

Short CodeLong CodeIs Required?Description
-h NoGenerate the help screen/set of options
-e--extractor_configYesJSON document describing the extractor for this input data source
-t--tableYesThe HBase table to import into
-f--column_familyYesThe HBase table column family to import into
-i--inputYesThe input data location on HDFS
-n--enrichment_configNoThe JSON document describing the enrichments to configure. Unlike other loaders, this is run first if specified.

CSV File

The shell script $METRON_HOME/bin/flatfile_loader.sh will read data from local disk and load the enrichment or threat intel data into an HBase table.

One special thing to note here is that there is a special configuration parameter to the Extractor config that is only considered during this loader:

inputFormatHandler

This specifies how to consider the data. The two implementations are BY_LINE and org.apache.metron.dataloads.extractor.inputformat.WholeFileFormat

The default is BY_LINE, which makes sense for a list of CSVs where each line indicates a unit of information which can be imported. However, if you are importing a set of STIX documents, then you want each document to be considered as input to the Extractor.

The parameters for the utility are as follows:

Short CodeLong CodeIs Required?Description
-h NoGenerate the help screen/set of options
-e--extractor_configYesJSON document describing the extractor for this input data source
-t--hbase_tableYesThe HBase table to import into
-c--hbase_cfYesThe HBase table column family to import into
-i--inputYesThe input data location on local disk. If this is a file, then that file will be loaded. If this is a directory, then the files will be loaded recursively under that directory.
-l--log4jNoThe log4j properties file to load
-n--enrichment_configNoThe JSON document describing the enrichments to configure. Unlike other loaders, this is run first if specified.

GeoLite2 Loader

The shell script $METRON_HOME/bin/geo_enrichment_load.sh will retrieve MaxMind GeoLite2 data and load data into HDFS, and update the configuration.

[Important]Important

This script will not update Ambari's global.json, just the ZooKeeper configuration. Changes will go into effect, but will not persist past an Ambari restart until updated in the global.json file.

The parameters for the utility are as follows:

Short CodeLong CodeIs Required?Description
-h NoGenerate the help screen/set of options
-g--geo_urlNoGeoIP URL - defaults to http://geolite.maxmind.com/download/geoip/database/GeoLite2-City.mmdb.gz
-r--remote_dirNoHDFS directory to land formatted GeoIP file - defaults to /apps/metron/geo/<epoch millis>/
-t--tmp_dirNoDirectory for landing the temporary GeoIP data - defaults to /tmp
-z--zk_quorumYesZooKeeper Quorum URL (zk1:port,zk2:port,...)