Release Notes
Also available as:

Chapter 1. Hortonworks DataFlow 3.0.3 Release Notes

This document provides you with the latest information about the HDF 3.0.3 release and its product documentation.

Component Support

HDF 3.0.3 includes the following components:

  • Apache Ambari 2.6.0

  • Apache Kafka

  • Apache NiFi 1.2.0

  • Apache Ranger 0.7.0

  • Apache Storm 1.1.0

  • Apache ZooKeeper 3.4.6

  • Hortonworks Schema Registry 0.3.0

  • Hortonworks Streaming Analytics Manager 0.5.0

What's New in HDF 3.0.3

HDF 3.0.3 is a maintenance release that certifies HDF for use on IBM Power Systems.


You cannot install HDF 3.0.3 on systems other than IBM Power.

For complete information about the HDF 3.0.x release, see the following Release Notes:

Unsupported Features

Some features exist within HDF 3.0.3, but Hortonworks does not currently support these capabilities.

Technical Preview Features

The following features are available within HDF 3.0.3 but are not ready for production deployment. Hortonworks encourages you to explore these technical preview features in non-production environments and provide feedback on your experiences through the Hortonworks Community Forums.

Table 1.1. Technical Previews

NiFiKnox proxy integration
Streaming Analytics Manager
  • SAM Stream Insights module (Druid and Apache Superset)

  • Sinks

    • Cassandra

    • OpenTSDB

    • Solr

  • Sources

    • HDFS

Community Driven Features

The following features are developed and tested by the Hortonworks community but are not officially supported by Hortonworks. These features are excluded for a variety of reasons, including insufficient reliability or incomplete test case coverage, declaration of non-production readiness by the community at large, and feature deviation from Hortonworks best practices. Do not use these features in your production environments.

Community Driven Kafka features

  • Kafka Connect

  • Kafka Streams

Community Driven NiFi Tools and Services

  • Embedded ZooKeeper

  • Sensitive key migration toolkit

  • Docker image for Apache NiFi

Community Driven NiFi Processors

  • AttributeRollingWindow

  • AWSCredentialsProviderControllerService

  • CompareFuzzyHash

  • ConsumeEWS

  • ConsumeIMAP

  • ConsumePOP3

  • ConvertExcelToCSVProcessor

  • DebugFlow

  • DeleteDynamoDB

  • DeleteGCSObject

  • DeleteHDFS

  • ExtractCCDAAttributes

  • ExtractEmailAttachments

  • ExtractEmailHeaders

  • ExtractMediaMetadata

  • ExtractTNEFAttachments

  • FetchAzureBlobStorage

  • FetchGCSObject

  • FuzzyHashContent

  • GetDynamoDB

  • GetHDFSEvents

  • GetSNMP

  • ISPEnrichIP

  • InferAvroSchema

  • ListenBeats

  • ListenLumberjack

  • ListenSMTP

  • ListAzureBlobStorage

  • ListGCSBucket

  • ListS3

  • ModifyBytes

  • OrcFormatConversion

  • PutAzureBlobStorage

  • PutDynamoDB

  • PutGCSObject

  • PutIgniteCache

  • PutKinesisFirehose

  • PutKinesisStream

  • PutLambda

  • PutSlack

  • PutTCP

  • PutUDP

  • QueryDNS

  • SetSNMP

  • SpringContextProcessor

  • StoreInKiteDataset

Community Driven NiFi Controller Services

  • AWSCredentialsProviderControllerService

  • GCPCredentialsControllerService

Community Driven NiFi Reporting Tasks

  • SiteToSiteBulletinReportingTask

  • SiteToSiteStatusReportingTask

  • DataDogReportingTask

Unsupported Customizations

Hortonworks cannot guarantee that default NiFi processors are compatible with proprietary protocol implementations or proprietary interface extensions. For example, we support interfaces like JMS and JDBC that are built around standards, specifications, or open protocols. But we do not support customizations of those interfaces, or proprietary extensions built on top of those interfaces.

HDF Repository Locations

Use the following table to identify the HDF 3.0.3 repository location for your operating system and operational objectives. HDF 3.0.3 supports the following operating systems:

Common Vulnerabilities and Exposures

The following CVEs have been fixed in HDF 3.0.3.


Summary: Apache NiFi XXE issue in template XML upload
Severity: Medium
Versions Affected: Apache NiFi 1.0.0 - 1.3.0; HDF 2.x, 3.0.0 -
Impact: An authorized user could upload a template which contained malicious code and accessed sensitive files via an XML External Entity (XXE) attack.
Recommended Action: The fix to properly handle XML External Entities was applied on the Apache NiFi 1.4.0 release. Users running a prior 1.x release should upgrade to the appropriate release. HDF users should upgrade to HDF 3.0.2.

Known Issues

Hortonworks Bug ID

Apache JIRA



BUG-87353 SAM/Kafka/Ambari

Issue: If you are upgrading from HDF 3.0.0, SAM may be unable to configure Kafka sources.

Result: SAM displays the following error message:

Output stream fields cannot be blank

Workaround: To work around this issue manually upgrade the Kafka bundle using the following steps:

curl -i --negotiate -u:anyUser -b /tmp/cookiejar.txt -c /tmp/cookiejar.txt -sS -i HOST:8080/api/v1/catalog/streams/componentbundles/SOURCE?subType=KAFKA
Get the value of "Id" from the above request.

Check if /usr/hdf/current/streamline/bootstrap/components/sources/kafka-source-topology-component.json has the field "readerSchemaVersion"

curl -i --negotiate -u:anyUser -b /tmp/cookiejar.txt -c /tmp/cookiejar.txt -sS -X PUT -i -F topologyComponentBundle=@/usr/hdf/current/streamline/bootstrap/components/sources/kafka-source-topology-component.json HOST:8080/api/v1/catalog/streams/componentbundles/SOURCE/
{ID from the earlier curl command}

Issue: NiFi/Knox interation does not support HA.

In NiFiHaDispatch, executeRequest is overridden and does not have the try/catch block in DefaultHaDispatch's executeRequest method which is used to catch exceptions and begin the failover process.

Workaround: There is no workaround for this issue.


Issue: Knox HA failover for NiFi is not supported.

Workaround: There is no workaround for this issue.


Summary: Solr bolt does not run in a Kerberos environment.

Associated error message: The following is an example: [ERROR] Request to collection hadoop_logs failed due to (401) org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http:[...] Error 401 Authentication required

Workaround: None at this time.


Summary: When you are using a Kerberized environment, SAM topology is not writing data to HDFS.

Workaround: There is no workaround for this issue.

Third-Party Licenses

HDF 3.0.3 deploys numerous third-party licenses and dependencies, all of which are compatible with the Apache software license. For complete third-party license information, see the licenses and notice files contained within the distribution.