Non-Ambari Cluster Installation Guide
Also available as:
PDF
loading table of contents...

Spark Prerequisites

Before installing Spark, make sure your cluster meets the following prerequisites.

Table 19.1. Prerequisites for running Spark 1.5.2

PrerequisiteDescription
Cluster Stack Version
  • HDP 2.3.4 or later

(Optional) Ambari Version
  • 2.2 or later

Software dependencies
  • Spark requires HDFS and YARN

  • PySpark requires Python to be installed on all nodes


[Note]Note

HDP 2.3.4 supports several versions of Apache Spark. When you install HDP 2.3.4, Spark 1.5.2 is installed. If you prefer to use an earlier version of Spark, follow the Spark Manual Downgrade procedure in the Release Notes.