Non-Ambari Cluster Installation Guide
Also available as:
PDF
loading table of contents...

Spark Prerequisites

Before installing Spark, make sure your cluster meets the following prerequisites:

Table 19.1. Spark Cluster Prerequisites

Item

Prerequisite

Cluster Stack Version

  • HDP 2.2.6 or later

(Optional) Ambari Version

  • 2.1 or later

Components

  • Spark requires HDFS and YARN

  • PySpark requires Python to be installed on all nodes


[Note]Note

If you installed the Spark tech preview, save any configuration changes you made to the tech preview environment. Install Spark, and then update the configuration with your changes.