Table of Contents

About

Spark is agnostic to the underlying cluster manager.

The installation is then cluster manager dependent .

Installation Type / cluster managers

Docker

Configuration

See Spark - Configuration

Hdfs

To enable HDFS, set HADOOP_CONF_DIR in SPARK_HOME/conf/spark-env.sh to a location containing the configuration files.