Spark is agnostic to the underlying cluster manager.
The installation is then cluster manager dependent .
See Spark - Configuration
To enable HDFS, set HADOOP_CONF_DIR in SPARK_HOME/conf/spark-env.sh to a location containing the configuration files.