In a Spark Distribution, in the version file: “spark-2.2.0-bin-hadoop2.7\python\pyspark\version.py”
Dependency
At its core PySpark depends on Py4J, but some additional sub-packages have their own extra requirements for some features (including numpy, pandas, and pyarrow).