How to a jar file when executing a PySpark script.
When starting pyspark, it had this directory in the classpath. Add your Jar there.
With the pyspark client and the –jars option
pyspark --jars file1.jar,file2.jar
conf = SparkConf().set("spark.jars", "/path/to.jar")
sc = SparkContext( conf=conf)
See Spark - Classpath (SPARK_CLASSPATH)