Table of Contents
About
A Spark Context is a
connection object
for an RDD.
Articles Related
Spark - Connection (Context)
RDD - Pipe
Jupyter - SparkMagic
RDD - Calling a Worker (Local|External) Process
Management
Get
from a
sparksession
sc = spark.sparkContext
API
pyspark.SparkContext