HomeDbSparkRddSc
Table of Contents
A Spark Context is a connection object for an RDD.
from a sparksession
sc = spark.sparkContext