Spark RDD - Spark Context (sc, sparkContext)

Spark Pipeline


A Spark Context is a connection object for an RDD.



from a sparksession

sc = spark.sparkContext


Discover More
Sparkmagic Hello
Jupyter - SparkMagic

Sparkmagic is a kernel that provides Ipython magic for working with Spark clusters through Livy in Jupyter notebooks. installation ...
Spark Pipeline
RDD - Calling a Worker (Local|External) Process

How to call a (forked) external process from Spark Example with a
Spark Pipeline
RDD - Pipe

pipe is a transformation pipe return an RDD created by piping elements to a forked external process. Example with a...
Card Puncher Data Processing
Spark - Connection (Context)

A Spark Connection is : a context object (known also as connection) the first step when creating a script This object is called: an SQL Context for a RDD (in Spark 1.x.) SparkSession for a...

Share this page:
Follow us:
Task Runner