flytekitplugins.spark.Spark#
- class flytekitplugins.spark.Spark(spark_conf=None, hadoop_conf=None, executor_path=None, applications_path=None, driver_pod=None, executor_pod=None)#
Use this to configure a SparkContext for a your task. Task’s marked with this will automatically execute natively onto K8s as a distributed execution of spark
- Parameters:
executor_path (str | None)
applications_path (str | None)
driver_pod (PodTemplate | None)
executor_pod (PodTemplate | None)
- driver_pod#
The pod template for the Spark driver pod.
- Type:
Optional[PodTemplate]
- executor_pod#
The pod template for the Spark executor pod.
- Type:
Optional[PodTemplate]
Methods
Attributes
- driver_pod: PodTemplate | None = None
- executor_pod: PodTemplate | None = None