fix: update Spark plugin for compatibility with Spark 3.x and 4.x #3350
+150
−134
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Tracking issue
Why are the changes needed?
The current Spark plugin imports and uses Spark 4-specific data types that do not exist in Spark 3.x (including Spark 3.4).
This results in runtime import errors as
Because Flyte users run Spark workloads across mixed versions (Spark 3.x or Spark 4.x), the plugin must not assume Spark 4 APIs exist at runtime.
Without this patch, Spark 3.x tasks fail immediately, even if their logic does not depend on Spark-4-only features.
What changes were proposed in this pull request?
How was this patch tested?
Tested PySpark locally with Spark 3.4 and Spark 4.x.
Built a Flyte sandbox image with the updated transformer and schema.
Ran Spark tasks in Flyte using:
Spark 3.4 base image → passed
Spark 4.x base image → passed
Setup process
Screenshots
Check all the applicable boxes
Related PRs
Docs link
Summary by Bito