-
Notifications
You must be signed in to change notification settings - Fork 1.5k
unify SparkApplication defaulting logic #2671
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Verification Report:
$ kubectl get pods --namespace spark-operator
NAME READY STATUS RESTARTS AGE
spark-operator-controller-74b96777cd-jwcnn 1/1 Running 0 45m
spark-operator-webhook-996c9c77b-9mmpg 1/1 Running 0 45m
$ kubectl apply -f - <<EOF
apiVersion: sparkoperator.k8s.io/v1beta2
kind: SparkApplication
metadata:
name: spark-pi
namespace: default
spec:
type: Scala
mode: cluster
image: docker.io/library/spark:4.0.0
imagePullPolicy: IfNotPresent
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: local:///opt/spark/examples/jars/spark-examples.jar
arguments:
- "5000"
sparkVersion: 4.0.0
driver:
labels:
version: 4.0.0
memory: 512m
memoryOverhead: 100m
serviceAccount: spark-operator-spark
securityContext:
capabilities:
drop:
- ALL
runAsGroup: 185
runAsUser: 185
runAsNonRoot: true
allowPrivilegeEscalation: false
seccompProfile:
type: RuntimeDefault
executor:
labels:
version: 4.0.0
cores: 1
memory: 512m
securityContext:
capabilities:
drop:
- ALL
runAsGroup: 185
runAsUser: 185
runAsNonRoot: true
allowPrivilegeEscalation: false
seccompProfile:
type: RuntimeDefault
EOF
$ kubectl get sparkapplications.sparkoperator.k8s.io spark-pi -oyaml |grep instance
instances: 1 |
|
The currecnt CI workflow is broken, we will fix it and then review this PR again. |
Signed-off-by: zhzhuang-zju <[email protected]>
|
/ok-to-test |
|
@zhzhuang-zju Thank you for your contribution for improving the consistency of defaulting logic. |
|
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: ChenYi015 The full list of commands accepted by this bot can be found here. The pull request process is described here DetailsNeeds approval from an approver in each of these files:
Approvers can indicate their approval by writing |
Purpose of this PR
This PR consolidates the SparkApplication defaulting logic that was previously maintained in two separate places (API declaration level and webhook-specific defaults) into a unified approach using Kubernetes SchemeBuilder pattern.
Proposed changes:
api/v1beta2/defaults.go)runtime.NewSchemeBuilder(addKnownTypes, addDefaultingFuncs)operatorscheme.WebhookScheme.Default(app)for consistent default applicationChange Category
Rationale
Previously, SparkApplication default values were maintained in two different locations, which created potential inconsistencies and maintenance overhead. By consolidating the defaulting logic into the SchemeBuilder pattern:
Checklist
Additional Notes
This change maintains backward compatibility while improving the internal architecture. The webhook behavior remains the same from an external perspective.