-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-40417][K8S][DOCS] Use YuniKorn v1.1+ #37872
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Closed
Closed
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Member
Author
|
Could you review this, @viirya ? |
viirya
approved these changes
Sep 13, 2022
Member
|
lgtm |
Member
Author
|
Thank you, @viirya . Merged to master/3.3. |
dongjoon-hyun
added a commit
that referenced
this pull request
Sep 13, 2022
### What changes were proposed in this pull request? This PR aims to update K8s document to declare the support of YuniKorn v1.1.+ ### Why are the changes needed? YuniKorn 1.1.0 has 87 JIRAs and is the first version to support multi-arch officially. - https://yunikorn.apache.org/release-announce/1.1.0 ``` $ docker inspect apache/yunikorn:scheduler-1.0.0 | grep Architecture "Architecture": "amd64", $ docker inspect apache/yunikorn:scheduler-1.1.0 | grep Architecture "Architecture": "arm64", ``` ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Manually tested with Apache YuniKorn v1.1.0+. ``` $ build/sbt -Psparkr -Pkubernetes -Pkubernetes-integration-tests \ -Dspark.kubernetes.test.deployMode=docker-desktop "kubernetes-integration-tests/test" \ -Dtest.exclude.tags=minikube,local,decom \ -Dtest.default.exclude.tags= ... [info] KubernetesSuite: [info] - Run SparkPi with no resources (11 seconds, 238 milliseconds) [info] - Run SparkPi with no resources & statefulset allocation (11 seconds, 58 milliseconds) [info] - Run SparkPi with a very long application name. (9 seconds, 948 milliseconds) [info] - Use SparkLauncher.NO_RESOURCE (9 seconds, 884 milliseconds) [info] - Run SparkPi with a master URL without a scheme. (9 seconds, 834 milliseconds) [info] - Run SparkPi with an argument. (9 seconds, 870 milliseconds) [info] - Run SparkPi with custom labels, annotations, and environment variables. (9 seconds, 887 milliseconds) [info] - All pods have the same service account by default (9 seconds, 891 milliseconds) [info] - Run extraJVMOptions check on driver (5 seconds, 888 milliseconds) [info] - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties (10 seconds, 261 milliseconds) [info] - Run SparkPi with env and mount secrets. (18 seconds, 702 milliseconds) [info] - Run PySpark on simple pi.py example (10 seconds, 944 milliseconds) [info] - Run PySpark to test a pyfiles example (13 seconds, 934 milliseconds) [info] - Run PySpark with memory customization (10 seconds, 853 milliseconds) [info] - Run in client mode. (11 seconds, 301 milliseconds) [info] - Start pod creation from template (9 seconds, 853 milliseconds) [info] - SPARK-38398: Schedule pod creation from template (9 seconds, 923 milliseconds) [info] - Run SparkR on simple dataframe.R example (13 seconds, 929 milliseconds) [info] YuniKornSuite: [info] - Run SparkPi with no resources (9 seconds, 769 milliseconds) [info] - Run SparkPi with no resources & statefulset allocation (9 seconds, 776 milliseconds) [info] - Run SparkPi with a very long application name. (9 seconds, 856 milliseconds) [info] - Use SparkLauncher.NO_RESOURCE (9 seconds, 803 milliseconds) [info] - Run SparkPi with a master URL without a scheme. (10 seconds, 783 milliseconds) [info] - Run SparkPi with an argument. (10 seconds, 771 milliseconds) [info] - Run SparkPi with custom labels, annotations, and environment variables. (9 seconds, 868 milliseconds) [info] - All pods have the same service account by default (10 seconds, 811 milliseconds) [info] - Run extraJVMOptions check on driver (6 seconds, 858 milliseconds) [info] - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties (11 seconds, 171 milliseconds) [info] - Run SparkPi with env and mount secrets. (18 seconds, 221 milliseconds) [info] - Run PySpark on simple pi.py example (11 seconds, 970 milliseconds) [info] - Run PySpark to test a pyfiles example (13 seconds, 990 milliseconds) [info] - Run PySpark with memory customization (11 seconds, 992 milliseconds) [info] - Run in client mode. (11 seconds, 294 milliseconds) [info] - Start pod creation from template (11 seconds, 10 milliseconds) [info] - SPARK-38398: Schedule pod creation from template (9 seconds, 956 milliseconds) [info] - Run SparkR on simple dataframe.R example (12 seconds, 992 milliseconds) [info] Run completed in 10 minutes, 15 seconds. [info] Total number of tests run: 36 [info] Suites: completed 2, aborted 0 [info] Tests: succeeded 36, failed 0, canceled 0, ignored 0, pending 0 [info] All tests passed. [success] Total time: 751 s (12:31), completed Sep 13, 2022, 11:47:24 AM ``` Closes #37872 from dongjoon-hyun/SPARK-40417. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]> (cherry picked from commit a934cab) Signed-off-by: Dongjoon Hyun <[email protected]>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What changes were proposed in this pull request?
This PR aims to update K8s document to declare the support of YuniKorn v1.1.+
Why are the changes needed?
YuniKorn 1.1.0 has 87 JIRAs and is the first version to support multi-arch officially.
Does this PR introduce any user-facing change?
No.
How was this patch tested?
Manually tested with Apache YuniKorn v1.1.0+.