Skip to content

Conversation

@dongjoon-hyun
Copy link
Member

@dongjoon-hyun dongjoon-hyun commented Sep 13, 2022

What changes were proposed in this pull request?

This PR aims to update K8s document to declare the support of YuniKorn v1.1.+

Why are the changes needed?

YuniKorn 1.1.0 has 87 JIRAs and is the first version to support multi-arch officially.

$ docker inspect apache/yunikorn:scheduler-1.0.0 | grep Architecture
        "Architecture": "amd64",
$ docker inspect apache/yunikorn:scheduler-1.1.0 | grep Architecture
        "Architecture": "arm64", 

Does this PR introduce any user-facing change?

No.

How was this patch tested?

Manually tested with Apache YuniKorn v1.1.0+.

$ build/sbt -Psparkr -Pkubernetes -Pkubernetes-integration-tests \
-Dspark.kubernetes.test.deployMode=docker-desktop "kubernetes-integration-tests/test" \
-Dtest.exclude.tags=minikube,local,decom \
-Dtest.default.exclude.tags=
...
[info] KubernetesSuite:
[info] - Run SparkPi with no resources (11 seconds, 238 milliseconds)
[info] - Run SparkPi with no resources & statefulset allocation (11 seconds, 58 milliseconds)
[info] - Run SparkPi with a very long application name. (9 seconds, 948 milliseconds)
[info] - Use SparkLauncher.NO_RESOURCE (9 seconds, 884 milliseconds)
[info] - Run SparkPi with a master URL without a scheme. (9 seconds, 834 milliseconds)
[info] - Run SparkPi with an argument. (9 seconds, 870 milliseconds)
[info] - Run SparkPi with custom labels, annotations, and environment variables. (9 seconds, 887 milliseconds)
[info] - All pods have the same service account by default (9 seconds, 891 milliseconds)
[info] - Run extraJVMOptions check on driver (5 seconds, 888 milliseconds)
[info] - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties (10 seconds, 261 milliseconds)
[info] - Run SparkPi with env and mount secrets. (18 seconds, 702 milliseconds)
[info] - Run PySpark on simple pi.py example (10 seconds, 944 milliseconds)
[info] - Run PySpark to test a pyfiles example (13 seconds, 934 milliseconds)
[info] - Run PySpark with memory customization (10 seconds, 853 milliseconds)
[info] - Run in client mode. (11 seconds, 301 milliseconds)
[info] - Start pod creation from template (9 seconds, 853 milliseconds)
[info] - SPARK-38398: Schedule pod creation from template (9 seconds, 923 milliseconds)
[info] - Run SparkR on simple dataframe.R example (13 seconds, 929 milliseconds)
[info] YuniKornSuite:
[info] - Run SparkPi with no resources (9 seconds, 769 milliseconds)
[info] - Run SparkPi with no resources & statefulset allocation (9 seconds, 776 milliseconds)
[info] - Run SparkPi with a very long application name. (9 seconds, 856 milliseconds)
[info] - Use SparkLauncher.NO_RESOURCE (9 seconds, 803 milliseconds)
[info] - Run SparkPi with a master URL without a scheme. (10 seconds, 783 milliseconds)
[info] - Run SparkPi with an argument. (10 seconds, 771 milliseconds)
[info] - Run SparkPi with custom labels, annotations, and environment variables. (9 seconds, 868 milliseconds)
[info] - All pods have the same service account by default (10 seconds, 811 milliseconds)
[info] - Run extraJVMOptions check on driver (6 seconds, 858 milliseconds)
[info] - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties (11 seconds, 171 milliseconds)
[info] - Run SparkPi with env and mount secrets. (18 seconds, 221 milliseconds)
[info] - Run PySpark on simple pi.py example (11 seconds, 970 milliseconds)
[info] - Run PySpark to test a pyfiles example (13 seconds, 990 milliseconds)
[info] - Run PySpark with memory customization (11 seconds, 992 milliseconds)
[info] - Run in client mode. (11 seconds, 294 milliseconds)
[info] - Start pod creation from template (11 seconds, 10 milliseconds)
[info] - SPARK-38398: Schedule pod creation from template (9 seconds, 956 milliseconds)
[info] - Run SparkR on simple dataframe.R example (12 seconds, 992 milliseconds)
[info] Run completed in 10 minutes, 15 seconds.
[info] Total number of tests run: 36
[info] Suites: completed 2, aborted 0
[info] Tests: succeeded 36, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[success] Total time: 751 s (12:31), completed Sep 13, 2022, 11:47:24 AM

@github-actions github-actions bot added the DOCS label Sep 13, 2022
@dongjoon-hyun
Copy link
Member Author

Could you review this, @viirya ?

@viirya
Copy link
Member

viirya commented Sep 13, 2022

lgtm

@dongjoon-hyun
Copy link
Member Author

Thank you, @viirya . Merged to master/3.3.

dongjoon-hyun added a commit that referenced this pull request Sep 13, 2022
### What changes were proposed in this pull request?

This PR aims to update K8s document to declare the support of YuniKorn v1.1.+

### Why are the changes needed?

YuniKorn 1.1.0 has 87 JIRAs and is the first version to support multi-arch officially.
- https://yunikorn.apache.org/release-announce/1.1.0

```
$ docker inspect apache/yunikorn:scheduler-1.0.0 | grep Architecture
        "Architecture": "amd64",
$ docker inspect apache/yunikorn:scheduler-1.1.0 | grep Architecture
        "Architecture": "arm64",
```

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Manually tested with Apache YuniKorn v1.1.0+.
```
$ build/sbt -Psparkr -Pkubernetes -Pkubernetes-integration-tests \
-Dspark.kubernetes.test.deployMode=docker-desktop "kubernetes-integration-tests/test" \
-Dtest.exclude.tags=minikube,local,decom \
-Dtest.default.exclude.tags=
...
[info] KubernetesSuite:
[info] - Run SparkPi with no resources (11 seconds, 238 milliseconds)
[info] - Run SparkPi with no resources & statefulset allocation (11 seconds, 58 milliseconds)
[info] - Run SparkPi with a very long application name. (9 seconds, 948 milliseconds)
[info] - Use SparkLauncher.NO_RESOURCE (9 seconds, 884 milliseconds)
[info] - Run SparkPi with a master URL without a scheme. (9 seconds, 834 milliseconds)
[info] - Run SparkPi with an argument. (9 seconds, 870 milliseconds)
[info] - Run SparkPi with custom labels, annotations, and environment variables. (9 seconds, 887 milliseconds)
[info] - All pods have the same service account by default (9 seconds, 891 milliseconds)
[info] - Run extraJVMOptions check on driver (5 seconds, 888 milliseconds)
[info] - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties (10 seconds, 261 milliseconds)
[info] - Run SparkPi with env and mount secrets. (18 seconds, 702 milliseconds)
[info] - Run PySpark on simple pi.py example (10 seconds, 944 milliseconds)
[info] - Run PySpark to test a pyfiles example (13 seconds, 934 milliseconds)
[info] - Run PySpark with memory customization (10 seconds, 853 milliseconds)
[info] - Run in client mode. (11 seconds, 301 milliseconds)
[info] - Start pod creation from template (9 seconds, 853 milliseconds)
[info] - SPARK-38398: Schedule pod creation from template (9 seconds, 923 milliseconds)
[info] - Run SparkR on simple dataframe.R example (13 seconds, 929 milliseconds)
[info] YuniKornSuite:
[info] - Run SparkPi with no resources (9 seconds, 769 milliseconds)
[info] - Run SparkPi with no resources & statefulset allocation (9 seconds, 776 milliseconds)
[info] - Run SparkPi with a very long application name. (9 seconds, 856 milliseconds)
[info] - Use SparkLauncher.NO_RESOURCE (9 seconds, 803 milliseconds)
[info] - Run SparkPi with a master URL without a scheme. (10 seconds, 783 milliseconds)
[info] - Run SparkPi with an argument. (10 seconds, 771 milliseconds)
[info] - Run SparkPi with custom labels, annotations, and environment variables. (9 seconds, 868 milliseconds)
[info] - All pods have the same service account by default (10 seconds, 811 milliseconds)
[info] - Run extraJVMOptions check on driver (6 seconds, 858 milliseconds)
[info] - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties (11 seconds, 171 milliseconds)
[info] - Run SparkPi with env and mount secrets. (18 seconds, 221 milliseconds)
[info] - Run PySpark on simple pi.py example (11 seconds, 970 milliseconds)
[info] - Run PySpark to test a pyfiles example (13 seconds, 990 milliseconds)
[info] - Run PySpark with memory customization (11 seconds, 992 milliseconds)
[info] - Run in client mode. (11 seconds, 294 milliseconds)
[info] - Start pod creation from template (11 seconds, 10 milliseconds)
[info] - SPARK-38398: Schedule pod creation from template (9 seconds, 956 milliseconds)
[info] - Run SparkR on simple dataframe.R example (12 seconds, 992 milliseconds)
[info] Run completed in 10 minutes, 15 seconds.
[info] Total number of tests run: 36
[info] Suites: completed 2, aborted 0
[info] Tests: succeeded 36, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[success] Total time: 751 s (12:31), completed Sep 13, 2022, 11:47:24 AM
```

Closes #37872 from dongjoon-hyun/SPARK-40417.

Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit a934cab)
Signed-off-by: Dongjoon Hyun <[email protected]>
@dongjoon-hyun dongjoon-hyun deleted the SPARK-40417 branch September 13, 2022 19:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants