-
Notifications
You must be signed in to change notification settings - Fork 33
[DCOS-38138] Update Spark CLI for shell-escape fix #388
Changes from 1 commit
5150951
764e6c0
746969a
62b542b
a71e59f
9f1dee7
65a0f7c
8d347b9
8336832
db177cd
c5df110
55412bc
5a8e3f4
8297693
93f588a
96ac0a2
3026e88
7c294c7
7b12b34
ff65589
29701c3
376bf1b
f7ad435
6b32e56
c711c6c
34dffbc
d0a230e
1afd177
644c264
f267518
e7035d0
9f86664
0fef2c3
39434da
cfadbf1
9a69880
36faae7
fbc86dc
13bb7f0
7865b6c
0ca555a
d0aae3c
7b2bb6c
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
- Loading branch information
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -2,7 +2,7 @@ | |
| "spark_version": "2.2.1", | ||
| "default_spark_dist": { | ||
| "hadoop_version": "2.7", | ||
| "uri": "https://svt-dev.s3.amazonaws.com/spark/spark-2.2.1-bin-2.7.3-dcos-38138.tgz" | ||
| "uri": "svt-dev.s3.amazonaws.com/spark/spark-2.2.1-bin-2.7.3-dcos-38138.tgz" | ||
|
||
| }, | ||
| "spark_dist": [ | ||
| { | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -13,4 +13,4 @@ object MultiConfs { | |
| val conf = new SparkConf().setAppName(APPNAME) | ||
| conf.getAll.foreach(println) | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. One additional check that tests the whole thing end-to-end: pass in
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Good idea, I'll update. |
||
| } | ||
| } | ||
| } | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -108,11 +108,12 @@ def test_sparkPi(service_name=utils.SPARK_SERVICE_NAME): | |
| @pytest.mark.sanity | ||
| def test_spark_with_multi_configs(service_name=utils.SPARK_SERVICE_NAME): | ||
| utils.run_tests( | ||
| app_url="https://s3-us-west-1.amazonaws.com/svt-dev/jars/dcos-spark-scala-tests-assembly-0.2-DCOS-38138.jar", | ||
| app_url=utils.dcos_test_jar_url(), | ||
| app_args="", | ||
| expected_output="spark.executor.extraJavaOptions,-XX:+PrintGCDetails -XX:+PrintGCTimeStamps -Dparam3=val3 -Dparam4=\"This one with spaces\"'", | ||
| expected_output="spark.executor.extraJavaOptions,-XX:+PrintGCDetails -XX:+PrintGCTimeStamps -Dparam3=val3", | ||
| service_name=service_name, | ||
| args=["--conf spark.driver.extraJavaOptions='-XX:+PrintGCDetails -XX:+PrintGCTimeStamps -Dparam3=val3 -Dparam4=\"This one with spaces\"'", | ||
| args=["--conf spark.driver.extraJavaOptions='-XX:+PrintGCDetails -XX:+PrintGCTimeStamps -Dparam3=val3'", | ||
|
||
| "--conf spark.mesos.containerizer=mesos", | ||
| "--class MultiConfs"]) | ||
|
|
||
|
|
||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What if we pass
--supervise falseor something similar? The code doesn't seem to handle this case explicitly (It seems as if this will be joined in thedefaultcase though.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're right,
falsewill get joined with--superviseand the job will fail. I just tried it and we get a parse errorIn this case, it might make sense to check for boolean flags and if found, check the next argument. If it's a boolean statement like
true/false, we could just move the pointer forward and skip it (i+=2).More generally though, there are a lot of ways to mess up confs. Does it make sense to have some kind of blacklist that guards against 'user error'?