Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -25,9 +25,6 @@ private[spark] object Constants {
val SPARK_POD_DRIVER_ROLE = "driver"
val SPARK_POD_EXECUTOR_ROLE = "executor"

// Annotations
val SPARK_APP_NAME_ANNOTATION = "spark-app-name"

// Credentials secrets
val DRIVER_CREDENTIALS_SECRETS_BASE_DIR =
"/mnt/secrets/spark-kubernetes-credentials"
Expand All @@ -50,17 +47,14 @@ private[spark] object Constants {
val DEFAULT_BLOCKMANAGER_PORT = 7079
val DRIVER_PORT_NAME = "driver-rpc-port"
val BLOCK_MANAGER_PORT_NAME = "blockmanager"
val EXECUTOR_PORT_NAME = "executor"

// Environment Variables
val ENV_EXECUTOR_PORT = "SPARK_EXECUTOR_PORT"
val ENV_DRIVER_URL = "SPARK_DRIVER_URL"
val ENV_EXECUTOR_CORES = "SPARK_EXECUTOR_CORES"
val ENV_EXECUTOR_MEMORY = "SPARK_EXECUTOR_MEMORY"
val ENV_APPLICATION_ID = "SPARK_APPLICATION_ID"
val ENV_EXECUTOR_ID = "SPARK_EXECUTOR_ID"
val ENV_EXECUTOR_POD_IP = "SPARK_EXECUTOR_POD_IP"
val ENV_MOUNTED_CLASSPATH = "SPARK_MOUNTED_CLASSPATH"
val ENV_JAVA_OPT_PREFIX = "SPARK_JAVA_OPT_"
val ENV_CLASSPATH = "SPARK_CLASSPATH"
val ENV_DRIVER_BIND_ADDRESS = "SPARK_DRIVER_BIND_ADDRESS"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,8 +46,6 @@ private[spark] class KubernetesClusterManager extends ExternalClusterManager wit
sc: SparkContext,
masterURL: String,
scheduler: TaskScheduler): SchedulerBackend = {
val executorSecretNamesToMountPaths = KubernetesUtils.parsePrefixedKeyValuePairs(
sc.conf, KUBERNETES_EXECUTOR_SECRETS_PREFIX)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It does not create any side effects so it can be removed.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we not have secrets -> mountpaths support right now? @mccheah @liyinan926

Copy link
Contributor Author

@skonto skonto Jun 28, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@foxish you are right we do have, but this statement has no effect. For example ti does not modify sparkConf.
It was used in the past by the following statement that was removed:

-    val mountSecretBootstrap = if (executorSecretNamesToMountPaths.nonEmpty) {	
-      Some(new MountSecretsBootstrap(executorSecretNamesToMountPaths))	
-    } else {	
-      None	
-    }

Here is the relevant PR:
a83ae0d#diff-7d9979c0153744eafa24348ecbfa1671

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mccheah @liyinan926 any more to this?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This might be a bug, shouldn't we just be mounting the secrets here?

Copy link
Contributor Author

@skonto skonto Jun 30, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mccheah @felixcheung I think the logic has changed otherwise the tests I have in this PR(#21652) would have failed when removed that part and re-run them. Also they should have failed anyway if there was a bug. Tests there cover mounted secrets.

val kubernetesClient = SparkKubernetesClientFactory.createKubernetesClient(
KUBERNETES_MASTER_INTERNAL_URL,
Some(sc.conf.get(KUBERNETES_NAMESPACE)),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,12 +45,10 @@ shift 1

SPARK_CLASSPATH="$SPARK_CLASSPATH:${SPARK_HOME}/jars/*"
env | grep SPARK_JAVA_OPT_ | sort -t_ -k4 -n | sed 's/[^=]*=\(.*\)/\1/g' > /tmp/java_opts.txt
readarray -t SPARK_JAVA_OPTS < /tmp/java_opts.txt
if [ -n "$SPARK_MOUNTED_CLASSPATH" ]; then
SPARK_CLASSPATH="$SPARK_CLASSPATH:$SPARK_MOUNTED_CLASSPATH"
fi
if [ -n "$SPARK_MOUNTED_FILES_DIR" ]; then
cp -R "$SPARK_MOUNTED_FILES_DIR/." .
readarray -t SPARK_EXECUTOR_JAVA_OPTS < /tmp/java_opts.txt
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rename for historical reasons.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is local right? shouldn't matter what the name is. also this might be an image running the driver, not an executor?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@felixcheung I believe this is because we are running spark-submit from driver. And so the JAVA_OPTS are now only applicable to the executor.
I +1 this change, but would like weigh-in from @foxish

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah spark-submit detects whatever it needs via the spark-launcher.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, that makes sense. Those env-vars are set by launcher at submission time.


if [ -n "$SPARK_EXTRA_CLASSPATH" ]; then
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This var exists in docs but not in code.

SPARK_CLASSPATH="$SPARK_CLASSPATH:$SPARK_EXTRA_CLASSPATH"
fi

case "$SPARK_K8S_CMD" in
Expand All @@ -66,7 +64,7 @@ case "$SPARK_K8S_CMD" in
executor)
CMD=(
${JAVA_HOME}/bin/java
"${SPARK_JAVA_OPTS[@]}"
"${SPARK_EXECUTOR_JAVA_OPTS[@]}"
-Xms$SPARK_EXECUTOR_MEMORY
-Xmx$SPARK_EXECUTOR_MEMORY
-cp "$SPARK_CLASSPATH"
Expand Down