Skip to content

Commit 28a6afb

Browse files
ivanwickpdeyhim
authored andcommitted
Set spark.executor.uri from environment variable (needed by Mesos)
The Mesos backend uses this property when setting up a slave process. It is similarly set in the Scala repl (org.apache.spark.repl.SparkILoop), but I couldn't find any analogous for pyspark. Author: Ivan Wick <[email protected]> This patch had conflicts when merged, resolved by Committer: Matei Zaharia <[email protected]> Closes apache#311 from ivanwick/master and squashes the following commits: da0c3e4 [Ivan Wick] Set spark.executor.uri from environment variable (needed by Mesos)
1 parent bc387ff commit 28a6afb

File tree

1 file changed

+3
-0
lines changed

1 file changed

+3
-0
lines changed

python/pyspark/shell.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,9 @@
2929
# this is the equivalent of ADD_JARS
3030
add_files = os.environ.get("ADD_FILES").split(',') if os.environ.get("ADD_FILES") != None else None
3131

32+
if os.environ.get("SPARK_EXECUTOR_URI"):
33+
SparkContext.setSystemProperty("spark.executor.uri", os.environ["SPARK_EXECUTOR_URI"])
34+
3235
sc = SparkContext(os.environ.get("MASTER", "local[*]"), "PySparkShell", pyFiles=add_files)
3336

3437
print """Welcome to

0 commit comments

Comments
 (0)