Skip to content
Closed
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Use a single config.
  • Loading branch information
ueshin committed Jul 30, 2020
commit 9feb7e1cffc51e53f1ed815a769bb46defb769d5
2 changes: 1 addition & 1 deletion docs/core-migration-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ license: |

## Upgrading from Core 3.0 to 3.1

- In Spark 3.0 and below, `SparkContext` can be created in executors. Since Spark 3.1, an exception will be thrown when creating `SparkContext` in executors. If you need to create `SparkContext` in executors, you can allow it by setting the configuration `spark.driver.allowSparkContextInExecutors` in Scala/Java executors or `spark.python.allowSparkContextInExecutors` in PySpark executors.
- In Spark 3.0 and below, `SparkContext` can be created in executors. Since Spark 3.1, an exception will be thrown when creating `SparkContext` in executors. You can allow it by setting the configuration `spark.driver.allowSparkContextInExecutors` when creating `SparkContext` in executors.

## Upgrading from Core 2.4 to 3.0

Expand Down
2 changes: 1 addition & 1 deletion python/pyspark/context.py
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@ def __init__(self, master=None, appName=None, sparkHome=None, pyFiles=None,
ValueError:...
"""
if (conf is None or
conf.get("spark.python.allowSparkContextInExecutors", "false").lower() != "true"):
conf.get("spark.driver.allowSparkContextInExecutors", "false").lower() != "true"):
# In order to prevent SparkContext from being created in executors.
SparkContext._assert_on_driver()

Expand Down
2 changes: 1 addition & 1 deletion python/pyspark/tests/test_context.py
Original file line number Diff line number Diff line change
Expand Up @@ -279,7 +279,7 @@ def test_allow_to_create_spark_context_in_executors(self):
# SPARK-32160: SparkContext can be created in executors if the config is set.

def create_spark_context():
conf = SparkConf().set("spark.python.allowSparkContextInExecutors", "true")
conf = SparkConf().set("spark.driver.allowSparkContextInExecutors", "true")
with SparkContext(conf=conf):
pass

Expand Down