-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-25004][CORE] Add spark.executor.pyspark.memory limit. #21977
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 1 commit
a5004ba
306538b
9535a6b
5288f5b
fbac4a5
f11b3bb
ac7de4a
a38eac3
fcee94c
bb8fecb
0b275cf
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
- Loading branch information
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -179,6 +179,15 @@ of the most common options to set are: | |
| (e.g. <code>2g</code>, <code>8g</code>). | ||
| </td> | ||
| </tr> | ||
| <tr> | ||
| <td><code>spark.executor.pyspark.memory</code></td> | ||
| <td>Not set</td> | ||
| <td> | ||
| The amount of memory to be allocated to PySpark in each executor, in MiB | ||
| unless otherwise specified. If set, PySpark memory for an executor will be | ||
| limited to this amount. If not set, Spark will not limit Python's memory use. | ||
|
||
| </td> | ||
| </tr> | ||
| <tr> | ||
| <td><code>spark.executor.memoryOverhead</code></td> | ||
| <td>executorMemory * 0.10, with minimum of 384 </td> | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should probably mention that this is added to the executor memory request in Yarn mode.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've added "When PySpark is run in YARN, this memory is added to executor resource requests."