Skip to content
Closed
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Address comments
  • Loading branch information
HyukjinKwon committed Nov 21, 2018
commit fd92a4e1cee9f666d7ee6f9c9fcb45367c8132a8
4 changes: 3 additions & 1 deletion docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,9 @@ of the most common options to set are:
limited to this amount. If not set, Spark will not limit Python's memory use
and it is up to the application to avoid exceeding the overhead memory space
shared with other non-JVM processes. When PySpark is run in YARN or Kubernetes, this memory
is added to executor resource requests. This configuration is not supported on Windows.
is added to executor resource requests.

NOTE: This configuration is not supported on Windows.
</td>
</tr>
<tr>
Expand Down
10 changes: 5 additions & 5 deletions python/pyspark/worker.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@
import os
import sys
import time
# 'resource' is a Unix specific package.
has_resource_package = True
# 'resource' is a Unix specific module.
has_resource_module = True
try:
import resource
except ImportError:
has_resource_package = False
has_resource_module = False
import socket
import traceback

Expand Down Expand Up @@ -274,8 +274,8 @@ def main(infile, outfile):
# set up memory limits
memory_limit_mb = int(os.environ.get('PYSPARK_EXECUTOR_MEMORY_MB', "-1"))
# 'PYSPARK_EXECUTOR_MEMORY_MB' should be undefined on Windows because it depends on
# resource package which is a Unix specific package.
if memory_limit_mb > 0 and has_resource_package:
# resource module which is a Unix specific module.
if memory_limit_mb > 0 and has_resource_module:
total_memory = resource.RLIMIT_AS
try:
(soft_limit, hard_limit) = resource.getrlimit(total_memory)
Expand Down