-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-26539][CORE] Remove spark.memory.useLegacyMode and StaticMemoryManager #23457
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 1 commit
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
This file was deleted.
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -61,15 +61,10 @@ package org.apache.spark | |
| * }}} | ||
| * | ||
| * | ||
| * There are two implementations of [[org.apache.spark.memory.MemoryManager]] which vary in how | ||
| * they handle the sizing of their memory pools: | ||
| * There is one implementation of [[org.apache.spark.memory.MemoryManager]]: | ||
| * | ||
| * - [[org.apache.spark.memory.UnifiedMemoryManager]], the default in Spark 1.6+, enforces soft | ||
|
||
| * boundaries between storage and execution memory, allowing requests for memory in one region | ||
| * to be fulfilled by borrowing memory from the other. | ||
| * - [[org.apache.spark.memory.StaticMemoryManager]] enforces hard boundaries between storage | ||
| * and execution memory by statically partitioning Spark's memory and preventing storage and | ||
| * execution from borrowing memory from each other. This mode is retained only for legacy | ||
| * compatibility purposes. | ||
| */ | ||
| package object memory | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -29,7 +29,7 @@ public class TaskMemoryManagerSuite { | |
| @Test | ||
| public void leakedPageMemoryIsDetected() { | ||
| final TaskMemoryManager manager = new TaskMemoryManager( | ||
| new StaticMemoryManager( | ||
| new UnifiedMemoryManager( | ||
| new SparkConf().set("spark.memory.offHeap.enabled", "false"), | ||
| Long.MAX_VALUE, | ||
| Long.MAX_VALUE, | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. What does these mean after the change? maxHeapMemory = Long.MAX_VALUE and onHeapStorageRegionSize = Long.MAX_VALUE? |
||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tests and other internals now need to use this constructor directly