Skip to content
Closed
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Covers OffHeap.
  • Loading branch information
viirya committed Dec 10, 2018
commit 4c621d2bd36c50a10591d93ccd77bd7c0432a873
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@

import org.apache.spark.SparkConf;
import org.apache.spark.executor.ShuffleWriteMetrics;
import org.apache.spark.memory.MemoryMode;
import org.apache.spark.memory.TestMemoryConsumer;
import org.apache.spark.memory.TaskMemoryManager;
import org.apache.spark.memory.TestMemoryManager;
Expand Down Expand Up @@ -671,7 +672,8 @@ public void testPeakMemoryUsed() {
@Test
public void avoidDeadlock() throws InterruptedException {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi, @viirya . Since this test case reproduces Deadlock situation, we need a timeout logic. Otherwise, it will hang (instead of failures) when we hit this issue later.

Copy link
Member Author

@viirya viirya Dec 10, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've tried several ways to set a timeout logic, but don't work. The deadlock always hangs the test and timeout logic.

memoryManager.limit(PAGE_SIZE_BYTES);
TestMemoryConsumer c1 = new TestMemoryConsumer(taskMemoryManager);
MemoryMode mode = useOffHeapMemoryAllocator() ? MemoryMode.OFF_HEAP: MemoryMode.ON_HEAP;
TestMemoryConsumer c1 = new TestMemoryConsumer(taskMemoryManager, mode);
BytesToBytesMap map =
new BytesToBytesMap(taskMemoryManager, blockManager, serializerManager, 1, 0.5, 1024);

Expand Down