Skip to content

Conversation

@kthui
Copy link
Contributor

@kthui kthui commented Sep 18, 2025

Overview:

Use pytest marker for indicating if a particular test case needs to be run in a subprocess, for DistributedRuntime singleton isolation.

Details:

  • Moved all cancellation unit test into a single file.
  • Defined runtime fixture in conftest.py globally.
  • Added new pytest-forked library to requirement list.

Where should the reviewer start?

N/A

Related Issues: (use one of the action keywords Closes / Fixes / Resolves / Relates to)

N/A

Summary by CodeRabbit

  • Tests

    • Consolidated cancellation tests into a single async suite, replacing per-test subprocess runs for faster, more reliable execution.
    • Added a shared runtime fixture and stabilized namespaces to reduce flakiness.
    • Expanded coverage for client-initiated, loop-break, and server-initiated cancellation scenarios, with clear assertions on cancellation signals.
    • Adopted forked and asyncio markers for improved isolation and correctness.
  • Chores

    • Added pytest-forked to test dependencies to enhance test isolation and stability.

@kthui kthui self-assigned this Sep 18, 2025
@copy-pr-bot
Copy link

copy-pr-bot bot commented Sep 18, 2025

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

@kthui
Copy link
Contributor Author

kthui commented Sep 18, 2025

/ok to test b7f1df5

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Sep 18, 2025

Walkthrough

Consolidates Python cancellation tests into a single module using a Context-based approach, removes subprocess-driven test wrappers, adds a runtime fixture in conftest, and updates test dependencies by adding pytest-forked.

Changes

Cohort / File(s) Summary
Dependencies
container/deps/requirements.test.txt
Added pytest-forked to test requirements.
Cancellation tests consolidation
lib/bindings/python/tests/cancellation/test_cancellation.py, lib/bindings/python/tests/test_cancellation/*
Replaced multiple per-case test files and a subprocess harness with a single async test module. Introduced fixed namespace, Context-based tests, and pytestmark = pytest.mark.pre_merge. Deleted four specific cancellation tests and their runner.
Runtime fixture
lib/bindings/python/tests/conftest.py
Added async runtime fixture creating DistributedRuntime using the running event loop; ensures shutdown on teardown.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  actor Tester
  participant Client
  participant Server
  participant Context

  rect rgba(220,240,255,0.4)
  note over Tester,Server: Client-driven context cancellation
  Tester->>Context: create()
  Tester->>Client: generate(endpoint, context)
  Client->>Server: stream request (with context)
  loop streaming
    Server-->>Client: data chunk
    Client-->>Tester: yield chunk
    alt after N chunks
      Tester->>Context: stop_generating()
      Context-->>Server: cancellation signal
      Server-->>Client: end stream (cancelled)
      Client-->>Tester: raise ValueError (ended early)
    end
  end
  end
Loading
sequenceDiagram
  autonumber
  actor Tester
  participant Client
  participant Server

  rect rgba(235,255,235,0.4)
  note over Tester,Server: Loop break without explicit cancellation
  Tester->>Client: generate(endpoint)
  Client->>Server: stream request
  loop streaming
    Server-->>Client: data chunk
    Client-->>Tester: yield chunk
    alt Tester stops reading
      Tester-->>Client: break loop
      note right of Server: No context cancellation observed
    end
  end
  end
Loading
sequenceDiagram
  autonumber
  actor Tester
  participant Client
  participant Server

  rect rgba(255,240,220,0.4)
  note over Tester,Server: Server-initiated cancellation and raised CancelledError
  Tester->>Client: generate(_generate_and_cancel_context)
  Client->>Server: start stream
  Server->>Server: cancel its context
  Server-->>Client: terminate stream
  Client-->>Tester: ValueError("Stream ended before generation completed")

  Tester->>Client: generate(_generate_and_raise_cancelled)
  Client->>Server: start stream
  Server->>Server: raise CancelledError
  Server-->>Client: propagate termination
  Client-->>Tester: ValueError containing CancelledError message
  end
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Possibly related PRs

Poem

A rabbit taps tests with a gentle thump,
Gathering streams in a single clump.
Context whispers, “stop,” the bytes obey—
Server sighs, “all right, we’ll end today.”
Forked we hop, no subprocess fuss,
Carrots compiled—green checks for us. 🥕

Pre-merge checks

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 54.55% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (2 passed)
Check name Status Explanation
Title Check ✅ Passed The title "refactor: PyTest subprocess for cancellation unit tests" accurately and concisely summarizes the primary change — refactoring how cancellation tests are run using subprocess/forking and adding pytest-forked — and aligns with the diff that consolidates tests, adds a runtime fixture, and updates requirements.
Description Check ✅ Passed The PR description follows the repository template and clearly states the Overview and Details, correctly noting the consolidation of cancellation tests, the new global runtime fixture in conftest.py, and the addition of pytest-forked, which matches the provided changes; however the "Where should the reviewer start?" section is marked "N/A" instead of pointing to the new test file and conftest.py.

Tip

👮 Agentic pre-merge checks are now available in preview!

Pro plan users can now enable pre-merge checks in their settings to enforce checklists before merging PRs.

  • Built-in checks – Quickly apply ready-made checks to enforce title conventions, require pull request descriptions that follow templates, validate linked issues for compliance, and more.
  • Custom agentic checks – Define your own rules using CodeRabbit’s advanced agentic capabilities to enforce organization-specific policies and workflows. For example, you can instruct CodeRabbit’s agent to verify that API documentation is updated whenever API schema files are modified in a PR. Note: Upto 5 custom checks are currently allowed during the preview period. Pricing for this feature will be announced in a few weeks.

Please see the documentation for more information.

Example:

reviews:
  pre_merge_checks:
    custom_checks:
      - name: "Undocumented Breaking Changes"
        mode: "warning"
        instructions: |
          Pass/fail criteria: All breaking changes to public APIs, CLI flags, environment variables, configuration keys, database schemas, or HTTP/GraphQL endpoints must be documented in the "Breaking Change" section of the PR description and in CHANGELOG.md. Exclude purely internal or private changes (e.g., code not exported from package entry points or explicitly marked as internal).

Please share your feedback with us on this Discord post.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (7)
lib/bindings/python/tests/conftest.py (1)

43-53: Enforce forked isolation for the runtime fixture.

Guard the fixture so tests using it won’t accidentally run in-process (singleton hazard). Also accept the request object.

-@pytest.fixture(scope="function", autouse=False)
-async def runtime():
+@pytest.fixture(scope="function", autouse=False)
+async def runtime(request):
@@
-    loop = asyncio.get_running_loop()
+    # Ensure isolation: require @pytest.mark.forked
+    if request.node.get_closest_marker("forked") is None:
+        pytest.skip("runtime fixture requires @pytest.mark.forked for process isolation")
+    loop = asyncio.get_running_loop()
     runtime = DistributedRuntime(loop, True)
     yield runtime
     runtime.shutdown()
lib/bindings/python/tests/cancellation/test_cancellation.py (6)

22-22: Register custom marker to avoid PytestUnknownMarkWarning.

Add pre_merge to pytest.ini (or top-level config).

Example:

# pytest.ini
[pytest]
markers =
    pre_merge: run in pre-merge test suite

129-131: Make namespace unique to avoid cross-test interference.

If these tests ever run concurrently (e.g., with xdist), a constant namespace can collide. Use PID or UUID.

-    """Namespace for this test file"""
-    return "cancellation_unit_test"
+    """Namespace for this test file"""
+    import os
+    return f"cancellation_unit_test_{os.getpid()}"

200-202: Close the stream to propagate cancellation upstream.

Explicitly close if supported; avoids dangling tasks.

-    # Give server a moment to process the cancellation
-    await asyncio.sleep(0.2)
+    # Ensure the client stream is closed so cancellation propagates
+    if hasattr(stream, "aclose"):
+        await stream.aclose()
+    await asyncio.sleep(0.2)  # allow server to observe the cancellation

255-260: Avoid assert False; relax brittle message match.

Use AssertionError and check message substring to reduce flakiness across platforms/backends.

-        assert False, "Stream completed without cancellation"
+        raise AssertionError("Stream completed without cancellation")
@@
-    except ValueError as e:
+    except ValueError as e:
@@
-        assert str(e) == "Stream ended before generation completed"
+        assert "Stream ended before generation completed" in str(e)

279-286: Same here: replace assert False and avoid exact error string.

Prefer robust checks; consider asserting that the underlying cause is CancelledError if it’s exposed.

-        assert False, "Stream completed without cancellation"
+        raise AssertionError("Stream completed without cancellation")
@@
-    except ValueError as e:
+    except ValueError as e:
@@
-        assert (
-            str(e)
-            == "a python exception was caught while processing the async generator: CancelledError: "
-        )
+        assert "CancelledError" in str(e)

69-69: Reduce sleep-based timing to cut flakiness.

The 0.1s sleeps per iteration can slow tests and increase flakiness under load. Consider shorter sleeps (e.g., 0.01) or guarding loops with asyncio.wait_for.

-            await asyncio.sleep(0.1)
+            await asyncio.sleep(0.01)

Or wrap stream consumption with asyncio.wait_for(..., timeout=2.0).

Also applies to: 86-86, 105-105, 121-121

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 20b7a8a and b7f1df5.

📒 Files selected for processing (8)
  • container/deps/requirements.test.txt (1 hunks)
  • lib/bindings/python/tests/cancellation/test_cancellation.py (3 hunks)
  • lib/bindings/python/tests/conftest.py (2 hunks)
  • lib/bindings/python/tests/test_cancellation/test_cancellation.py (0 hunks)
  • lib/bindings/python/tests/test_cancellation/test_client_context_cancel.py (0 hunks)
  • lib/bindings/python/tests/test_cancellation/test_client_loop_break.py (0 hunks)
  • lib/bindings/python/tests/test_cancellation/test_server_context_cancel.py (0 hunks)
  • lib/bindings/python/tests/test_cancellation/test_server_raise_cancelled.py (0 hunks)
💤 Files with no reviewable changes (5)
  • lib/bindings/python/tests/test_cancellation/test_server_context_cancel.py
  • lib/bindings/python/tests/test_cancellation/test_client_context_cancel.py
  • lib/bindings/python/tests/test_cancellation/test_server_raise_cancelled.py
  • lib/bindings/python/tests/test_cancellation/test_cancellation.py
  • lib/bindings/python/tests/test_cancellation/test_client_loop_break.py
🧰 Additional context used
🧬 Code graph analysis (2)
lib/bindings/python/tests/conftest.py (2)
lib/bindings/python/src/dynamo/_core.pyi (1)
  • DistributedRuntime (31-55)
lib/bindings/python/rust/lib.rs (1)
  • shutdown (384-386)
lib/bindings/python/tests/cancellation/test_cancellation.py (1)
lib/bindings/python/tests/conftest.py (1)
  • runtime (44-53)
🪛 Ruff (0.12.2)
lib/bindings/python/tests/cancellation/test_cancellation.py

255-255: Do not assert False (python -O removes these calls), raise AssertionError()

Replace assert False

(B011)


279-279: Do not assert False (python -O removes these calls), raise AssertionError()

Replace assert False

(B011)

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Build and Test - vllm
🔇 Additional comments (5)
lib/bindings/python/tests/conftest.py (2)

16-16: Import asyncio: OK.


22-22: Import DistributedRuntime: OK.

lib/bindings/python/tests/cancellation/test_cancellation.py (2)

20-20: Import Context: OK.


234-238: Loop-break semantics: assertions match current behavior.

Acknowledged that implicit cancellation isn’t implemented; expectations are correct.

container/deps/requirements.test.txt (1)

25-25: pytest-forked missing in verification environment — ensure CI installs it and runners support fork semantics

The check you ran returned False (pytest-forked not available).

  • Ensure CI installs container/deps/requirements.test.txt (or pip install pytest-forked) before running tests.
  • Run forked tests only on non‑Windows runners or mark/skip them on Windows; re-run the same Python check in CI to verify availability.

Copy link
Contributor

@michaelfeil michaelfeil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like a good idea!

@kthui kthui requested a review from nv-anants September 18, 2025 23:55
@kthui kthui force-pushed the jacky-pytest-subprocess branch from b7f1df5 to 0263bb4 Compare September 19, 2025 19:31
@kthui kthui requested review from a team as code owners September 19, 2025 19:31
@kthui
Copy link
Contributor Author

kthui commented Sep 19, 2025

/ok to test 0263bb4

@kthui kthui merged commit 31c78df into main Sep 22, 2025
12 of 13 checks passed
@kthui kthui deleted the jacky-pytest-subprocess branch September 22, 2025 17:56
jasonqinzhou pushed a commit that referenced this pull request Sep 24, 2025
Signed-off-by: michaelfeil <[email protected]>
Signed-off-by: Jacky <[email protected]>
Co-authored-by: michaelfeil <[email protected]>
Signed-off-by: Jason Zhou <[email protected]>
jasonqinzhou pushed a commit that referenced this pull request Sep 24, 2025
Signed-off-by: michaelfeil <[email protected]>
Signed-off-by: Jacky <[email protected]>
Co-authored-by: michaelfeil <[email protected]>
Signed-off-by: Jason Zhou <[email protected]>
kylehh pushed a commit that referenced this pull request Sep 25, 2025
Signed-off-by: michaelfeil <[email protected]>
Signed-off-by: Jacky <[email protected]>
Co-authored-by: michaelfeil <[email protected]>
Signed-off-by: Kyle H <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants