-
Notifications
You must be signed in to change notification settings - Fork 753
refactor: PyTest subprocess for cancellation unit tests #3127
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
/ok to test b7f1df5 |
WalkthroughConsolidates Python cancellation tests into a single module using a Context-based approach, removes subprocess-driven test wrappers, adds a runtime fixture in conftest, and updates test dependencies by adding pytest-forked. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor Tester
participant Client
participant Server
participant Context
rect rgba(220,240,255,0.4)
note over Tester,Server: Client-driven context cancellation
Tester->>Context: create()
Tester->>Client: generate(endpoint, context)
Client->>Server: stream request (with context)
loop streaming
Server-->>Client: data chunk
Client-->>Tester: yield chunk
alt after N chunks
Tester->>Context: stop_generating()
Context-->>Server: cancellation signal
Server-->>Client: end stream (cancelled)
Client-->>Tester: raise ValueError (ended early)
end
end
end
sequenceDiagram
autonumber
actor Tester
participant Client
participant Server
rect rgba(235,255,235,0.4)
note over Tester,Server: Loop break without explicit cancellation
Tester->>Client: generate(endpoint)
Client->>Server: stream request
loop streaming
Server-->>Client: data chunk
Client-->>Tester: yield chunk
alt Tester stops reading
Tester-->>Client: break loop
note right of Server: No context cancellation observed
end
end
end
sequenceDiagram
autonumber
actor Tester
participant Client
participant Server
rect rgba(255,240,220,0.4)
note over Tester,Server: Server-initiated cancellation and raised CancelledError
Tester->>Client: generate(_generate_and_cancel_context)
Client->>Server: start stream
Server->>Server: cancel its context
Server-->>Client: terminate stream
Client-->>Tester: ValueError("Stream ended before generation completed")
Tester->>Client: generate(_generate_and_raise_cancelled)
Client->>Server: start stream
Server->>Server: raise CancelledError
Server-->>Client: propagate termination
Client-->>Tester: ValueError containing CancelledError message
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Possibly related PRs
Poem
Pre-merge checks❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
Tip 👮 Agentic pre-merge checks are now available in preview!Pro plan users can now enable pre-merge checks in their settings to enforce checklists before merging PRs.
Please see the documentation for more information. Example: reviews:
pre_merge_checks:
custom_checks:
- name: "Undocumented Breaking Changes"
mode: "warning"
instructions: |
Pass/fail criteria: All breaking changes to public APIs, CLI flags, environment variables, configuration keys, database schemas, or HTTP/GraphQL endpoints must be documented in the "Breaking Change" section of the PR description and in CHANGELOG.md. Exclude purely internal or private changes (e.g., code not exported from package entry points or explicitly marked as internal).Please share your feedback with us on this Discord post. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (7)
lib/bindings/python/tests/conftest.py (1)
43-53: Enforce forked isolation for the runtime fixture.Guard the fixture so tests using it won’t accidentally run in-process (singleton hazard). Also accept the request object.
-@pytest.fixture(scope="function", autouse=False) -async def runtime(): +@pytest.fixture(scope="function", autouse=False) +async def runtime(request): @@ - loop = asyncio.get_running_loop() + # Ensure isolation: require @pytest.mark.forked + if request.node.get_closest_marker("forked") is None: + pytest.skip("runtime fixture requires @pytest.mark.forked for process isolation") + loop = asyncio.get_running_loop() runtime = DistributedRuntime(loop, True) yield runtime runtime.shutdown()lib/bindings/python/tests/cancellation/test_cancellation.py (6)
22-22: Register custom marker to avoid PytestUnknownMarkWarning.Add pre_merge to pytest.ini (or top-level config).
Example:
# pytest.ini [pytest] markers = pre_merge: run in pre-merge test suite
129-131: Make namespace unique to avoid cross-test interference.If these tests ever run concurrently (e.g., with xdist), a constant namespace can collide. Use PID or UUID.
- """Namespace for this test file""" - return "cancellation_unit_test" + """Namespace for this test file""" + import os + return f"cancellation_unit_test_{os.getpid()}"
200-202: Close the stream to propagate cancellation upstream.Explicitly close if supported; avoids dangling tasks.
- # Give server a moment to process the cancellation - await asyncio.sleep(0.2) + # Ensure the client stream is closed so cancellation propagates + if hasattr(stream, "aclose"): + await stream.aclose() + await asyncio.sleep(0.2) # allow server to observe the cancellation
255-260: Avoid assert False; relax brittle message match.Use AssertionError and check message substring to reduce flakiness across platforms/backends.
- assert False, "Stream completed without cancellation" + raise AssertionError("Stream completed without cancellation") @@ - except ValueError as e: + except ValueError as e: @@ - assert str(e) == "Stream ended before generation completed" + assert "Stream ended before generation completed" in str(e)
279-286: Same here: replace assert False and avoid exact error string.Prefer robust checks; consider asserting that the underlying cause is CancelledError if it’s exposed.
- assert False, "Stream completed without cancellation" + raise AssertionError("Stream completed without cancellation") @@ - except ValueError as e: + except ValueError as e: @@ - assert ( - str(e) - == "a python exception was caught while processing the async generator: CancelledError: " - ) + assert "CancelledError" in str(e)
69-69: Reduce sleep-based timing to cut flakiness.The 0.1s sleeps per iteration can slow tests and increase flakiness under load. Consider shorter sleeps (e.g., 0.01) or guarding loops with asyncio.wait_for.
- await asyncio.sleep(0.1) + await asyncio.sleep(0.01)Or wrap stream consumption with asyncio.wait_for(..., timeout=2.0).
Also applies to: 86-86, 105-105, 121-121
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (8)
container/deps/requirements.test.txt(1 hunks)lib/bindings/python/tests/cancellation/test_cancellation.py(3 hunks)lib/bindings/python/tests/conftest.py(2 hunks)lib/bindings/python/tests/test_cancellation/test_cancellation.py(0 hunks)lib/bindings/python/tests/test_cancellation/test_client_context_cancel.py(0 hunks)lib/bindings/python/tests/test_cancellation/test_client_loop_break.py(0 hunks)lib/bindings/python/tests/test_cancellation/test_server_context_cancel.py(0 hunks)lib/bindings/python/tests/test_cancellation/test_server_raise_cancelled.py(0 hunks)
💤 Files with no reviewable changes (5)
- lib/bindings/python/tests/test_cancellation/test_server_context_cancel.py
- lib/bindings/python/tests/test_cancellation/test_client_context_cancel.py
- lib/bindings/python/tests/test_cancellation/test_server_raise_cancelled.py
- lib/bindings/python/tests/test_cancellation/test_cancellation.py
- lib/bindings/python/tests/test_cancellation/test_client_loop_break.py
🧰 Additional context used
🧬 Code graph analysis (2)
lib/bindings/python/tests/conftest.py (2)
lib/bindings/python/src/dynamo/_core.pyi (1)
DistributedRuntime(31-55)lib/bindings/python/rust/lib.rs (1)
shutdown(384-386)
lib/bindings/python/tests/cancellation/test_cancellation.py (1)
lib/bindings/python/tests/conftest.py (1)
runtime(44-53)
🪛 Ruff (0.12.2)
lib/bindings/python/tests/cancellation/test_cancellation.py
255-255: Do not assert False (python -O removes these calls), raise AssertionError()
Replace assert False
(B011)
279-279: Do not assert False (python -O removes these calls), raise AssertionError()
Replace assert False
(B011)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Build and Test - vllm
🔇 Additional comments (5)
lib/bindings/python/tests/conftest.py (2)
16-16: Import asyncio: OK.
22-22: Import DistributedRuntime: OK.lib/bindings/python/tests/cancellation/test_cancellation.py (2)
20-20: Import Context: OK.
234-238: Loop-break semantics: assertions match current behavior.Acknowledged that implicit cancellation isn’t implemented; expectations are correct.
container/deps/requirements.test.txt (1)
25-25: pytest-forked missing in verification environment — ensure CI installs it and runners support fork semanticsThe check you ran returned False (pytest-forked not available).
- Ensure CI installs container/deps/requirements.test.txt (or pip install pytest-forked) before running tests.
- Run forked tests only on non‑Windows runners or mark/skip them on Windows; re-run the same Python check in CI to verify availability.
michaelfeil
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like a good idea!
Signed-off-by: michaelfeil <[email protected]>
Signed-off-by: Jacky <[email protected]>
b7f1df5 to
0263bb4
Compare
|
/ok to test 0263bb4 |
Signed-off-by: michaelfeil <[email protected]> Signed-off-by: Jacky <[email protected]> Co-authored-by: michaelfeil <[email protected]> Signed-off-by: Jason Zhou <[email protected]>
Signed-off-by: michaelfeil <[email protected]> Signed-off-by: Jacky <[email protected]> Co-authored-by: michaelfeil <[email protected]> Signed-off-by: Jason Zhou <[email protected]>
Signed-off-by: michaelfeil <[email protected]> Signed-off-by: Jacky <[email protected]> Co-authored-by: michaelfeil <[email protected]> Signed-off-by: Kyle H <[email protected]>
Overview:
Use pytest marker for indicating if a particular test case needs to be run in a subprocess, for
DistributedRuntimesingleton isolation.Details:
runtimefixture in conftest.py globally.pytest-forkedlibrary to requirement list.Where should the reviewer start?
N/A
Related Issues: (use one of the action keywords Closes / Fixes / Resolves / Relates to)
N/A
Summary by CodeRabbit
Tests
Chores