-
Notifications
You must be signed in to change notification settings - Fork 967
feat: OpenAI compatible route /api/v1/responses #792
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: OpenAI compatible route /api/v1/responses #792
Conversation
<!-- .github/pull_request_template.md --> ## Description <!-- Provide a clear description of the changes in this PR --> ## DCO Affirmation I affirm that all code in every commit of this pull request conforms to the terms of the Topoteretes Developer Certificate of Origin. --------- Co-authored-by: Vasilije <[email protected]>
|
Important Review skippedAuto reviews are disabled on base/target branches other than the default branch. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the WalkthroughThis update introduces a new "responses" API endpoint under Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant FastAPI_Server
participant ResponsesRouter
participant OpenAIClient
participant DispatchFunction
Client->>FastAPI_Server: POST /api/v1/responses (ResponseRequest)
FastAPI_Server->>ResponsesRouter: Handle request
ResponsesRouter->>OpenAIClient: Call OpenAI responses API
OpenAIClient-->>ResponsesRouter: Return API response
alt Function call(s) present in response
ResponsesRouter->>DispatchFunction: dispatch_function(tool_call)
DispatchFunction-->>ResponsesRouter: Function result or error
end
ResponsesRouter-->>Client: Return ResponseResponse (with tool call outputs and usage)
Poem
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
c8e79e5 to
f713511
Compare
f713511 to
2018850
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 9
🧹 Nitpick comments (6)
cognee/api/v1/responses/default_tools.py (1)
24-28: Validatetop_kbounds
top_kis declared with a default of 10 but no numerical guardrails. Supplying an excessively large or negative value could exhaust resources or raise errors deeper in the stack.- "top_k": { - "type": "integer", - "description": "Maximum number of results to return", - "default": 10, - }, + "top_k": { + "type": "integer", + "description": "Maximum number of results to return", + "default": 10, + "minimum": 1, + "maximum": 100, + },cognee/api/client.py (1)
17-18: Import order nitpickTo keep alphabetical grouping (present in nearby imports) place the new import after
get_search_router.-from cognee.api.v1.responses.routers import get_responses_routerThis is purely stylistic – feel free to ignore if the team does not enforce ordering.
cognee/api/v1/responses/routers/default_tools.py (1)
1-86: Consider adding examples for each tool parameterThe tool definitions are well-structured, but adding examples would significantly improve developer experience when integrating with this API.
For example, you could enhance the search_query parameter like this:
"search_query": { "type": "string", "description": "The query to search for in the knowledge graph", + "example": "How does neural network training work?" },Similar examples could be added for other parameters across all tools to provide clear guidance on expected inputs.
cognee/api/v1/responses/models.py (2)
11-16: Consider adding more model options to the CogneeModel enumCurrently, only one model is defined in the
CogneeModelenum. As you expand your API offerings, you might want to support multiple model types.class CogneeModel(str, Enum): """Enum for supported model types""" COGNEEV1 = "cognee-v1" + # Future model versions can be added here + # COGNEEV2 = "cognee-v2"
102-103: Initialize metadata with an empty dictionaryThe
metadatafield inResponseResponseis defined without a default value, which could lead to None reference errors if not explicitly set.- metadata: Dict[str, Any] = None + metadata: Dict[str, Any] = Field(default_factory=dict)This ensures that
metadatais always a dictionary, consistent with the pattern used for other fields in the model.cognee/api/v1/responses/dispatch_function.py (1)
58-68: Use SearchType enum for validating search typesThe code manually validates search types against a hardcoded list, but you already import SearchType which could be used for validation.
+from cognee.modules.search.types import SearchType # Later in the handle_search function: search_type_str = arguments.get("search_type", "GRAPH_COMPLETION") - valid_search_types = ( - search_tool["parameters"]["properties"]["search_type"]["enum"] - if search_tool - else ["INSIGHTS", "CODE", "GRAPH_COMPLETION", "SEMANTIC", "NATURAL_LANGUAGE"] - ) + valid_search_types = [t.value for t in SearchType] if search_type_str not in valid_search_types: logger.warning(f"Invalid search_type: {search_type_str}, defaulting to GRAPH_COMPLETION") search_type_str = "GRAPH_COMPLETION"This approach is more maintainable as it will automatically stay in sync with the SearchType enum.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (8)
cognee/api/client.py(2 hunks)cognee/api/v1/responses/__init__.py(1 hunks)cognee/api/v1/responses/default_tools.py(1 hunks)cognee/api/v1/responses/dispatch_function.py(1 hunks)cognee/api/v1/responses/models.py(1 hunks)cognee/api/v1/responses/routers/__init__.py(1 hunks)cognee/api/v1/responses/routers/default_tools.py(1 hunks)cognee/api/v1/responses/routers/get_responses_router.py(1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (4)
cognee/api/client.py (1)
cognee/api/v1/responses/routers/get_responses_router.py (1)
get_responses_router(23-146)
cognee/api/v1/responses/routers/get_responses_router.py (2)
cognee/api/v1/responses/models.py (6)
ResponseRequest(64-73)ResponseResponse(92-102)ResponseToolCall(83-89)ChatUsage(56-61)FunctionCall(41-45)ToolCallOutput(76-80)cognee/api/v1/responses/dispatch_function.py (1)
dispatch_function(19-44)
cognee/api/v1/responses/__init__.py (1)
cognee/api/v1/responses/routers/get_responses_router.py (1)
get_responses_router(23-146)
cognee/api/v1/responses/dispatch_function.py (5)
cognee/api/v1/responses/models.py (1)
ToolCall(48-53)cognee/modules/search/types/SearchType.py (1)
SearchType(4-13)cognee/modules/users/methods/get_default_user.py (1)
get_default_user(12-37)cognee/shared/logging_utils.py (2)
info(121-122)warning(124-125)cognee/api/v1/datasets/datasets.py (1)
datasets(7-40)
🪛 Ruff (0.8.2)
cognee/api/v1/responses/dispatch_function.py
6-6: cognee.modules.search.types.SearchType imported but unused
Remove unused import: cognee.modules.search.types.SearchType
(F401)
🔇 Additional comments (5)
cognee/api/v1/responses/default_tools.py (1)
60-65: Keep tool list in sync with dispatcher
pruneis commented out here but still handled insidedispatch_function.py.
If a request explicitly chooses theprunetool, OpenAI will return a tool call the router cannot validate againsttools, resulting in a 400 error from the OpenAI API.Either:
- Restore the
pruneentry (preferably with an"danger": trueflag you check server-side), or- Remove its handling branch from
dispatch_function.py.cognee/api/v1/responses/routers/__init__.py (1)
1-3: Re-export looks goodSimple barrel import correctly exposes
get_responses_router.
No issues spotted.cognee/api/v1/responses/__init__.py (1)
1-3: LGTMThe package-level re-export aligns with neighbouring packages and improves DX.
cognee/api/client.py (1)
171-172: Verify router prefix collision & OpenAPI tag
/api/v1/responsesis new. Ensure no other router (e.g., legacy/responses) exists to avoid ambiguous routes, and that"responses"is added totags_metadataif you generate custom OpenAPI docs.cognee/api/v1/responses/routers/get_responses_router.py (1)
92-102: Confirmoutputkey matches OpenAI Responses specThe current implementation assumes the API payload resembles:
{ "output": [ { "type": "function_call", ... } ] }OpenAI’s public examples instead nest tool calls inside
choices[].message.tool_calls.
If the key path is wrong, no function will ever be dispatched.Please verify with the latest SDK docs or live response and adjust traversal logic accordingly.
|
Hey, @borisarzentar @Vasilije1990 I have made base version of handle now it has tools with cognify+add, search and prune. Could you take a look pls? @borisarzentar sorry, it was raw/draft version |
<!-- .github/pull_request_template.md --> ## Description This branch contains the Portuguese translation of the README (`README.pt.md`). Happy to work on any suggestions you may have. ## DCO Affirmation I affirm that all code in every commit of this pull request conforms to the terms of the Topoteretes Developer Certificate of Origin. --------- Signed-off-by: Diego B Theuerkauf <[email protected]> Co-authored-by: Hande <[email protected]> Co-authored-by: Boris <[email protected]> Co-authored-by: Boris <[email protected]> Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Please make sure all the checkboxes are checked:
|
Hey @dm1tryG, thanks! I will take a look today. |
cf28ebf to
683af24
Compare
| if text: | ||
| await add(data=text, user=user) | ||
|
|
||
| await cognify(user=user, ontology_file_path=graph_model_file if graph_model_file else None) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see mixed ontology_file_path and graph_model usage.
If you want to set a custom graph model, you can do it with graph_model argument. But that one expects a pydantic model, so not suitable for this case. If you want to set a custom ontology, then ontology_file_path is the right property, but then the argument name is wrong.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ohh, yes, I missed that
|
@borisarzentar @Vasilije1990 have made fixes, please take a look again |
Description
/api/v1/responses
In this PR manages function calls
Next steps
DCO Affirmation
I affirm that all code in every commit of this pull request conforms to the terms of the Topoteretes Developer Certificate of Origin.