diff --git a/docs/docs/API-Reference/api-build.mdx b/docs/docs/API-Reference/api-build.mdx index dc6d8e2959fe..be75662186d3 100644 --- a/docs/docs/API-Reference/api-build.mdx +++ b/docs/docs/API-Reference/api-build.mdx @@ -22,68 +22,60 @@ You might need to use or understand these endpoints when contributing to the Lan This endpoint builds and executes a flow, returning a job ID that can be used to stream execution events. -1. Send a POST request to the `/build/$FLOW_ID/flow` endpoint. - - - - -```bash -curl -X POST \ - "$LANGFLOW_URL/api/v1/build/$FLOW_ID/flow" \ - -H "accept: application/json" \ - -H "Content-Type: application/json" \ - -H "x-api-key: $LANGFLOW_API_KEY" \ - -d '{ - "inputs": { - "input_value": "Tell me a story" +1. Send a POST request to the `/build/$FLOW_ID/flow` endpoint: + + ```shell + curl -X POST \ + "$LANGFLOW_URL/api/v1/build/$FLOW_ID/flow" \ + -H "accept: application/json" \ + -H "Content-Type: application/json" \ + -H "x-api-key: $LANGFLOW_API_KEY" \ + -d '{ + "inputs": { + "input_value": "Tell me a story" + } + }' + ``` + +
+ Result + + ```json + { + "job_id": "123e4567-e89b-12d3-a456-426614174000" } - }' -``` + ``` - - - -```json -{ - "job_id": "123e4567-e89b-12d3-a456-426614174000" -} -``` - - - +
2. After receiving a job ID from the build endpoint, use the `/build/$JOB_ID/events` endpoint to stream the execution results: - - + ```shell + curl -X GET \ + "$LANGFLOW_URL/api/v1/build/123e4567-e89b-12d3-a456-426614174000/events" \ + -H "accept: application/json" \ + -H "x-api-key: $LANGFLOW_API_KEY" + ``` -```text -curl -X GET \ - "$LANGFLOW_URL/api/v1/build/123e4567-e89b-12d3-a456-426614174000/events" \ - -H "accept: application/json" \ - -H "x-api-key: $LANGFLOW_API_KEY" -``` +
+ Result - - + ```json + {"event": "vertices_sorted", "data": {"ids": ["ChatInput-XtBLx"], "to_run": ["Prompt-x74Ze", "ChatOutput-ylMzN", "ChatInput-XtBLx", "OpenAIModel-d1wOZ"]}} -```json -{"event": "vertices_sorted", "data": {"ids": ["ChatInput-XtBLx"], "to_run": ["Prompt-x74Ze", "ChatOutput-ylMzN", "ChatInput-XtBLx", "OpenAIModel-d1wOZ"]}} + {"event": "add_message", "data": {"timestamp": "2025-03-03T17:42:23", "sender": "User", "sender_name": "User", "session_id": "d2bbd92b-187e-4c84-b2d4-5df365704201", "text": "Tell me a story", "files": [], "error": false, "edit": false, "properties": {"text_color": "", "background_color": "", "edited": false, "source": {"id": null, "display_name": null, "source": null}, "icon": "", "allow_markdown": false, "positive_feedback": null, "state": "complete", "targets": []}, "category": "message", "content_blocks": [], "id": "28879bd8-6a68-4dd5-b658-74d643a4dd92", "flow_id": "d2bbd92b-187e-4c84-b2d4-5df365704201"}} -{"event": "add_message", "data": {"timestamp": "2025-03-03T17:42:23", "sender": "User", "sender_name": "User", "session_id": "d2bbd92b-187e-4c84-b2d4-5df365704201", "text": "Tell me a story", "files": [], "error": false, "edit": false, "properties": {"text_color": "", "background_color": "", "edited": false, "source": {"id": null, "display_name": null, "source": null}, "icon": "", "allow_markdown": false, "positive_feedback": null, "state": "complete", "targets": []}, "category": "message", "content_blocks": [], "id": "28879bd8-6a68-4dd5-b658-74d643a4dd92", "flow_id": "d2bbd92b-187e-4c84-b2d4-5df365704201"}} + // ... Additional events as the flow executes ... -// ... Additional events as the flow executes ... + {"event": "end", "data": {}} + ``` -{"event": "end", "data": {}} -``` - - - +
The `/build/$FLOW_ID/events` endpoint accepts an optional `stream` query parameter that defaults to `true`. To disable streaming and get all events at once, set `stream` to `false`. -```text +```shell curl -X GET \ "$LANGFLOW_URL/api/v1/build/123e4567-e89b-12d3-a456-426614174000/events?stream=false" \ -H "accept: application/json" \ @@ -115,7 +107,7 @@ The `/build` endpoint accepts optional values for `start_component_id` and `stop Setting `stop_component_id` for a component triggers the same behavior as clicking the **Play** button on that component, where all dependent components leading up to that component are also run. For example, to stop flow execution at the OpenAI model component, run the following command: -```bash +```shell curl -X POST \ "$LANGFLOW_URL/api/v1/build/$FLOW_ID/flow" \ -H "accept: application/json" \ @@ -129,10 +121,7 @@ curl -X POST \ The `/build` endpoint also accepts inputs for `data` directly, instead of using the values stored in the Langflow database. This is useful for running flows without having to pass custom values through the UI. - - - -```bash +```shell curl -X POST \ "$LANGFLOW_URL/api/v1/build/$FLOW_ID/flow" \ -H "accept: application/json" \ @@ -150,15 +139,14 @@ curl -X POST \ }' ``` - - +
+Result ```json { "job_id": "0bcc7f23-40b4-4bfa-9b8a-a44181fd1175" } ``` - - +
## See also diff --git a/docs/docs/API-Reference/api-files.mdx b/docs/docs/API-Reference/api-files.mdx index a01bf4a64414..a487754a41a7 100644 --- a/docs/docs/API-Reference/api-files.mdx +++ b/docs/docs/API-Reference/api-files.mdx @@ -30,13 +30,9 @@ Use the `/files` endpoints to move files between your local machine and Langflow ### Upload file (v1) -Upload a file to the `v1/files/upload/` endpoint of your flow. +Upload a file to the `v1/files/upload/$FLOW_ID` endpoint: Replace **FILE_NAME** with the uploaded file name. - - - - ```bash curl -X POST \ "$LANGFLOW_URL/api/v1/files/upload/$FLOW_ID" \ @@ -46,8 +42,11 @@ curl -X POST \ -F "file=@FILE_NAME.txt" ``` - - +Replace `FILE_NAME.txt` with the name and extension of the file you want to upload. +Not all file types are supported. + +
+Result ```json { @@ -56,8 +55,7 @@ curl -X POST \ } ``` - - +
### Upload image files (v1) @@ -116,9 +114,6 @@ To change this limit, set the `LANGFLOW_MAX_FILE_SIZE_UPLOAD` [environment varia List all files associated with a specific flow. - - - ```bash curl -X GET \ "$LANGFLOW_URL/api/v1/files/list/$FLOW_ID" \ @@ -126,8 +121,8 @@ curl -X GET \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```json { @@ -135,16 +130,12 @@ curl -X GET \ } ``` - - +
### Download file (v1) Download a specific file from a flow. - - - ```bash curl -X GET \ "$LANGFLOW_URL/api/v1/files/download/$FLOW_ID/2024-12-30_15-19-43_your_file.txt" \ @@ -153,23 +144,19 @@ curl -X GET \ --output downloaded_file.txt ``` - - +
+Result ```text File contents downloaded to downloaded_file.txt ``` - - +
### Delete file (v1) Delete a specific file from a flow. - - - ```bash curl -X DELETE \ "$LANGFLOW_URL/api/v1/files/delete/$FLOW_ID/2024-12-30_15-19-43_your_file.txt" \ @@ -177,8 +164,8 @@ curl -X DELETE \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```json { @@ -186,8 +173,7 @@ curl -X DELETE \ } ``` - - +
## Files/V2 endpoints @@ -202,42 +188,47 @@ Upload a file to your user account. The file can be used across multiple flows. The file is uploaded in the format `USER_ID/FILE_ID.FILE_EXTENSION`, such as `07e5b864-e367-4f52-b647-a48035ae7e5e/d44dc2e1-9ae9-4cf6-9114-8d34a6126c94.pdf`. -To retrieve your current `user_id`, call the `/whoami` endpoint. -```bash -curl -X GET \ - "$LANGFLOW_URL/api/v1/users/whoami" \ - -H "accept: application/json" \ - -H "x-api-key: $LANGFLOW_API_KEY" -``` +1. To retrieve your current `user_id`, call the `/whoami` endpoint: -Result: -``` -{"id":"07e5b864-e367-4f52-b647-a48035ae7e5e","username":"langflow","profile_image":null,"store_api_key":null,"is_active":true,"is_superuser":true,"create_at":"2025-05-08T17:59:07.855965","updated_at":"2025-05-28T19:00:42.556460","last_login_at":"2025-05-28T19:00:42.554338","optins":{"github_starred":false,"dialog_dismissed":true,"discord_clicked":false,"mcp_dialog_dismissed":true}} -``` + ```bash + curl -X GET \ + "$LANGFLOW_URL/api/v1/users/whoami" \ + -H "accept: application/json" \ + -H "x-api-key: $LANGFLOW_API_KEY" + ``` + +
+ Result + + ``` + {"id":"07e5b864-e367-4f52-b647-a48035ae7e5e","username":"langflow","profile_image":null,"store_api_key":null,"is_active":true,"is_superuser":true,"create_at":"2025-05-08T17:59:07.855965","updated_at":"2025-05-28T19:00:42.556460","last_login_at":"2025-05-28T19:00:42.554338","optins":{"github_starred":false,"dialog_dismissed":true,"discord_clicked":false,"mcp_dialog_dismissed":true}} + ``` + +
-In the POST request to `v2/files`, replace **@FILE_NAME.EXTENSION** with the uploaded file name and its extension. +2. In the POST request to `v2/files`, replace **@FILE_NAME.EXTENSION** with the uploaded file name and its extension. You must include the ampersand (`@`) in the request to instruct curl to upload the contents of the file, not the string `FILE_NAME.EXTENSION`. -```bash -curl -X POST \ - "$LANGFLOW_URL/api/v2/files" \ - -H "accept: application/json" \ - -H "Content-Type: multipart/form-data" \ - -H "x-api-key: $LANGFLOW_API_KEY" \ - -F "file=@FILE_NAME.EXTENSION" -``` + ```bash + curl -X POST \ + "$LANGFLOW_URL/api/v2/files" \ + -H "accept: application/json" \ + -H "Content-Type: multipart/form-data" \ + -H "x-api-key: $LANGFLOW_API_KEY" \ + -F "file=@FILE_NAME.EXTENSION" + ``` -The file is uploaded in the format `USER_ID/FILE_ID.FILE_EXTENSION`, and the API returns metadata about the uploaded file: + The file is uploaded in the format `USER_ID/FILE_ID.FILE_EXTENSION`, and the API returns metadata about the uploaded file: -```json -{ - "id":"d44dc2e1-9ae9-4cf6-9114-8d34a6126c94", - "name":"engine_manual", - "path":"07e5b864-e367-4f52-b647-a48035ae7e5e/d44dc2e1-9ae9-4cf6-9114-8d34a6126c94.pdf", - "size":851160, - "provider":null -} -``` + ```json + { + "id":"d44dc2e1-9ae9-4cf6-9114-8d34a6126c94", + "name":"engine_manual", + "path":"07e5b864-e367-4f52-b647-a48035ae7e5e/d44dc2e1-9ae9-4cf6-9114-8d34a6126c94.pdf", + "size":851160, + "provider":null + } + ``` ### Send files to your flows (v2) @@ -253,66 +244,68 @@ The default file limit is 100 MB. To configure this value, change the `LANGFLOW_ For more information, see [Supported environment variables](/environment-variables#supported-variables). 1. To send a file to your flow with the API, POST the file to the `/api/v2/files` endpoint. - Replace **FILE_NAME** with the uploaded file name. - This is the same step described in [Upload file (v2)](#upload-file-v2), but since you need the filename to upload to your flow, it is included here. -```bash -curl -X POST \ - "$LANGFLOW_URL/api/v2/files" \ - -H "accept: application/json" \ - -H "Content-Type: multipart/form-data" \ - -H "x-api-key: $LANGFLOW_API_KEY" \ - -F "file=@FILE_NAME.EXTENSION" -``` + Replace **FILE_NAME.EXTENSION** with the name and extension of the file you want to upload. + This is the same step described in [Upload file (v2)](#upload-file-v2), but since you need the filename to upload to your flow, it is included here. -The file is uploaded in the format `USER_ID/FILE_ID.FILE_EXTENSION`, and the API returns metadata about the uploaded file: + ```bash + curl -X POST \ + "$LANGFLOW_URL/api/v2/files" \ + -H "accept: application/json" \ + -H "Content-Type: multipart/form-data" \ + -H "x-api-key: $LANGFLOW_API_KEY" \ + -F "file=@FILE_NAME.EXTENSION" + ``` -```json -{ - "id":"d44dc2e1-9ae9-4cf6-9114-8d34a6126c94", - "name":"engine_manual", - "path":"07e5b864-e367-4f52-b647-a48035ae7e5e/d44dc2e1-9ae9-4cf6-9114-8d34a6126c94.pdf", - "size":851160, - "provider": null -} -``` + The file is uploaded in the format `USER_ID/FILE_ID.FILE_EXTENSION`, and the API returns metadata about the uploaded file: + + ```json + { + "id":"d44dc2e1-9ae9-4cf6-9114-8d34a6126c94", + "name":"engine_manual", + "path":"07e5b864-e367-4f52-b647-a48035ae7e5e/d44dc2e1-9ae9-4cf6-9114-8d34a6126c94.pdf", + "size":851160, + "provider": null + } + ``` 2. To use this file in your flow, add a [File](/components-data#file) component to load a file into the flow. 3. To load the file into your flow, send it to the **File** component. -To retrieve the **File** component's full name with the UUID attached, call the [Read flow](/api-flows#read-flow) endpoint, and then include your **File** component and the file path as a tweak with the `/v1/run` POST request. -In this example, the file uploaded to `/v2/files` is included with the `/v1/run` POST request. -```text -curl --request POST \ - --url "$LANGFLOW_URL/api/v1/run/$FLOW_ID" \ - --header "Content-Type: application/json" \ - --header "x-api-key: $LANGFLOW_API_KEY" \ - --data '{ - "input_value": "what do you see?", - "output_type": "chat", - "input_type": "text", - "tweaks": { - "File-1olS3": { - "path": [ - "07e5b864-e367-4f52-b647-a48035ae7e5e/3a290013-fe1e-4d3d-a454-cacae81288f3.pdf" - ] - } - } -}' -``` + To retrieve the **File** component's full name with the UUID attached, call the [Read flow](/api-flows#read-flow) endpoint, and then include your **File** component and the file path as a tweak with the `/v1/run` POST request. + In this example, the file uploaded to `/v2/files` is included with the `/v1/run` POST request. + + ```text + curl --request POST \ + --url "$LANGFLOW_URL/api/v1/run/$FLOW_ID" \ + --header "Content-Type: application/json" \ + --header "x-api-key: $LANGFLOW_API_KEY" \ + --data '{ + "input_value": "what do you see?", + "output_type": "chat", + "input_type": "text", + "tweaks": { + "File-1olS3": { + "path": [ + "07e5b864-e367-4f52-b647-a48035ae7e5e/3a290013-fe1e-4d3d-a454-cacae81288f3.pdf" + ] + } + } + }' + ``` -Result: -```text -"text":"This document provides important safety information and instructions for selecting, installing, and operating Briggs & Stratton engines. It includes warnings and guidelines to prevent injury, fire, or damage, such as choosing the correct engine model, proper installation procedures, safe fuel handling, and correct engine operation. The document emphasizes following all safety precautions and using authorized parts to ensure safe and effective engine use." -``` +
+ Result + + ```text + "text":"This document provides important safety information and instructions for selecting, installing, and operating Briggs & Stratton engines. It includes warnings and guidelines to prevent injury, fire, or damage, such as choosing the correct engine model, proper installation procedures, safe fuel handling, and correct engine operation. The document emphasizes following all safety precautions and using authorized parts to ensure safe and effective engine use." + ``` +
### List files (v2) List all files associated with your user account. - - - ```bash curl -X GET \ "$LANGFLOW_URL/api/v2/files" \ @@ -320,8 +313,8 @@ curl -X GET \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```json [ @@ -335,19 +328,13 @@ curl -X GET \ ] ``` - - +
### Download file (v2) Download a specific file by its ID and file extension. -:::tip You must specify the file type you expect in the `--output` value. -::: - - - ```bash curl -X GET \ @@ -357,23 +344,19 @@ curl -X GET \ --output downloaded_file.txt ``` - - +
+Result ```text File contents downloaded to downloaded_file.txt ``` - - +
### Edit file name (v2) Change a file name. - - - ```bash curl -X PUT \ "$LANGFLOW_URL/api/v2/files/$FILE_ID?name=new_file_name" \ @@ -381,8 +364,8 @@ curl -X PUT \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```json { @@ -394,15 +377,12 @@ curl -X PUT \ } ``` - - +
+ ### Delete file (v2) Delete a specific file by its ID. - - - ```bash curl -X DELETE \ "$LANGFLOW_URL/api/v2/files/$FILE_ID" \ @@ -410,8 +390,8 @@ curl -X DELETE \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```json { @@ -419,16 +399,12 @@ curl -X DELETE \ } ``` - - +
### Delete all files (v2) Delete all files associated with your user account. - - - ```bash curl -X DELETE \ "$LANGFLOW_URL/api/v2/files" \ @@ -436,8 +412,8 @@ curl -X DELETE \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```json { @@ -445,8 +421,7 @@ curl -X DELETE \ } ``` - - +
## Create upload file (Deprecated) diff --git a/docs/docs/API-Reference/api-flows-run.mdx b/docs/docs/API-Reference/api-flows-run.mdx index 22c4493f7a43..497a0cc826e3 100644 --- a/docs/docs/API-Reference/api-flows-run.mdx +++ b/docs/docs/API-Reference/api-flows-run.mdx @@ -41,7 +41,7 @@ curl -X POST \ The response from `/v1/run/$FLOW_ID` includes metadata, inputs, and outputs for the run.
- Result +Result The following example illustrates a response from a Basic Prompting flow: @@ -102,7 +102,7 @@ curl -X POST \ LLM chat responses are streamed back as `token` events, culminating in a final `end` event that closes the connection.
- Result +Result The following example is truncated to illustrate a series of `token` events as well as the final `end` event that closes the LLM's token streaming response: @@ -127,6 +127,7 @@ The following example is truncated to illustrate a series of `token` events as w {"event": "end", "data": {"result": {"session_id": "chat-123", "message": "Sure! Have you ever heard of the phenomenon known as \"bioluminescence\"?..."}}} ``` +
### Run endpoint headers @@ -190,12 +191,14 @@ curl -X POST \
Result + ```json { "message": "Task started in the background", "status": "in progress" } ``` +
For more information, see [Webhook component](/components-data#webhook) and [Trigger flows with webhooks](/webhook). diff --git a/docs/docs/API-Reference/api-flows.mdx b/docs/docs/API-Reference/api-flows.mdx index d23dabe30e4b..d4c9f6d9c8a8 100644 --- a/docs/docs/API-Reference/api-flows.mdx +++ b/docs/docs/API-Reference/api-flows.mdx @@ -14,9 +14,6 @@ If you want to use the Langflow API to run a flow, see [Flow trigger endpoints]( Creates a new flow. - - - ```bash curl -X POST \ "$LANGFLOW_URL/api/v1/flows/" \ @@ -40,8 +37,8 @@ curl -X POST \ }' ``` - - +
+Result ```json { @@ -63,8 +60,7 @@ curl -X POST \ } ``` - - +
## Create flows @@ -122,9 +118,6 @@ curl -X POST \ Retrieves a specific flow by its ID. - - - ```bash curl -X GET \ "$LANGFLOW_URL/api/v1/flows/$FLOW_ID" \ @@ -132,9 +125,8 @@ curl -X GET \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - - +
+Result ```json { @@ -151,8 +143,7 @@ curl -X GET \ } ``` - - +
## Read flows @@ -193,9 +184,6 @@ Updates an existing flow by its ID. This example changes the value for `endpoint_name` from a random UUID to `my_new_endpoint_name`. - - - ```bash curl -X PATCH \ "$LANGFLOW_URL/api/v1/flows/$FLOW_ID" \ @@ -212,8 +200,8 @@ curl -X PATCH \ }' ``` - - +
+Result ```json { @@ -235,16 +223,12 @@ curl -X PATCH \ } ``` - - +
## Delete flow Deletes a specific flow by its ID. - - - ```bash curl -X DELETE \ "$LANGFLOW_URL/api/v1/flows/$FLOW_ID" \ @@ -252,9 +236,8 @@ curl -X DELETE \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - - +
+Result ```json { @@ -262,8 +245,7 @@ curl -X DELETE \ } ``` - - +
## Export flows @@ -271,9 +253,6 @@ Exports specified flows to a ZIP file. This endpoint downloads a ZIP file containing [Langflow JSON files](/concepts-flows-import#langflow-json-file-contents) for each flow ID listed in the request body. - - - ```bash curl -X POST \ "$LANGFLOW_URL/api/v1/flows/download/" \ @@ -287,8 +266,8 @@ curl -X POST \ --output langflow-flows.zip ``` - - +
+Result ```text % Total % Received % Xferd Average Speed Time Time Time Current @@ -296,8 +275,7 @@ curl -X POST \ 100 76437 0 76353 100 84 4516k 5088 --:--:-- --:--:-- --:--:-- 4665k ``` - - +
## Import flows @@ -306,10 +284,7 @@ Imports flows by uploading a [Langflow-compatible JSON file](/concepts-flows-imp To specify a target project for the flow, include the query parameter `project_id`. The target `project_id` must already exist before uploading a flow. Call the [/api/v1/projects/](/api-projects#read-projects) endpoint for a list of available projects. -This example uploads a local file named `agent-with-astra-db-tool.json` to a project specified by a `PROJECT_ID` variable. - - - +This example uploads a local file named `agent-with-astra-db-tool.json` to a project specified by a `PROJECT_ID` variable: ```bash curl -X POST \ @@ -320,8 +295,8 @@ curl -X POST \ -F "file=@agent-with-astra-db-tool.json;type=application/json" ``` - - +
+Result ```json [ @@ -337,5 +312,4 @@ curl -X POST \ ] ``` - - \ No newline at end of file +
\ No newline at end of file diff --git a/docs/docs/API-Reference/api-logs.mdx b/docs/docs/API-Reference/api-logs.mdx index b7b543549c5c..caa4fff55a41 100644 --- a/docs/docs/API-Reference/api-logs.mdx +++ b/docs/docs/API-Reference/api-logs.mdx @@ -32,9 +32,6 @@ The `/logs` endpoint requires log retrieval to be enabled in your Langflow insta Stream logs in real-time using Server Sent Events (SSE). - - - ```bash curl -X GET \ "$LANGFLOW_URL/logs-stream" \ @@ -42,8 +39,8 @@ curl -X GET \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```text keepalive @@ -65,8 +62,7 @@ keepalive keepalive ``` - - +
## Retrieve logs with optional parameters @@ -79,9 +75,6 @@ Retrieve logs with optional query parameters: The default values for all three parameters is `0`. With default values, the endpoint returns the last 10 lines of logs. - - - ```bash curl -X GET \ "$LANGFLOW_URL/logs?lines_before=0&lines_after=0×tamp=0" \ @@ -89,8 +82,8 @@ curl -X GET \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```text { @@ -103,9 +96,8 @@ curl -X GET \ "1736354770588": "2025-01-08T11:46:10.588105-0500 DEBUG Create service ServiceType.CHAT_SERVICE\n", "1736354771021": "2025-01-08T11:46:11.021817-0500 DEBUG Telemetry data sent successfully.\n", "1736354775619": "2025-01-08T11:46:15.619545-0500 DEBUG Create service ServiceType.STORE_SERVICE\n", - "1736354775699": "2025-01-08T11:46:15.699661-0500 DEBUG File 046-rocket.svg retrieved successfully from flow /Users/mendon.kissling/Library/Caches/langflow/profile_pictures/Space.\n" + "1736354775699": "2025-01-08T11:46:15.699661-0500 DEBUG File 046-rocket.svg retrieved successfully from flow /Users/USER/Library/Caches/langflow/profile_pictures/Space.\n" } ``` - - \ No newline at end of file +
\ No newline at end of file diff --git a/docs/docs/API-Reference/api-monitor.mdx b/docs/docs/API-Reference/api-monitor.mdx index b76379bab2d4..f0839349a4df 100644 --- a/docs/docs/API-Reference/api-monitor.mdx +++ b/docs/docs/API-Reference/api-monitor.mdx @@ -12,9 +12,6 @@ Use the `/monitor` endpoint to monitor and modify messages passed between Langfl Retrieve Vertex builds for a specific flow. - - - ```bash curl -X GET \ "$LANGFLOW_URL/api/v1/monitor/builds?flow_id=$FLOW_ID" \ @@ -22,8 +19,8 @@ curl -X GET \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```json { @@ -385,16 +382,12 @@ curl -X GET \ } ``` - - +
## Delete Vertex builds Delete Vertex builds for a specific flow. - - - ```bash curl -X DELETE \ "$LANGFLOW_URL/api/v1/monitor/builds?flow_id=$FLOW_ID" \ @@ -402,15 +395,14 @@ curl -X DELETE \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```text 204 No Content ``` - - +
## Get messages @@ -429,9 +421,6 @@ To sort the results, use the `order_by` query parameter. This example retrieves messages sent by `Machine` and `AI` in a given chat session (`session_id`) and orders the messages by timestamp. - - - ```bash curl -X GET \ "$LANGFLOW_URL/api/v1/monitor/messages?flow_id=$FLOW_ID&session_id=01ce083d-748b-4b8d-97b6-33adbb6a528a&sender=Machine&sender_name=AI&order_by=timestamp" \ @@ -439,8 +428,8 @@ curl -X GET \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```json [ @@ -475,17 +464,13 @@ curl -X GET \ ] ``` - - +
## Delete messages Delete specific messages by their IDs. -This example deletes the message retrieved in the previous Get messages example. - - - +This example deletes the message retrieved in the previous `GET /messages` example. ```bash curl -v -X DELETE \ @@ -496,15 +481,14 @@ curl -v -X DELETE \ -d '["MESSAGE_ID_1", "MESSAGE_ID_2"]' ``` - - +
+Result ```text 204 No Content ``` - - +
## Update message @@ -512,9 +496,6 @@ Update a specific message by its ID. This example updates the `text` value of message `3ab66cc6-c048-48f8-ab07-570f5af7b160`. - - - ```bash curl -X PUT \ "$LANGFLOW_URL/api/v1/monitor/messages/3ab66cc6-c048-48f8-ab07-570f5af7b160" \ @@ -526,8 +507,8 @@ curl -X PUT \ }' ``` - - +
+Result ```json { @@ -557,8 +538,7 @@ curl -X PUT \ } ``` - - +
## Update session ID @@ -566,9 +546,6 @@ Update the session ID for messages. This example updates the `session_ID` value `01ce083d-748b-4b8d-97b6-33adbb6a528a` to `different_session_id`. - - - ```bash curl -X PATCH \ "$LANGFLOW_URL/api/v1/monitor/messages/session/01ce083d-748b-4b8d-97b6-33adbb6a528a?new_session_id=different_session_id" \ @@ -576,8 +553,8 @@ curl -X PATCH \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```json [ @@ -612,16 +589,12 @@ curl -X PATCH \ ] ``` - - +
## Delete messages by session Delete all messages for a specific session. - - - ```bash curl -X DELETE \ "$LANGFLOW_URL/api/v1/monitor/messages/session/different_session_id_2" \ @@ -629,23 +602,19 @@ curl -X DELETE \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```text HTTP/1.1 204 No Content ``` - - +
## Get transactions Retrieve all transactions, which are interactions between components, for a specific flow. - - - ```bash curl -X GET \ "$LANGFLOW_URL/api/v1/monitor/transactions?flow_id=$FLOW_ID&page=1&size=50" \ @@ -653,8 +622,8 @@ curl -X GET \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```json { @@ -678,8 +647,7 @@ curl -X GET \ } ``` - - +
## See also diff --git a/docs/docs/API-Reference/api-projects.mdx b/docs/docs/API-Reference/api-projects.mdx index da02164e155c..3b762d20dd24 100644 --- a/docs/docs/API-Reference/api-projects.mdx +++ b/docs/docs/API-Reference/api-projects.mdx @@ -14,9 +14,6 @@ Projects store your flows and components. Get a list of Langflow projects, including project IDs, names, and descriptions. - - - ```bash curl -X GET \ "$LANGFLOW_URL/api/v1/projects/" \ @@ -24,8 +21,8 @@ curl -X GET \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```json [ @@ -38,16 +35,12 @@ curl -X GET \ ] ``` - - +
## Create project Create a new project. - - - ```bash curl -X POST \ "$LANGFLOW_URL/api/v1/projects/" \ @@ -61,8 +54,8 @@ curl -X POST \ }' ``` - - +
+Result ```json { @@ -73,8 +66,7 @@ curl -X POST \ } ``` - - +
To add flows and components at project creation, retrieve the `components_list` and `flows_list` values from the [`/all`](/api-reference-api-examples#get-all-components) and [/flows/read](/api-flows#read-flows) endpoints and add them to the request body. @@ -104,9 +96,6 @@ Retrieve details of a specific project. To find the UUID of your project, call the [read projects](#read-projects) endpoint. - - - ```bash curl -X GET \ "$LANGFLOW_URL/api/v1/projects/$PROJECT_ID" \ @@ -114,8 +103,8 @@ curl -X GET \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```json [ @@ -128,8 +117,7 @@ curl -X GET \ ] ``` - - +
## Update project @@ -139,9 +127,6 @@ Each PATCH request updates the project with the values you send. Only the fields you include in your request are updated. If you send the same values multiple times, the update is still processed, even if the values are unchanged. - - - ```bash curl -X PATCH \ "$LANGFLOW_URL/api/v1/projects/b408ddb9-6266-4431-9be8-e04a62758331" \ @@ -160,8 +145,8 @@ curl -X PATCH \ }' ``` - - +
+Result ```json { @@ -172,16 +157,12 @@ curl -X PATCH \ } ``` - - +
## Delete project Delete a specific project. - - - ```bash curl -X DELETE \ "$LANGFLOW_URL/api/v1/projects/$PROJECT_ID" \ @@ -189,15 +170,14 @@ curl -X DELETE \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```text 204 No Content ``` - - +
## Export a project @@ -207,7 +187,7 @@ The `--output` flag is optional. ```bash curl -X GET \ - "$LANGFLOW_URL/api/v1/projects/download/b408ddb9-6266-4431-9be8-e04a62758331" \ + "$LANGFLOW_URL/api/v1/projects/download/$PROJECT_ID" \ -H "accept: application/json" \ -H "x-api-key: $LANGFLOW_API_KEY" \ --output langflow-project.zip @@ -224,4 +204,4 @@ curl -X POST \ -H "Content-Type: multipart/form-data" \ -H "x-api-key: $LANGFLOW_API_KEY" \ -F "file=@20241230_135006_langflow_flows.zip;type=application/zip" -``` +``` \ No newline at end of file diff --git a/docs/docs/API-Reference/api-reference-api-examples.mdx b/docs/docs/API-Reference/api-reference-api-examples.mdx index 3328cc88ed31..29f30b14077e 100644 --- a/docs/docs/API-Reference/api-reference-api-examples.mdx +++ b/docs/docs/API-Reference/api-reference-api-examples.mdx @@ -138,6 +138,7 @@ curl -X GET \
Result + ```text { "version": "1.1.1", @@ -145,6 +146,7 @@ curl -X GET \ "package": "Langflow" } ``` +
### Get configuration @@ -160,6 +162,7 @@ curl -X GET \
Result + ```json { "feature_flags": { @@ -172,6 +175,7 @@ curl -X GET \ "max_file_size_upload": 100 } ``` +
### Get all components diff --git a/docs/docs/API-Reference/api-users.mdx b/docs/docs/API-Reference/api-users.mdx index ee4a364e979e..4aa65eee683c 100644 --- a/docs/docs/API-Reference/api-users.mdx +++ b/docs/docs/API-Reference/api-users.mdx @@ -17,9 +17,6 @@ Create a new user account with a username and password. This creates a new UUID for the user's `id`, which is mapped to `user_id` in the Langflow database. - - - ```bash curl -X POST \ "$LANGFLOW_URL/api/v1/users/" \ @@ -31,8 +28,8 @@ curl -X POST \ }' ``` - - +
+Result ```json { @@ -53,16 +50,12 @@ curl -X POST \ } ``` - - +
## Get current user Retrieve information about the currently authenticated user. - - - ```bash curl -X GET \ "$LANGFLOW_URL/api/v1/users/whoami" \ @@ -70,8 +63,8 @@ curl -X GET \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```json { @@ -87,17 +80,13 @@ curl -X GET \ } ``` - - +
## List all users Get a paginated list of all users in the system. Only superusers can use this endpoint (`is_superuser: true`). - - - ```bash curl -X GET \ "$LANGFLOW_URL/api/v1/users/?skip=0&limit=10" \ @@ -105,8 +94,8 @@ curl -X GET \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```json { @@ -165,8 +154,7 @@ curl -X GET \ } ``` - - +
## Update user @@ -174,9 +162,6 @@ Modify an existing user's information with a PATCH request. This example makes the user `10c1c6a2-ab8a-4748-8700-0e4832fd5ce8` an active superuser. - - - ```bash curl -X PATCH \ "$LANGFLOW_URL/api/v1/users/10c1c6a2-ab8a-4748-8700-0e4832fd5ce8" \ @@ -188,8 +173,8 @@ curl -X PATCH \ }' ``` - - +
+Result ```json { @@ -210,8 +195,7 @@ curl -X PATCH \ } ``` - - +
## Reset password @@ -219,9 +203,6 @@ Change a user's password to a new secure value. You can't change another user's password. - - - ```bash curl -X PATCH \ "$LANGFLOW_URL/api/v1/users/10c1c6a2-ab8a-4748-8700-0e4832fd5ce8/reset-password" \ @@ -232,8 +213,8 @@ curl -X PATCH \ }' ``` - - +
+Result ```json { @@ -254,8 +235,7 @@ curl -X PATCH \ } ``` - - +
## Delete user @@ -263,9 +243,6 @@ Remove a user account from the system. Only superusers can use this endpoint (`is_superuser: true`). - - - ```bash curl -X DELETE \ "$LANGFLOW_URL/api/v1/users/10c1c6a2-ab8a-4748-8700-0e4832fd5ce8" \ @@ -273,8 +250,8 @@ curl -X DELETE \ -H "x-api-key: $LANGFLOW_API_KEY" ``` - - +
+Result ```json { @@ -282,5 +259,4 @@ curl -X DELETE \ } ``` - - \ No newline at end of file +
\ No newline at end of file diff --git a/docs/docs/Agents/agents-tools.mdx b/docs/docs/Agents/agents-tools.mdx index 67d8c3a2d0bb..ac0487dca53e 100644 --- a/docs/docs/Agents/agents-tools.mdx +++ b/docs/docs/Agents/agents-tools.mdx @@ -46,92 +46,92 @@ For example, the default tool name is `Agent`. Edit the name to `Agent-gpt-41`, ## Add custom components as tools {#components-as-tools} -An agent can use custom components as tools. +An agent can use [custom components](/components-custom-components) as tools. 1. To add a custom component to the agent flow, click **New Custom Component**. -2. Add custom Python code to the custom component. -For example, to create a text analyzer component, paste the below code into the custom component's **Code** pane. - -
-Python - -```python -from langflow.custom import Component -from langflow.io import MessageTextInput, Output -from langflow.schema import Data -import re - -class TextAnalyzerComponent(Component): - display_name = "Text Analyzer" - description = "Analyzes and transforms input text." - documentation: str = "http://docs.langflow.org/components/custom" - icon = "chart-bar" - name = "TextAnalyzerComponent" - - inputs = [ - MessageTextInput( - name="input_text", - display_name="Input Text", - info="Enter text to analyze", - value="Hello, World!", - tool_mode=True, - ), - ] - - outputs = [ - Output(display_name="Analysis Result", name="output", method="analyze_text"), - ] - - def analyze_text(self) -> Data: - text = self.input_text - - # Perform text analysis - word_count = len(text.split()) - char_count = len(text) - sentence_count = len(re.findall(r'\w+[.!?]', text)) - - # Transform text - reversed_text = text[::-1] - uppercase_text = text.upper() - - analysis_result = { - "original_text": text, - "word_count": word_count, - "character_count": char_count, - "sentence_count": sentence_count, - "reversed_text": reversed_text, - "uppercase_text": uppercase_text - } - - data = Data(value=analysis_result) - self.status = data - return data -``` -
+2. Enter Python code into the **Code** pane to create the custom component. + + If you don't already have code for a custom component, you can use the following code snippet as an example before creating your own. + +
+ Text Analyzer custom component + + This code creates a text analyzer component. + + ```python + from langflow.custom import Component + from langflow.io import MessageTextInput, Output + from langflow.schema import Data + import re + + class TextAnalyzerComponent(Component): + display_name = "Text Analyzer" + description = "Analyzes and transforms input text." + documentation: str = "http://docs.langflow.org/components/custom" + icon = "chart-bar" + name = "TextAnalyzerComponent" + + inputs = [ + MessageTextInput( + name="input_text", + display_name="Input Text", + info="Enter text to analyze", + value="Hello, World!", + tool_mode=True, + ), + ] + + outputs = [ + Output(display_name="Analysis Result", name="output", method="analyze_text"), + ] + + def analyze_text(self) -> Data: + text = self.input_text + + # Perform text analysis + word_count = len(text.split()) + char_count = len(text) + sentence_count = len(re.findall(r'\w+[.!?]', text)) + + # Transform text + reversed_text = text[::-1] + uppercase_text = text.upper() + + analysis_result = { + "original_text": text, + "word_count": word_count, + "character_count": char_count, + "sentence_count": sentence_count, + "reversed_text": reversed_text, + "uppercase_text": uppercase_text + } + + data = Data(value=analysis_result) + self.status = data + return data + ``` +
3. To use the custom component as a tool, click **Tool Mode**. 4. Connect the custom component's tool output to the agent's tools input. 5. Open the
+ 2. Add an **SQL Database** component to your flow. diff --git a/docs/docs/Components/components-vector-stores.mdx b/docs/docs/Components/components-vector-stores.mdx index a21abe05b24a..e8c8bafd1a26 100644 --- a/docs/docs/Components/components-vector-stores.mdx +++ b/docs/docs/Components/components-vector-stores.mdx @@ -576,12 +576,6 @@ For more information, see the [Chroma documentation](https://docs.trychroma.com/ - - - - - - ## Milvus This component creates a Milvus vector store with search capabilities. diff --git a/docs/docs/Concepts/concepts-file-management.mdx b/docs/docs/Concepts/concepts-file-management.mdx index 6a91212a5489..376c00b7cc07 100644 --- a/docs/docs/Concepts/concepts-file-management.mdx +++ b/docs/docs/Concepts/concepts-file-management.mdx @@ -18,7 +18,7 @@ You can also manage all files that have been uploaded to your Langflow server. 1. Navigate to Langflow file management: - * In the Langflow UI, on the [**Projects** page](#projects) page, click **My Files** below the list of projects. + * In the Langflow UI, on the [**Projects** page](/concepts-flows#projects) page, click **My Files** below the list of projects. * From a browser, navigate to your Langflow server's `/files` endpoint, such as `http://localhost:7860/files`. Modify the base URL as needed for your Langflow server. * For programmatic file management, use the [Langflow API files endpoints](/api-files). However, the following steps assume you're using the file management UI. diff --git a/docs/docs/Concepts/concepts-overview.mdx b/docs/docs/Concepts/concepts-overview.mdx index 9530d308167b..bce9cd7b75d8 100644 --- a/docs/docs/Concepts/concepts-overview.mdx +++ b/docs/docs/Concepts/concepts-overview.mdx @@ -22,7 +22,7 @@ This is where you add [components](/concepts-components), configure them, and at ![Empty Langflow workspace](/img/workspace.png) -From the **Workspace**, you can also access the [**Playground**](#playground), [**Share** menu](#share-menu), and [**Logs**](/concepts-flows#flow-logs). +From the **Workspace**, you can also access the [**Playground**](#playground), [**Share** menu](#share-menu), and [**Logs**](/concepts-flows#flow-storage-and-logs). ### Workspace gestures and interactions diff --git a/docs/docs/Concepts/concepts-playground.mdx b/docs/docs/Concepts/concepts-playground.mdx index 591deaa628e6..54b799328efb 100644 --- a/docs/docs/Concepts/concepts-playground.mdx +++ b/docs/docs/Concepts/concepts-playground.mdx @@ -87,7 +87,7 @@ Custom session IDs are helpful for multiple reasons: You can set custom session IDs in the visual editor and programmatically. - + In your [input and output components](/components-io), use the **Session ID** field: @@ -101,8 +101,8 @@ If the field is empty, the flow uses the default session ID. Make sure to change the **Session ID** when you want to start a new chat session or continue an earlier chat session with a different session ID. - - + + When you trigger a flow with the Langflow API, include the `session_id` parameter in the request payload. For example: @@ -121,7 +121,7 @@ curl -X POST "http://$LANGFLOW_SERVER_ADDRESS/api/v1/run/$FLOW_ID" \ This command starts a new chat sessions with the specified `session_id` or it retrieves an existing session with that ID, if one exists. - + :::tip diff --git a/docs/docs/Concepts/concepts-publish.mdx b/docs/docs/Concepts/concepts-publish.mdx index c624ca0e0858..2570e5c6b953 100644 --- a/docs/docs/Concepts/concepts-publish.mdx +++ b/docs/docs/Concepts/concepts-publish.mdx @@ -169,7 +169,7 @@ For more information, see the [langflow-embedded-chat README](https://github.com The following examples show how to use embedded chat widget in React, Angular, and plain HTML. - + To use the chat widget in your React application, create a component that loads the widget script and renders the chat interface: @@ -258,8 +258,8 @@ Modify the following reference for your React component's name and the desired ` ``` - - + + To use the chat widget in your Angular application, create a component that loads the widget script and renders the chat interface. @@ -330,8 +330,8 @@ You must add `CUSTOM_ELEMENTS_SCHEMA` to your module's configuration to enable t } ``` - - + + ```html @@ -348,7 +348,7 @@ You must add `CUSTOM_ELEMENTS_SCHEMA` to your module's configuration to enable t ``` - + ### Configure the langflow-chat web component {#configure-the-langflow-chat-web-component} diff --git a/docs/docs/Concepts/data-types.mdx b/docs/docs/Concepts/data-types.mdx index 6ac2ce354be5..1703a717fac9 100644 --- a/docs/docs/Concepts/data-types.mdx +++ b/docs/docs/Concepts/data-types.mdx @@ -274,15 +274,15 @@ The following example shows how to inspect the output of a **Type Convert** comp The default output is `Message` data, which is the same as the input coming from the **Chat Input** component. To see the `Message` data converted to `Data` or `DataFrame`, change the **Output Type** on the **Type Convert** component, and then rerun the component. - - + + ```text Input text ``` - - + + ```json { @@ -317,8 +317,8 @@ The following example shows how to inspect the output of a **Type Convert** comp } ``` - - + + ```text | Field | Value | @@ -339,7 +339,7 @@ The following example shows how to inspect the output of a **Type Convert** comp | duration | (empty) | ``` - + ## See also diff --git a/docs/docs/Concepts/mcp-server.mdx b/docs/docs/Concepts/mcp-server.mdx index 343153fcb945..1f3dc51c115e 100644 --- a/docs/docs/Concepts/mcp-server.mdx +++ b/docs/docs/Concepts/mcp-server.mdx @@ -101,19 +101,21 @@ The following procedure describes how to connect [Cursor](https://www.cursor.com However, you can connect any [MCP-compatible client](https://modelcontextprotocol.io/clients) following similar steps. - + + +:::important +Auto installation only works if your HTTP client and Langflow server are on the same local machine. +In this is not the case, configure the client with the code in the **JSON** tab. +::: 1. Install [Cursor](https://docs.cursor.com/get-started/installation). 2. In the Langflow dashboard, select the project that contains the flows you want to serve, and then click the **MCP Server** tab. 3. To auto install your current Langflow project as an MCP server, click **Add**. - The installation adds the server's configuration file to Cursor's `mcp.json` configuration file. - :::important - Auto installation only works if your HTTP client and Langflow server are on the same local machine. - In this is not the case, configure the client with the code in the **JSON** tab. - ::: - - +The installation adds the server's configuration file to Cursor's `mcp.json` configuration file. + + + 1. Install [Cursor](https://docs.cursor.com/get-started/installation). 2. In Cursor, go to **Cursor Settings > MCP**, and then click **Add New Global MCP Server**. @@ -149,7 +151,7 @@ For example: 5. Save and close the `mcp.json` file in Cursor. The newly added MCP server will appear in the **MCP Servers** section. - + Cursor is now connected to your project's MCP server and your flows are registered as tools. @@ -250,47 +252,49 @@ The default address is `http://localhost:6274`. If Claude for Desktop is not using your server's tools correctly, you may need to explicitly define the path to your local `uvx` or `npx` executable file in the `claude_desktop_config.json` configuration file. 1. To find your UVX path, run `which uvx`. -To find your NPX path, run `which npx`. + + To find your NPX path, run `which npx`. 2. Copy the path, and then replace `PATH_TO_UVX` or `PATH_TO_NPX` in your `claude_desktop_config.json` file. - - + + -```json -{ - "mcpServers": { - "PROJECT_NAME": { - "command": "PATH_TO_UVX", - "args": [ - "mcp-proxy", - "http://LANGFLOW_SERVER_ADDRESS/api/v1/mcp/project/PROJECT_ID/sse" - ] + ```json + { + "mcpServers": { + "PROJECT_NAME": { + "command": "PATH_TO_UVX", + "args": [ + "mcp-proxy", + "http://LANGFLOW_SERVER_ADDRESS/api/v1/mcp/project/PROJECT_ID/sse" + ] + } + } } - } -} -``` - + ``` - + + -```json -{ - "mcpServers": { - "PROJECT_NAME": { - "command": "PATH_TO_NPX", - "args": [ - "-y", - "supergateway", - "--sse", - "http://LANGFLOW_SERVER_ADDRESS/api/v1/mcp/project/PROJECT_ID/sse" - ] + ```json + { + "mcpServers": { + "PROJECT_NAME": { + "command": "PATH_TO_NPX", + "args": [ + "-y", + "supergateway", + "--sse", + "http://LANGFLOW_SERVER_ADDRESS/api/v1/mcp/project/PROJECT_ID/sse" + ] + } + } } - } -} -``` - - + ``` + + + ## See also diff --git a/docs/docs/Configuration/configuration-api-keys.mdx b/docs/docs/Configuration/configuration-api-keys.mdx index 74622c2482a6..1716c0ea6f12 100644 --- a/docs/docs/Configuration/configuration-api-keys.mdx +++ b/docs/docs/Configuration/configuration-api-keys.mdx @@ -33,15 +33,15 @@ You can generate a Langflow API key with the UI or the CLI. The UI-generated key is appropriate for most cases. The CLI-generated key is needed when your Langflow server is running in `--backend-only` mode. - + 1. In the Langflow UI header, click your profile icon, and then select **Settings**. 2. Click **Langflow API Keys**, and then click **Add New**. 3. Name your key, and then click **Create API Key**. 4. Copy the API key and store it securely. - - + + If you're serving your flow with `--backend-only=true`, you can't create API keys in the UI, because the frontend is not running. @@ -69,7 +69,7 @@ To create an API key for a user from the CLI, do the following: -H "x-api-key: $LANGFLOW_API_KEY" ``` -
+
Result ```json @@ -93,7 +93,8 @@ To create an API key for a user from the CLI, do the following: ```shell uv run langflow api-key ``` - + + ## Authenticate requests with the Langflow API key @@ -103,7 +104,7 @@ Include your API key in API requests to authenticate requests to Langflow. API keys allow access only to the flows and components of the specific user who created the key. - + To use the API key when making API requests, include the API key in the HTTP header: @@ -115,8 +116,8 @@ curl -X POST \ -d '{"inputs": {"text":""}, "tweaks": {}}' ``` - - + + To pass the API key as a query parameter: @@ -126,7 +127,8 @@ curl -X POST \ -H 'Content-Type: application/json' \ -d '{"inputs": {"text":""}, "tweaks": {}}' ``` - + + ## Generate a Langflow secret key diff --git a/docs/docs/Configuration/configuration-authentication.mdx b/docs/docs/Configuration/configuration-authentication.mdx index 4db5741c95a7..ad0c29b1f35f 100644 --- a/docs/docs/Configuration/configuration-authentication.mdx +++ b/docs/docs/Configuration/configuration-authentication.mdx @@ -70,38 +70,39 @@ To generate a `LANGFLOW_SECRET_KEY`, follow these steps: 1. Run the command to generate and copy a secret to the clipboard. - - + + -```bash -# Copy to clipboard (macOS) -python3 -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" | pbcopy + ```bash + # Copy to clipboard (macOS) + python3 -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" | pbcopy -# Copy to clipboard (Linux) -python3 -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" | xclip -selection clipboard + # Copy to clipboard (Linux) + python3 -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" | xclip -selection clipboard -# Or just print -python3 -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" -``` - + # Or just print + python3 -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" + ``` - + + -```bash -# Copy to clipboard -python -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" | clip + ```bash + # Copy to clipboard + python -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" | clip -# Or just print -python -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" -``` + # Or just print + python -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" + ``` - - + + 2. Paste the value into your `.env` file: -```text -LANGFLOW_SECRET_KEY=dBuuuB_FHLvU8T9eUNlxQF9ppqRxwWpXXQ42kM2_fb -``` + + ```text + LANGFLOW_SECRET_KEY=dBuuuB_FHLvU8T9eUNlxQF9ppqRxwWpXXQ42kM2_fb + ``` ### LANGFLOW_NEW_USER_IS_ACTIVE @@ -134,71 +135,73 @@ LANGFLOW_NEW_USER_IS_ACTIVE=False 2. Generate a secret key for encrypting sensitive data. -Generate your secret key using one of the following commands: + Generate your secret key using one of the following commands: - - + + -```bash -# Copy to clipboard (macOS) -python3 -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" | pbcopy + ```bash + # Copy to clipboard (macOS) + python3 -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" | pbcopy -# Copy to clipboard (Linux) -python3 -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" | xclip -selection clipboard + # Copy to clipboard (Linux) + python3 -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" | xclip -selection clipboard -# Or just print -python3 -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" -``` - + # Or just print + python3 -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" + ``` - + + -```bash -# Copy to clipboard -python -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" | clip + ```bash + # Copy to clipboard + python -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" | clip -# Or just print -python -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" -``` + # Or just print + python -c "from secrets import token_urlsafe; print(f'LANGFLOW_SECRET_KEY={token_urlsafe(32)}')" + ``` - - + + 3. Paste your `LANGFLOW_SECRET_KEY` into the `.env` file. 4. Start Langflow with the configuration from your `.env` file. -```text -uv run langflow run --env-file .env -``` + ```text + uv run langflow run --env-file .env + ``` 5. Verify the server is running. The default location is `http://localhost:7860`. ### Manage users as an administrator 1. To complete your first-time login as a superuser, go to `http://localhost:7860/login`. -2. Log in with your superuser credentials: -* Username: Value of `LANGFLOW_SUPERUSER` (for example, `administrator`) -* Password: Value of `LANGFLOW_SUPERUSER_PASSWORD` (for example, `securepassword`) +2. Log in with your superuser credentials. -:::info -The default values are `langflow` and `langflow`. -::: + * Username: Value of `LANGFLOW_SUPERUSER` (for example, `administrator`) + * Password: Value of `LANGFLOW_SUPERUSER_PASSWORD` (for example, `securepassword`) + + The default values are both `langflow`. 3. To manage users on your server, navigate to the `/admin` page at `http://localhost:7860/admin`. -Click your user profile image, and then click **Admin Page**. -As a superuser, you can create users, set permissions, reset passwords, and delete accounts. + Click your user profile image, and then click **Admin Page**. + + As a superuser, you can create users, set permissions, reset passwords, and delete accounts. 4. To create a user, in the Langflow UI, click **New User**, and then complete the following fields: -* **Username** -* **Password** and **Confirm Password** -* Select **Active** and deselect **Superuser** for the new user. -**Active** users can log into the system and access their flows. **Inactive** users cannot log in or see their flows. -A **Superuser** has full administrative privileges. + + * **Username** + * **Password** and **Confirm Password** + * Select **Active** and deselect **Superuser** for the new user. + **Active** users can log into the system and access their flows. **Inactive** users cannot log in or see their flows. + A **Superuser** has full administrative privileges. 5. To complete user creation, click **Save**. Your new user appears in the **Admin Page**. + 6. To confirm your new user's functionality, log out of Langflow, and log back in with your new user's credentials. Attempt to access the `/admin` page. You should be redirected to the `/flows` page, because the new user is not a superuser. diff --git a/docs/docs/Configuration/configuration-global-variables.mdx b/docs/docs/Configuration/configuration-global-variables.mdx index 142816a621c3..d9d5d7bf42c0 100644 --- a/docs/docs/Configuration/configuration-global-variables.mdx +++ b/docs/docs/Configuration/configuration-global-variables.mdx @@ -77,7 +77,6 @@ Langflow's [default global variables](#default-environment-variables) are alread You can extend this list by setting `LANGFLOW_VARIABLES_TO_GET_FROM_ENVIRONMENT` with your additional variables. - If you installed Langflow locally, you must define the `LANGFLOW_VARIABLES_TO_GET_FROM_ENVIRONMENT` environment variable in a `.env` file. @@ -107,11 +106,9 @@ If you installed Langflow locally, you must define the `LANGFLOW_VARIABLES_TO_GE VARIABLE1="VALUE1" VARIABLE2="VALUE2" python -m langflow run --env-file .env ``` - :::note In this example, the environment variables (`VARIABLE1="VALUE1"` and `VARIABLE2="VALUE2"`) are prefixed to the startup command. This is a rudimentary method for exposing environment variables to Python on the command line, and is meant for illustrative purposes. Make sure to expose your environment variables to Langflow in a manner that best suits your own environment. - ::: 5. Confirm that Langflow successfully sourced the global variables from the environment: @@ -120,7 +117,6 @@ If you installed Langflow locally, you must define the `LANGFLOW_VARIABLES_TO_GE 2. Click **Global Variables**, and then make sure that your environment variables appear in the **Global Variables** list. - If you're using Docker, you can pass `LANGFLOW_VARIABLES_TO_GET_FROM_ENVIRONMENT` directly from the command line or from a `.env` file. @@ -148,7 +144,6 @@ docker run -it --rm \ ``` - :::info @@ -160,7 +155,6 @@ When adding global variables from the environment, the following limitations app - Global variables that you add from the environment always have the **Credential** type. ::: - If you want to explicitly prevent Langflow from sourcing global variables from the environment, set `LANGFLOW_STORE_ENVIRONMENT_VARIABLES` to `false` in your `.env` file: ```text @@ -205,5 +199,4 @@ Langflow automatically detects and converts some environment variables into glob - `VECTARA_CORPUS_ID` - `VECTARA_CUSTOMER_ID` - -For information about other environment variables and their usage, see [Environment Variables](/environment-variables). +For information about other environment variables and their usage, see [Environment Variables](/environment-variables). \ No newline at end of file diff --git a/docs/docs/Configuration/environment-variables.mdx b/docs/docs/Configuration/environment-variables.mdx index 6583b9971143..1b9d86adb471 100644 --- a/docs/docs/Configuration/environment-variables.mdx +++ b/docs/docs/Configuration/environment-variables.mdx @@ -25,28 +25,30 @@ If you choose to use both sources together, be aware that environment variables Run the following commands to set environment variables for your current terminal session: - + ```bash export VARIABLE_NAME='VALUE' ``` - + + ``` set VARIABLE_NAME='VALUE' ``` - + + ```bash docker run -it --rm \ -p 7860:7860 \ -e VARIABLE_NAME='VALUE' \ langflowai/langflow:latest ``` - + When you start Langflow, it looks for environment variables that you've set in your terminal. @@ -249,7 +251,7 @@ The following table lists the environment variables supported by Langflow. The following examples show how to configure Langflow using environment variables in different scenarios. - + The `.env` file is a text file that contains key-value pairs of environment variables. @@ -406,8 +408,8 @@ For macOS, this means any GUI-based app launched from Finder, Spotlight, Launchp To set environment variables for Langflow Desktop, you need to use specific commands or files, depending on your OS. - - + + Langflow Desktop for macOS cannot automatically use variables set in your terminal, such as those in`.zshrc` or `.bash_profile`, when launched from the macOS GUI. @@ -415,43 +417,45 @@ To make environment variables available to GUI apps on macOS, you need to use `l 1. Create the `LaunchAgents` directory if it doesn't exist: -```bash -mkdir -p ~/Library/LaunchAgents -``` + ```bash + mkdir -p ~/Library/LaunchAgents + ``` 2. In the `LaunchAgents` directory, create a `.plist` file called `dev.langflow.env`. 3. Add the following content to `dev.langflow.env.plist`, and then add, change, or remove Langflow environment variables as needed for your configuration. -This example sets the `LANGFLOW_CONFIG_DIR` environment variable for all GUI apps launched from the macOS GUI. - -```xml - - - - - Label - dev.langflow.env - ProgramArguments - - launchctl - setenv - LANGFLOW_CONFIG_DIR - /Users/your_user/custom/config - - RunAtLoad - - - -``` + + This example sets the `LANGFLOW_CONFIG_DIR` environment variable for all GUI apps launched from the macOS GUI. + + ```xml + + + + + Label + dev.langflow.env + ProgramArguments + + launchctl + setenv + LANGFLOW_CONFIG_DIR + /Users/your_user/custom/config + + RunAtLoad + + + + ``` 4. Load the file with `launchctl`: -```bash -launchctl load ~/Library/LaunchAgents/dev.langflow.env.plist -``` - - + ```bash + launchctl load ~/Library/LaunchAgents/dev.langflow.env.plist + ``` + + + Langflow Desktop for Windows cannot automatically use variables set in your terminal, such as those defined with `set` in `cmd` or `$env:VAR=...` in PowerShell, when launched from the Windows GUI. @@ -462,16 +466,18 @@ To set environment variables using the System Properties interface, do the follo 1. Press Win + R, enter `SystemPropertiesAdvanced`, and then press Enter. 2. Click **Environment Variables**. 3. Under **User variables**, click **New**. -:::tip -To apply the setting to all users, select **System variables**. -::: + + :::tip + To apply the setting to all users, select **System variables**. + ::: + 4. Enter the name of the Langflow variable you want to set, such as `LANGFLOW_CONFIG_DIR`, and the desired value, such as `C:\Users\your_user\.langflow_config`. 5. Click **OK** to save the variable. 6. Repeat until you have set all necessary Langflow environment variables. 7. Launch or restart Langflow Desktop to apply the environment variables. - - + + To define environment variables for Windows using PowerShell, do the following: @@ -489,5 +495,6 @@ To define environment variables for Windows using PowerShell, do the following: 2. Repeat until you have set all necessary Langflow environment variables. 3. Launch or restart Langflow Desktop to apply the environment variables. - - + + + \ No newline at end of file diff --git a/docs/docs/Contributing/contributing-how-to-contribute.mdx b/docs/docs/Contributing/contributing-how-to-contribute.mdx index ae1d7517f872..64cd0990b586 100644 --- a/docs/docs/Contributing/contributing-how-to-contribute.mdx +++ b/docs/docs/Contributing/contributing-how-to-contribute.mdx @@ -39,73 +39,65 @@ Replace the following: ### Run Langflow from source -If you're not developing, but want to run Langflow from source after cloning the repo, run the following commands. +You can run Langflow from source after cloning the repository, even if you're not contributing to the codebase. - - +
+Run from source on macOS/Linux - 1. To run Langflow from source, run the following command: - ```bash - make run_cli - ``` - - This command does the following: - - Installs frontend and backend dependencies - - Builds the frontend static files - - Starts the application with default settings - - The Langflow frontend is available at `http://localhost:7860/`. +In your terminal, navigate to the root of the Langflow directory, and then run `make run_cli`. - - +This command does the following: - To run Langflow from source on Windows, you can use the Langflow project's included scripts, or run the commands in the terminal. +- Installs frontend and backend dependencies +- Builds the frontend static files +- Starts the application with default settings - 1. To run Langflow with the included scripts, navigate to the `scripts/windows` directory. - Two scripts are available to install and start Langflow. +The Langflow frontend is served at `http://localhost:7860`. - 2. Run Langflow with one of the scripts. +
- - +
+Run from source with Windows CMD - To install and start Langflow with a Windows Batch file, double-click `build_and_run.bat`. +To run Langflow from source on Windows, you can use the Langflow project's included scripts, or run the commands in the terminal. - - +Do one of the following: - To install and start Langflow with the Powershell script, run: +* To install and run Langflow with the included Windows Batch file, navigate to the `scripts/windows` directory, and then run the `build_and_run.bat` file. - ```ps - .\build_and_run.ps1 - ``` +* To run Langflow from the Windows Command Line: - - + 1. Build the Langflow frontend: - **Alternatively**, to run Langflow from source with the Windows Command Line or Powershell, do the following. - - - - - 1. Run the following commands to build the Langflow frontend. ``` cd src/frontend && npm install && npm run build && npm run start ``` - 2. Copy the contents of the built `src/frontend/build` directory to `src/backend/base/langflow/frontend`. + 2. Copy the contents of the built `src/frontend/build` directory to `src/backend/base/langflow/frontend`. + + 3. Start Langflow: - 3. To start Langflow, run the following command. ``` uv run langflow run ``` - The frontend is served at http://localhost:7860. +The Langflow frontend is served at `http://localhost:7860`. - - +
+ +
+Run from source with Powershell + +To run Langflow from source on Windows, you can use the Langflow project's included scripts, or run the commands in the terminal. + +Do one of the following: + +* To install and run Langflow with the included scripts, navigate to the `scripts/windows` directory, and then run the `build_and_run.ps1` file. + +* To run Langflow from a Powershell terminal: + + 1. Run the following commands separately to build the Langflow frontend: - 1. Run the following commands to build the Langflow frontend. ``` cd src/frontend npm install @@ -113,32 +105,31 @@ If you're not developing, but want to run Langflow from source after cloning the npm run start ``` - 2. Copy the contents of the built `src/frontend/build` directory to `src/backend/base/langflow/frontend`. + 2. Copy the contents of the built `src/frontend/build` directory to `src/backend/base/langflow/frontend`. + + 3. Start Langflow: - 3. To start Langflow, run the following command. ``` uv run langflow run ``` - The frontend is served at http://localhost:7860. +The Langflow frontend is served at `http://localhost:7860`. - - - - - +
### Set up your Langflow development environment - - + + + +1. Set up the Langflow development environment: -1. To set up the Langflow development environment, run the following command: ```bash make init ``` This command sets up the development environment by doing the following: + - Checking for uv and npm. - Installing backend and frontend dependencies. - Installing pre-commit hooks. @@ -148,37 +139,39 @@ If you're not developing, but want to run Langflow from source after cloning the ```bash # Run backend in development mode (includes hot reload) make backend + ``` + ```bash # In another terminal, run frontend in development mode (includes hot reload) make frontend ``` + The `make backend` and `make frontend` commands automatically install dependencies, so you don't need to run install commands separately. + The frontend is served at `http://localhost:7860`. - The `make backend` and `make frontend` commands automatically install dependencies, so you don't need to run install commands separately. +3. Optional: Install pre-commit hooks to help keep your changes clean and well-formatted. + + With pre-commit hooks installed, you must use `uv run git commit` instead of `git commit` directly. -3. Optional: Install pre-commit hooks to help keep your changes clean and well-formatted. `make init` installs pre-commit hooks automatically. + `make init` installs pre-commit hooks automatically, or you can run the following command to install them manually: ```bash uv sync uv run pre-commit install ``` - :::note - With pre-commit hooks installed, you need to use `uv run git commit` instead of `git commit` directly. - ::: +4. To test your changes before pushing commits, run `make lint`, `make format`, and `make unit_tests`. +To run all tests, including coverage, unit, and integration, tests, run `make tests`. -4. To test your changes, run `make lint`, `make format`, and `make unit_tests` before pushing to the repository. -To run all tests, including unit tests, integration tests, and coverage, run `make tests`. - - - + + Since Windows does not include `make`, building and running Langflow from source uses `npm` and `uv`. -To set up the Langflow development environment, run the frontend and backend in separate terminals. +To set up the Langflow development environment, run the frontend and backend in separate terminals: -1. To run the frontend, run the following commands. +1. To run the frontend, run the following commands: ```bash cd src/frontend @@ -186,7 +179,7 @@ To set up the Langflow development environment, run the frontend and backend in npm run start ``` -2. To run the backend, run the following command. +2. In a separate terminal, run the following command to run the backend: ```bash uv run langflow run --backend-only @@ -194,7 +187,7 @@ To set up the Langflow development environment, run the frontend and backend in The frontend is served at `http://localhost:7860`. - + ### Debug @@ -213,40 +206,49 @@ For more information, see the [VSCode documentation](https://code.visualstudio.c ## Contribute documentation The documentation is built using [Docusaurus](https://docusaurus.io/) and written in [Markdown](https://docusaurus.io/docs/markdown-features). -Contributions should follow the [Google Developer Documentation Style Guide](https://developers.google.com/style). +For style guidance, see the [Google Developer Documentation Style Guide](https://developers.google.com/style). -### Prerequisites +1. Install [Node.js](https://nodejs.org/en/download/package-manager) and [Yarn](https://yarnpkg.com/) -* [Node.js](https://nodejs.org/en/download/package-manager) -* [Yarn](https://yarnpkg.com/) +2. Fork the [Langflow GitHub repository](https://github.com/langflow-ai/langflow). -### Clone the Langflow repository +3. Add the new remote to your local repository on your local machine: -1. Fork the [Langflow GitHub repository](https://github.com/langflow-ai/langflow). + ```bash + git remote add FORK_NAME https://github.com/GIT_USERNAME/langflow.git + ``` -2. Add the new remote to your local repository on your local machine: -```bash -git remote add FORK_NAME https://github.com/GIT_USERNAME/langflow.git -``` -Replace the following: -* `FORK_NAME`: A name for your fork of the repository -* `GIT_USERNAME`: Your Git username + Replace the following: -3. From your Langflow fork's root, change directory to the `langflow/docs` folder with the following command: -```bash -cd docs -``` + * `FORK_NAME`: A name for your fork of the repository + * `GIT_USERNAME`: Your Git username -4. To install dependencies and start a local Docusaurus static site with hot reloading, run: -```bash -yarn install -yarn start -``` +4. From the root of your local Langflow fork, change to the `/docs` directory: + + ```bash + cd docs + ``` + +5. Install dependencies and start a local Docusaurus static site with hot reload: + + ```bash + yarn install + yarn start + ``` + + The documentation is served at `localhost:3000`. + +6. To edit and create content, work with the `.mdx` files in the `langflow/docs/docs` directory. + + Create new files in `.mdx` format. + + Navigation is defined in `langflow/docs/sidebars.js`. -The documentation is available at `localhost:3000`. -The Markdown content files are located in the `langflow/docs/docs` folder. + Most pages use a `slug` for shorthand cross-referencing, rather than supplying the full or relative directory path. + For example, if a page has a `slug` of `/cool-page`, you can link to it with `[Cool page](/cool-page)` from any other `/docs` page. -5. Optional: Run `yarn build` to build the site locally and ensure there are no broken links. +7. Recommended: After making some changes, run `yarn build` to build the site locally with more robust logging. +This can help you find broken links before creating a PR. ## Open a pull request diff --git a/docs/docs/Deployment/deployment-kubernetes-prod.mdx b/docs/docs/Deployment/deployment-kubernetes-prod.mdx index 58b8349a34cb..4bcfef5f3c75 100644 --- a/docs/docs/Deployment/deployment-kubernetes-prod.mdx +++ b/docs/docs/Deployment/deployment-kubernetes-prod.mdx @@ -111,7 +111,7 @@ For example, the [example flow JSON](https://raw.githubusercontent.com/langflow- Instead, when importing the flow in the Langflow runtime, you can set the global variable in one of the following ways: - + ```yaml env: @@ -131,7 +131,7 @@ env: ``` - + 1. Create the secret: ```shell diff --git a/docs/docs/Develop/Clients/typescript-client.mdx b/docs/docs/Develop/Clients/typescript-client.mdx index 8a855a5fb1f1..b00a6fc1f79c 100644 --- a/docs/docs/Develop/Clients/typescript-client.mdx +++ b/docs/docs/Develop/Clients/typescript-client.mdx @@ -16,7 +16,7 @@ For the npm package, see [@datastax/langflow-client](https://www.npmjs.com/packa To install the Langflow typescript client package, use one of the following commands: - + ```bash @@ -44,77 +44,79 @@ pnpm add @datastax/langflow-client 1. Import the client into your code. -```tsx -import { LangflowClient } from "@datastax/langflow-client"; -``` + ```tsx + import { LangflowClient } from "@datastax/langflow-client"; + ``` -2. Initialize a client object to interact with your server. -The `LangflowClient` object allows you to interact with the Langflow API. +2. Initialize a `LangflowClient` object to interact with your server: -Replace `BASE_URL` and `API_KEY` with values from your deployment. -The default Langflow base URL is `http://localhost:7860`. -To create an API key, see [API keys](/configuration-api-keys). + ```tsx + const baseUrl = "BASE_URL"; + const apiKey = "API_KEY"; + const client = new LangflowClient({ baseUrl, apiKey }); + ``` -```tsx -const baseUrl = "BASE_URL"; -const apiKey = "API_KEY"; -const client = new LangflowClient({ baseUrl, apiKey }); -``` + Replace `BASE_URL` and `API_KEY` with values from your deployment. + The default Langflow base URL is `http://localhost:7860`. + To create an API key, see [API keys](/configuration-api-keys). ## Langflow TypeScript client quickstart -1. With your Langflow client initialized, submit a message to your Langflow server and receive a response. -This example uses the minimum values for sending a message and running your flow on a Langflow server, with no API keys. -Replace `baseUrl` and `flowId` with values from your deployment. -The `input` string is the message you're sending to your flow. +1. With your Langflow client initialized, test the connection by calling your Langflow server. -```tsx -import { LangflowClient } from "@datastax/langflow-client"; + The following example runs a flow (`runFlow`) by sending the flow ID and a chat input string: -const baseUrl = "http://localhost:7860"; -const client = new LangflowClient({ baseUrl }); + ```tsx + import { LangflowClient } from "@datastax/langflow-client"; -async function runFlow() { - const flowId = "aa5a238b-02c0-4f03-bc5c-cc3a83335cdf"; - const flow = client.flow(flowId); - const input = "Is anyone there?"; + const baseUrl = "http://localhost:7860"; + const client = new LangflowClient({ baseUrl }); - const response = await flow.run(input); - console.log(response); -} + async function runFlow() { + const flowId = "aa5a238b-02c0-4f03-bc5c-cc3a83335cdf"; + const flow = client.flow(flowId); + const input = "Is anyone there?"; -runFlow().catch(console.error); -``` + const response = await flow.run(input); + console.log(response); + } -
-Response + runFlow().catch(console.error); + ``` -``` -FlowResponse { - sessionId: 'aa5a238b-02c0-4f03-bc5c-cc3a83335cdf', - outputs: [ { inputs: [Object], outputs: [Array] } ] -} -``` + Replace the following: -
+ * `baseUrl`: The URL of your Langflow server + * `flowId`: The ID of the flow you want to run + * `input`: The chat input message you want to send to trigger the flow -This confirms your client is connecting to Langflow. -* The `sessionID` value is a unique identifier for the client-server session. For more information, see [Session ID](/session-id). -* The `outputs` array contains the results of your flow execution. -2. To get the full response objects from your server, change the `console.log` code to stringify the returned JSON object: +2. Review the result to confirm that the client connected to your Langflow server. -```tsx -console.log(JSON.stringify(response, null, 2)); -``` + The following example shows the response from a well-formed `runFlow` request that reached the Langflow server and successfully started the flow: -The exact structure of the returned `inputs` and `outputs` depends on how your flow is configured in Langflow. + ``` + FlowResponse { + sessionId: 'aa5a238b-02c0-4f03-bc5c-cc3a83335cdf', + outputs: [ { inputs: [Object], outputs: [Array] } ] + } + ``` -3. To get the first chat message returned from the chat output component, change `console.log` to use the `chatOutputText` convenience function. + In this case, the response includes a [`sessionID`](/session-id) that is a unique identifier for the client-server session and an `outputs` array that contains information about the flow run. -```tsx -console.log(response.chatOutputText()); -``` +3. If you want to get full response objects from the server, change `console.log` to stringify the returned JSON object: + + ```tsx + console.log(JSON.stringify(response, null, 2)); + ``` + + The exact structure of the returned `inputs` and `outputs` objects depends on the components and configuration of your flow. + +4. If you want the response to include only the chat message from the **Chat Output** component, change `console.log` to use the `chatOutputText` convenience function: + + ```tsx + console.log(response.chatOutputText()); + ``` ## Use advanced TypeScript client features @@ -123,132 +125,147 @@ The TypeScript client can do more than just connect to your server and run a flo This example builds on the quickstart with additional features for interacting with Langflow. 1. Pass tweaks to your code as an object with the request. -Tweaks change values within components for all calls to your flow. -This example tweaks the Open-AI model component to enforce using the `gpt-4o-mini` model. -```tsx -const tweaks = { model_name: "gpt-4o-mini" }; -``` -2. Pass a [session ID](/session-id) with the request to maintain the same conversation with the LLM from this application. -```tsx -const session_id = "aa5a238b-02c0-4f03-bc5c-cc3a83335cdf"; -``` -3. Instead of calling `run` on the Flow object, call `stream` with the same arguments. -The response is a [ReadableStream](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream) of objects. -For more information on streaming Langflow responses, see the [/run endpoint](/api-flows-run#run-flow). -```tsx -const response = await client.flow(flowId).stream(input); -for await (const event of response) { - console.log(event); -} -``` -4. Run the completed TypeScript application to call your server with `tweaks` and `session_id`, and stream the response back. -Replace `baseUrl` and `flowId` with values from your deployment. + Tweaks change values within components for all calls to your flow. -```tsx -import { LangflowClient } from "@datastax/langflow-client"; + This example tweaks the Open-AI model component to enforce using the `gpt-4o-mini` model: -const baseUrl = "http://localhost:7860"; -const client = new LangflowClient({ baseUrl }); - -async function runFlow() { - const flowId = "aa5a238b-02c0-4f03-bc5c-cc3a83335cdf"; - const input = "Is anyone there?"; + ```tsx const tweaks = { model_name: "gpt-4o-mini" }; - const session_id = "test-session"; + ``` + +2. Pass a [session ID](/session-id) with the request to separate the conversation from other flow runs, and to be able to continue this conversation by calling the same session ID in the future: + + ```tsx + const session_id = "aa5a238b-02c0-4f03-bc5c-cc3a83335cdf"; + ``` + +3. Instead of calling `run` on the Flow object, call `stream` with the same arguments: - const response = await client.flow(flowId).stream(input, { - session_id, - tweaks, - }); + ```tsx + const response = await client.flow(flowId).stream(input); for await (const event of response) { - console.log(event); + console.log(event); } + ``` -} -runFlow().catch(console.error); -``` + The response is a [`ReadableStream`](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream) of objects. + For more information on streaming Langflow responses, see the [`/run` endpoint](/api-flows-run#run-flow). -
-Response +4. Run the modified TypeScript application to run the flow with `tweaks` and `session_id`, and then stream the response back. -```text -{ - event: 'add_message', - data: { - timestamp: '2025-05-23 15:52:48 UTC', - sender: 'User', - sender_name: 'User', - session_id: 'test-session', - text: 'Is anyone there?', - files: [], - error: false, - edit: false, - properties: { - text_color: '', - background_color: '', - edited: false, - source: [Object], - icon: '', - allow_markdown: false, - positive_feedback: null, - state: 'complete', - targets: [] - }, - category: 'message', - content_blocks: [], - id: '7f096715-3f2d-4d84-88d6-5e2f76bf3fbe', - flow_id: 'aa5a238b-02c0-4f03-bc5c-cc3a83335cdf', - duration: null - } -} -{ - event: 'token', - data: { - chunk: 'Absolutely', - id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5', - timestamp: '2025-05-23 15:52:48 UTC' - } -} -{ - event: 'token', - data: { - chunk: ',', - id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5', - timestamp: '2025-05-23 15:52:48 UTC' - } -} -{ - event: 'token', - data: { - chunk: " I'm", - id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5', - timestamp: '2025-05-23 15:52:48 UTC' - } -} -{ - event: 'token', - data: { - chunk: ' here', - id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5', - timestamp: '2025-05-23 15:52:48 UTC' - } -} +Replace `baseUrl` and `flowId` with values from your deployment. -// this response is abbreviated + ```tsx + import { LangflowClient } from "@datastax/langflow-client"; -{ - event: 'end', - data: { result: { session_id: 'test-session', outputs: [Array] } } -} -``` + const baseUrl = "http://localhost:7860"; + const client = new LangflowClient({ baseUrl }); -
+ async function runFlow() { + const flowId = "aa5a238b-02c0-4f03-bc5c-cc3a83335cdf"; + const input = "Is anyone there?"; + const tweaks = { model_name: "gpt-4o-mini" }; + const session_id = "test-session"; + + const response = await client.flow(flowId).stream(input, { + session_id, + tweaks, + }); + + for await (const event of response) { + console.log(event); + } + + } + runFlow().catch(console.error); + ``` + + Replace `baseUrl` and `flowId` with your server URL and flow ID, as you did in the previous run. + +
+ Result + + With streaming enabled, the response includes the flow metatadata and timestamped events for flow activity. + For example: + + ```text + { + event: 'add_message', + data: { + timestamp: '2025-05-23 15:52:48 UTC', + sender: 'User', + sender_name: 'User', + session_id: 'test-session', + text: 'Is anyone there?', + files: [], + error: false, + edit: false, + properties: { + text_color: '', + background_color: '', + edited: false, + source: [Object], + icon: '', + allow_markdown: false, + positive_feedback: null, + state: 'complete', + targets: [] + }, + category: 'message', + content_blocks: [], + id: '7f096715-3f2d-4d84-88d6-5e2f76bf3fbe', + flow_id: 'aa5a238b-02c0-4f03-bc5c-cc3a83335cdf', + duration: null + } + } + { + event: 'token', + data: { + chunk: 'Absolutely', + id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5', + timestamp: '2025-05-23 15:52:48 UTC' + } + } + { + event: 'token', + data: { + chunk: ',', + id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5', + timestamp: '2025-05-23 15:52:48 UTC' + } + } + { + event: 'token', + data: { + chunk: " I'm", + id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5', + timestamp: '2025-05-23 15:52:48 UTC' + } + } + { + event: 'token', + data: { + chunk: ' here', + id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5', + timestamp: '2025-05-23 15:52:48 UTC' + } + } + + // this response is abbreviated + + { + event: 'end', + data: { result: { session_id: 'test-session', outputs: [Array] } } + } + ``` + +
## Retrieve Langflow logs with the TypeScript client -To retrieve Langflow logs, you must enable log retrieval on your Langflow server by including the following values in your server's `.env` file: +To retrieve [Langflow logs](/logging), you must enable log retrieval on your Langflow server by including the following values in your server's `.env` file: ```text LANGFLOW_ENABLE_LOG_RETRIEVAL=true @@ -256,10 +273,7 @@ LANGFLOW_LOG_RETRIEVER_BUFFER_SIZE=10000 LANGFLOW_LOG_LEVEL=DEBUG ``` -For more information, see [Logs endpoints](/api-logs). - -This complete example starts streaming logs in the background, and then runs a flow so you can see how a flow executes. -Replace `baseUrl` and `flowId` with values from your deployment. +The following example script starts streaming logs in the background, and then runs a flow so you can monitor the flow run: ```tsx import { LangflowClient } from "@datastax/langflow-client"; @@ -290,12 +304,16 @@ async function main() { main().catch(console.error); ``` +Replace `baseUrl` and `flowId` with your server URL and flow ID, as you did in the previous run. + Logs begin streaming indefinitely, and the flow runs once. -The logs below are abbreviated, but you can monitor how the flow instantiates its components, configures its model, and processes the outputs. +The following example result is truncated for readability, but you can follow the messages to see how the flow instantiates its components, configures its model, and processes the outputs. + +The `FlowResponse` object, at the end of the stream, is returned to the client with the flow result in the `outputs` array.
-Response +Result ```text Starting log stream... @@ -359,4 +377,4 @@ Log: Log {
-The `FlowResponse` object is returned to the client, with the `outputs` array including your flow result. \ No newline at end of file +For more information, see [Logs endpoints](/api-logs). \ No newline at end of file diff --git a/docs/docs/Develop/develop-application.mdx b/docs/docs/Develop/develop-application.mdx index b46ed78669f1..3fea84ecddbb 100644 --- a/docs/docs/Develop/develop-application.mdx +++ b/docs/docs/Develop/develop-application.mdx @@ -206,6 +206,7 @@ For information about publishing your image on Docker Hub and running a Langflow "session_id": "charizard_test_request" }' ``` +
About this example diff --git a/docs/docs/Get-Started/get-started-installation.mdx b/docs/docs/Get-Started/get-started-installation.mdx index f42074e5e9bf..c988f57d1368 100644 --- a/docs/docs/Get-Started/get-started-installation.mdx +++ b/docs/docs/Get-Started/get-started-installation.mdx @@ -23,29 +23,29 @@ This option offers more control over the environment, dependencies, and versioni Langflow Desktop is a desktop version of Langflow that simplifies dependency management and upgrades. However, some features aren't available for Langflow Desktop, such as the **Shareable Playground**. - - + + - 1. Navigate to [Langflow Desktop](https://www.langflow.org/desktop). - 2. Click **Download Langflow**, enter your contact information, and then click **Download**. - 3. Mount and install the Langflow application. - 4. When the installation completes, open the Langflow application, and then create your first flow with the [Quickstart](/get-started-quickstart). +1. Navigate to [Langflow Desktop](https://www.langflow.org/desktop). +2. Click **Download Langflow**, enter your contact information, and then click **Download**. +3. Mount and install the Langflow application. +4. When the installation completes, open the Langflow application, and then create your first flow with the [Quickstart](/get-started-quickstart). - - + + - 1. Navigate to [Langflow Desktop](https://www.langflow.org/desktop). - 2. Click **Download Langflow**, enter your contact information, and then click **Download**. - 3. Open the **File Explorer**, and then navigate to **Downloads**. - 4. Double-click the downloaded `.msi` file, and then use the install wizard to install Langflow Desktop. +1. Navigate to [Langflow Desktop](https://www.langflow.org/desktop). +2. Click **Download Langflow**, enter your contact information, and then click **Download**. +3. Open the **File Explorer**, and then navigate to **Downloads**. +4. Double-click the downloaded `.msi` file, and then use the install wizard to install Langflow Desktop. - :::important - Windows installations of Langflow Desktop require a C++ compiler that may not be present on your system. If you receive a `C++ Build Tools Required!` error, follow the on-screen prompt to install Microsoft C++ Build Tools, or [install Microsoft Visual Studio](https://visualstudio.microsoft.com/downloads/). - ::: + :::important + Windows installations of Langflow Desktop require a C++ compiler that may not be present on your system. If you receive a `C++ Build Tools Required!` error, follow the on-screen prompt to install Microsoft C++ Build Tools, or [install Microsoft Visual Studio](https://visualstudio.microsoft.com/downloads/). + ::: - 5. When the installation completes, open the Langflow application, and then create your first flow with the [Quickstart](/get-started-quickstart). +5. When the installation completes, open the Langflow application, and then create your first flow with the [Quickstart](/get-started-quickstart). - + For upgrade information, see the [Release notes](/release-notes). @@ -78,57 +78,83 @@ For more information, see [Deploy Langflow on Docker](/deployment-docker). - Windows: Version 3.10 to 3.12 - [uv](https://docs.astral.sh/uv/getting-started/installation/) - Sufficient infrastructure: - - Minimum: Dual-core CPU and 2 GB RAM - - Recommended: Multi-core CPU and at least 4 GB RAM + - Minimum: Dual-core CPU and 2 GB RAM + - Recommended: Multi-core CPU and at least 4 GB RAM 2. Create a virtual environment with [uv](https://docs.astral.sh/uv/pip/environments). -
-Need help with virtual environments? - -Virtual environments ensure Langflow is installed in an isolated, fresh environment. -To create a new virtual environment, do the following. - - - - 1. Navigate to where you want your virtual environment to be created, and create it with `uv`. -Replace `VENV_NAME` with your preferred name for your virtual environment. -``` -uv venv VENV_NAME -``` -2. Start the virtual environment. -``` -source VENV_NAME/bin/activate -``` -Your shell's prompt changes to display that you're currently working in a virtual environment. -``` -(VENV_NAME) ➜ langflow git:(main) ✗ -``` -3. To deactivate the virtual environment and return to your regular shell, type `deactivate`. - When activated, the virtual environment temporarily modifies your PATH variable to prioritize packages installed within the virtual environment, so always deactivate it when you're done to avoid conflicts with other projects. -To delete the virtual environment, type `rm -rf VENV_NAME`. - - -1. Navigate to where you want your virtual environment to be created, and create it with `uv`. -Replace `VENV_NAME` with your preferred name for your virtual environment. -``` -uv venv VENV_NAME -``` -2. Start the virtual environment. -```shell -VENV_NAME\Scripts\activate -``` -Your shell's prompt changes to display that you're currently working in a virtual environment. -``` -(VENV_NAME) PS C:/users/username/langflow-dir> -``` -3. To deactivate the virtual environment and return to your regular shell, type `deactivate`. - When activated, the virtual environment temporarily modifies your PATH variable to prioritize packages installed within the virtual environment, so always deactivate it when you're done to avoid conflicts with other projects. -To delete the virtual environment, type `Remove-Item VENV_NAME`. - - - -
+
+ Need help with virtual environments? + + Virtual environments ensure Langflow is installed in an isolated, fresh environment. + To create a new virtual environment, do the following. + + + + + 1. Navigate to where you want your virtual environment to be created, and then create it with `uv`: + + ```shell + uv venv VENV_NAME + ``` + + Replace `VENV_NAME` with a name for your virtual environment. + + 2. Start the virtual environment: + + ```shell + source VENV_NAME/bin/activate + ``` + + Your shell's prompt changes to display that you're currently working in a virtual environment: + + ```text + (VENV_NAME) ➜ langflow git:(main) ✗ + ``` + + 3. To deactivate the virtual environment and return to your regular shell, type `deactivate`. + + When activated, the virtual environment temporarily modifies your `PATH` variable to prioritize packages installed within the virtual environment. + To avoid conflicts with other projects, it's a good idea to deactivate your virtual environment when you're done working in it. + + To delete the virtual environment, type `rm -rf VENV_NAME`. + This completely removes the virtual environment directory and its contents. + + + + + 1. Navigate to where you want your virtual environment to be created, and create it with `uv`. + + ```shell + uv venv VENV_NAME + ``` + + Replace `VENV_NAME` with a name for your virtual environment. + + 2. Start the virtual environment: + + ```shell + VENV_NAME\Scripts\activate + ``` + + Your shell's prompt changes to display that you're currently working in a virtual environment: + + ```text + (VENV_NAME) PS C:/users/username/langflow-dir> + ``` + + 3. To deactivate the virtual environment and return to your regular shell, type `deactivate`. + + When activated, the virtual environment temporarily modifies your `PATH` variable to prioritize packages installed within the virtual environment. + To avoid conflicts with other projects, it's a good idea to deactivate your virtual environment when you're done working in it. + + To delete the virtual environment, type `Remove-Item VENV_NAME`. + This completely removes the virtual environment directory and its contents. + + + + +
3. In your virtual environment, install Langflow: diff --git a/docs/docs/Get-Started/get-started-quickstart.mdx b/docs/docs/Get-Started/get-started-quickstart.mdx index f866ae68a0f8..3ddb4d02cbf6 100644 --- a/docs/docs/Get-Started/get-started-quickstart.mdx +++ b/docs/docs/Get-Started/get-started-quickstart.mdx @@ -12,39 +12,40 @@ Get started with Langflow by loading a template flow, running it, and then servi ## Prerequisites - [Install and start Langflow](/get-started-installation) -- [Create an OpenAI API key](https://platform.openai.com/api-keys) -- [Create a Langflow API key](/configuration-api-keys) +- Create an [OpenAI API key](https://platform.openai.com/api-keys) +- Create a [Langflow API key](/configuration-api-keys) -
-Create a Langflow API key +
+ Create a Langflow API key -A Langflow API key is a user-specific token you can use with Langflow. + A Langflow API key is a user-specific token you can use with Langflow. -To create a Langflow API key, do the following: + To create a Langflow API key, do the following: -1. In Langflow, click your user icon, and then select **Settings**. -2. Click **Langflow API Keys**, and then click
+ # Send request + curl --request POST \ + --url 'http://LANGFLOW_SERVER_ADDRESS/api/v1/run/FLOW_ID' \ + --header 'Content-Type: application/json' \ + --header 'x-api-key: LANGFLOW_API_KEY' \ + --data '{ + "output_type": "chat", + "input_type": "chat", + "input_value": "Hello" + }' + ``` + +
## Run the Simple Agent template flow @@ -99,8 +100,8 @@ Langflow provides code snippets to help you get started with the Langflow API. Replace these values if you're using the code for a different server or flow. The default Langflow server address is `http://localhost:7860`. - - + + ```python import requests @@ -134,8 +135,8 @@ Langflow provides code snippets to help you get started with the Langflow API. print(f"Error parsing response: {e}") ``` - - + + ```js const payload = { @@ -160,9 +161,8 @@ Langflow provides code snippets to help you get started with the Langflow API. .catch(err => console.error(err)); ``` - - - + + ```bash curl --request POST \ @@ -178,8 +178,7 @@ Langflow provides code snippets to help you get started with the Langflow API. # A 200 response confirms the call succeeded. ``` - - + 2. Copy the snippet, paste it in a script file, and then run the script to send the request. @@ -188,8 +187,8 @@ If you are using the curl snippet, you can run the command directly in your term If the request is successful, the response includes many details about the flow run, including the session ID, inputs, outputs, components, durations, and more. The following is an example of a response from running the **Simple Agent** template flow: -
-Response +
+Result ```json { @@ -367,8 +366,8 @@ The following example builds on the API pane's example code to create a question 1. Incorporate your **Simple Agent** flow's `/run` snippet into the following script. This script runs a question-and-answer chat in your terminal and stores the Agent's previous answer so you can compare them. - - + + ```python import requests @@ -432,8 +431,8 @@ This script runs a question-and-answer chat in your terminal and stores the Agen previous_answer = result ``` - - + + ```js const readline = require('readline'); @@ -512,7 +511,7 @@ This script runs a question-and-answer chat in your terminal and stores the Agen startChat(); ``` - + 2. To view the Agent's previous answer, type `compare`. To close the terminal chat, type `exit`. diff --git a/docs/docs/Integrations/Arize/integrations-arize.mdx b/docs/docs/Integrations/Arize/integrations-arize.mdx index 7db804b108a4..e641d4fa94b8 100644 --- a/docs/docs/Integrations/Arize/integrations-arize.mdx +++ b/docs/docs/Integrations/Arize/integrations-arize.mdx @@ -26,7 +26,7 @@ Instructions for integrating Langflow and Arize are also available in the Arize ## Connect Arize to Langflow - + 1. In your [Arize dashboard](https://app.arize.com/), copy your **Space ID** and [**API Key (Ingestion Service Account Key)**](https://arize.com/docs/ax/security-and-settings/api-keys). @@ -49,8 +49,8 @@ Instructions for integrating Langflow and Arize are also available in the Arize uv run langflow run --env-file .env ``` - - + + 1. In your [Arize Phoenix dashboard](https://app.phoenix.arize.com/), copy your **API Key**. @@ -70,7 +70,7 @@ Instructions for integrating Langflow and Arize are also available in the Arize uv run langflow run --env-file .env ``` - + ## Run a flow and view metrics in Arize diff --git a/docs/docs/Integrations/Docling/integrations-docling.mdx b/docs/docs/Integrations/Docling/integrations-docling.mdx index e36241ef7382..395ab151083f 100644 --- a/docs/docs/Integrations/Docling/integrations-docling.mdx +++ b/docs/docs/Integrations/Docling/integrations-docling.mdx @@ -49,12 +49,11 @@ The following sections describe the purpose and configuration options for each c ### Docling -This component uses Docling to process input documents running the Docling models locally. +The **Docling** component ingest documents, and then uses Docling to process them by running the Docling models locally. -
-Parameters +It outputs `files`, which is the processed files with `DoclingDocument` data. -**Inputs** +#### Docling parameters | Name | Type | Description | |------|------|-------------| @@ -62,22 +61,13 @@ This component uses Docling to process input documents running the Docling model | pipeline | String | Docling pipeline to use (standard, vlm). | | ocr_engine | String | OCR engine to use (easyocr, tesserocr, rapidocr, ocrmac). | -**Outputs** - -| Name | Type | Description | -|------|------|-------------| -| files | File | The processed files with DoclingDocument data. | - -
- ### Docling Serve -This component uses Docling to process input documents connecting to your instance of Docling Serve. +The **Docling Serve** component ingests documents, and then uses Docling to process them by connecting to your instance of Docling Serve. -
-Parameters +It outputs `files`, which is the processed files with `DoclingDocument` data. -**Inputs** +#### Docling Serve parameters | Name | Type | Description | |------|------|-------------| @@ -88,22 +78,13 @@ This component uses Docling to process input documents connecting to your instan | api_headers | Dict | Optional dictionary of additional headers required for connecting to Docling Serve. | | docling_serve_opts | Dict | Optional dictionary of additional options for Docling Serve. | -**Outputs** - -| Name | Type | Description | -|------|------|-------------| -| files | File | The processed files with DoclingDocument data. | - -
- ### Chunk DoclingDocument -This component uses the DoclingDocument chunkers to split a document into chunks. +The **Chunk DoclingDocument** component uses the `DoclingDocument` chunkers to split a document into chunks. -
-Parameters +It outputs the chunked documents as a [`DataFrame`](/data-types#dataframe). -**Inputs** +#### Chunk DoclingDocument parameters | Name | Type | Description | |------|------|-------------| @@ -115,22 +96,13 @@ This component uses the DoclingDocument chunkers to split a document into chunks | max_tokens | Integer | Maximum number of tokens for the HybridChunker. | | doc_key | String | The key to use for the DoclingDocument column. | -**Outputs** - -| Name | Type | Description | -|------|------|-------------| -| dataframe | DataFrame | The chunked documents as a DataFrame. | - -
- ### Export DoclingDocument -This component exports DoclingDocument to Markdown, HTML, and other formats. +The **Export DoclingDocument** component exports `DoclingDocument` to Markdown, HTML, and other formats. -
-Parameters +It can output the exported data as either [`Data`](/data-types#data) or [`DataFrame`](/data-types#dataframe). -**Inputs** +#### Export DoclingDocument parameters | Name | Type | Description | |------|------|-------------| @@ -139,13 +111,4 @@ This component exports DoclingDocument to Markdown, HTML, and other formats. | image_mode | String | Specify how images are exported in the output (placeholder, embedded). | | md_image_placeholder | String | Specify the image placeholder for markdown exports. | | md_page_break_placeholder | String | Add this placeholder between pages in the markdown output. | -| doc_key | String | The key to use for the DoclingDocument column. | - -**Outputs** - -| Name | Type | Description | -|------|------|-------------| -| data | Data | The exported data. | -| dataframe | DataFrame | The exported data as a DataFrame. | - -
\ No newline at end of file +| doc_key | String | The key to use for the DoclingDocument column. | \ No newline at end of file diff --git a/docs/docs/Integrations/Google/integrations-google-big-query.mdx b/docs/docs/Integrations/Google/integrations-google-big-query.mdx index 232ea0d3e3b0..d8df9c655dfc 100644 --- a/docs/docs/Integrations/Google/integrations-google-big-query.mdx +++ b/docs/docs/Integrations/Google/integrations-google-big-query.mdx @@ -48,12 +48,12 @@ The BigQuery component can now query your datasets and tables using your service With your component credentials configured, query your BigQuery datasets and tables to confirm connectivity. 1. Connect **Chat Input** and **Chat Output** components to the BigQuery component. -The flow looks like this: -![BigQuery component connected to chat input and output](/img/google/integrations-bigquery.png) + + ![BigQuery component connected to chat input and output](/img/google/integrations-bigquery.png) + 2. Open the **Playground**, and then submit a valid SQL query. -This example queries a table of Oscar winners stored within a BigQuery dataset called `the_oscar_award`. - - + + This example queries a table of Oscar winners stored within a BigQuery dataset called `the_oscar_award`: ```sql SELECT film, category, year_film @@ -62,8 +62,8 @@ This example queries a table of Oscar winners stored within a BigQuery dataset c LIMIT 10 ``` - - +
+ Result ```text film category year_film @@ -78,7 +78,7 @@ This example queries a table of Oscar winners stored within a BigQuery dataset c Wings OUTSTANDING PICTURE 1927 Sunrise UNIQUE AND ARTISTIC PICTURE 1927 ``` - - + +
A successful chat confirms the component can access the BigQuery table. \ No newline at end of file diff --git a/docs/docs/Tutorials/agent.mdx b/docs/docs/Tutorials/agent.mdx index 95e846742566..5714579dd1ee 100644 --- a/docs/docs/Tutorials/agent.mdx +++ b/docs/docs/Tutorials/agent.mdx @@ -75,75 +75,77 @@ With your flow operational, connect it to a JavaScript application to use the ag If you're using the `customer_orders.csv` example file, you can run this example as-is with the example email address in the code sample. If not, modify the `const email = "isabella.rodriguez@example.com"` to search for a value in your dataset. - ```js - import { LangflowClient } from "@datastax/langflow-client"; - - const LANGFLOW_SERVER_ADDRESS = 'LANGFLOW_SERVER_ADDRESS'; - const FLOW_ID = 'FLOW_ID'; - const LANGFLOW_API_KEY = 'LANGFLOW_API_KEY'; - const email = "isabella.rodriguez@example.com"; - - async function runAgentFlow(): Promise { - try { - // Initialize the Langflow client - const client = new LangflowClient({ - baseUrl: LANGFLOW_SERVER_ADDRESS, - apiKey: LANGFLOW_API_KEY - }); - - console.log(`Connecting to Langflow server at: ${LANGFLOW_SERVER_ADDRESS} `); - console.log(`Flow ID: ${FLOW_ID}`); - console.log(`Email: ${email}`); - - // Get the flow instance - const flow = client.flow(FLOW_ID); - - // Run the flow with the email as input - console.log('\nSending request to agent...'); - const response = await flow.run(email, { - session_id: email // Use email as session ID for context - }); - - console.log('\n=== Response from Langflow ==='); - console.log('Session ID:', response.sessionId); - - // Extract URLs from the chat message - const chatMessage = response.chatOutputText(); - console.log('\n=== URLs from Chat Message ==='); - const messageUrls = chatMessage.match(/https?:\/\/[^\s"')\]]+/g) || []; - const cleanMessageUrls = [...new Set(messageUrls)].map(url => url.trim()); - console.log('URLs from message:'); - cleanMessageUrls.slice(0, 3).forEach(url => console.log(url)); - - } catch (error) { - console.error('Error running flow:', error); - - // Provide error messages - if (error instanceof Error) { - if (error.message.includes('fetch')) { - console.error('\nMake sure your Langflow server is running and accessible at:', LANGFLOW_SERVER_ADDRESS); - } - if (error.message.includes('401') || error.message.includes('403')) { - console.error('\nCheck your API key configuration'); - } - if (error.message.includes('404')) { - console.error('\nCheck your Flow ID - make sure it exists and is correct'); - } + ```js + import { LangflowClient } from "@datastax/langflow-client"; + + const LANGFLOW_SERVER_ADDRESS = 'LANGFLOW_SERVER_ADDRESS'; + const FLOW_ID = 'FLOW_ID'; + const LANGFLOW_API_KEY = 'LANGFLOW_API_KEY'; + const email = "isabella.rodriguez@example.com"; + + async function runAgentFlow(): Promise { + try { + // Initialize the Langflow client + const client = new LangflowClient({ + baseUrl: LANGFLOW_SERVER_ADDRESS, + apiKey: LANGFLOW_API_KEY + }); + + console.log(`Connecting to Langflow server at: ${LANGFLOW_SERVER_ADDRESS} `); + console.log(`Flow ID: ${FLOW_ID}`); + console.log(`Email: ${email}`); + + // Get the flow instance + const flow = client.flow(FLOW_ID); + + // Run the flow with the email as input + console.log('\nSending request to agent...'); + const response = await flow.run(email, { + session_id: email // Use email as session ID for context + }); + + console.log('\n=== Response from Langflow ==='); + console.log('Session ID:', response.sessionId); + + // Extract URLs from the chat message + const chatMessage = response.chatOutputText(); + console.log('\n=== URLs from Chat Message ==='); + const messageUrls = chatMessage.match(/https?:\/\/[^\s"')\]]+/g) || []; + const cleanMessageUrls = [...new Set(messageUrls)].map(url => url.trim()); + console.log('URLs from message:'); + cleanMessageUrls.slice(0, 3).forEach(url => console.log(url)); + + } catch (error) { + console.error('Error running flow:', error); + + // Provide error messages + if (error instanceof Error) { + if (error.message.includes('fetch')) { + console.error('\nMake sure your Langflow server is running and accessible at:', LANGFLOW_SERVER_ADDRESS); + } + if (error.message.includes('401') || error.message.includes('403')) { + console.error('\nCheck your API key configuration'); + } + if (error.message.includes('404')) { + console.error('\nCheck your Flow ID - make sure it exists and is correct'); } } } + } - // Run the function - console.log('Starting Langflow Agent...\n'); - runAgentFlow().catch(console.error); - ``` + // Run the function + console.log('Starting Langflow Agent...\n'); + runAgentFlow().catch(console.error); + ``` 3. Save and run the script to send the request and test the flow. + Your application receives three URLs for recommended used items based on a customer's previous orders in your local CSV, all without changing any code. -
- Response - The following is an example of a response returned from this tutorial's flow. Due to the nature of LLMs and variations in your inputs, your response might be different. +
+ Result + + The following is an example response from this tutorial's flow. Due to the nature of LLMs and variations in your inputs, your response might be different. ``` Starting Langflow Agent... @@ -166,7 +168,7 @@ If not, modify the `const email = "isabella.rodriguez@example.com"` to search fo
4. To quickly check traffic to your flow, open the **Playground**. - New sessions appear named after the user's email address. + New sessions are named after the user's email address. Keeping sessions distinct helps the agent maintain context. For more on session IDs, see [Session ID](/session-id). ## Next steps diff --git a/docs/docs/Tutorials/chat-with-files.mdx b/docs/docs/Tutorials/chat-with-files.mdx index be699fe82bf7..25bfb42a7f46 100644 --- a/docs/docs/Tutorials/chat-with-files.mdx +++ b/docs/docs/Tutorials/chat-with-files.mdx @@ -146,10 +146,13 @@ For help with constructing file upload requests in Python, JavaScript, and curl, 3. Save and run the script to send the requests and test the flow. + The initial output contains the JSON response object from the file upload endpoint, including the internal path where Langflow stores the file. + Then, the LLM retrieves the file and evaluates its content, in this case the suitability of the resume for a job position. +
- Response + Result - The following is an example of a response returned from this tutorial's flow. Due to the nature of LLMs and variations in your inputs, your response might be different. + The following is an example response from this tutorial's flow. Due to the nature of LLMs and variations in your inputs, your response might be different. ``` {"id":"793ba3d8-5e7a-4499-8b89-d9a7b6325fee","name":"fake-resume (1)","path":"02791d46-812f-4988-ab1c-7c430214f8d5/fake-resume.txt","size":1779,"provider":null} @@ -179,10 +182,6 @@ For help with constructing file upload requests in Python, JavaScript, and curl,
- The initial output contains the JSON response object from the file upload endpoint, including the internal path where Langflow stores the file. - - The LLM then retrieves this file and evaluates its content, in this case the suitability of the resume for a job position. - ## Next steps To continue building on this tutorial, try these next steps. diff --git a/docs/docs/Tutorials/chat-with-rag.mdx b/docs/docs/Tutorials/chat-with-rag.mdx index adbf6dfb1dfc..d0687514bbd7 100644 --- a/docs/docs/Tutorials/chat-with-rag.mdx +++ b/docs/docs/Tutorials/chat-with-rag.mdx @@ -22,7 +22,7 @@ This tutorial demonstrates how you can use Langflow to create a chatbot applicat 1. In Langflow, click **New Flow**, and then select the **Vector Store RAG** template.
- About the Vector Store RAG template + About the Vector Store RAG template This template has two flows. @@ -60,62 +60,62 @@ The Langflow UI option is simpler, but it is only recommended for scenarios wher In situations where many users load data or you need to load data programmatically, use the Langflow API option. - - - 1. In your RAG chatbot flow, click the **File component**, and then click **File**. - 2. Select the local file you want to upload, and then click **Open**. - The file is loaded to your Langflow server. - 3. To load the data into your vector store, click the vector store component, and then click - - - To load data programmatically, use the `/v2/files/` and `/v1/run/$FLOW_ID` endpoints. The first endpoint loads a file to your Langflow server, and then returns an uploaded file path. The second endpoint runs the **Load Data Flow**, referencing the uploaded file path, to chunk, embed, and load the data into the vector store. - - The following script demonstrates this process. - For help with creating this script, use the [Langflow File Upload Utility](https://langflow-file-upload-examples.onrender.com/). - - ```js - // Node 18+ example using global fetch, FormData, and Blob - import fs from 'fs/promises'; - - // 1. Prepare the form data with the file to upload - const fileBuffer = await fs.readFile('FILE_NAME'); - const data = new FormData(); - data.append('file', new Blob([fileBuffer]), 'FILE_NAME'); - const headers = { 'x-api-key': 'LANGFLOW_API_KEY' }; - - // 2. Upload the file to Langflow - const uploadRes = await fetch('LANGFLOW_SERVER_ADDRESS/api/v2/files/', { - method: 'POST', - headers, - body: data - }); - const uploadData = await uploadRes.json(); - const uploadedPath = uploadData.path; - - // 3. Call the Langflow run endpoint with the uploaded file path - const payload = { - input_value: "Analyze this file", - output_type: "chat", - input_type: "text", - tweaks: { - 'FILE_COMPONENT_NAME': { - path: uploadedPath - } - } - }; - const runRes = await fetch('LANGFLOW_SERVER_ADDRESS/api/v1/run/FLOW_ID', { - method: 'POST', - headers: { 'Content-Type': 'application/json', 'x-api-key': 'LANGFLOW_API_KEY' }, - body: JSON.stringify(payload) - }); - const langflowData = await runRes.json(); - // Output only the message - console.log(langflowData.outputs?.[0]?.outputs?.[0]?.results?.message?.data?.text); - ``` - - + + +1. In your RAG chatbot flow, click the **File component**, and then click **File**. +2. Select the local file you want to upload, and then click **Open**. + The file is loaded to your Langflow server. +3. To load the data into your vector store, click the vector store component, and then click + + +To load data programmatically, use the `/v2/files/` and `/v1/run/$FLOW_ID` endpoints. The first endpoint loads a file to your Langflow server, and then returns an uploaded file path. The second endpoint runs the **Load Data Flow**, referencing the uploaded file path, to chunk, embed, and load the data into the vector store. + +The following script demonstrates this process. +For help with creating this script, use the [Langflow File Upload Utility](https://langflow-file-upload-examples.onrender.com/). + +```js +// Node 18+ example using global fetch, FormData, and Blob +import fs from 'fs/promises'; + +// 1. Prepare the form data with the file to upload +const fileBuffer = await fs.readFile('FILE_NAME'); +const data = new FormData(); +data.append('file', new Blob([fileBuffer]), 'FILE_NAME'); +const headers = { 'x-api-key': 'LANGFLOW_API_KEY' }; + +// 2. Upload the file to Langflow +const uploadRes = await fetch('LANGFLOW_SERVER_ADDRESS/api/v2/files/', { + method: 'POST', + headers, + body: data +}); +const uploadData = await uploadRes.json(); +const uploadedPath = uploadData.path; + +// 3. Call the Langflow run endpoint with the uploaded file path +const payload = { + input_value: "Analyze this file", + output_type: "chat", + input_type: "text", + tweaks: { + 'FILE_COMPONENT_NAME': { + path: uploadedPath + } + } +}; +const runRes = await fetch('LANGFLOW_SERVER_ADDRESS/api/v1/run/FLOW_ID', { + method: 'POST', + headers: { 'Content-Type': 'application/json', 'x-api-key': 'LANGFLOW_API_KEY' }, + body: JSON.stringify(payload) +}); +const langflowData = await runRes.json(); +// Output only the message +console.log(langflowData.outputs?.[0]?.outputs?.[0]?.results?.message?.data?.text); +``` + + When the flow runs, the flow ingests the selected file, chunks the data, loads the data into the vector store database, and then generates embeddings for the chunks, which are also stored in the vector store. @@ -199,9 +199,9 @@ This tutorial uses JavaScript for demonstration purposes. 3. Save and run the script to send the requests and test the flow.
- Response + Result - The following is an example of a response returned from this tutorial's flow. Due to the nature of LLMs and variations in your inputs, your response might be different. + The following is an example response from this tutorial's flow. Due to the nature of LLMs and variations in your inputs, your response might be different. ``` 👤 You: Do you have any documents about engines? @@ -215,7 +215,6 @@ This tutorial uses JavaScript for demonstration purposes.
- ## Next steps For more information on building or extending this tutorial, see the following: