Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
376 changes: 376 additions & 0 deletions docs/docs/Flows/lfx.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,376 @@
---
title: Run flows with Langflow Executor (LFX)
slug: /lfx-stateless-flows
---

import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';

The Langflow Executor (LFX) is a command-line tool that serves and runs flows statelessly from [flow JSON files](/concepts-flows-import) with minimal dependencies.

Flows are run without the flow builder UI or database, and any flow dependencies are automatically added to complete the run.
The flow graph is stored in memory at all times, so there is less overhead for loading the graph from a database.
Running a flow with LFX is similar to running flows with the [`--backend-only` environment variable](/environment-variables#server) enabled, but even more lightweight, because the Langflow package and all of its dependencies don't need to be installed.

Use LFX to share flows with other developers, test flows in different environments, and run flows in production applications without requiring the full Langflow UI or database setup.

LFX includes three commands for working with flows:

* [`lfx serve`](#serve): This command starts a FastAPI server hosting a Langflow API endpoint with your flow available at `/flows/{flow_id}/run`.
* [`lfx run`](#run): This command executes a flow locally and returns the results to `stdout`.
* [`lfx check`](#check): This command checks if flows have outdated components and updates them (similar to the version check in the UI).

## Prerequisites

- Install [Python](https://www.python.org/downloads/release/python-3100/)
- Install [uv](https://docs.astral.sh/uv/getting-started/installation/)
- Create or download a [flow JSON file](/concepts-flows)
- Create an [OpenAI API key](https://platform.openai.com/api-keys)
- Create a [Langflow API key](/api-keys-and-authentication)

## Install LFX

LFX can be installed in multiple ways.

<Tabs>
<TabItem value="source" label="Clone repository" default>

1. Clone the Langflow repository:
```bash
git clone https://github.com/langflow-ai/langflow
```

2. Change directory to `langflow/src/lfx`:
```bash
cd langflow/src/lfx
```

3. Run LFX commands using `uv run`:
```bash
uv run lfx serve simple-agent-flow.json
```

</TabItem>
<TabItem value="pypi" label="Install from PyPI">

1. Create and activate a virtual environment.

```bash
uv venv lfx-venv
source lfx-venv/bin/activate
```

2. Install the LFX package from PyPI:

```bash
uv pip install lfx
```

3. Run LFX commands using `uv run`:

```bash
uv run lfx serve simple-agent-flow.json
```

</TabItem>
<TabItem value="uvx" label="Run without installing">

Run LFX without installing it using `uvx`:

```bash
uvx lfx serve simple-agent-flow.json
```

This command downloads and runs LFX in a temporary environment without permanent installation.

</TabItem>
</Tabs>

## Serve the simple agent starter flow with `lfx serve` {#serve}

To serve a flow as a REST API endpoint, set a `LANGFLOW_API_KEY` and run the flow JSON.
The API key is required for security because `lfx serve` can create a publicly accessible FastAPI server.
To create a Langflow API key, see [API keys and authentication](/api-keys-and-authentication).

This example uses the **Agent** component's built-in OpenAI model, which requires an OpenAI API key.
If you want to use a different provider, edit the model provider, model name, and credentials accordingly.

1. Set up your environment variables.

<Tabs>
<TabItem value="env-file" label=".env file" default>

Create a `.env` file and populate it with your flow's variables.
The `LANGFLOW_API_KEY` is required.
This example assumes the flow requires an OpenAI API key.

```bash
LANGFLOW_API_KEY="sk..."
OPENAI_API_KEY="sk-..."
```

</TabItem>
<TabItem value="export" label="Export variables">

Export your variables in the same terminal session where you'll start the server.
You must declare your variables before the server starts for the server to pick them up.

```bash
export LANGFLOW_API_KEY="sk..."
export OPENAI_API_KEY="sk-..."
```

</TabItem>
</Tabs>

2. Start the server with your variable values.

<Tabs>
<TabItem value="env-file" label=".env file" default>

This example assumes your flow file and `.env` file are in the current directory:

```
uv run lfx serve simple-agent-flow.json --env-file .env
```

If your `.env` file is in a different location, provide the full or relative path:

```
uv run lfx serve simple-agent-flow.json --env-file /path/to/.env
```

</TabItem>
<TabItem value="export" label="Export variables">

If you exported your variables, the command to start the server automatically picks up the values when it starts.

```
uv run lfx serve simple-agent-flow.json
```

To export new values, stop the server, export the variables, and start the server again.

</TabItem>
</Tabs>



3. The startup process displays a `flow_id` value in the output.
Copy the `flow_id` to use in the test API call in the next step.
In this example, the `flow_id` is `c1dab29d-3364-58ef-8fef-99311d32ee42`.

```bash
╭───────────────────────────── LFX Server ─────────────────────────────╮
│ 🎯 Single Flow Served Successfully! │
│ │
│ Source: /Users/mendonkissling/Downloads/simple-agent-flow.json │
│ Server: http://127.0.0.1:8000 │
│ API Key: sk-... │
│ │
│ Send POST requests to: │
│ http://127.0.0.1:8000/flows/c1dab29d-3364-58ef-8fef-99311d32ee42/run │
│ │
│ With headers: │
│ x-api-key: sk-... │
│ │
│ Or query parameter: │
│ ?x-api-key=sk-... │
│ │
│ Request body: │
│ {'input_value': 'Your input message'} │
╰──────────────────────────────────────────────────────────────────────╯
```

4. In a new terminal, export your `flow_id` and Langflow API key values as variables.
```bash
export LANGFLOW_API_KEY="sk..."
export FLOW_ID="c1dab29d-3364-58ef-8fef-99311d32ee42"
```

5. Test the server with an API call to the `/flows/flow_id/run` endpoint.

```bash
curl -X POST http://localhost:8000/flows/$FLOW_ID/run \
-H "Content-Type: application/json" \
-H "x-api-key: $LANGFLOW_API_KEY" \
-d '{"input_value": "Hello, world!"}'
```

Successful response:
```json
{
"result": "Hello world! 👋\n\nHow can I help you today? If you have any questions or need assistance, just let me know!",
"success": true,
"logs": "\n\n\u001b[1m> Entering new None chain...\u001b[0m\n\u001b[32;1m\u001b[1;3mHello world! 👋\n\nHow can I help you today? If you have any questions or need assistance, just let me know!\u001b[0m\n\n\u001b[1m> Finished chain.\u001b[0m\n",
"type": "message",
"component": "Chat Output"
}
```

Your flow is now running as a lightweight API endpoint, with only the flow's required dependencies and no visual builder installed.
Users who call your endpoint don't need to install Langflow or configure their own LLM provider keys.

To make your server publicly accessible, use a [tunneling service like ngrok](/deployment-public-server), or deploy to a public cloud provider such as [DigitalOcean](/deployment-nginx-ssl).

### LFX serve options

| Option | Description |
|-----------------------------------------|-----------------------------------------------------------------------------------------------|
| `--check-variables`/`--no-check-variables` | Check global variables for environment variables. |
| `--env-file` | The path to the `.env` file. |
| `--host`, `-h` | Host to bind server. Default: `127.0.0.1` (localhost only). Use `0.0.0.0` to make it publicly accessible from other machines. |
| `--log-level` | Set logging level. Options are `debug`, `info`, `warning`, `error`, or `critical`. |
| `--port`, `-p` | Port to bind server. Default:`8000`. |
| `--verbose`, `-v` | Display diagnostic output. |

## Run the simple agent flow with `lfx run` {#run}

The `lfx run` command runs a flow from a JSON file without serving it, and the output is sent to `stdout`.
Input to `lfx run` can be a path to the JSON file, inline JSON passed with `--input-value`, or read from `stdin`.
No Langflow API key is required.

This example uses the **Agent** component's built-in OpenAI model, which requires an OpenAI API key.
If you want to use a different provider, edit the model provider, model name, and credentials accordingly.

1. Export your variables in the same terminal session where you'll run the flow.
```bash
export OPENAI_API_KEY="sk-..."
```

2. Run the flow from a flow JSON file.
```bash
uv run lfx run simple-agent-flow.json "Hello world"
```

This flow expects a [Message](/data-types#message) input, which is a simple text string. The simple agent flow includes Calculator and URL tools, it can answer questions such as `"What is 15 multiplied by 23?"` or `"Can you fetch information from https://example.com?"`.

If your flow expects multiple structured input fields, you can pass structured JSON with the `--input-value` flag. The field names must match what your flow expects:
```bash
uv run lfx run structured-input-flow.json \
--input-value '{"question": "What is the weather in Paris?", "context": "weather"}'
```

In addition to running flows from JSON files, `lfx run` supports other input methods, which are described in the sections below.

### Run flows from stdin

The `--stdin` option allows you to run flows that come from dynamic sources such as APIs or databases, or when you want to modify a flow before execution.
The command reads the flow's JSON definition from `stdin`, validates the JSON structure, and runs the flow.

This example reads a flow JSON from stdin.
Provide the input value to the flow with the `--input-value` flag.
```bash
cat simple-agent-flow.json | uv run lfx run --stdin \
--input-value "Hello world" \
--format json | jq '.result'
```

This example fetches a flow JSON from a remote API endpoint and runs it:
```bash
curl https://api.example.com/flows/my-agent-flow | uv run lfx run --stdin \
--input-value "Hello world"
```

Running a flow with `stdin` allows you to modify flows created in the visual builder before execution.
This example demonstrates changing the OpenAI model to `gpt-4o` before running the flow:
```bash
cat simple-agent-flow.json | jq '(.data.nodes[] | select(.data.node.template.model_name.value) | .data.node.template.model_name.value) = "gpt-4o"' | \
uv run lfx run --stdin \
--input-value "Hello world" \
--format json | jq '.result'
```

### Run flows with inline JSON

Instead of piping from `stdin` or reading from a JSON file, you can pass the flow JSON directly as a string argument:
```bash
uv run lfx run --flow-json '{"data": {"nodes": [...], "edges": [...]}}' \
--input-value "Hello world"
```

### LFX run options

| Option | Description |
|------------------------------------------------|--------------------------------------------------------------------------------------------------|
| `--check-variables`/`--no-check-variables` | Validates the flow's global variables. Default: check. |
| `--flow-json` | Loads inline JSON flow content as a string. |
| `--format`, `-f` | Output format. Accepts `json`, `text`, `message`, or `result`. Default: `json`. |
| `--input-value` | Input value to pass to the graph. |
| `--stdin` | Read JSON flow content from `stdin`. |
| `--timing` | Include detailed timing information in output. |
| `--verbose`, `-v` | Show basic progress information and diagnostic output. |
| `-vv` | Show detailed progress and debug information. |
| `-vvv` | Show full debugging output including component logs. |

### Use LFX run to create an application

In addition to running flows from JSON files, you can use `lfx run` with Python scripts that define flows programmatically.
This approach allows you to create flows directly in Python code without the visual builder.

For a complete example of creating an agent flow programmatically using LFX components, see the [Complete Agent Example on PyPI](https://pypi.org/project/lfx/0.1.13/#complete-agent-example).

## Check and update outdated flow components with `lfx check` {#check}

The `lfx check` command checks if a flow JSON file contains outdated components.
`lfx check` is similar to the [version check feature](/concepts-components#component-versions) available in the Langflow UI, but can be run from the command line.

To check a flow file for outdated components, run the `lfx check` command.
The command checks the flow for outdated components and displays information about any components that need to be updated.
If outdated components are found, the command reports them but does not modify the flow file.

```bash
uv run lfx check simple-agent-flow.json
```

Result:
```result
Built lfx @ file:///Users/mendonkissling/Documents/GitHub/langflow/src/lfx
Uninstalled 29 packages in 342ms
Installed 29 packages in 49ms

Checking flow: simple-agent-flow.json
Total nodes: 5
Outdated components: 0
✅ All components are up to date!
```

To check and automatically apply safe updates to the flow file, include the `--update` flag with the `lfx check` command.

```bash
uv run lfx check simple-agent-flow.json --update
```

To check and apply all updates, including breaking changes, include the `--force` flag with the `lfx check` command.

```bash
uv run lfx check simple-agent-flow.json --update --force
```

To check multiple flow files at once, pass them as arguments to the `lfx check` command.

```bash
uv run lfx check flow1.json flow2.json flow3.json
```

To check a flow interactively, with prompts for each component update, include the `--interactive` flag with the `lfx check` command:

```bash
uv run lfx check simple-agent-flow.json --interactive
```

To check a flow and save the updates to a new flow file, include the `--output` flag with a file path to a `.json` file.

```bash
uv run lfx check simple-agent-flow.json --update --output updated-flow.json
```

### LFX check options

| Option | Description |
|------------------------------------------------|--------------------------------------------------------------------------------------------------|
| `--update` | Apply safe updates automatically without prompting. |
| `--force` | Apply all updates including breaking changes. Use with caution and test thoroughly. |
| `--interactive`, `-i` | Prompt for each component update individually. |
| `--output`, `-o` | Output file path for the updated flow (defaults to input file when updates are applied). |
| `--verbose`, `-v` | Show detailed information about component updates and changes. |
4 changes: 4 additions & 0 deletions docs/docs/Support/release-notes.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,10 @@ For all changes, see the [Changelog](https://github.com/langflow-ai/langflow/rel

### New features and enhancements

- Langflow Executor (LFX)

Langflow Executor (LFX) is a new command-line tool for serving and running flows statelessly from JSON files without requiring the full Langflow UI or database setup. Use `lfx serve` to create lightweight REST API endpoints from your flows, or `lfx run` to execute flows locally and get results immediately. LFX automatically installs flow dependencies and runs flows with minimal overhead. For more information, see [Run flows with Langflow Executor (LFX)](/lfx-stateless-flows).

- Webhook authentication

Added the `LANGFLOW_WEBHOOK_AUTH_ENABLE` environment variable for authenticating requests to the [`/webhook` endpoint](/api-flows-run#webhook-run-flow). When `LANGFLOW_WEBHOOK_AUTH_ENABLE=TRUE`, webhook endpoints require API key authentication and validate that the authenticated user owns the flow being executed. When `FALSE`, no Langflow API key is required and all requests to the webhook endpoint are treated as being sent by the flow owner.
Expand Down
Loading
Loading