Skip to content

tageecc/cursor-agent-api-proxy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

cursor-agent-api-proxy

npm version npm downloads license

中文文档

OpenAI-compatible API proxy for the Cursor CLI. Lets any OpenAI client use your Cursor subscription.

Prerequisites

  • Node.js 20+
  • Active Cursor subscription (Pro / Business)

Install

1. Install the Cursor CLI and log in:

# macOS / Linux
curl https://cursor.com/install -fsS | bash

# Windows PowerShell
irm 'https://cursor.com/install?win32=true' | iex
agent login          # opens browser, sign in with your Cursor account
agent --list-models  # verify it works

Headless? Skip agent login, generate a key at cursor.com/settings and export CURSOR_API_KEY=<key>.

2. Install and start the proxy:

npm install -g cursor-agent-api-proxy
cursor-agent-api          # starts in background on http://localhost:4646
cursor-agent-api status   # check if running

3. Verify:

curl http://localhost:4646/health

Other commands:

cursor-agent-api stop           # stop
cursor-agent-api restart        # restart
cursor-agent-api start 8080     # start on a custom port
cursor-agent-api run            # run in foreground (for debugging)

Logs: ~/.cursor-agent-api/server.log

Use with OpenClaw

First-time setup (onboarding wizard)

If you haven't set up OpenClaw yet, run the onboarding wizard:

openclaw onboard

When the wizard asks you to configure Model/Auth:

  1. Provider type → choose Custom Provider (OpenAI-compatible)
  2. Base URL → http://localhost:4646/v1
  3. API Key → type not-needed (if you ran agent login)
  4. Default model → auto (or any model from agent --list-models)

Existing setup (edit config)

Already have OpenClaw running? Edit the config file directly:

{
  env: {
    // "not-needed" = already logged in via agent login
    // or set your Cursor API Key here to forward it per-request
    OPENAI_API_KEY: "not-needed",
    OPENAI_BASE_URL: "http://localhost:4646/v1",
  },
  agents: {
    defaults: {
      model: { primary: "openai/auto" },
    },
  },
}

Models

Model IDs match agent --list-models output directly:

auto                  # auto-select
gpt-5.2               # GPT-5.2
gpt-5.3-codex         # GPT-5.3 Codex
opus-4.6-thinking     # Claude Opus 4.6 (thinking)
sonnet-4.5-thinking   # Claude Sonnet 4.5 (thinking)
gemini-3-pro          # Gemini 3 Pro

Full list: curl http://localhost:4646/v1/models or agent --list-models.

API

Endpoint Method Description
/health GET Health check
/v1/models GET List models
/v1/chat/completions POST Chat completion (supports stream: true)

Configuration

Env Variable Default Description
PORT 4646 Listen port (or cursor-agent-api start 8080)
CURSOR_API_KEY - Alternative to agent login

Auto-start (boot)

To start the proxy automatically on system boot:

cursor-agent-api install    # register as system service
cursor-agent-api uninstall  # remove
  • macOS → LaunchAgent
  • Windows → Task Scheduler
  • Linux → systemd user service

Other Clients

Python (openai SDK)
from openai import OpenAI

client = OpenAI(
    base_url="http://localhost:4646/v1",
    api_key="not-needed",
)

resp = client.chat.completions.create(
    model="auto",
    messages=[{"role": "user", "content": "Hello!"}],
)
print(resp.choices[0].message.content)
Continue.dev
{
  "models": [{
    "title": "Cursor",
    "provider": "openai",
    "model": "auto",
    "apiBase": "http://localhost:4646/v1",
    "apiKey": "not-needed"
  }]
}
curl
curl -X POST http://localhost:4646/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{"model":"auto","messages":[{"role":"user","content":"Hello!"}]}'

How it Works

Client  →  POST /v1/chat/completions (OpenAI format)
        →  cursor-agent-api-proxy
        →  spawn agent CLI (stream-json)
        →  Cursor subscription
        →  AI response → OpenAI format → Client

Contributing

git clone https://github.com/tageecc/cursor-agent-api-proxy.git
cd cursor-agent-api-proxy
pnpm install && pnpm run build
pnpm start

License

MIT

About

Use your Cursor subscription with any OpenAI-compatible client. Wraps Cursor CLI as an OpenAI-compatible API proxy.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors