Skip to content
Closed
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
73 commits
Select commit Hold shift + click to select a range
0d47c43
gguf: add GGUFReader.read_field(field) method + read template example
Apr 27, 2024
0d1d46e
grammars: add troubleshooting section to readme
Apr 8, 2024
63d1324
server.py: hacky code
Mar 25, 2024
ffc7436
agents: scripts to run scripts as sandboxed fastapi servers
Mar 26, 2024
d5d9993
server.py: default tools work!
Mar 26, 2024
8afd4de
server.py: make tools work w/ mixtral-8x7b-instruct
Mar 27, 2024
aa9605c
server.py: kinda api-compliant output, disabled grammar
Mar 27, 2024
a406293
server.py: reenable grammar, accommodate mistral's escaped underscores
Mar 27, 2024
63a384d
server.py: raise n_predict
Mar 28, 2024
5f3de16
server.py: pass all request options, comments in ts sigs, render tool…
Mar 28, 2024
59b4114
server.py: refactor chat handlers
Mar 29, 2024
253b68d
server.py: crude reactor
Mar 29, 2024
e874565
agent: split code from openai example
Mar 29, 2024
b63f91a
Update agent.py
Mar 29, 2024
c340e8c
Update example_weather_tools.py
Mar 29, 2024
ce2fb01
agent: add --allow_parallel_calls
Mar 29, 2024
ea34bd3
agent/openai:nits
Mar 29, 2024
80c7930
openai: fix message merging for mixtral (parallel calls)
Mar 29, 2024
9ab493f
Update prompting.py
Mar 29, 2024
e0c8af4
agent: --style
Mar 29, 2024
b4e292e
Create requirements.txt
Mar 29, 2024
d1d8602
agent: disable parallel by default
Mar 29, 2024
eb9a552
agent: nits
Mar 29, 2024
3da30ed
agent: fix functionary tool_calls templating
Mar 29, 2024
ff6563a
Delete test.sh
Mar 29, 2024
dd11bb6
agent: format still broken
Mar 29, 2024
22b980f
agent: update readme
Mar 29, 2024
61f35e0
agent: prepare to test various templates
Mar 29, 2024
d8a53ea
openai: test features of templates at runtime, to make sure no bits o…
Mar 30, 2024
ad2f4c1
Update test_chat_handlers.py
Mar 30, 2024
3c3eff5
openai: quiet + update prompt output
Mar 30, 2024
6935503
openai: refactor chat handler vs. template
Mar 30, 2024
d9f30f8
Update test_chat_handlers.md
Mar 30, 2024
da2067a
openai: only special-format assistant in thoughtful mode
Mar 30, 2024
09de4eb
openai: actually use thoughtful examples in tests
Mar 30, 2024
19811a4
openai: tests didn't catch output format
Mar 30, 2024
22fe86d
openai tools: TS signatures work well too at a fraction of the eval cost
Mar 30, 2024
6e52a9c
Update test_chat_handlers.md
Apr 8, 2024
701a66d
agent: fix response_format
Apr 9, 2024
b447a74
agent: revert to json schemas (ts not ready for refs)
Apr 9, 2024
85820f4
agent: fix sandbox dockerfile
Apr 9, 2024
6880f1d
agent: support basic openapi tools (incl. from fastify sandbox)
Apr 9, 2024
0532680
agent: nits
Apr 9, 2024
a634e03
agent: cache_prompt=True
Apr 10, 2024
9fe269e
openai: nit
Apr 10, 2024
a61ebeb
agent: hint at math import in python tool
Apr 10, 2024
24e34f1
agent: nit
Apr 10, 2024
1475b1e
agent: fix killing of subprocesses
Apr 10, 2024
6c00378
agent: nits
Apr 10, 2024
082d54d
agent: rename fake weather tools
Apr 10, 2024
f9afb04
agent: python tool: test serializability of variables
Apr 10, 2024
a98f483
agent: python tool: return errors
Apr 10, 2024
ea0c31b
agent: ensure DATA_DIR exists
Apr 10, 2024
89dcc06
agent: mypy type fixes
Apr 10, 2024
0120f7c
agent: fix wait --std-tools
Apr 10, 2024
09c2565
grammars: early exit when no next_candidates to reject
Apr 21, 2024
00c709e
grammars: cache decoded tokens
Apr 21, 2024
8d503ef
grammars: faster llama_grammar_copy
Apr 21, 2024
b4a00ce
Merge branch 'gguf-read' into agent-example
Apr 27, 2024
7675ac6
Merge remote-tracking branch 'origin/master' into agent-example
Apr 30, 2024
312e20b
openai: update after merge
Apr 30, 2024
ca1a640
server: tool call grammar-constraints
May 2, 2024
2b2127c
agent: url params
May 2, 2024
e41b6ce
server: update tool calling, introduce system prompt for json schema
May 2, 2024
a1d64cf
openai: function call arguments must be returned stringified!
May 18, 2024
3f5a25f
Merge remote-tracking branch 'origin/master' into agent-example
May 18, 2024
5ea637e
openai: fix merge
May 21, 2024
6dadcd2
Merge remote-tracking branch 'origin/master' into agent-example
May 21, 2024
c8458fa
openai: make content optional for tool call grammar gen
May 22, 2024
a39e6e0
openai: pretty indent json response
May 22, 2024
793f4ff
agent: support OpenAI: --endpoint https://api.openai.com --auth "Bear…
May 22, 2024
a1c4aac
server: ultra basic tools, tool_choice, tool_calls support
May 22, 2024
298c098
Merge remote-tracking branch 'origin/master' into agent-example
Jun 9, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
server.py: hacky code
  • Loading branch information
ochafik committed Apr 27, 2024
commit 63d13245e1668b01533765e00958c19b27df29fc
53 changes: 53 additions & 0 deletions examples/openai/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
# examples.openai: OpenAI API-compatible server

A simple Python server that sits above the C++ [../server](examples/server) and offers improved OAI compatibility.

## Usage

```bash
python -m examples.openai -m some-model.gguf


```

## Features

The new examples/openai/server.py:

- Uses llama.cpp C++ server as a backend (spawns it or connects to existing)

- Uses actual jinja2 chat templates read from the models

- Supports grammar-constrained output for both JSON response format and tool calls

- Tool calling β€œworks” w/ all models (even non-specialized ones like Mixtral 7x8B)

- Optimised support for Functionary & Nous Hermes, easy to extend to other tool-calling fine-tunes

## TODO

- Embedding endpoint w/ distinct server subprocess

- Automatic/manual session caching

- Spawns the main C++ CLI under the hood

- Support precaching long prompts from CLI

- Instant incremental inference in long threads

- Improve examples/agent:

- Interactive agent CLI that auto-discovers tools from OpenAPI endpoints

- Script that wraps any Python source as a container-sandboxed OpenAPI endpoint (allowing running ~unsafe code w/ tools)

- Basic memory / RAG / python interpreter tools

- Follow-ups

- Remove OAI support from server

- Remove non-Python json schema to grammar converters

- Reach out to frameworks to advertise new option.
8 changes: 8 additions & 0 deletions examples/openai/__main__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@

from jsonargparse import CLI

from examples.openai.server import main

if __name__ == "__main__":
CLI(main)

27 changes: 27 additions & 0 deletions examples/openai/api.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
from typing import Any, Optional
from pydantic import BaseModel, Json

class Message(BaseModel):
role: str
content: str

class ToolFunction(BaseModel):
name: str
description: str
parameters: Any

class Tool(BaseModel):
type: str
function: ToolFunction

class ResponseFormat(BaseModel):
type: str
json_schema: Optional[Any] = None

class ChatCompletionRequest(BaseModel):
model: str
tools: Optional[list[Tool]] = None
messages: list[Message]
response_format: Optional[ResponseFormat] = None
temperature: float = 1.0
stream: bool = False
59 changes: 59 additions & 0 deletions examples/openai/chat_format.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
from enum import StrEnum
import jinja2

from examples.openai.gguf_kvs import GGUFKeyValues, Keys

def raise_exception(msg: str):
raise Exception(msg)

class ToolStyle(StrEnum):
# https://cookbook.openai.com/examples/how_to_call_functions_with_chat_models
DEFAULT="Default",
# https://github.com/MeetKai/functionary
# TODO: look at https://github.com/ggerganov/llama.cpp/pull/5695
# https://github.com/MeetKai/functionary/blob/main/functionary/prompt_template/prompt_template_v2.py
FUNCTIONARY_V2="Functionary V2",
# https://github.com/NousResearch/Hermes-Function-Calling
NOUS_RESEARCH_HERMES="Nous-Research-Hermes-Function-Calling",

class ChatFormat: #(BaseModel):
def __init__(self, template: str, eos_token: str, bos_token: str):
env = jinja2.Environment(loader=jinja2.BaseLoader(), trim_blocks=True, lstrip_blocks=True)
self.template = env.from_string(template)
self.eos_token = eos_token
self.bos_token = bos_token

self.strict_user_assistant_alternation = "{% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}{{ raise_exception" in template

if "<|recipient|>' + tool_call['function']['name']" in template:
self.tool_style = ToolStyle.FUNCTIONARY_V2
else:
self.tool_style = ToolStyle.DEFAULT


def __str__(self):
return f"ChatFormat(template={self.template}, eos_token={self.eos_token}, bos_token={self.bos_token})"


@staticmethod
def from_gguf(metadata: GGUFKeyValues):
return ChatFormat(
template = metadata[Keys.Tokenizer.CHAT_TEMPLATE],
bos_token = metadata[Keys.Tokenizer.BOS_ID],
eos_token = metadata[Keys.Tokenizer.EOS_ID])
# @staticmethod
# def from_gguf(model: Path):
# reader = GGUFReader(model.as_posix())
# return ChatFormat(
# template = reader.fields[Keys.Tokenizer.CHAT_TEMPLATE].read(),
# bos_token = reader.fields[Keys.Tokenizer.BOS_ID].read(),
# eos_token = reader.fields[Keys.Tokenizer.EOS_ID].read())

def render(self, messages: list[dict], add_generation_prompt: bool, omit_bos: bool = False):
return self.template.render(
messages=messages,
eos_token=self.eos_token,
bos_token='' if omit_bos else self.bos_token,
raise_exception=raise_exception,
add_generation_prompt=add_generation_prompt,
)
20 changes: 20 additions & 0 deletions examples/openai/gguf_kvs.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
from pathlib import Path
import sys

sys.path.insert(0, str(Path(__file__).parent.parent.parent / "gguf-py"))

from gguf.gguf_reader import GGUFReader
from gguf.constants import Keys

class GGUFKeyValues:
def __init__(self, model: Path):
reader = GGUFReader(model.as_posix())
self.fields = reader.fields
def __getitem__(self, key: str):
if '{arch}' in key:
key = key.replace('{arch}', self[Keys.General.ARCHITECTURE])
return self.fields[key].read()
def __contains__(self, key: str):
return key in self.fields
def keys(self):
return self.fields.keys()
28 changes: 28 additions & 0 deletions examples/openai/llama_cpp_server_api.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
from typing import Optional
from pydantic import BaseModel, Json

class LlamaCppServerCompletionRequest(BaseModel):
prompt: str
stream: Optional[bool] = None
cache_prompt: Optional[bool] = None
n_predict: Optional[int] = None
top_k: Optional[int] = None
top_p: Optional[float] = None
min_p: Optional[float] = None
tfs_z: Optional[float] = None
typical_p: Optional[float] = None
temperature: Optional[float] = None
dynatemp_range: Optional[float] = None
dynatemp_exponent: Optional[float] = None
repeat_last_n: Optional[int] = None
repeat_penalty: Optional[float] = None
frequency_penalty: Optional[float] = None
presence_penalty: Optional[float] = None
mirostat: Optional[bool] = None
mirostat_tau: Optional[float] = None
mirostat_eta: Optional[float] = None
penalize_nl: Optional[bool] = None
n_keep: Optional[int] = None
seed: Optional[int] = None
grammar: Optional[str] = None
json_schema: Optional[Json] = None
7 changes: 7 additions & 0 deletions examples/openai/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
fastapi[all]
gguf
jinja2
jsonargparse
pydantic
sse-starlette
uvicorn[all]
Loading