Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 12 additions & 4 deletions docs/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -184,15 +184,23 @@ when running the above command.
This section answers the question *"I already have a model downloaded locally by application X, can I use it with llamafile?"*. The general answer is "yes, as long as those models are locally stored in GGUF format" but its implementation can be more or less hacky depending on the application. A few examples (tested on a Mac) follow.

### LM Studio
[LM Studio](https://lmstudio.ai/) stores downloaded models in `~/.cache/lm-studio/models`, in subdirectories with the same name of the models (following HuggingFace's `account_name/model_name` format), with the same filename you saw when you chose to download the file.
[LM Studio](https://lmstudio.ai/) (recent versions) stores downloaded models in `~/.lmstudio/models/`, in subdirectories like `.lmstudio/models/lmstudio-community/`, `.lmstudio/models/mlx-community/`, `~/.lmstudio/models/nightmedia/`. Older versions used to save models with the same name of the models (following HuggingFace's `account_name/model_name` format). According to documentation LM Studio aims to preserves the directory structure of models downloaded from Hugging Face. The expected directory structure is as follows:

So if you have downloaded e.g. the `llama-2-7b.Q2_K.gguf` file for `TheBloke/Llama-2-7B-GGUF`, you can run llamafile as follows:
```
~/.lmstudio/models/
└── publisher/
└── model/
└── model-file.gguf
```

So if user has downloaded e.g. the `phi-4-reasoning-plus` file for `microsoft/phi-4-reasoning-plus`, user can run `llamafile` as follows:

```
cd ~/.cache/lm-studio/models/TheBloke/Llama-2-7B-GGUF
llamafile -m llama-2-7b.Q2_K.gguf
llamafile -m ~/.lmstudio/models/lmstudio-community/Phi-4-reasoning-plus-GGUF/Phi-4-reasoning-plus-Q8_0.gguf
```

For more details check [LM Studio Documentation](https://lmstudio.ai/docs/app/advanced/import-model).

### Ollama

When you download a new model with [ollama](https://ollama.com), all its metadata will be stored in a manifest file under `~/.ollama/models/manifests/registry.ollama.ai/library/`. The directory and manifest file name are the model name as returned by `ollama list`. For instance, for `llama3:latest` the manifest file will be named `.ollama/models/manifests/registry.ollama.ai/library/llama3/latest`.
Expand Down