Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
FEAT: Add Ollama local model support for tab autocomplete
Document how to use Ollama local models for tab autocomplete. Provides examples for Codegemma and StarCoder2 models.
  • Loading branch information
cwilliams committed Oct 12, 2024
commit 297ed73578a005ac67301029aae18c6677df2933
20 changes: 17 additions & 3 deletions docs/tab-autocomplete.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,8 +31,22 @@ PearAI supports tab autocomplete, which predicts and suggests what you would typ

3. **Enjoy the development speed up with autocomplete!**

## Alternative

- You can also use [Supermaven](https://supermaven.com/) for tab autocomplete. It is currently one of the best autocomplete AI on the market, and provides a free tier. You can get started by installing Supermaven directly as an extension within PearAI.
## Alternatives

- You can use [Supermaven](https://supermaven.com/) for tab autocomplete. It is currently one of the best autocomplete AI on the market, and provides a free tier. You can get started by installing Supermaven directly as an extension within PearAI.

![Supermaven extension](../static/img/supermaven.png)

- Also, you can use [Ollama](https://ollama.ai/) local models. Download the model that you want and add it to your config.json file. Examples:

```json
"tabAutocompleteModel": {
"title": "Codegemma",
"provider": "ollama",
"model": "codegemma:2b"
},
{
"title": "StarCoder2",
"provider": "ollama",
"model": "starcoder2:latest"
}