Local models support and no telemetry collection
Run with ollama
You can run Aide locally without sending over any data outside your computer. To do so first:
-
Install ollama by visiting this webpage: https://ollama.ai/install (opens in a new tab)
-
Run any of the local models (we recommend using
wizardcoder:13b-python
orcodellama:latest
)
ollama run wizardcoder:13b-python
and in another window run:
ollama serve
-
In Aide go to your settings and search for codestory and set the local model address to the path which was shown by
ollama serve
-
Disable all telemetry collection in settings by following this link (opens in a new tab) or by setting the following in your USER settings json
{
"telemetry.telemetryLevel": "off"
}