🦙 Local model support

Local models support and no telemetry collection

Run with ollama

You can run Aide locally without sending over any data outside your computer. To do so first:

  1. Install ollama by visiting this webpage: https://ollama.ai/install (opens in a new tab)

  2. Run any of the local models (we recommend using wizardcoder:13b-python or codellama:latest)

ollama run wizardcoder:13b-python

and in another window run:

ollama serve
  1. In Aide go to your settings and search for codestory and set the local model address to the path which was shown by ollama serve

  2. Disable all telemetry collection in settings by following this link (opens in a new tab) or by setting the following in your USER settings json

{
    "telemetry.telemetryLevel": "off"
}