Skip to main content

Ollama

📋 Prerequisites

  • Windows, macOS, or Linux computer
  • Caret installed in VS Code

🚀 Setup Steps

1. Install Ollama

  • Visit ollama.com
  • Download and install for your operating system
Ollama download page

2. Choose and Download a Model

  • Browse models at ollama.com/search

  • Select model and copy command:

    ollama run [model-name]
Selecting a model in Ollama
  • Open your Terminal and run the command:

    • Example:

      ollama run llama2
Running Ollama in terminal

✨ Your model is now ready to use within Caret!

3. Configure Caret

  1. Open VS Code
  2. Click Caret settings icon
  3. Select "Ollama" as API provider
  4. Enter configuration:
    • Base URL: http://localhost:11434/ (default value, can be left as is)
    • Select the model from your available options
Configuring Caret with Ollama

⚠️ Important Notes

  • Start Ollama before using with Caret
  • Keep Ollama running in background
  • First model download may take several minutes

🔧 Troubleshooting

If Caret can't connect to Ollama:

  1. Verify Ollama is running
  2. Check base URL is correct
  3. Ensure model is downloaded

Need more info? Read the Ollama Docs.