Ollama
📋 Prerequisites
- Windows, macOS, or Linux computer
- Caret installed in VS Code
🚀 Setup Steps
1. Install Ollama
- Visit ollama.com
- Download and install for your operating system
%20(1)%20(1).png)
2. Choose and Download a Model
-
Browse models at ollama.com/search
-
Select model and copy command:
ollama run [model-name]
.gif)
-
Open your Terminal and run the command:
-
Example:
ollama run llama2
-
.gif)
✨ Your model is now ready to use within Caret!
3. Configure Caret
- Open VS Code
- Click Caret settings icon
- Select "Ollama" as API provider
- Enter configuration:
- Base URL:
http://localhost:11434/
(default value, can be left as is) - Select the model from your available options
- Base URL:
.gif)
⚠️ Important Notes
- Start Ollama before using with Caret
- Keep Ollama running in background
- First model download may take several minutes
🔧 Troubleshooting
If Caret can't connect to Ollama:
- Verify Ollama is running
- Check base URL is correct
- Ensure model is downloaded
Need more info? Read the Ollama Docs.