updates
This commit is contained in:
142
docs/INTEGRATION_SUMMARY.md
Normal file
142
docs/INTEGRATION_SUMMARY.md
Normal file
@@ -0,0 +1,142 @@
|
||||
# CodeCompanion + Ollama Integration Summary
|
||||
|
||||
## What Was Done
|
||||
|
||||
Your CodeCompanion configuration has been updated to support Ollama models alongside your existing Claude adapters. Here's what changed:
|
||||
|
||||
### 1. **Added Ollama Adapter** (`lua/shelbybark/plugins/codecompanion.lua`)
|
||||
- Reads `OLLAMA_ENDPOINT` environment variable
|
||||
- Defaults to `http://localhost:11434` if not set
|
||||
- Uses `mistral` as the default model (configurable)
|
||||
|
||||
### 2. **Added Ollama Keymaps**
|
||||
- `<leader>cll` - Toggle Ollama chat (normal and visual modes)
|
||||
- Works alongside existing Claude keymaps
|
||||
|
||||
### 3. **Created Documentation**
|
||||
- `docs/OLLAMA_SETUP.md` - Comprehensive setup guide
|
||||
- `docs/OLLAMA_QUICK_SETUP.md` - Quick reference for other machines
|
||||
- `docs/ollama_env_example.sh` - Shell configuration example
|
||||
|
||||
## How It Works
|
||||
|
||||
### On Your Main Machine (with Ollama)
|
||||
```bash
|
||||
# Ollama runs locally, CodeCompanion uses http://localhost:11434 by default
|
||||
nvim
|
||||
# Press <leader>cll to chat with Ollama
|
||||
```
|
||||
|
||||
### On Other Machines (without Ollama)
|
||||
```bash
|
||||
# Set environment variable to your Ollama server's Tailscale IP
|
||||
export OLLAMA_ENDPOINT="http://100.123.45.67:11434"
|
||||
nvim
|
||||
# Press <leader>cll to chat with Ollama via Tailscale
|
||||
```
|
||||
|
||||
## Key Features
|
||||
|
||||
✅ **Network Access**: Access Ollama from any machine on your Tailscale network
|
||||
✅ **Fallback Support**: Keep Claude as primary, use Ollama as alternative
|
||||
✅ **Easy Switching**: Use keymaps to switch between models instantly
|
||||
✅ **Environment-Based**: Configuration adapts to each machine automatically
|
||||
✅ **No Code Changes**: Just set an environment variable on other machines
|
||||
|
||||
## Next Steps
|
||||
|
||||
### 1. **On Your Ollama Server Machine**
|
||||
|
||||
Ensure Ollama is exposed to the network:
|
||||
```bash
|
||||
# Check current Ollama binding
|
||||
ps aux | grep ollama
|
||||
|
||||
# If needed, set it to listen on all interfaces
|
||||
sudo systemctl edit ollama
|
||||
# Add: Environment="OLLAMA_HOST=0.0.0.0:11434"
|
||||
# Save and exit, then:
|
||||
sudo systemctl restart ollama
|
||||
|
||||
# Find your Tailscale IP
|
||||
tailscale ip -4
|
||||
```
|
||||
|
||||
### 2. **On Other Machines**
|
||||
|
||||
Add to your shell config (`~/.zshrc`, `~/.bashrc`, etc.):
|
||||
```bash
|
||||
export OLLAMA_ENDPOINT="http://YOUR_TAILSCALE_IP:11434"
|
||||
```
|
||||
|
||||
### 3. **Test It**
|
||||
|
||||
```bash
|
||||
# Verify connection
|
||||
curl http://YOUR_TAILSCALE_IP:11434/api/tags
|
||||
|
||||
# Start Neovim and press <leader>cll
|
||||
nvim
|
||||
```
|
||||
|
||||
## Configuration Details
|
||||
|
||||
### Ollama Adapter Settings
|
||||
- **Location**: `lua/shelbybark/plugins/codecompanion.lua` (lines 35-45)
|
||||
- **Default Model**: `mistral` (change to your preference)
|
||||
- **Endpoint**: Read from `OLLAMA_ENDPOINT` env var
|
||||
- **Fallback**: `http://localhost:11434`
|
||||
|
||||
### Available Models to Try
|
||||
- `mistral` - Fast, good quality (recommended)
|
||||
- `neural-chat` - Optimized for conversation
|
||||
- `dolphin-mixtral` - Larger, higher quality
|
||||
- `llama2` - General purpose
|
||||
- `orca-mini` - Very fast, lightweight
|
||||
|
||||
Pull models with: `ollama pull <model-name>`
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Connection Issues
|
||||
```bash
|
||||
# Test Ollama is running
|
||||
curl http://localhost:11434/api/tags
|
||||
|
||||
# Test Tailscale connectivity
|
||||
ping 100.x.x.x # Use your Tailscale IP
|
||||
|
||||
# Check Ollama is bound to network
|
||||
sudo netstat -tlnp | grep 11434
|
||||
```
|
||||
|
||||
### Model Issues
|
||||
```bash
|
||||
# List available models
|
||||
curl http://localhost:11434/api/tags | jq '.models[].name'
|
||||
|
||||
# Pull a model
|
||||
ollama pull mistral
|
||||
```
|
||||
|
||||
## Files Modified/Created
|
||||
|
||||
- ✏️ `lua/shelbybark/plugins/codecompanion.lua` - Added Ollama adapter and keymaps
|
||||
- ✨ `docs/OLLAMA_SETUP.md` - Comprehensive setup guide
|
||||
- ✨ `docs/OLLAMA_QUICK_SETUP.md` - Quick reference
|
||||
- ✨ `docs/ollama_env_example.sh` - Shell config example
|
||||
- 📄 `docs/INTEGRATION_SUMMARY.md` - This file
|
||||
|
||||
## Support
|
||||
|
||||
For issues or questions:
|
||||
1. Check the troubleshooting section in `docs/OLLAMA_SETUP.md`
|
||||
2. Verify Ollama is running: `curl http://localhost:11434/api/tags`
|
||||
3. Verify Tailscale connectivity: `tailscale status`
|
||||
4. Check CodeCompanion logs in Neovim: `:messages`
|
||||
|
||||
## References
|
||||
|
||||
- [Ollama GitHub](https://github.com/ollama/ollama)
|
||||
- [Tailscale Documentation](https://tailscale.com/kb/)
|
||||
- [CodeCompanion.nvim](https://github.com/olimorris/codecompanion.nvim)
|
||||
Reference in New Issue
Block a user