Sandbox: WSL2 on Windows 11
Two pieces:
- Ollama Desktop on Windows, using your GPU.
- Codehamr inside WSL2, a real Linux sandbox.
They talk to each other over host.docker.internal. No Docker Desktop, no GPU passthrough.
For the plain non-sandboxed path see Run codehamr on Windows 11.
1. Install Ollama Desktop on Windows
Follow steps 1 and 2 of the Windows quickstart: install Ollama, lift the context length, pull qwen3.6:27b. Come back here.
2. Install WSL2
In PowerShell as administrator:
wsl --install
Reboot when Windows asks. Ubuntu launches and prompts for a Linux username and password, anything works.
Confirm:
wsl --status
You want Default Version: 2.
3. Install codehamr inside WSL2
Inside the Ubuntu terminal:
curl -fsSL https://codehamr.com/install.sh | bash
codehamr --version
4. Point codehamr at the host's Ollama
In your project:
.codehamr/config.yaml
active: local
models:
local:
llm: qwen3.6:27b
url: http://host.docker.internal:11434
key: ""
context_size: 65536
host.docker.internal resolves to your Windows host from WSL2 automatically.
5. Run
codehamr
GPU on Windows, agent in Linux, your Windows files untouched.
If something doesn't work
From the WSL terminal:
curl -s http://host.docker.internal:11434/v1/models
- JSON back → all good, recheck
url:in your config. - Connection refused → Ollama Desktop isn't running on Windows.
- Hostname doesn't resolve → older Windows 10 build. Update Windows.
VS Code
Install the WSL extension on Windows, then Ctrl+Shift+P → WSL: Open Folder in WSL. Run codehamr in the integrated terminal.