Mistral vibe with local ollama

Mistral vibe running locally with ollama and for example ministral-3 works pretty well on a MacBook M. In ~/.vibe/config.toml adapt:

[[providers]]
name = “llamacpp”
api_base = “http://localhost:11434/v1”
api_key_env_var = “”
api_style = “openai”
backend = “generic”

[[models]]
name = “ministral-3”
provider = “llamacpp”
alias = “local”
temperature = 0.2
input_price = 0.0
output_price = 0.0

[[models]]
name = “devstral-small-2”
provider = “llamacpp”
alias = “local-devstral-small-2”
temperature = 0.2
input_price = 0.0
output_price = 0.0

[[models]]
name = “devstral-2:123b-cloud”
provider = “llamacpp”
alias = “ollama-devstral-2:123b-cloud”
temperature = 0.2
input_price = 0.0
output_price = 0.0