building[4].cloud

Beam me up, Scotty.

Running LLMs locally with Ollama