Setting Up opencode with oh-my-opencode for Local AI-Powered Development
Due to recent performance throttling observed with certain AI coding assistants when used alongside domestic large language models, a shift toward fully self-hosted and controllable alternatives became necessary. Among the available options, opencode—an open-source AI coding agent—emerged as a promising solution, especially when enhanced with the oh-my-opencode plugin framework.
opencode supports multiple interfaces including terminal, IDEs, and VS Code extensions. For this setup, the terminal-based approach was chosen. Installation is straightforward via a single command:
curl -fsSL https://opencode.ai/install | bash
The environment used here is Windows with WSL2 (Ubuntu distribution). After installation, the tool can be launched by simply typing opencode in the terminal.
Model integration is handled through built-in commands. Upon launching opencode, users can run /connect to select a model provider (e.g., DeepSeek, Kimi, GLM, or MiniMax) and input the corresponding API key. Model selection is then done via /models, eliminating the need for manual environment variable configuration.
By default, opencode includes two agents: Plan and Build. To unlock advanced multi-agent collaboration, the oh-my-opencode extension is recommended. This plugin transforms opencode into a coordinated team of specialized AI agents capable of parallel execution, automated orchestration, and complex task handling.
Installation of oh-my-opencode requires bun, a fast JavaScript runtime. Install it with:
curl -fsSL https://bun.sh/install | bash
Then reload the shell configuration:
source ~/.bashrc
Finally, install the plugin:
bunx oh-my-opencode install
During installation, the system verifies the presence of opencode and prompts for model subscription details.
The core agents provided by oh-my-opencode include:
Additional utility agents enhance specific workflows:
Each agent’s underlying model can be customized by editing the configuration file at ~/.config/opencode/oh-my-opencode.json. Below is an example configuration leveraging Kimi and DeepSeek models:
{
"$schema": "https://raw.githubusercontent.com/code-yeongyu/oh-my-opencode/dev/assets/oh-my-opencode.schema.json",
"agents": {
"sisyphus": { "model": "kimi-for-coding/k2p5" },
"prometheus": { "model": "kimi-for-coding/k2p5" },
"hephaestus": { "model": "kimi-for-coding/k2p5" },
"atlas": { "model": "kimi-for-coding/k2p5" },
"oracle": { "model": "deepseek/deepseek-reasoner" },
"librarian": { "model": "deepseek/deepseek-reasoner" },
"explore": { "model": "deepseek/deepseek-reasoner" },
"metis": { "model": "kimi-for-coding/k2p5" },
"momus": { "model": "deepseek/deepseek-reasoner" },
"multimodal-looker": { "model": "kimi-for-coding/k2p5" }
},
"categories": {
"visual-engineering": { "model": "kimi-for-coding/k2p5" },
"ultrabrain": { "model": "deepseek/deepseek-reasoner" },
"quick": { "model": "deepseek/deepseek-reasoner" },
"writing": { "model": "kimi-for-coding/k2p5" },
"unspecified-low": { "model": "deepseek/deepseek-reasoner" },
"unspecified-high": { "model": "deepseek/deepseek-reasoner" }
}
}
With the environment now configured, further exploration—such as using custom skills within opencode—can begin.