Your AI Productivity Stack: Ollama Cloud + Goose + Obsidian
- What You’re Building
- Step 1: Install Obsidian
- Step 2: Install Ollama
- Step 3: Install Goose
- Step 4: Connect Obsidian to Goose (The Magic Part ✨)
- What You Can Do Now
- Troubleshooting
- Recommended Models by Use Case
- Quick Reference
- Links & Resources
What You’re Building
Three tools working together:
| Tool | What It Does | Cost |
|---|---|---|
| Ollama | Runs AI models — either on your computer or in the cloud | Free tier available |
| Goose | An AI assistant that can actually do things — read files, write notes, run commands | Free & open source |
| Obsidian | Your notes and knowledge base — where everything lives | Free for personal use |
The magic: Goose connects to your Obsidian vault through the Local REST API plugin, so your AI assistant can read your notes, search them, create new ones, edit existing ones, and organize your knowledge — all while you chat with it naturally.
Here’s what the flow looks like:
You ←→ Goose ←→ Ollama Cloud (AI brain)
↕
Obsidian (your notes)
Step 1: Install Obsidian
Obsidian is where all your notes live. It’s a beautiful, free note-taking app that stores everything as plain Markdown files on your computer.
Download
- Mac: https://obsidian.md/download?os=mac
- Windows: https://obsidian.md/download?os=win
- Linux: https://obsidian.md/download?os=linux
Install & Set Up
- Download and install Obsidian for your operating system.
- Open Obsidian and click “Create new vault”.
- Name your vault (e.g., “My Notes”) and choose a folder on your computer. Remember this location — you’ll need it later.
- Create a test note: click the ✏️ icon or press
Ctrl+N(Windows/Linux) orCmd+N(Mac) and type something.
💡 Tip: Obsidian stores notes as plain
.mdfiles in a regular folder on your computer. No cloud service required — your data stays on your machine.
Step 2: Install Ollama
Ollama is the engine that runs AI models. With Ollama Cloud, you can use powerful AI models without needing a fancy graphics card.
Download
- Mac: https://ollama.com/download (downloads the Mac app)
- Windows: Open PowerShell and run:
irm https://ollama.com/install.ps1 | iex
Or download the installer from https://ollama.com/download
- Linux: Run in your terminal:
curl -fsSL https://ollama.com/install.sh | sh
Set Up Your Ollama Account
- After installing, open a terminal and sign in:
ollama signin
- This will open your browser to create an account or sign in at ollama.com.
- That’s it — the free tier gives you access to cloud models at no cost.
💡 Free tier: Unlimited local model usage + light cloud model usage. The Pro tier ($20/mo) gives you heavier cloud usage and access to larger models.
Test It Works
Open a terminal and run:
ollama run qwen3:4b
This downloads and runs a small, fast model locally. Type a message and press Enter. You should get a response! Type /bye to exit.
For a cloud model (runs on Ollama’s servers — no GPU needed):
ollama pull gpt-oss:120b-cloud
ollama run gpt-oss:120b-cloud
💡 Cloud models have
-cloudin the name. They run on Ollama’s servers, so they work even on older laptops without a dedicated GPU.
Step 3: Install Goose
Goose is an AI agent made by Block (the company behind Cash App). It can use tools, read/write files, and connect to services like Obsidian.
Download
Desktop App (recommended for beginners):
- Mac: Download from GitHub (look for the
.dmgfile) or install via Homebrew:
brew install --cask block-goose
- Windows: Download the installer from GitHub Releases
- Linux: Download the
.debfile from GitHub Releases
CLI (command-line, for more advanced users):
curl -fsSL https://github.com/block/goose/releases/download/stable/download_cli.sh | bash
First-Time Setup
- Open Goose (the desktop app or run
goose sessionin your terminal). - On first launch, Goose asks you to configure an AI provider. Choose Ollama.
- Confirm the API host is
http://localhost:11434and click Submit. - Enter a model name. For local use, try
qwen3:4b. For cloud use, use a cloud model likegpt-oss:120b-cloud.
💡 To switch to Ollama Cloud (so you don’t need a local GPU):
- Create an API key at ollama.com/settings/keys
- In Goose, go to Settings → Configure Provider → Ollama
- Change the API Host to
https://ollama.com
- Set your API key as an environment variable:
export OLLAMA_API_KEY=your_key_here
Test It Works
Type a message in Goose’s chat. You should get a response from the AI model. Try something like:
“What can you do?”
You should see a response explaining Goose’s capabilities.
Step 4: Connect Obsidian to Goose (The Magic Part ✨)
This is where it all comes together. We’ll install a plugin in Obsidian that exposes your vault as an API, then tell Goose to connect to it.
4a. Install the Obsidian Local REST API Plugin
- Open Obsidian.
- Go to Settings (gear icon in the bottom left).
- Click Community Plugins in the left sidebar.
- If you see a message about restricted mode, click “Turn off restricted mode” and confirm.
- Click Browse and search for “Local REST API”.
- Click Install, then Enable.
- After enabling, you’ll see “Local REST API” in your installed plugins list. Click the ⚙️ icon next to it.
4b. Configure the Local REST API Plugin
- In the Local REST API settings, find the API Key section.
- Copy the auto-generated API key (or click “Generate” to create a new one). Save this — you’ll need it in the next step.
- Make sure the “Non-encrypted (HTTP) Server” option is enabled (turned on). This allows Goose to connect on
http://127.0.0.1:27123. - The server should now be running at
http://127.0.0.1:27123.
⚠️ Important: The HTTP (non-encrypted) server is only accessible from your own computer — it’s safe for local use. Don’t expose it to the internet.
4c. Add the Obsidian MCP Server to Goose
The MCP (Model Context Protocol) server is what lets Goose talk to Obsidian.
In Goose Desktop:
- Open Goose’s sidebar (☰ icon top-left).
- Click Settings.
- Go to the Integrations tab.
- Click “Add New Server” (under MCP Servers).
- Add the following configuration:
{
"mcpServers": {
"obsidian vault": {
"command": "npx",
"args": ["-y", "obsidian-mcp-server@latest"],
"env": {
"OBSIDIAN_API_KEY": "YOUR_API_KEY_FROM_STEP_4B",
"OBSIDIAN_BASE_URL": "http://127.0.0.1:27123",
"OBSIDIAN_VERIFY_SSL": "false",
"OBSIDIAN_ENABLE_CACHE": "true"
}
}
}
}
Replace YOUR_API_KEY_FROM_STEP_4B with the API key you copied from Obsidian.
In Goose CLI:
Run the configure command:
goose configure
Select “Configure Extensions”, then add the MCP server configuration above.
4d. Verify the Connection
- Make sure Obsidian is running with the Local REST API plugin enabled.
- Start a new Goose session.
- Try asking Goose something about your notes:
“List all the notes in my vault.”
“Search my vault for any notes about meetings.”
“Create a new note called ‘AI Setup Notes’ with today’s date.”
If Goose responds with your vault contents or creates a note — 🎉 it’s working!
What You Can Do Now
Here are some practical things to try:
📝 Note Management
- “Create a new note called ‘Project Ideas’ with the following bullet points…”
- “Append to my daily note: Today I set up my AI productivity stack”
- “Search my vault for anything about ‘budget’ and summarize what you find”
🔍 Research & Organization
- “Find all notes tagged #project and list them”
- “Search my vault for ‘meeting notes’ from this week”
- “Organize my inbox folder — move notes to appropriate folders based on their content”
✍️ Writing & Editing
- “Read my ‘Draft Blog Post’ note and suggest improvements”
- “Replace ‘utilize’ with ‘use’ throughout my ‘Technical Documentation’ note”
- “Add tags #ai #productivity to my ‘AI Setup Notes’ note”
🧠 Knowledge Work
- “Summarize all my notes about machine learning into a single overview”
- “Find connections between my notes on ‘design patterns’ and ‘software architecture’”
- “Create a weekly review note that pulls highlights from my daily notes”
Troubleshooting
“Goose can’t connect to Obsidian”
- Make sure Obsidian is running
- Check that the Local REST API plugin is enabled
- Verify the HTTP server is on (Settings → Community Plugins → Local REST API → Enable “Non-encrypted HTTP Server”)
- Check that your API key in the Goose config matches the one in Obsidian
“Ollama model not found”
- Run
ollama listto see which models you have downloaded - Use
ollama pull qwen3:4b(or any model name) to download a model - For cloud models, make sure you’ve signed in with
ollama signin
“Goose isn’t using the Obsidian tools”
- Make sure the MCP server is configured in Goose’s settings
- Try restarting Goose after adding the MCP server
- Check that
npxis available (it comes with Node.js — install from nodejs.org)
“My model is too slow”
- Try a smaller model:
qwen3:1.7borgemma3:1brun fast even on older hardware - For cloud models, latency depends on your internet connection
- Local models benefit from having a GPU, but many run fine on CPU
Recommended Models by Use Case
| Use Case | Recommended Model | Local or Cloud |
|–––––|——————|––––––––|
| Quick tasks, fast responses | qwen3:4b | Local |
| Good all-rounder | qwen3:8b | Local (needs 8GB+ RAM) |
| Powerful reasoning | gpt-oss:120b-cloud | Cloud |
| Coding assistance | qwen3-coder:8b | Local |
| Long documents, research | gpt-oss:120b-cloud | Cloud |
💡 Rule of thumb: Models with
-cloudin the name run on Ollama’s servers. Everything else runs locally on your machine. Cloud models are smarter but need internet; local models work offline but need RAM/GPU.
Quick Reference
| What | Command / Action |
|——|—————––|
| Install Ollama (Linux) | curl -fsSL https://ollama.com/install.sh | sh |
| Install Ollama (Mac) | Download from ollama.com/download |
| Install Ollama (Windows) | irm https://ollama.com/install.ps1 | iex in PowerShell |
| Sign in to Ollama Cloud | ollama signin |
| Download a model | ollama pull qwen3:4b |
| Run a model | ollama run qwen3:4b |
| List downloaded models | ollama list |
| Install Goose Desktop | github.com/block/goose/releases |
| Install Goose CLI | curl -fsSL https://github.com/block/goose/releases/download/stable/download_cli.sh | bash |
| Configure Goose provider | goose configure (CLI) or Settings → Configure Provider (Desktop) |
| Obsidian download | obsidian.md/download |
| Local REST API plugin | Obsidian → Settings → Community Plugins → Browse → “Local REST API” |
| Ollama Cloud API keys | ollama.com/settings/keys |
| Ollama Cloud pricing | Free tier available, Pro $20/mo — ollama.com/cloud |
Links & Resources
- Ollama: ollama.com · docs.ollama.com · GitHub
- Goose: block.github.io/goose · GitHub · Discord
- Obsidian: obsidian.md · Help Docs
- Obsidian MCP Server: github.com/cyanheads/obsidian-mcp-server
- Obsidian Local REST API Plugin: github.com/coddingtonbear/obsidian-local-rest-api
Last updated: May 2026
Write a comment