Your AI Productivity Stack: Ollama Cloud + Goose + Obsidian

A step-by-step guide for setting up a powerful, private AI workflow — no coding experience needed.
Your AI Productivity Stack: Ollama Cloud + Goose + Obsidian

What You’re Building

Three tools working together:

Tool What It Does Cost
Ollama Runs AI models — either on your computer or in the cloud Free tier available
Goose An AI assistant that can actually do things — read files, write notes, run commands Free & open source
Obsidian Your notes and knowledge base — where everything lives Free for personal use

The magic: Goose connects to your Obsidian vault through the Local REST API plugin, so your AI assistant can read your notes, search them, create new ones, edit existing ones, and organize your knowledge — all while you chat with it naturally.

Here’s what the flow looks like:

You ←→ Goose ←→ Ollama Cloud (AI brain)

↕

Obsidian (your notes)

Step 1: Install Obsidian

Obsidian is where all your notes live. It’s a beautiful, free note-taking app that stores everything as plain Markdown files on your computer.

Download

Install & Set Up

  1. Download and install Obsidian for your operating system.
  2. Open Obsidian and click “Create new vault”.
  3. Name your vault (e.g., “My Notes”) and choose a folder on your computer. Remember this location — you’ll need it later.
  4. Create a test note: click the ✏️ icon or press Ctrl+N (Windows/Linux) or Cmd+N (Mac) and type something.

💡 Tip: Obsidian stores notes as plain .md files in a regular folder on your computer. No cloud service required — your data stays on your machine.


Step 2: Install Ollama

Ollama is the engine that runs AI models. With Ollama Cloud, you can use powerful AI models without needing a fancy graphics card.

Download

irm https://ollama.com/install.ps1 | iex

Or download the installer from https://ollama.com/download

  • Linux: Run in your terminal:
curl -fsSL https://ollama.com/install.sh | sh

Set Up Your Ollama Account

  1. After installing, open a terminal and sign in:
ollama signin
  1. This will open your browser to create an account or sign in at ollama.com.
  2. That’s it — the free tier gives you access to cloud models at no cost.

💡 Free tier: Unlimited local model usage + light cloud model usage. The Pro tier ($20/mo) gives you heavier cloud usage and access to larger models.

Test It Works

Open a terminal and run:

ollama run qwen3:4b

This downloads and runs a small, fast model locally. Type a message and press Enter. You should get a response! Type /bye to exit.

For a cloud model (runs on Ollama’s servers — no GPU needed):

ollama pull gpt-oss:120b-cloud

ollama run gpt-oss:120b-cloud

💡 Cloud models have -cloud in the name. They run on Ollama’s servers, so they work even on older laptops without a dedicated GPU.


Step 3: Install Goose

Goose is an AI agent made by Block (the company behind Cash App). It can use tools, read/write files, and connect to services like Obsidian.

Download

Desktop App (recommended for beginners):

brew install --cask block-goose

CLI (command-line, for more advanced users):

curl -fsSL https://github.com/block/goose/releases/download/stable/download_cli.sh | bash

First-Time Setup

  1. Open Goose (the desktop app or run goose session in your terminal).
  2. On first launch, Goose asks you to configure an AI provider. Choose Ollama.
  3. Confirm the API host is http://localhost:11434 and click Submit.
  4. Enter a model name. For local use, try qwen3:4b. For cloud use, use a cloud model like gpt-oss:120b-cloud.

💡 To switch to Ollama Cloud (so you don’t need a local GPU):

  1. Create an API key at ollama.com/settings/keys
  1. In Goose, go to Settings → Configure Provider → Ollama
  1. Change the API Host to https://ollama.com
  1. Set your API key as an environment variable: export OLLAMA_API_KEY=your_key_here

Test It Works

Type a message in Goose’s chat. You should get a response from the AI model. Try something like:

“What can you do?”

You should see a response explaining Goose’s capabilities.


Step 4: Connect Obsidian to Goose (The Magic Part ✨)

This is where it all comes together. We’ll install a plugin in Obsidian that exposes your vault as an API, then tell Goose to connect to it.

4a. Install the Obsidian Local REST API Plugin

  1. Open Obsidian.
  2. Go to Settings (gear icon in the bottom left).
  3. Click Community Plugins in the left sidebar.
  4. If you see a message about restricted mode, click “Turn off restricted mode” and confirm.
  5. Click Browse and search for “Local REST API”.
  6. Click Install, then Enable.
  7. After enabling, you’ll see “Local REST API” in your installed plugins list. Click the ⚙️ icon next to it.

4b. Configure the Local REST API Plugin

  1. In the Local REST API settings, find the API Key section.
  2. Copy the auto-generated API key (or click “Generate” to create a new one). Save this — you’ll need it in the next step.
  3. Make sure the “Non-encrypted (HTTP) Server” option is enabled (turned on). This allows Goose to connect on http://127.0.0.1:27123.
  4. The server should now be running at http://127.0.0.1:27123.

⚠️ Important: The HTTP (non-encrypted) server is only accessible from your own computer — it’s safe for local use. Don’t expose it to the internet.

4c. Add the Obsidian MCP Server to Goose

The MCP (Model Context Protocol) server is what lets Goose talk to Obsidian.

In Goose Desktop:

  1. Open Goose’s sidebar (☰ icon top-left).
  2. Click Settings.
  3. Go to the Integrations tab.
  4. Click “Add New Server” (under MCP Servers).
  5. Add the following configuration:
{

"mcpServers": {

"obsidian vault": {

"command": "npx",

"args": ["-y", "obsidian-mcp-server@latest"],

"env": {

"OBSIDIAN_API_KEY": "YOUR_API_KEY_FROM_STEP_4B",

"OBSIDIAN_BASE_URL": "http://127.0.0.1:27123",

"OBSIDIAN_VERIFY_SSL": "false",

"OBSIDIAN_ENABLE_CACHE": "true"

}

}

}

}

Replace YOUR_API_KEY_FROM_STEP_4B with the API key you copied from Obsidian.

In Goose CLI:

Run the configure command:

goose configure

Select “Configure Extensions”, then add the MCP server configuration above.

4d. Verify the Connection

  1. Make sure Obsidian is running with the Local REST API plugin enabled.
  2. Start a new Goose session.
  3. Try asking Goose something about your notes:

“List all the notes in my vault.”

“Search my vault for any notes about meetings.”

“Create a new note called ‘AI Setup Notes’ with today’s date.”

If Goose responds with your vault contents or creates a note — 🎉 it’s working!


What You Can Do Now

Here are some practical things to try:

📝 Note Management

  • “Create a new note called ‘Project Ideas’ with the following bullet points…”
  • “Append to my daily note: Today I set up my AI productivity stack”
  • “Search my vault for anything about ‘budget’ and summarize what you find”

🔍 Research & Organization

  • “Find all notes tagged #project and list them”
  • “Search my vault for ‘meeting notes’ from this week”
  • “Organize my inbox folder — move notes to appropriate folders based on their content”

✍️ Writing & Editing

  • “Read my ‘Draft Blog Post’ note and suggest improvements”
  • “Replace ‘utilize’ with ‘use’ throughout my ‘Technical Documentation’ note”
  • “Add tags #ai #productivity to my ‘AI Setup Notes’ note”

🧠 Knowledge Work

  • “Summarize all my notes about machine learning into a single overview”
  • “Find connections between my notes on ‘design patterns’ and ‘software architecture’”
  • “Create a weekly review note that pulls highlights from my daily notes”

Troubleshooting

“Goose can’t connect to Obsidian”

  • Make sure Obsidian is running
  • Check that the Local REST API plugin is enabled
  • Verify the HTTP server is on (Settings → Community Plugins → Local REST API → Enable “Non-encrypted HTTP Server”)
  • Check that your API key in the Goose config matches the one in Obsidian

“Ollama model not found”

  • Run ollama list to see which models you have downloaded
  • Use ollama pull qwen3:4b (or any model name) to download a model
  • For cloud models, make sure you’ve signed in with ollama signin

“Goose isn’t using the Obsidian tools”

  • Make sure the MCP server is configured in Goose’s settings
  • Try restarting Goose after adding the MCP server
  • Check that npx is available (it comes with Node.js — install from nodejs.org)

“My model is too slow”

  • Try a smaller model: qwen3:1.7b or gemma3:1b run fast even on older hardware
  • For cloud models, latency depends on your internet connection
  • Local models benefit from having a GPU, but many run fine on CPU

Recommended Models by Use Case

| Use Case | Recommended Model | Local or Cloud |

|–––––|——————|––––––––|

| Quick tasks, fast responses | qwen3:4b | Local |

| Good all-rounder | qwen3:8b | Local (needs 8GB+ RAM) |

| Powerful reasoning | gpt-oss:120b-cloud | Cloud |

| Coding assistance | qwen3-coder:8b | Local |

| Long documents, research | gpt-oss:120b-cloud | Cloud |

💡 Rule of thumb: Models with -cloud in the name run on Ollama’s servers. Everything else runs locally on your machine. Cloud models are smarter but need internet; local models work offline but need RAM/GPU.


Quick Reference

| What | Command / Action |

|——|—————––|

| Install Ollama (Linux) | curl -fsSL https://ollama.com/install.sh | sh |

| Install Ollama (Mac) | Download from ollama.com/download |

| Install Ollama (Windows) | irm https://ollama.com/install.ps1 | iex in PowerShell |

| Sign in to Ollama Cloud | ollama signin |

| Download a model | ollama pull qwen3:4b |

| Run a model | ollama run qwen3:4b |

| List downloaded models | ollama list |

| Install Goose Desktop | github.com/block/goose/releases |

| Install Goose CLI | curl -fsSL https://github.com/block/goose/releases/download/stable/download_cli.sh | bash |

| Configure Goose provider | goose configure (CLI) or Settings → Configure Provider (Desktop) |

| Obsidian download | obsidian.md/download |

| Local REST API plugin | Obsidian → Settings → Community Plugins → Browse → “Local REST API” |

| Ollama Cloud API keys | ollama.com/settings/keys |

| Ollama Cloud pricing | Free tier available, Pro $20/mo — ollama.com/cloud |


Links & Resources


Last updated: May 2026


Write a comment
No comments yet.