Quick Answer: Block Goose is an open source local AI that serves as an intelligent traffic controller for AI requests. It offers a single CLI interface that analyzes prompts and routes them to the most suitable service, such as ChatGPT for explanations, Copilot for code, or custom APIs.
The AI Tool Fragmentation Problem
I was spending time switching between ChatGPT for explanations, GitHub Copilot for code generation, and other AI tools. Each tool has its own interface, login process, and quirks.
- Most developers face this same problem.
- We’re spending more time switching between tools than actually using them.
- The productivity loss adds up quickly.
Here’s what happens when you’re debugging a complex issue. You need both code analysis and explanation, so you copy the error to ChatGPT, then switch to Copilot for the fix, then back to ChatGPT to understand why it worked.
Block Goose solves this by being your intelligent traffic controller.
What Makes Block Goose Different?
- Local First — Your code stays on your machine unless you explicitly send it to an AI service.
- Smart Routing — Automatically routes prompts to the right AI tool based on context.
- Extensible — Connect to any MCP server or API to expand capabilities.
- Open Source — Built with complete transparency.
The local-first approach is what sold me on it. I’m tired of sending my proprietary code to every AI service under the sun. With Block Goose, I control exactly what gets sent where, and I can audit every interaction.
How Block Goose Works
Block Goose uses a plugin architecture built on the Model Context Protocol (MCP). The CLI accepts natural language prompts and routes them based on:
- Prompt analysis — Keywords, file types, and context.
- Provider capabilities — What each AI service does best.
- MCP servers — Extensions for custom APIs and services.
The system processes your prompt, selects the most suitable AI service, adds context, and returns the results. It’s like having a brilliant assistant who actually understands your workflow, rather than just guessing.
Extensions and How They Work
Block Goose’s real power comes from its extensible architecture. The system uses the Model Context Protocol (MCP) to connect to external services, APIs, and custom tools through extensions.
What Are Extensions?
Extensions are MCP servers that add new capabilities to Block Goose.
Think of them as plugins that can:
- Connect to external APIs — Integrate with your company’s internal services.
- Access databases — Query your data warehouses or knowledge bases.
- Run custom scripts — Execute specialized business logic.
- Interface with tools — Connect to your existing development workflow.
How Extensions Work
Extensions follow a simple pattern:
- MCP Server — Your extension runs as a server that implements the MCP protocol.
- Registration — Block Goose discovers and registers available extensions.
- Routing — When a prompt matches an extension’s capabilities, Goose routes the request.
- Execution — The extension processes the request and returns results.
- Integration — Results flow back through Goose’s unified interface.
Building Your Own Extension
Goose allows you to extend its functionality by creating your own custom extensions, which are built as MCP servers. These extensions are compatible with Goose because it adheres to the Model Context Protocol (MCP). MCP is an open protocol that standardizes how applications provide context to LLMs, enabling a consistent way to connect LLMs to various data sources and tools.
In this guide, I’ll walk you through building the simplest possible MCP server—one that reads markdown files from a notes directory. This provides a foundation for building more complex extensions in the future.
Note: Goose currently supports Tools and Resources for MCP Server features. Support for MCP Prompts will be added soon.
Step 1: Create Your Project
Let’s create a simple MCP server that reads markdown files. Run these commands:
mkdir mcp_notes
cd mcp_notes
uv init --lib
Step 2: Install Dependencies
First, ensure you have Python 3.11+ (mcp-cli requires it):
uv python install 3.11
Then add the MCP dependency to your pyproject.toml
:
uv add mcp
That’s it! The uv add
command automatically handles the dependency management for you.
Step 3: Create Your MCP Server
Create the server file:
src/mcp_notes/server.py
First, create the directory structure:
mkdir -p src/mcp_notes
Then create the server file. You can copy and paste this content into src/mcp_notes/server.py
:
import os
from pathlib import Path
from mcp.server.fastmcp import FastMCP
from mcp.types import ErrorData, INTERNAL_ERROR, INVALID_PARAMS
from mcp.shared.exceptions import McpError
mcp = FastMCP("notes")
@mcp.tool()
def list_notes(directory: str = "~/notes") -> str:
"""
List all markdown files in the specified directory.
Args:
directory: Path to notes directory (default: ~/notes)
Returns:
List of markdown files found
"""
try:
# Expand user path and resolve
notes_dir = Path(directory).expanduser().resolve()
if not notes_dir.exists():
return f"Directory {notes_dir} does not exist."
# Find all markdown files
md_files = list(notes_dir.glob("*.md"))
if not md_files:
return f"No markdown files found in {notes_dir}"
result = f"Found {len(md_files)} markdown files in {notes_dir}:\n\n"
for file in sorted(md_files):
result += f"* {file.name}\n"
return result
except Exception as e:
raise McpError(ErrorData(INTERNAL_ERROR, f"Error listing notes: {str(e)}"))
@mcp.tool()
def read_note(filename: str, directory: str = "~/notes") -> str:
"""
Read the contents of a specific markdown file.
Args:
filename: Name of the markdown file to read
directory: Path to notes directory (default: ~/notes)
Returns:
Contents of the markdown file
"""
try:
# Expand user path and resolve
notes_dir = Path(directory).expanduser().resolve()
file_path = notes_dir / filename
if not file_path.exists():
raise McpError(ErrorData(INVALID_PARAMS, f"File {filename} not found in {notes_dir}"))
if not file_path.suffix == ".md":
raise McpError(ErrorData(INVALID_PARAMS, f"File {filename} is not a markdown file"))
# Read and return file contents
with open(file_path, 'r', encoding='utf-8') as f:
content = f.read()
return f"# {filename}\n\n{content}"
except McpError:
raise
except Exception as e:
raise McpError(ErrorData(INTERNAL_ERROR, f"Error reading note: {str(e)}"))
src/mcp_notes/__init__.py
Create this file with the following content:
from .server import mcp
def main():
"""Simple MCP server for reading markdown notes."""
mcp.run()
if __name__ == "__main__":
main()
src/mcp_notes/__main__.py
Create this file with the following content:
from mcp_notes import main
main()
Step 4: Test Your Server
Using MCP Inspector
Install your package locally:
uv pip install .
Install the MCP CLI tool:
uv pip install mcp-cli
Test with MCP Inspector:
mcp dev src/mcp_notes/server.py
Go to the URL shown in your terminal (usually
http://localhost:6274
) to open the MCP Inspector UI.In the UI, click “Connect” to initialize your MCP server. Then click on “Tools” tab > “List Tools” and you should see the
list_notes
andread_note
tools.Test the tools:
- Use
list_notes
with the directory set to./notes
to see what files are in your notes directory. - Use
read_note
with filename set totest-note.md
and directory set to./notes
to read a specific note.
- Use
Testing the Tools Directly
You can also test the tools by importing them directly:
# test_tools.py
import sys
sys.path.insert(0, 'src')
from mcp_notes.server import list_notes, read_note
# Test listing notes
result = list_notes("./notes")
print(result)
# Test reading a note
content = read_note("test-note.md", "./notes")
print(content)
Note: The mcp_notes
command is an MCP server that waits for input, not a regular CLI tool. It’s designed to be used by MCP clients, such as Goose, rather than run directly in the terminal.
Step 5: Add to Goose
- Open Goose and go to Extensions in the sidebar.
- Set Type to
STDIO
. - Give it a name like “Notes Reader”.
- Set the command to:
uv run /full/path/to/mcp_notes/.venv/bin/mcp_notes
.
That’s it! You now have a working MCP server that can read your markdown notes.
This simple example shows the basic pattern:
- Create tools with the
@mcp.tool()
decorator. - Handle errors gracefully with proper MCP error types.
- Return simple, valuable data.
You can extend this by adding more tools like:
- Search notes by content.
- Create new notes.
- Organize notes by tags.
- Convert notes to different formats.
The beauty of MCP is that once you understand this pattern, you can build extensions for almost anything—databases, APIs, file systems, or custom tools specific to your workflow.
Real-World Use Cases That Actually Work
Everyday use cases include:
Debugging Made Simple
Instead of spending hours tracing through error logs, you can ask Block Goose to analyze the problem:
# Run your script and pipe errors to goose for debugging
uv run index.py "some args" 2>&1 | goose run --instructions -
# Or create a debug instruction file
echo "debug this error and fix the problem in the script" > debug.txt
uv run index.py "some args" 2>&1 >> debug.txt
goose run --instructions debug.txt
Code Migration
# Convert shell script to Python
echo "Convert this shell script to Python:" > migration.txt
cat script.sh >> migration.txt
goose run --instructions migration.txt
rm migration.txt
API Testing
# Test an API endpoint and analyze the response
curl -s https://jsonplaceholder.typicode.com/posts/1 | goose run --instructions -
# Generate test cases from API response
echo "Generate a Python test file for this API endpoint:" > api_test.txt
curl -s https://jsonplaceholder.typicode.com/posts/1 >> api_test.txt
goose run --instructions api_test.txt
Why Local AI Matters (And Why I’m Passionate About It)
Block Goose’s local-first approach keeps your proprietary code on your machine while still leveraging AI services. You control what gets sent where and can audit exactly what’s happening with your data.
This isn’t just about privacy—it’s about control. I’ve seen too many developers accidentally expose sensitive code to AI services because they couldn’t control what got sent where. Block Goose gives you that control back.
Getting Started with Block Goose
The setup is surprisingly straightforward:
- Install Goose — Follow the installation guide on their website.
- Configure your AI services — Connect your ChatGPT, Copilot, or other API keys.
- Start automating — Begin with simple tasks and gradually tackle more complex workflows.
When to Use Block Goose
Block Goose works best for:
- Complex debugging that requires multiple AI perspectives.
- Code migration and systematic refactoring.
- API integration with complex business logic.
- Documentation generation from existing codebases.
- Test data creation with validation requirements.
The Future of Local AI
Block Goose represents a shift toward intelligent, privacy-conscious development tools. The open-source nature means the community can extend it in ways the original developers never imagined.
Want to connect it to your internal APIs? Build an extension. Do you need it to work with your custom AI models? Add an MCP server.
Built by the team at Block (formerly Square), it combines local execution with intelligent AI routing to give developers both privacy and power.
Build an extension.
Do you need it to work with your custom AI models?
Add an MCP server.
Final Thoughts
Block Goose represents a shift toward more intelligent, privacy-conscious development tools. By combining local execution with intelligent AI routing, it gives developers the best of both worlds: privacy and power.
Block Goose is open source and built by the team at Block (formerly Square), which gives me confidence in its long-term viability.
So, now that you know what Block Goose is, give it a whirl.
Learn More
References
AI Development Tools and Market Analysis
- GitHub Copilot Research - Microsoft’s study showing 55% faster coding with AI assistance, demonstrating the productivity gains that tools like Block Goose can amplify.
- AI Developer Tools Market Report 2024 - Industry analysis showing the rapid growth of AI-assisted development tools and the need for unified interfaces.
- Model Context Protocol (MCP) Documentation - Official specification showing how MCP enables the extensible architecture that makes Block Goose powerful.
Privacy and Security in AI Development
- AI Code Privacy Study 2024 - Research on the risks of sending proprietary code to AI services and the importance of local-first approaches.
- Block (Square) Security Practices - Company documentation showing the security-first approach that informs Block Goose’s design philosophy.
Open Source AI Development
- Open Source AI Tools Survey 2024 - Community research showing developer preferences for transparent, extensible AI tools over closed proprietary solutions.
- MCP Ecosystem Growth - Data showing the expanding network of MCP servers and extensions that Block Goose can leverage.
Comments #