Goose (Block)
Goose by Block is an open-source AI agent for developers and teams who need flexible automation without locking into rigid platforms.
Reviewed by Mathijs Bronsdijk · Updated Apr 13, 2026

What is Goose?
Goose is an open source AI agent by Block that runs locally on your machine as a desktop app, CLI, or embeddable API. It goes beyond code suggestions to handle research, writing, automation, data analysis, and general problem-solving through natural language instructions. Built in Rust for speed and portability, Goose works with 15+ LLM providers and connects to 70+ extensions through the Model Context Protocol (MCP) standard. The project recently moved to the Agentic AI Foundation at the Linux Foundation, and the GitHub repository has over 41,000 stars.
Key Features
- Multi-Interface Access: Run Goose as a native desktop app (macOS, Linux, Windows), a command-line tool, or an embedded API in your own applications
- 15+ LLM Provider Support: Switch between Anthropic, OpenAI, Google, Ollama, OpenRouter, Azure, Bedrock, and more without changing your workflow
- 70+ MCP Extensions: Connect to external tools and services through the Model Context Protocol open standard, with built-in extensions for development, web scraping, memory, and data visualization
- Bring Your Own Subscription: Use existing Claude, ChatGPT, or Gemini subscriptions through ACP (Authentication and Connection Protocol) instead of managing separate API keys
- Agent Autonomy Controls: Set boundaries for what Goose can do on its own, from fully autonomous operation to manual approval for each action
- Session Memory: The built-in memory extension lets Goose retain your preferences and context across conversations
- Mid-Session Extension Swapping: Enable or disable extensions during a conversation without restarting, and Goose auto-detects which extensions a task needs
- Built-in Security Scanning: Extensions are automatically scanned for malware before activation
Use Cases
- Software developers: Build complete applications from natural language prompts, refactor across files, and run terminal commands without leaving the conversation
- DevOps engineers: Automate deployment scripts, infrastructure changes, and monitoring setups through conversational instructions
- Data analysts: Process datasets, generate visualizations, and produce reports by describing the analysis in plain language
- Solo builders and indie hackers: Prototype full web applications from a single prompt and iterate quickly without deep expertise in every technology involved
- Teams with mixed LLM preferences: Standardize on one agent tool while letting individual team members pick their preferred model provider
Strengths and Weaknesses
Strengths:
- Fully open source under Apache 2.0, so you can inspect, modify, and self-host without restrictions
- The MCP extension ecosystem with 70+ integrations is one of the broadest among open source AI agents
- Works with any major LLM provider, avoiding vendor lock-in to a single model
- Active development backed by Block (formerly Square) with strong community momentum and 41,000+ GitHub stars
- Desktop, CLI, and API interfaces cover different workflows without needing separate tools
Weaknesses:
- Being a general-purpose agent, it may lack the depth of specialized coding agents like Cursor or Windsurf for IDE-specific workflows
- The extension ecosystem, while broad, includes community-built options that vary in quality and maintenance
- Setting up custom extensions requires familiarity with MCP server development, which has a learning curve for non-developers
- Documentation moved during the transition to the Linux Foundation, and some users report broken links or outdated guides
Getting Started
Install on macOS via Homebrew: brew install --cask block-goose (desktop) or brew install block-goose-cli (CLI)
Linux packages available as DEB, RPM, and Flatpak. Windows supports direct download and WSL.
Configure your preferred LLM provider with an API key or connect an existing Claude, ChatGPT, or Gemini subscription.
License: Apache 2.0 (fully open source) GitHub: github.com/block/goose Documentation: goose-docs.ai
FAQ
Is Goose free to use?
Goose itself is completely free and open source under Apache 2.0. You do need access to an LLM provider, which may have its own costs, but you can use free providers like Ollama with local models for zero-cost operation.
What LLM providers does Goose support?
Goose supports 15+ providers including Anthropic, OpenAI, Google, Ollama, OpenRouter, Azure, and Bedrock. You can also connect existing Claude, ChatGPT, or Gemini subscriptions instead of using separate API keys.
How does Goose compare to Cursor?
Goose is a general-purpose AI agent that handles coding, research, writing, and automation tasks across desktop, CLI, and API interfaces. Cursor is a dedicated AI code editor focused specifically on the IDE experience. Goose offers broader task coverage and model flexibility, while Cursor provides deeper integration with the coding workflow.
Can Goose run completely offline?
Yes. Pair Goose with a local model through Ollama or another local inference provider, and it runs entirely on your machine without sending data to external services.
What are Goose extensions?
Extensions are add-ons built on the Model Context Protocol (MCP) that connect Goose to external tools and services. Over 70 extensions are available, covering development, web scraping, file management, memory, and data visualization. You can also build custom extensions as MCP servers.
Is Goose only for developers?
No. While Goose started as a developer tool, it handles research, writing, data analysis, and general automation tasks. The desktop app provides a conversational interface that does not require programming knowledge for basic use.