DocumentationNeuronDB Documentation
Documentation Branch: You are viewing documentation for the main branch (3.0.0-devel). Select a branch to view its documentation:

Getting Started with NeuronDB

Introduction

📌 Branch & Version Selection

NeuronDB has three branches with different versions. Choose based on your needs:

BranchVersionStatusUse When
main3.0.0-develLatestNew projects, development, latest features (default)
REL2_STABLE2.0.0StableProduction, stable v2.0 features
REL1_STABLE1.0.0StableProduction, maximum stability required

Note: This documentation reflects version 3.0.0-devel from the main branch. For stable releases, use REL2_STABLE (v2.0.0) or REL1_STABLE (v1.0.0). See GitHub repository for branch details.

NeuronDB is a PostgreSQL AI ecosystem that provides GPU-accelerated vector search, ML inference, hybrid retrieval, and complete agent infrastructure. The ecosystem includes:

  • NeuronDB - PostgreSQL extension with vector search, ONNX model inference, and GPU acceleration
  • NeuronAgent - REST API and WebSocket agent runtime with long-term memory and tool execution
  • NeuronMCP - Model Context Protocol server with 100+ tools for MCP-compatible clients (like Claude Desktop)
  • NeuronDesktop - Unified web interface for managing all components

What you can build: Semantic search, RAG pipelines, AI agents with PostgreSQL-backed memory, and MCP integrations - all in one unified ecosystem.

Choose Your Path

Pick the installation method that best fits your needs:

MethodBest ForTimeDifficulty
Simple StartBeginners, fastest setup5 minutes⭐ Easy
Docker Quick StartComplete ecosystem, Docker users5 minutes⭐ Easy
Quick Start GuideTechnical users, first queries10 minutes⭐⭐ Medium
Source BuildProduction, custom builds, developers30+ minutes⭐⭐⭐ Advanced

💡 Note: New here? Use Simple Start for a beginner-friendly guide. For the complete ecosystem with Docker, use Docker Quick Start. Technical users can use Quick Start Guide.

Docker Quick Start

Complete NeuronDB ecosystem running in under 5 minutes with Docker Compose. This method includes all components with GPU support (CUDA, ROCm, Metal) and requires no manual configuration.

Start complete ecosystem

# Clone repository (main branch = 3.0.0-devel)
git clone https://github.com/neurondb-ai/neurondb.git
cd neurondb

# For stable 1.0.0 release, checkout REL1_STABLE branch:
# git checkout REL1_STABLE

# Start all services
docker compose up -d

# Verify services
docker compose ps

This starts:

  • NeuronDB (PostgreSQL with extension) on port 5433
  • NeuronAgent (REST API) on port 8080
  • NeuronMCP (MCP server)
  • NeuronDesktop (Web UI) on port 3000

Continue to Quick Start guide →

Source Build (Advanced)

For production deployments or custom builds, install from source. This requires PostgreSQL 16-18, a C toolchain, and build dependencies.

See the Installation Guide for detailed platform-specific instructions (Ubuntu/Debian, macOS, Rocky Linux/RHEL).

Quick reference (Ubuntu/Debian)

sudo apt-get install -y postgresql-17 postgresql-server-dev-17 build-essential

git clone https://github.com/neurondb-ai/neurondb.git
cd neurondb

# For stable 1.0.0 release, checkout REL1_STABLE branch:
# git checkout REL1_STABLE

cd NeuronDB
make PG_CONFIG=/usr/lib/postgresql/17/bin/pg_config
sudo make install PG_CONFIG=/usr/lib/postgresql/17/bin/pg_config

After installing the extension, you'll need to separately build and run NeuronAgent, NeuronMCP, and NeuronDesktop. See component documentation for details.

Next Steps

After installation, use these guides: