mylocalai

Local AI Chat with LangGraph & MCP Tools

āš ļø Project Discontinued

This project is no longer being maintained. I’m moving on to work on something more interesting that people are actually excited about. Thanks to everyone who checked it out!


A Next.js chat application powered by LangGraph with MCP (Model Context Protocol) tools for real-time web search and data access. Features Server-Sent Events (SSE) streaming for real-time AI responses. Completely local and open source.

GitHub

šŸŽ„ Watch Demo

Architecture

Core Components

Data Flow

User Message → LangGraph Agent → MCP Tools (Web Search) → LLM → SSE Stream → UI

Features

šŸš€ Advanced AI Capabilities

šŸ’¬ Chat Interface

šŸ”’ Privacy & Performance

Quick Start

1. Install Ollama

# macOS
brew install ollama

# Windows/Linux
# Visit https://ollama.com for installer

2. Install Required Model

ollama pull qwen3:4b
# Alternative: ollama pull qwen3:4b

3. Start Ollama Service

ollama serve

4. Run the Application

make prod

5. Open Browser

Navigate to http://localhost:3000

Technical Requirements

Hardware

Software

Key Dependencies

Configuration

Model Configuration

Edit app/page.tsx to change the model:

const requiredModel = 'qwen3:4b';
// or 'qwen3:4b', 'llama3.1:70b', etc.

MCP Tools Available

Web Scraper (scrape)

Dice Roller (roll_dice)

Development

Available Commands

make prod            # Install, build, and start production server
make dev             # Development server with hot reload
make clean           # Clean build artifacts
make help            # Show all available commands

Project Structure

app/
ā”œā”€ā”€ components/           # React components
│   ā”œā”€ā”€ ChatInterface.tsx # Main chat UI
│   ā”œā”€ā”€ ChatList.tsx     # Conversation sidebar
│   └── StatusBanner.tsx # Connection status indicator
ā”œā”€ā”€ langraph_backend/    # LangGraph API routes
│   ā”œā”€ā”€ route.ts        # Main SSE streaming endpoint
│   ā”œā”€ā”€ schemas.ts      # Request/response validation
│   ā”œā”€ā”€ lib/            # Utilities and checkpointer
│   └── conversations/  # Thread management API
│       ā”œā”€ā”€ route.ts    # List conversations
│       └── [thread_id]/route.ts # Get/delete specific conversation
ā”œā”€ā”€ mcp_server/         # MCP tool implementations
│   ā”œā”€ā”€ [transport]/    # MCP protocol handler
│   │   └── route.ts    # Tool registration and routing
│   ā”œā”€ā”€ tools/         # Individual tool definitions
│   │   ā”œā”€ā”€ googleSearch.ts # Google search tool
│   │   ā”œā”€ā”€ scrape.ts   # Web scraping tool
│   │   └── rollDice.ts # Random number generator
│   ā”œā”€ā”€ search/        # Google search implementation
│   └── scrape/        # Web scraping implementation
ā”œā”€ā”€ utils/             # Shared utilities
│   └── localStorage.ts # Browser storage helpers
ā”œā”€ā”€ layout.tsx         # Root layout component
└── page.tsx           # Main chat page

Adding New MCP Tools

  1. Create tool definition in app/mcp_server/tools/
  2. Register in app/mcp_server/[transport]/route.ts
  3. Tool will be automatically available to LangGraph agent

Troubleshooting

Ollama Issues

# Check if Ollama is running
curl http://localhost:11434/api/tags

# List installed models
ollama list

# Check model installation
ollama pull qwen3:4b

Performance Optimization

SSE Streaming Issues

MCP Tool Errors

API Endpoints

Chat Streaming

Conversation Management

MCP Tools

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit changes (git commit -m 'Add amazing feature')
  4. Push to branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

MIT License - See LICENSE file for details.

Acknowledgments