APIAPI

Build a Local LLM API with Node.js, MongoDB, and Ollama (deepseek-r1)

Build a Local LLM API with Node.js, MongoDB, and Ollama

This technical tutorial will guide you through creating a Local LLM API with a Node.js backend that connects to a MongoDB database (including combining dependent collections), exposes a REST API for real-time inference, and uses the open-source Ollama platform to run the deepseek-r1 LLM. This approach helps you build AI API locally with proper database and model integration.

Overview

  • LLM Integration: Run and query the DeepSeek-R1 model locally with Ollama.This forms the base of a self-hosted LLM API.
  • Database: Connect your server to MongoDB, including advanced queries using joins across collections with $lookup to achieve seamless MongoDB LLM integration.
  • Tech Stack: Node.js (Express), Ollama (with ollama NPM package), MongoDB (with mongodb or mongoose package). This setup enables robust Node.js LLM API development.

Ollama deepseek-r1 Setup Guide: Prerequisites 

    • Install Node.js and npm
      • On Ubuntu/debian:
sudo apt update
sudo apt install nodejs npm -y
node -v
npm -v
curl -fsSL https://ollama.com/install.sh | sh
  • Verify Installation
ollama --version

Download the DeepSeek-R1 Model

ollama --version

With this command, the model is downloaded to your local computer from Ollama’s model collection.

  • Start the Ollama Server with deepseek-r1 model:
ollama pull deepseek-r1

With this command, the model is downloaded to your local computer from Ollama’s model collection.

  • Start the Ollama Server with deepseek-r1 model:
ollama run deepseek-r1

This launches the Ollama deepseek-r1 API service locally, ready to accept API calls.

Project Boilerplate

mkdir deepseek-node-llm
cd deepseek-node-llm
npm init -y
npm install express ollama mongodb

File Structure and Code by File

Your working directory should look like:

deepseek-node-llm/
│
├── db.js
├── joinCollections.js
├── ollamaClient.js
├── index.js
├── package.json

db.js — MongoDB Connection

const { MongoClient } = require('mongodb');
const uri = "mongodb://localhost:27017";
const dbName = "mydatabase";
async function getDb() {
  const client = new MongoClient(uri, {useUnifiedTopology: true});
  await client.connect();
  return client.db(dbName);
}
module.exports = getDb;

joinCollections.js — Join Collections by Dependency

const getDb = require('./db');

// Example joins orders with products and then with suppliers.
async function joinOrdersWithProductsSuppliers() {
  const db = await getDb();
  const results = await db.collection('orders').aggregate([
    {
      $lookup: {
        from: 'products',
        localField: 'product_id',
        foreignField: '_id',
        as: 'product_details'
      }
    },
    { $unwind: '$product_details' },
    {
      $lookup: {
        from: 'suppliers',
        localField: 'product_details.supplier_id',
        foreignField: '_id',
        as: 'supplier_details'
      }
    }
  ]).toArray();
  return results;
}

module.exports = { joinOrdersWithProductsSuppliers };

ollamaClient.js — Query DeepSeek-R1 via Ollama

const getDb = require('./db');

// Example joins orders with products and then with suppliers.
async function joinOrdersWithProductsSuppliers() {
  const db = await getDb();
  const results = await db.collection('orders').aggregate([
    {
      $lookup: {
        from: 'products',
        localField: 'product_id',
        foreignField: '_id',
        as: 'product_details'
      }
    },
    { $unwind: '$product_details' },
    {
      $lookup: {
        from: 'suppliers',
        localField: 'product_details.supplier_id',
        foreignField: '_id',
        as: 'supplier_details'
      }
    }
  ]).toArray();
  return results;
}

module.exports = { joinOrdersWithProductsSuppliers };

index.js — Main Express API Server

const express = require('express');
const bodyParser = require('body-parser');
const { joinOrdersWithProductsSuppliers } = require('./joinCollections');
const { queryDeepSeek } = require('./ollamaClient');
const app = express();
app.use(bodyParser.json());
app.get('/api/joined-orders', async (req, res) => {
  try {
    const data = await joinOrdersWithProductsSuppliers();
    res.json(data);
  } catch (err) {
    res.status(500).json({ error: "DB error" });
  }
});
app.post('/api/llm', async (req, res) => {
  const prompt = req.body.prompt;
  try {
    const answer = await queryDeepSeek(prompt);
    res.json({ answer });
  } catch (err) {
    res.status(500).json({ error: "LLM error" });
  }
});
app.listen(3000, () => console.log("API running on http://localhost:3000"));

This completes the Node.js AI API development for your local LLM setup.

Run the Full Stack

Start MongoDB

(ensure your instance is running).

Start Ollama Server and DeepSeek-R1:

ollama run deepseek-r1 

Start Node.js API:

node index.js 

API Examples

  • Join Orders, Products, Suppliers:
GET http://localhost:3000/api/joined-orders
  • Get LLM Response:
POST http://localhost:3000/api/llm
Content-Type: application/json
{
  "prompt": "Explain what is MongoDB aggregation?"
}

Summary Table — Files and Purposes

File Purpose
db.js Enables you to reuse the MongoDB connection across all of your files without having to duplicate setup code by setting it up and exporting it.
joinCollections.js Shows how to multiple MongoDB LLM integration (e.g., orders, products, suppliers) using MongoDB’s $lookup aggregation, letting you fetch related data as a single combined result.
ollamaClient.js Lets you connect and send prompts to the locally running Ollama deepseek-r1 API on Ollama, and get back language model responses for your queries.
index.js

The Express.js server that exposes two routes:

  • GET /api/joined-orders — Returns joined MongoDB data.
  • POST /api/llm — Takes a prompt and returns LLM model output from deepseek-r1 via Ollama.
package.json Project metadata, dependencies

Summary:

This setup allows you to build AI API locally that serves AI answers and combines them with relational (joined) data from MongoDB—all through a simple RESTful API using easy-to-read code and modular files.

Whether you are working with a Local AI model API or scaling towards a self-hosted LLM API, this stack makes it easy and efficient. For tailored solutions, reach out us for Custom AI Support.

Practical tip:

Before interacting with the API (from Node.js), just open a terminal and start Ollama:

 ollama serve 

or ensure it’s launched via the Desktop app or as a background service.

Interested & Talk More?

Let's brew something together!

GET IN TOUCH
WhatsApp Image