Deployment
📊 Model Deployment Memory and Supported Model Size Reference Guide
Note: "B" in the table represents "billion parameters model". Data shown are examples only; actual supported model sizes may vary depending on system optimization, deployment environment, and other hardware/software conditions.
8
~0.8B (example)
~0.4B (example)
~1.0B (example)
~0.6B (example)
16
1.5B (example)
0.5B (example)
~2.0B (example)
~0.8B (example)
32
~2.8B (example)
~1.2B (example)
~3.5B (example)
~1.5B (example)
Note: Models below 0.5B may not provide satisfactory performance for complex tasks. And we're continuously improving cross-platform support - please submit an issue for feedback or compatibility problems on different operating systems.
MLX Acceleration: Mac M-series users can use MLX to run larger models (CLI-only).
🐳 Option 1: Docker Setup
Note: Docker setup on Mac M-Series chips has 25-30% performance overhead compared to integrated setup, but offers easier installation process.
Prerequisites
Docker and Docker Compose installed on your system
For Docker installation: Get Docker
For Docker Compose installation: Install Docker Compose
For Windows Users: You can use MinGW to run
make
commands. You may need to modify the Makefile by replacing Unix-specific commands with Windows-compatible alternatives.Memory Usage Settings (important):
Configure these settings in Docker Desktop (macOS) or Docker Desktop (Windows) at: Dashboard -> Settings -> Resources
Make sure to allocate sufficient memory resources (at least 8GB recommended)
Setup Steps
Clone the repository
git clone git@github.com:Mindverse/Second-Me.git
cd Second-Me
Start the containers
make docker-up
After starting the service (either with local setup or Docker), open your browser and visit:
http://localhost:3000
View help and more commands
make help
For custom Ollama model configuration, please refer to:Custom Model Config(Ollama)
🚀 Option 2: Integrated Setup (Non-Docker)
Note: Integrated Setup provides best performance, especially for larger models, as it runs directly on your host system without containerization overhead.
Prerequisites
Python 3.12+ installed on your system (using uv)
Node.js 23+ and npm installed
Basic build tools (cmake, make, etc.)
Setup Steps
Clone the repository
git clone git@github.com:Mindverse/Second-Me.git
cd Second-Me
Setup Python Environment Using uv
# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh
# Create virtual environment with Python 3.12
uv venv --python 3.12
# Activate the virtual environment
source .venv/bin/activate # Unix/macOS
# or
# .venv\Scripts\activate # Windows
Install dependencies
make setup
Start all services
make restart
After services are started, open your browser and visit:
http://localhost:3000
💡 Advantages: This method offers better performance than Docker on Mac & Linux systems while still providing a simple setup process. It installs directly on your host system without containerization overhead. (Windows not tested)
Last updated