Connect Databases
Mako connects to your databases directly — no data leaves your infrastructure. You get AI-powered SQL, schema exploration, and query collaboration on top of your existing databases.
Supported Databases
Section titled “Supported Databases”| Database | Protocol | Query Language |
|---|---|---|
| PostgreSQL | TCP (port 5432) | SQL |
| MySQL | TCP (port 3306) | SQL |
| MongoDB | MongoDB protocol | MongoDB queries |
| BigQuery | Google Cloud API | SQL |
| ClickHouse | HTTP / native | SQL |
| Amazon Redshift | TCP (port 5439) | SQL |
| Cloud SQL (Postgres) | Cloud SQL Auth Proxy | SQL |
| Cloudflare D1 | Cloudflare API | SQL |
| Cloudflare KV | Cloudflare API | JavaScript |
Adding a Database
Section titled “Adding a Database”- Go to Settings → Databases in your workspace
- Select the database type
- Enter your connection details (host, port, credentials, database name)
- Click Test Connection to verify
- Save — Mako will discover your schema automatically
IP Whitelisting
Section titled “IP Whitelisting”If your database requires IP whitelisting, add the following address to your allowlist:
34.79.190.46This is the static outbound IP used by Mako’s cloud service for all database connections.
Database-Specific Notes
Section titled “Database-Specific Notes”PostgreSQL
Section titled “PostgreSQL”Standard connection string format:
postgresql://user:password@host:5432/databaseWorks with any Postgres-compatible database (Supabase, Neon, etc.). SSL connections are supported.
Standard connection string format:
mysql://user:password@host:3306/databaseMongoDB
Section titled “MongoDB”When connecting via MongoDB Atlas, the connection string format differs from what Atlas shows:
Atlas gives you:
mongodb+srv://user:password@cluster.server.mongodb.net/?appName=MyClusterMako needs:
mongodb+srv://user:password@cluster.server.mongodb.net/MyClusterNote the query parameter (?appName=) becomes a path segment (/MyCluster).
Atlas IP whitelisting: Go to Network Access in your Atlas project → Add IP Address → enter 34.79.190.46 → Confirm.
BigQuery
Section titled “BigQuery”Requires a Google Cloud service account with BigQuery access. Upload the service account JSON key file when adding the connection.
ClickHouse
Section titled “ClickHouse”Connects over HTTP. Default port is 8123 (HTTP) or 9000 (native). Cloud-hosted ClickHouse (e.g. ClickHouse Cloud) is supported.
Amazon Redshift
Section titled “Amazon Redshift”Standard Redshift connection using the Postgres wire protocol:
postgresql://user:password@cluster.region.redshift.amazonaws.com:5439/databaseCloud SQL (Postgres)
Section titled “Cloud SQL (Postgres)”For Google Cloud SQL instances with the Auth Proxy. Provide the instance connection name (e.g. project:region:instance) and credentials.
Cloudflare D1
Section titled “Cloudflare D1”Connects via the Cloudflare API. Requires your Cloudflare account ID and an API token with D1 permissions.
Cloudflare KV
Section titled “Cloudflare KV”Connects via the Cloudflare API. Uses JavaScript for key-value operations rather than SQL. Requires your Cloudflare account ID, namespace ID, and an API token.
Multiple Databases
Section titled “Multiple Databases”You can connect as many databases as you need — even different types. Mako handles the driver differences transparently. The AI agent knows which dialect to use for each database.