n8n Integration Guide
n8n Integration Guide
Quick answer: n8n is open-source, self-hosted workflow automation (alternative to Zapier/Make). Perfect for privacy-focused teams, developers, agencies needing full control. Connect Surmado Intelligence Tokens to PostgreSQL, MySQL, custom APIs, AI models, internal tools. Free forever (self-hosted), runs on your infrastructure, unlimited workflows. 5 complete n8n workflows included.
Reading time: 20 minutes
In this guide:
- Install self-hosted n8n with Docker using docker run n8nio/n8n or docker-compose.yml for production (accessing http://localhost:5678), enabling unlimited workflows with full data privacy vs Zapier’s cloud-only model, saving $20-240/month while maintaining HIPAA/GDPR compliance on your infrastructure
- Build Signal to PostgreSQL workflow by creating signal_reports table schema, using Email IMAP trigger (imap.gmail.com monitoring hi@surmado.com), Code node extracting token with regex, HTTP Request fetching JSON, Transform Data node flattening metrics, and Postgres Insert node storing presence_rate/authority_score/competitors/platform_variance as JSONB for custom analytics
- Create scheduled competitive monitoring with Cron Trigger (0 9 * * 1 for Monday 9 AM), Set Tokens array, Split In Batches iterator running HTTP GET 4× for 4 tokens, Code node aggregating/ranking results, Slack message showing ranked summary, and PostgreSQL competitive_snapshots table storing historical trends
- Integrate GPT-4 AI strategic analysis using HTTP Request for token data → OpenAI Chat Model node with prompt analyzing Presence Rate/Authority Score/Ghost Influence/competitors → Generate 3-sentence assessment + 2 strategic priorities + timeline → Send Email to marketing team + Store analysis in ai_analyses PostgreSQL table for tracking recommendations over time
- Set up multi-client agency automation with Schedule Trigger (1st of month 10 AM), Postgres query fetching 20 client tokens from clients table, Split In Batches loop, HTTP requests, Puppeteer Code node generating PDF reports with metrics charts, Google Drive uploads to client-specific folders, Send Email notifications with custom templates (saving 20-40 hours/month manual reporting)
Why n8n: Self-hosted (data stays private), code-based transforms, database connections, AI agent integration
Prerequisites
What You Need
- n8n installation (self-hosted or n8n Cloud)
- Surmado reports with Intelligence Tokens
- Email access (IMAP for watching emails)
- Destination services (database, API, webhooks)
n8n Hosting Options
Self-hosted (Free, recommended):
- Docker container on your server
- Full control, unlimited workflows
- Data stays on your infrastructure
n8n Cloud ($20/month):
- Managed hosting by n8n
- No server setup required
- 2,500 workflow executions/month
For this guide: Assumes self-hosted Docker setup
Installation (Self-Hosted)
Docker Installation
Prerequisites: Docker installed on Linux/Mac/Windows
# Pull n8n Docker image
docker pull n8nio/n8n
# Run n8n container
docker run -it --rm \
--name n8n \
-p 5678:5678 \
-v ~/.n8n:/home/node/.n8n \
n8nio/n8n
Access: Open browser to http://localhost:5678
Create account: Set up credentials (stored locally)
Docker Compose (Production)
File: docker-compose.yml
version: '3.8'
services:
n8n:
image: n8nio/n8n
container_name: n8n
restart: unless-stopped
ports:
- "5678:5678"
environment:
- N8N_BASIC_AUTH_ACTIVE=true
- N8N_BASIC_AUTH_USER=admin
- N8N_BASIC_AUTH_PASSWORD=your_password
- N8N_HOST=n8n.yourcompany.com
- N8N_PROTOCOL=https
- WEBHOOK_URL=https://n8n.yourcompany.com/
volumes:
- ~/.n8n:/home/node/.n8n
Start:
docker-compose up -d
Access: https://n8n.yourcompany.com (with reverse proxy)
Workflow 1: Signal to PostgreSQL Database
Use Case
Store all Signal report data in PostgreSQL database for custom analytics, dashboards, historical tracking.
Setup Steps
Step 1: Create PostgreSQL Table
CREATE TABLE signal_reports (
id SERIAL PRIMARY KEY,
token VARCHAR(50) UNIQUE NOT NULL,
business_name VARCHAR(255),
report_date TIMESTAMP,
presence_rate DECIMAL(5,2),
authority_score INTEGER,
ghost_influence DECIMAL(5,2),
category_share DECIMAL(5,2),
top_competitor_name VARCHAR(255),
top_competitor_rate DECIMAL(5,2),
competitive_gap DECIMAL(5,2),
tier VARCHAR(20),
platform_variance JSONB,
competitors JSONB,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
Step 2: Create n8n Workflow
In n8n:
- Click New Workflow
- Name:
Signal → PostgreSQL
Step 3: Email Trigger Node
Node: Email (IMAP)
Credentials:
- Host:
imap.gmail.com - Port: 993
- SSL: Yes
- User:
you@gmail.com - Password: (Gmail app password)
Settings:
- Mailbox: INBOX
- From:
hi@surmado.com - Subject:
Signal Report - Check Interval: 1 minute
Output: Email data including subject, body, attachments
Step 4: Extract Token Node
Node: Code (JavaScript)
Code:
const subject = $input.first().json.subject;
// Extract token with regex
const pattern = /SIG-\d{4}-\d{2}-[A-Z0-9]{5}/;
const match = subject.match(pattern);
if (match) {
return {
json: {
token: match[0]
}
};
} else {
throw new Error('Token not found in subject');
}
Output: { "token": "SIG-2025-11-A1B2C" }
Step 5: HTTP Request Node
Node: HTTP Request
Method: GET
URL: https://api.surmado.com/intelligence/{{ $json.token }}
Authentication: None
Response Format: JSON
Output: Full Intelligence Token JSON response
Step 6: Transform Data Node
Node: Code (JavaScript)
Code:
const data = $input.first().json;
// Extract and transform metrics
const metrics = data.metrics || {};
const business = data.business || {};
const competitors = data.competitors || [];
const platformVariance = data.platform_variance || {};
// Calculate competitive gap
const yourRate = metrics.presence_rate * 100;
const topCompetitorRate = competitors.length > 0
? competitors[0].presence_rate * 100
: 0;
const competitiveGap = yourRate - topCompetitorRate;
return {
json: {
token: data.token,
business_name: business.name,
report_date: data.created_at,
presence_rate: Math.round(yourRate * 100) / 100,
authority_score: metrics.authority_score || null,
ghost_influence: Math.round((metrics.ghost_influence * 100) * 100) / 100,
category_share: Math.round((metrics.category_share * 100) * 100) / 100,
top_competitor_name: competitors.length > 0 ? competitors[0].name : null,
top_competitor_rate: Math.round(topCompetitorRate * 100) / 100,
competitive_gap: Math.round(competitiveGap * 100) / 100,
tier: data.tier,
platform_variance: platformVariance,
competitors: competitors
}
};
Output: Flattened data ready for PostgreSQL insert
Step 7: PostgreSQL Node
Node: Postgres
Credentials:
- Host:
localhost(or database host) - Database:
surmado - User:
postgres - Password: (database password)
- Port: 5432
Operation: Insert
Table: signal_reports
Columns (mapped from previous node):
- token:
{{ $json.token }} - business_name:
{{ $json.business_name }} - report_date:
{{ $json.report_date }} - presence_rate:
{{ $json.presence_rate }} - authority_score:
{{ $json.authority_score }} - ghost_influence:
{{ $json.ghost_influence }} - category_share:
{{ $json.category_share }} - top_competitor_name:
{{ $json.top_competitor_name }} - top_competitor_rate:
{{ $json.top_competitor_rate }} - competitive_gap:
{{ $json.competitive_gap }} - tier:
{{ $json.tier }} - platform_variance:
{{ JSON.stringify($json.platform_variance) }} - competitors:
{{ JSON.stringify($json.competitors) }}
Step 8: Error Handler Node
Add Error Workflow:
- Click workflow settings → Add Error Workflow
- Create new workflow:
Error Handler - Signal
Error Workflow Nodes:
- Webhook (receives error data)
- Send Email (notify admin)
Email content:
Subject: n8n Error - Signal to PostgreSQL Failed
Workflow: Signal → PostgreSQL
Error: {{ $json.error.message }}
Token: {{ $json.execution.data.token }}
Timestamp: {{ $now }}
Check n8n logs for details.
Step 9: Activate and Test
- Click Activate (toggle top right)
- Send test Signal email
- Verify row inserted in PostgreSQL
Query to verify:
SELECT * FROM signal_reports ORDER BY created_at DESC LIMIT 5;
Complete Workflow Diagram
[Email IMAP Trigger]
↓
[Extract Token (Code)]
↓
[HTTP Request (Intelligence Token API)]
↓
[Transform Data (Code)]
↓
[PostgreSQL Insert]
↓ (on error)
[Error Workflow → Send Email]
Workflow 2: Scheduled Competitive Monitoring
Use Case
Every Monday 9 AM, fetch your Signal token + 3 competitors, compare metrics, store in database, send Slack summary.
Setup Steps
Step 1: Cron Trigger Node
Node: Schedule Trigger
Trigger Rules:
- Mode: Cron Expression
- Cron:
0 9 * * 1(Every Monday at 9 AM) - Timezone: America/New_York (or your timezone)
Step 2: Set Tokens Node
Node: Set
Values:
- Name:
tokens - Value:
["SIG-2025-11-A1B2C", "SIG-2025-11-COMP1", "SIG-2025-11-COMP2", "SIG-2025-11-COMP3"]
Step 3: Loop Through Tokens
Node: Split In Batches
Batch Size: 1
Input: {{ $json.tokens }}
Step 4: HTTP Request for Each Token
Node: HTTP Request
Method: GET
URL: https://api.surmado.com/intelligence/{{ $json.value }}
Output: JSON for each token (runs 4 times for 4 tokens)
Step 5: Aggregate Results
Node: Code (JavaScript)
Code:
// Get all items from previous node
const items = $input.all();
// Transform to comparison table
const results = items.map(item => {
const data = item.json;
const metrics = data.metrics || {};
return {
business_name: data.business?.name || 'Unknown',
presence_rate: Math.round(metrics.presence_rate * 100 * 10) / 10,
authority_score: metrics.authority_score || 0,
token: data.token
};
});
// Sort by presence_rate descending
results.sort((a, b) => b.presence_rate - a.presence_rate);
// Add rank
results.forEach((r, idx) => {
r.rank = idx + 1;
});
return {
json: {
results: results,
summary: results.map(r =>
`${r.rank}. ${r.business_name}: ${r.presence_rate}% (Authority: ${r.authority_score})`
).join('\n')
}
};
Output: Ranked comparison with summary text
Step 6: Send Slack Message
Node: Slack
Credentials: (Connect Slack workspace)
Channel: #marketing
Message:
*Weekly Competitive Presence Rate Summary*
{{ $json.summary }}
---
Your rank: #{{ $json.results.find(r => r.token.startsWith('SIG-2025-11-A1B2C'))?.rank || 'N/A' }}
Run new Signal reports if data is outdated.
Step 7: Store in Database
Node: Postgres
Operation: Insert
Table: competitive_snapshots
Schema:
CREATE TABLE competitive_snapshots (
id SERIAL PRIMARY KEY,
snapshot_date DATE,
business_name VARCHAR(255),
presence_rate DECIMAL(5,2),
authority_score INTEGER,
rank INTEGER,
token VARCHAR(50),
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
Insert (Loop through results): Use Loop Over Items node before PostgreSQL to insert each result as separate row
Step 8: Activate
Set active: Yes
Result: Every Monday 9 AM, automated competitive summary in Slack + database
Workflow 3: AI Agent Integration (LangChain)
Use Case
Pass Signal Intelligence Token data to AI agent (GPT-4) for strategic analysis and recommendations.
Setup Steps
Step 1: HTTP Request for Token Data
Node: HTTP Request
URL: https://api.surmado.com/intelligence/SIG-2025-11-A1B2C
Step 2: OpenAI Node
Node: OpenAI (Chat Model)
Credentials: OpenAI API key
Model: gpt-4
Prompt:
You are a strategic marketing advisor analyzing AI visibility metrics.
Here is Signal report data:
- Business: {{ $json.business.name }}
- Presence Rate: {{ $json.metrics.presence_rate * 100 }}%
- Authority Score: {{ $json.metrics.authority_score }}
- Ghost Influence: {{ $json.metrics.ghost_influence * 100 }}%
- Top Competitor: {{ $json.competitors[0].name }} ({{ $json.competitors[0].presence_rate * 100 }}%)
Provide:
1. 3-sentence assessment of competitive position
2. Top 2 strategic priorities for improving Presence Rate
3. Expected timeline for improvement (optimistic and realistic)
Keep response concise and actionable.
Output: GPT-4 strategic analysis
Step 3: Send Analysis via Email
Node: Send Email
To: marketing@yourcompany.com
Subject: AI Strategic Analysis - {{ $json.business.name }} Signal Report
Body:
Signal report for {{ $json.business.name }} has been analyzed by AI.
## Strategic Analysis
{{ $('OpenAI').first().json.message.content }}
## Raw Metrics
Presence Rate: {{ $json.metrics.presence_rate * 100 }}%
Authority Score: {{ $json.metrics.authority_score }}
Ghost Influence: {{ $json.metrics.ghost_influence * 100 }}%
Intelligence Token: {{ $json.token }}
Step 4: Store Analysis in Database
Table: ai_analyses
CREATE TABLE ai_analyses (
id SERIAL PRIMARY KEY,
token VARCHAR(50),
analysis TEXT,
model VARCHAR(50),
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
Insert:
- token:
{{ $json.token }} - analysis:
{{ $('OpenAI').first().json.message.content }} - model:
gpt-4
Use Cases for AI Integration
Scenario 1: Automatic gap analysis (compare metrics, generate action plan)
Scenario 2: Multi-report synthesis (analyze 10 clients, identify patterns)
Scenario 3: Custom alerting (AI determines if metrics require urgent attention)
Workflow 4: Scan to Webhook for Custom Dashboard
Use Case
Send Scan report data to custom internal dashboard via webhook (real-time updates).
Setup Steps
Step 1: Email Trigger
Node: Email IMAP
Filter: Subject contains “Scan Report”
Step 2: Extract Token
Node: Code
Pattern: SCAN-\d{4}-\d{2}-[A-Z0-9]{5}
Step 3: Fetch Scan Data
Node: HTTP Request
URL: https://api.surmado.com/intelligence/{{ $json.token }}
Step 4: Transform for Dashboard
Node: Code
const data = $input.first().json;
const scores = data.category_scores || {};
const issues = data.issues || [];
// Filter high-severity issues
const highIssues = issues.filter(i => i.severity === 'high');
// Format for dashboard API
return {
json: {
website: data.website?.url,
token: data.token,
timestamp: data.created_at,
scores: {
overall: data.overall_score,
technical_seo: scores.technical_seo,
performance: scores.performance,
accessibility: scores.accessibility,
security: scores.security
},
alerts: highIssues.map(issue => ({
page: issue.page,
issue: issue.issue,
severity: 'high'
}))
}
};
Step 5: Send to Custom Webhook
Node: HTTP Request
Method: POST
URL: https://dashboard.yourcompany.com/api/scan-update
Headers:
- Content-Type:
application/json - Authorization:
Bearer YOUR_API_KEY
Body: {{ JSON.stringify($json) }}
Result: Dashboard receives real-time Scan update
Workflow 5: Multi-Client Reporting (Agencies)
Use Case
Process 20 client Signal reports, generate individual PDFs, upload to client-specific Google Drive folders.
Setup Steps
Step 1: Trigger on Schedule
Node: Schedule Trigger
Cron: 0 10 1 * * (1st of month, 10 AM for monthly reports)
Step 2: Fetch Tokens from Database
Node: Postgres
Operation: Execute Query
Query:
SELECT client_name, latest_signal_token
FROM clients
WHERE active = true
ORDER BY client_name;
Output: Array of clients with tokens
Step 3: Loop Through Clients
Node: Split In Batches
Batch Size: 1
Step 4: Fetch Signal Data
Node: HTTP Request
URL: https://api.surmado.com/intelligence/{{ $json.latest_signal_token }}
Step 5: Generate PDF Report
Node: Code (with Puppeteer)
Requires: Puppeteer npm package in n8n
Code:
const puppeteer = require('puppeteer');
const data = $input.first().json;
const html = `
<html>
<head><style>
body { font-family: Arial; padding: 40px; }
h1 { color: #333; }
.metric { margin: 20px 0; padding: 15px; background: #f5f5f5; }
</style></head>
<body>
<h1>${data.business.name} - Signal Report</h1>
<div class="metric">
<h2>Presence Rate</h2>
<p>${(data.metrics.presence_rate * 100).toFixed(1)}%</p>
</div>
<div class="metric">
<h2>Authority Score</h2>
<p>${data.metrics.authority_score}/100</p>
</div>
<p>Intelligence Token: ${data.token}</p>
</body>
</html>
`;
const browser = await puppeteer.launch();
const page = await browser.newPage();
await page.setContent(html);
const pdf = await page.pdf({ format: 'A4' });
await browser.close();
return {
json: {
client_name: data.business.name,
pdf_base64: pdf.toString('base64')
}
};
Output: Base64-encoded PDF
Step 6: Upload to Google Drive
Node: Google Drive
Credentials: (Connect Google account)
Operation: Upload a File
File Name: {{ $json.client_name }}_Signal_Report_{{ $now.format('YYYY-MM') }}.pdf
File Content: {{ Buffer.from($json.pdf_base64, 'base64') }}
Folder: (Client-specific folder ID from database)
Step 7: Send Email to Client
Node: Send Email
To: {{ $('Postgres').first().json.client_email }}
Subject: Your Monthly Signal Report - {{ $json.client_name }}
Body:
Hi {{ $json.client_name }},
Your Signal report for this month is ready.
Presence Rate: {{ $json.metrics.presence_rate * 100 }}%
Authority Score: {{ $json.metrics.authority_score }}
Full report has been uploaded to your Google Drive folder.
Questions? Reply to this email.
Best regards,
[Your Agency]
Advanced n8n Features
1. Custom Nodes (JavaScript)
Use case: Reusable logic for Intelligence Token parsing
Create: Tools → Custom Nodes → New Node
Code:
class IntelligenceTokenParser {
async execute() {
const token = this.getNodeParameter('token');
const response = await this.helpers.request({
method: 'GET',
url: `https://api.surmado.com/intelligence/${token}`
});
return [{ json: response }];
}
}
module.exports.nodeClass = IntelligenceTokenParser;
Usage: Add custom node to workflows, configure token input
2. Sub-Workflows
Use case: Reusable “fetch and parse” logic across multiple workflows
Create: Workflow → Save as Sub-Workflow
Call from parent: Execute Workflow node
3. Conditional Execution
Use case: Different actions based on Presence Rate tier
Node: IF
Condition: {{ $json.metrics.presence_rate }} Greater than 0.5
True path: Send congratulations email
False path: Send improvement recommendations
4. Queue Mode (High Volume)
For agencies processing 100+ reports/day:
Enable: n8n settings → Queue Mode: On
Benefits: Parallel execution, better performance, Redis-backed queue
n8n vs Zapier vs Make.com
When to Use n8n
Advantages:
- Self-hosted (data privacy, compliance)
- Unlimited workflows (no per-workflow fees)
- Database connections (PostgreSQL, MySQL, MongoDB)
- Custom code (full JavaScript/Python support)
- AI model integration (OpenAI, Anthropic, local LLMs)
- Free forever (self-hosted)
Use n8n when:
- Privacy-critical data (HIPAA, GDPR)
- Complex transformations (custom logic)
- Database-heavy workflows
- AI agent integrations
- High volume (1000+ executions/day)
- Budget constraints (free self-hosted)
When to Use Zapier/Make.com
Use Zapier/Make when:
- No technical team (can’t manage servers)
- Quick setup needed (no infrastructure)
- Common integrations only (Gmail, Sheets, Slack)
- Low volume (< 100 executions/month)
Self-Hosting Best Practices
1. Security
Enable HTTPS: Use Nginx reverse proxy with Let’s Encrypt
Basic auth: Set N8N_BASIC_AUTH_ACTIVE=true
Firewall: Restrict port 5678 to internal network only
2. Backups
Backup n8n data:
tar -czf n8n-backup-$(date +%Y%m%d).tar.gz ~/.n8n
Schedule: Daily backups via cron
Restore:
tar -xzf n8n-backup-YYYYMMDD.tar.gz -C ~/
3. Monitoring
Check n8n health: curl http://localhost:5678/healthz
Docker logs:
docker logs n8n --tail 100 -f
Alerting: Use monitoring tools (Prometheus, Grafana) to track execution failures
4. Updates
Pull latest n8n:
docker pull n8nio/n8n:latest
docker-compose down
docker-compose up -d
Check release notes: n8n changelog
Troubleshooting
Issue 1: “Cannot connect to database”
Symptom: PostgreSQL node fails with connection error
Causes:
- Database not running
- Wrong credentials
- Firewall blocking port 5432
Fix:
- Verify database running:
psql -U postgres -c "SELECT 1;" - Check credentials in n8n Credentials section
- Allow n8n container network access to database
Issue 2: “Module not found” in Code node
Symptom: Custom npm packages not available
Causes:
- Package not installed in n8n Docker container
Fix:
- Create custom Dockerfile with npm packages:
FROM n8nio/n8n
RUN npm install -g puppeteer
- Build custom image:
docker build -t n8n-custom . - Use custom image in docker-compose
Issue 3: Workflow Execution Timeout
Symptom: Long-running workflows fail after 2 minutes
Causes:
- Default timeout too short
Fix:
- Set environment variable:
EXECUTIONS_TIMEOUT=600(10 minutes) - Or:
EXECUTIONS_TIMEOUT_MAX=1800(30 minutes max)
Frequently Asked Questions
Is n8n truly free forever?
Yes. Self-hosted n8n is open-source (Apache 2.0 license). No usage limits, no hidden fees. Pay only for infrastructure (server costs). n8n Cloud ($20/month) is optional managed service.
Can n8n handle 1000+ executions per day?
Yes. With queue mode + Redis, n8n scales to high volumes. Agencies process 10,000+ client reports monthly on single server.
Does n8n work with local LLMs (Llama, Mistral)?
Yes. Use HTTP Request node to call local LLM API (Ollama, LM Studio), or custom Code node with Python bindings.
Can I migrate Zapier workflows to n8n?
Partially. n8n has equivalents for most Zapier apps. Complex workflows may need code adjustments. Use n8n community templates as starting point.
How do I backup Intelligence Token data in n8n?
Use PostgreSQL COPY command to export signal_reports table to CSV, or set up automated backups with pg_dump. n8n workflows themselves are backed up with ~/.n8n directory.
Need help setting up self-hosted n8n for Surmado integrations? Contact hi@surmado.com for deployment consulting, custom workflow templates, or database schema recommendations.
Was this helpful?
Thanks for your feedback!
Have suggestions for improvement?
Tell us moreHelp Us Improve This Article
Know a better way to explain this? Have a real-world example or tip to share?
Contribute and earn credits:
- Submit: Get $25 credit (Signal, Scan, or Solutions)
- If accepted: Get an additional $25 credit ($50 total)
- Plus: Byline credit on this article