The Ultimate n8n Automation Cheatsheet 2025
Field Manual for Automation Engineers
This cheatsheet distills 20+ years of enterprise automation experience into actionable patterns, shortcuts, and best practices for building lightning-fast, reliable, and scalable workflows in n8n.
1. Introduction
What is n8n?
n8n is an open-source, self-hostable workflow automation platform that enables you to connect APIs, services, and databases without writing code. Unlike Zapier or Make, n8n gives you complete control over your data, infrastructure, and execution logic.
Key Differentiators:
- ✅ Self-hostable - Run on your infrastructure
- ✅ Open-source - Full transparency and customization
- ✅ Visual workflow builder - Intuitive drag-and-drop interface
- ✅ Code nodes - Write custom JavaScript/TypeScript when needed
- ✅ Webhook-first - Built for event-driven architectures
- ✅ Cost-effective - No per-task pricing, unlimited executions
The Automation Mindset
⚠️ Pro Tip: Before building, ask: "What manual process am I replacing, and what's the failure cost?"
Core Principles:
- Modularity - Build reusable workflow components
- Resilience - Design for failure with retries and fallbacks
- Observability - Log everything, monitor execution times
- Scalability - Batch operations, avoid rate limits
- Security - Encrypt credentials, validate webhook signatures
2. Setup & Configuration
Installation Methods
Desktop App (Quick Start)
# Download from n8n.io/download
# Or via Homebrew (macOS)
brew install n8n
Docker (Recommended for Production)
docker run -it --rm \
--name n8n \
-p 5678:5678 \
-v ~/.n8n:/home/node/.n8n \
n8nio/n8n
Docker Compose (Full Stack)
version: '3.8'
services:
n8n:
image: n8nio/n8n
ports:
- "5678:5678"
environment:
- N8N_BASIC_AUTH_ACTIVE=true
- N8N_BASIC_AUTH_USER=admin
- N8N_BASIC_AUTH_PASSWORD=${N8N_PASSWORD}
- DB_TYPE=postgresdb
- DB_POSTGRESDB_HOST=postgres
- DB_POSTGRESDB_DATABASE=n8n
- DB_POSTGRESDB_USER=n8n
- DB_POSTGRESDB_PASSWORD=${POSTGRES_PASSWORD}
volumes:
- n8n_data:/home/node/.n8n
depends_on:
- postgres
postgres:
image: postgres:15
environment:
- POSTGRES_DB=n8n
- POSTGRES_USER=n8n
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
volumes:
- postgres_data:/var/lib/postgresql/data
volumes:
n8n_data:
postgres_data:
Cloud (n8n Cloud)
- Sign up at n8n.io
- Instant setup, managed infrastructure
- Free tier: 2,000 executions/month
Environment Configuration
Essential Environment Variables:
# Authentication
N8N_BASIC_AUTH_ACTIVE=true
N8N_BASIC_AUTH_USER=admin
N8N_BASIC_AUTH_PASSWORD=your-secure-password
# Database (PostgreSQL recommended for production)
DB_TYPE=postgresdb
DB_POSTGRESDB_HOST=localhost
DB_POSTGRESDB_DATABASE=n8n
DB_POSTGRESDB_USER=n8n
DB_POSTGRESDB_PASSWORD=your-db-password
# Encryption (generate with: openssl rand -base64 32)
N8N_ENCRYPTION_KEY=your-32-char-encryption-key
# Webhook URL (for self-hosted)
WEBHOOK_URL=https://your-domain.com/
# Timezone
TZ=UTC
# Execution Settings
EXECUTIONS_PROCESS=main
EXECUTIONS_DATA_PRUNE=true
EXECUTIONS_DATA_MAX_AGE=336 # 14 days in hours
✅ Best Practice: Use environment variables for all sensitive data. Never hardcode credentials in workflows.
Credential Management
Secure Credential Storage:
-
Environment Variables - Use for API keys, secrets
# In .env file OPENAI_API_KEY=sk-... SLACK_WEBHOOK_URL=https://hooks.slack.com/... -
n8n Credentials - Encrypted storage in database
- Access via: Settings → Credentials
- Supports OAuth2, API keys, basic auth
-
External Vaults (Advanced)
- HashiCorp Vault integration
- AWS Secrets Manager
- Azure Key Vault
Version Control & Backup
Workflow Versioning:
# Export workflows as JSON
# Via UI: Workflow → Download
# Via API:
curl -X GET \
"https://your-n8n-instance.com/api/v1/workflows" \
-H "X-N8N-API-KEY: your-api-key" \
> workflows-backup.json
Automated Backup Script:
#!/bin/bash
# backup-n8n.sh
DATE=$(date +%Y%m%d_%H%M%S)
BACKUP_DIR="./backups"
mkdir -p $BACKUP_DIR
# Export workflows
curl -X GET \
"http://localhost:5678/api/v1/workflows" \
-H "X-N8N-API-KEY: $N8N_API_KEY" \
> "$BACKUP_DIR/workflows_$DATE.json"
# Backup database (PostgreSQL)
pg_dump -h localhost -U n8n n8n > "$BACKUP_DIR/db_$DATE.sql"
echo "Backup completed: $DATE"
🧠 Pro Tip: Set up daily automated backups. Workflows are code—treat them like code.
3. Core Concepts
Nodes, Connections, and Execution Flow
Node Types:
- Trigger Nodes - Start workflows (Webhook, Cron, Schedule)
- Action Nodes - Perform operations (HTTP Request, Database, API calls)
- Transform Nodes - Modify data (Set, Code, Merge, Split)
- Control Nodes - Control flow (IF, Switch, Wait, Error Trigger)
Execution Flow:
Trigger → Transform → Action → Transform → Action → End
Data Passing:
Each node receives data from the previous node as an array of items. Access data using:
// In Code Node
const inputData = $input.all();
const firstItem = $input.first();
const itemValue = $input.item.json.fieldName;
JSON Structure Handling
Understanding n8n Data Format:
[
{
"json": {
"id": 1,
"name": "John",
"email": "john@example.com"
},
"binary": {},
"pairedItem": null
}
]
Common Access Patterns:
// Get all items
const items = $input.all();
// Get first item's JSON
const first = $input.first().json;
// Map over items
const emails = items.map(item => item.json.email);
// Filter items
const filtered = items.filter(item => item.json.status === 'active');
Workflow Activation
Activation Methods:
- Manual Execution - Click "Execute Workflow"
- Production Mode - Toggle "Active" switch (workflow runs automatically)
- Webhook Trigger - Workflow activates on HTTP POST
- Schedule Trigger - Cron-based activation
- API Call - Trigger via REST API
Production Activation Checklist:
- [ ] Test workflow in manual mode
- [ ] Add error handling nodes
- [ ] Set up monitoring/alerting
- [ ] Document workflow purpose
- [ ] Enable "Active" toggle
4. Node Reference
Trigger Nodes
Webhook
- Use Case: HTTP POST/GET endpoints
- Pro Tip: Use for real-time integrations
- Example Config: Method: POST, Path:
/webhook/my-workflow
Cron
- Use Case: Scheduled tasks
- Pro Tip: Use cron syntax:
0 9 * * *(daily at 9 AM) - Example Config: Expression:
0 */6 * * *(every 6 hours)
Schedule Trigger
- Use Case: Simple recurring tasks
- Pro Tip: Easier than Cron for basic schedules
- Example Config: Every: 1 hour, Starting at: 09:00
IMAP Email
- Use Case: Email-triggered workflows
- Pro Tip: Filter by subject/from to reduce noise
- Example Config: Folder: INBOX, Options: Unread only
WebSocket
- Use Case: Real-time bidirectional communication
- Pro Tip: Use for live data streams
- Example Config: URL:
wss://api.example.com/stream
Polling
- Use Case: Check APIs at intervals
- Pro Tip: Set reasonable intervals to avoid rate limits
- Example Config: Interval: 5 minutes
Data Transformation Nodes
Code (JavaScript)
- Use Case: Custom logic, data mapping
- Pro Tip: Use
$input.all()for batch processing - Example: See Code Node Mastery section
Set
- Use Case: Add/modify fields
- Pro Tip: Use dot notation:
user.email - Example: Field:
fullName, Value:{{$json.firstName}} {{$json.lastName}}
Merge
- Use Case: Combine data from multiple nodes
- Pro Tip: Choose merge mode: "Merge By Index" or "Merge By Key"
- Example: Mode: Merge By Key, Key:
id
Split In Batches
- Use Case: Process large datasets
- Pro Tip: Set batch size based on API limits
- Example: Batch Size: 100, Options: Reset
IF
- Use Case: Conditional branching
- Pro Tip: Use expressions:
{{$json.status === 'active'}} - Example: Condition:
{{$json.amount > 1000}}
Switch
- Use Case: Multi-path routing
- Pro Tip: Use multiple rules for complex routing
- Example: Rule 1:
{{$json.type === 'email'}}, Rule 2:{{$json.type === 'sms'}}
Regex
- Use Case: Extract/transform text
- Pro Tip: Test regex patterns before using
- Example: Pattern:
(\d{4}), Replace:****
Integration Nodes
HTTP Request
- Use Case: Call any REST API
- Pro Tip: Use authentication in headers
- Example: Method: POST, URL:
https://api.example.com/data, Headers:Authorization: Bearer {{$env.API_KEY}}
MySQL/Postgres
- Use Case: Database operations
- Pro Tip: Use parameterized queries to prevent SQL injection
- Example: Query:
SELECT * FROM users WHERE status = :status, Parameters:{"status": "active"}
Google Sheets
- Use Case: Spreadsheet operations
- Pro Tip: Use named ranges for better performance
- Example: Operation: Append, Sheet: "Data", Range: "A1:D100"
Slack
- Use Case: Team notifications
- Pro Tip: Use blocks for rich formatting
- Example: Channel:
#alerts, Message:{{$json.message}}
Telegram
- Use Case: Personal/bot notifications
- Pro Tip: Use Markdown for formatting
- Example: Chat ID:
{{$env.TELEGRAM_CHAT_ID}}, Text:*Alert*: {{$json.title}}
Notion
- Use Case: Knowledge base operations
- Pro Tip: Use page IDs, not URLs
- Example: Operation: Create Page, Parent:
{{$json.parentId}}
GitHub
- Use Case: Repository automation
- Pro Tip: Use personal access tokens with minimal scopes
- Example: Operation: Create Issue, Repository:
owner/repo, Title:{{$json.title}}
Utility Nodes
Wait
- Use Case: Pause execution
- Pro Tip: Use for rate limiting or delays
- Example: Wait For: 5 seconds
Delay
- Use Case: Time-based delays
- Pro Tip: Use for scheduled retries
- Example: Wait For: 1 hour
Error Trigger
- Use Case: Catch and handle errors
- Pro Tip: Place after nodes that might fail
- Example: Continue On Fail: true
Execute Workflow
- Use Case: Call other workflows
- Pro Tip: Use for modular design
- Example: Workflow ID:
{{$env.MAIN_WORKFLOW_ID}}
Function Item
- Use Case: Per-item processing
- Pro Tip: Use for item-level transformations
- Example: See Code Node examples
5. Code Node Mastery
Basic Patterns
Accessing Input Data:
// Get all items as array
const items = $input.all();
// Get first item
const firstItem = $input.first().json;
// Get specific field
const email = $input.item.json.email;
// Get binary data
const imageData = $input.item.binary.data;
Returning Data:
// Return single item
return {
json: {
id: 1,
name: "John",
processed: true
}
};
// Return multiple items
return items.map(item => ({
json: {
...item.json,
processed: true,
timestamp: new Date().toISOString()
}
}));
Data Transformation Examples
Mapping and Filtering:
// Filter active users
const activeUsers = $input.all()
.filter(item => item.json.status === 'active')
.map(item => ({
json: {
id: item.json.id,
email: item.json.email,
name: item.json.name
}
}));
return activeUsers;
API Response Parsing:
// Parse nested API response
const response = $input.first().json;
const transformed = {
json: {
userId: response.data.user.id,
userName: response.data.user.name,
email: response.data.user.email,
metadata: {
createdAt: response.data.user.created_at,
lastLogin: response.data.user.last_login
}
}
};
return transformed;
Batch Processing:
// Process items in chunks
const items = $input.all();
const batchSize = 10;
const batches = [];
for (let i = 0; i < items.length; i += batchSize) {
batches.push(items.slice(i, i + batchSize));
}
// Return first batch (use Split In Batches node for full solution)
return batches[0].map(item => ({
json: {
...item.json,
batchNumber: 1
}
}));
Reusable Functions
Helper Functions:
// Date formatting helper
function formatDate(dateString, format = 'YYYY-MM-DD') {
const date = new Date(dateString);
const year = date.getFullYear();
const month = String(date.getMonth() + 1).padStart(2, '0');
const day = String(date.getDate()).padStart(2, '0');
return format
.replace('YYYY', year)
.replace('MM', month)
.replace('DD', day);
}
// Email validation
function isValidEmail(email) {
return /^[^\s@]+@[^\s@]+\.[^\s@]+$/.test(email);
}
// Use helpers
const items = $input.all();
return items.map(item => ({
json: {
...item.json,
formattedDate: formatDate(item.json.createdAt),
isValidEmail: isValidEmail(item.json.email)
}
}));
Error Handling Patterns
Try-Catch in Code Node:
const items = $input.all();
const results = [];
for (const item of items) {
try {
// Process item
const processed = {
json: {
...item.json,
processed: true,
error: null
}
};
results.push(processed);
} catch (error) {
// Handle error gracefully
results.push({
json: {
...item.json,
processed: false,
error: error.message
}
});
}
}
return results;
Validation Pattern:
const item = $input.first().json;
// Validate required fields
const required = ['email', 'name', 'phone'];
const missing = required.filter(field => !item[field]);
if (missing.length > 0) {
throw new Error(`Missing required fields: ${missing.join(', ')}`);
}
// Validate format
if (!/^[^\s@]+@[^\s@]+\.[^\s@]+$/.test(item.email)) {
throw new Error('Invalid email format');
}
return { json: item };
Dynamic Variable Usage
Environment Variables:
// Access environment variables
const apiKey = $env.OPENAI_API_KEY;
const webhookUrl = $env.SLACK_WEBHOOK_URL;
// Use in API calls
return {
json: {
apiKey: apiKey, // Don't expose in production!
webhookUrl: webhookUrl
}
};
Node Output References:
// Reference previous node outputs
const webhookData = $('Webhook').first().json;
const dbResult = $('Postgres').all();
// Combine data
return {
json: {
webhookId: webhookData.id,
dbRecords: dbResult.length,
combined: {
...webhookData,
records: dbResult
}
}
};
6. Workflow Design Patterns (Expert Section)
Parallel Execution Strategy
Running Multiple Operations Simultaneously:
Webhook Trigger
├─→ HTTP Request (API 1)
├─→ HTTP Request (API 2)
└─→ Database Query
↓
Merge Node (Wait for all)
↓
Process Combined Results
✅ Best Practice: Use Merge node with "Wait for all inputs" to synchronize parallel operations.
Example: Parallel API Calls
// In Merge node, combine results
const api1Data = $input.all()[0].json; // First input
const api2Data = $input.all()[1].json; // Second input
const dbData = $input.all()[2].json; // Third input
return {
json: {
api1: api1Data,
api2: api2Data,
database: dbData,
timestamp: new Date().toISOString()
}
};
Error-Resilient Automation
Try/Catch Pattern with Error Trigger:
HTTP Request
↓
Error Trigger (Continue On Fail: true)
├─→ Success Path → Process Data
└─→ Error Path → Log Error → Send Alert → Fallback Action
Retry Mechanism:
// Retry logic in Code Node
async function retryOperation(operation, maxRetries = 3) {
for (let i = 0; i < maxRetries; i++) {
try {
return await operation();
} catch (error) {
if (i === maxRetries - 1) throw error;
await new Promise(resolve => setTimeout(resolve, 1000 * (i + 1)));
}
}
}
// Use in workflow
const result = await retryOperation(() => {
// Your operation here
return fetch('https://api.example.com/data');
});
Deduplication Pattern:
// Track processed items
const processedIds = new Set();
const items = $input.all();
const uniqueItems = items.filter(item => {
const id = item.json.id;
if (processedIds.has(id)) {
return false; // Duplicate
}
processedIds.add(id);
return true;
});
return uniqueItems.map(item => ({ json: item.json }));
Event-Driven Architecture
Webhook-First Design:
External Service → Webhook → n8n → Process → Trigger Actions
↓
Multiple Workflows
Webhook Security:
// Verify webhook signature (in Code Node)
const crypto = require('crypto');
const secret = $env.WEBHOOK_SECRET;
const signature = $input.first().headers['x-signature'];
const payload = JSON.stringify($input.first().json);
const expectedSignature = crypto
.createHmac('sha256', secret)
.update(payload)
.digest('hex');
if (signature !== expectedSignature) {
throw new Error('Invalid webhook signature');
}
return $input.all();
Modular Workflow Design
Using Execute Workflow Node:
Main Workflow
├─→ Execute Workflow: Data Processing
├─→ Execute Workflow: Notification
└─→ Execute Workflow: Reporting
Benefits:
- Reusability across workflows
- Easier testing and debugging
- Better organization
- Independent versioning
Passing Data Between Workflows:
// In calling workflow
{
"workflowId": "{{$env.PROCESSING_WORKFLOW_ID}}",
"data": {
"items": $input.all(),
"config": {
"mode": "production",
"notify": true
}
}
}
AI-Powered Automations
Integrating OpenAI:
// OpenAI API call pattern
const items = $input.all();
const results = [];
for (const item of items) {
const response = await fetch('https://api.openai.com/v1/chat/completions', {
method: 'POST',
headers: {
'Authorization': `Bearer ${$env.OPENAI_API_KEY}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({
model: 'gpt-4',
messages: [
{
role: 'system',
content: 'You are a helpful assistant that summarizes data.'
},
{
role: 'user',
content: `Summarize this: ${JSON.stringify(item.json)}`
}
],
temperature: 0.7
})
});
const data = await response.json();
results.push({
json: {
...item.json,
summary: data.choices[0].message.content
}
});
}
return results;
AI-Driven Decision Making:
// Use AI to classify or route items
const classification = await classifyItem(item.json);
if (classification.category === 'urgent') {
// Route to urgent handling workflow
} else if (classification.category === 'normal') {
// Route to standard processing
}
7. Debugging & Optimization
Common Mistakes and Fixes
Missing error handling
- Symptom: Workflow fails silently
- Fix: Add Error Trigger nodes
Not handling empty arrays
- Symptom: "Cannot read property of undefined"
- Fix: Check
$input.all().length > 0
Rate limit issues
- Symptom: API calls fail intermittently
- Fix: Add Wait nodes, implement retry logic
Memory leaks
- Symptom: Workflow slows over time
- Fix: Use Split In Batches, limit data retention
Hardcoded values
- Symptom: Workflow breaks when data changes
- Fix: Use expressions:
{{$json.field}}
Infinite loops
- Symptom: Workflow runs forever
- Fix: Add execution time limits, break conditions
Performance Tuning
Batch Size Optimization:
// Optimal batch size depends on:
// 1. API rate limits
// 2. Memory constraints
// 3. Processing time
// For most APIs: 50-100 items per batch
// For heavy processing: 10-20 items per batch
// For simple operations: 200-500 items per batch
Lazy Loading Pattern:
// Only fetch what you need
const items = $input.all();
const ids = items.map(item => item.json.id);
// Fetch details only for active items
const activeItems = items.filter(item => item.json.status === 'active');
// Process only active items
Data Trimming:
// Remove unnecessary fields to reduce memory
const items = $input.all();
return items.map(item => ({
json: {
id: item.json.id,
email: item.json.email,
// Only keep essential fields
}
}));
Monitoring and Logging
Execution Logging:
// Log to console (visible in n8n execution log)
console.log('Processing item:', item.json.id);
console.error('Error occurred:', error.message);
// Log to external service
await fetch($env.LOGGING_WEBHOOK, {
method: 'POST',
body: JSON.stringify({
level: 'info',
message: 'Workflow executed',
data: item.json,
timestamp: new Date().toISOString()
})
});
Performance Monitoring:
// Track execution time
const startTime = Date.now();
// ... your processing ...
const executionTime = Date.now() - startTime;
return {
json: {
...item.json,
_metadata: {
executionTimeMs: executionTime,
timestamp: new Date().toISOString()
}
}
};
Alerting on Failures:
Error Trigger
↓
Code Node (Format Error)
↓
Slack/Telegram/Email Notification
// Format error for notification
const error = $input.first().json;
return {
json: {
alert: true,
severity: 'high',
message: `Workflow failed: ${error.error.message}`,
workflow: error.workflow?.name || 'Unknown',
node: error.node?.name || 'Unknown',
timestamp: new Date().toISOString(),
details: error
}
};
8. Real-World Examples
Example 1: Google Sheets → Airtable → Slack Pipeline
Use Case: Sync data from Google Sheets to Airtable and notify team in Slack.
Schedule Trigger (Daily at 9 AM)
↓
Google Sheets (Read Data)
↓
Code Node (Transform Data)
↓
Airtable (Create/Update Records)
↓
IF Node (Check for new records)
├─→ Yes → Slack (Send notification)
└─→ No → End
Code Node Transformation:
const sheetsData = $input.all();
return sheetsData.map(item => ({
json: {
fields: {
'Name': item.json.name,
'Email': item.json.email,
'Status': item.json.status,
'Last Updated': new Date().toISOString()
}
}
}));
Example 2: Daily Report Generation & Email
Use Case: Generate daily report from multiple data sources and email to stakeholders.
Cron Trigger (Every day at 8 AM)
↓
HTTP Request (Fetch API Data 1)
HTTP Request (Fetch API Data 2)
Database Query (Fetch DB Data)
↓
Merge Node (Combine all data)
↓
Code Node (Generate Report)
↓
Gmail (Send Email with Report)
Report Generation Code:
const api1Data = $input.all()[0].json;
const api2Data = $input.all()[1].json;
const dbData = $input.all()[2].json;
const report = {
date: new Date().toLocaleDateString(),
summary: {
totalUsers: dbData.users.length,
activeSubscriptions: api1Data.subscriptions.active,
revenue: api2Data.revenue.total
},
details: {
newUsers: dbData.users.filter(u => u.createdToday),
topProducts: api1Data.products.slice(0, 5)
}
};
return {
json: {
to: 'team@example.com',
subject: `Daily Report - ${report.date}`,
htmlBody: `
<h1>Daily Report</h1>
<h2>Summary</h2>
<ul>
<li>Total Users: ${report.summary.totalUsers}</li>
<li>Active Subscriptions: ${report.summary.activeSubscriptions}</li>
<li>Revenue: $${report.summary.revenue}</li>
</ul>
<h2>Details</h2>
<pre>${JSON.stringify(report.details, null, 2)}</pre>
`
}
};
Example 3: Automated Social Media Posting
Use Case: Post to Twitter/X and LinkedIn from a content database.
Schedule Trigger (Every 6 hours)
↓
Database Query (Get next post)
↓
Code Node (Format for each platform)
↓
HTTP Request (Post to Twitter)
HTTP Request (Post to LinkedIn)
↓
Database Query (Mark as posted)
Platform-Specific Formatting:
const post = $input.first().json;
// Twitter format (280 chars)
const twitterPost = {
text: post.content.substring(0, 280),
media_ids: post.imageId ? [post.imageId] : undefined
};
// LinkedIn format
const linkedInPost = {
author: `urn:li:person:${$env.LINKEDIN_PERSON_URN}`,
lifecycleState: 'PUBLISHED',
specificContent: {
'com.linkedin.ugc.ShareContent': {
shareCommentary: {
text: post.content
},
shareMediaCategory: 'ARTICLE',
media: post.imageUrl ? [{
status: 'READY',
media: post.imageUrl
}] : undefined
}
},
visibility: {
'com.linkedin.ugc.MemberNetworkVisibility': 'PUBLIC'
}
};
return [
{ json: { platform: 'twitter', data: twitterPost } },
{ json: { platform: 'linkedin', data: linkedInPost } }
];
Example 4: Auto-Translation Workflow
Use Case: Translate content from DeepL and sync to Notion.
Webhook Trigger (Receive content)
↓
DeepL (Translate)
↓
Notion (Create/Update Page)
↓
Webhook Response (Return translated content)
Translation Code:
const content = $input.first().json;
// DeepL API call (via HTTP Request node)
// Then process response:
const translated = {
json: {
original: content.text,
translated: $('DeepL').first().json.translations[0].text,
sourceLang: content.sourceLang || 'EN',
targetLang: content.targetLang || 'ES',
timestamp: new Date().toISOString()
}
};
return translated;
Example 5: Error Alerting System
Use Case: Monitor workflows and send alerts via Telegram and email.
Error Trigger (From any workflow)
↓
Code Node (Format Alert)
↓
Telegram (Send Alert)
Gmail (Send Email Alert)
↓
Database (Log Alert)
Alert Formatting:
const error = $input.first().json;
const alert = {
json: {
telegram: {
chatId: $env.TELEGRAM_CHAT_ID,
text: `🚨 *Workflow Error*\n\n` +
`*Workflow:* ${error.workflow?.name}\n` +
`*Node:* ${error.node?.name}\n` +
`*Error:* ${error.error?.message}\n` +
`*Time:* ${new Date().toLocaleString()}`
},
email: {
to: 'alerts@example.com',
subject: `Workflow Error: ${error.workflow?.name}`,
htmlBody: `
<h2>Workflow Error Alert</h2>
<p><strong>Workflow:</strong> ${error.workflow?.name}</p>
<p><strong>Node:</strong> ${error.node?.name}</p>
<p><strong>Error:</strong> ${error.error?.message}</p>
<pre>${JSON.stringify(error, null, 2)}</pre>
`
},
log: {
workflowId: error.workflow?.id,
nodeId: error.node?.id,
error: error.error?.message,
timestamp: new Date().toISOString(),
resolved: false
}
}
};
return alert;
9. Security, Scalability & Maintenance
Webhook Security
Signature Verification:
// Verify webhook signature
const crypto = require('crypto');
const secret = $env.WEBHOOK_SECRET;
const signature = $input.first().headers['x-signature'];
const payload = JSON.stringify($input.first().json);
const expectedSignature = crypto
.createHmac('sha256', secret)
.update(payload)
.digest('hex');
if (signature !== `sha256=${expectedSignature}`) {
throw new Error('Invalid webhook signature');
}
IP Whitelisting:
// Check allowed IPs
const allowedIPs = $env.ALLOWED_IPS.split(',');
const clientIP = $input.first().headers['x-forwarded-for'] ||
$input.first().headers['x-real-ip'];
if (!allowedIPs.includes(clientIP)) {
throw new Error('IP not allowed');
}
Rate Limiting:
// Simple rate limiting (use Redis for production)
const rateLimitKey = `ratelimit:${clientIP}`;
// Check against stored count
// Increment on each request
// Block if exceeds limit
Scaling n8n
Docker Compose for High Availability:
version: '3.8'
services:
n8n:
image: n8nio/n8n
deploy:
replicas: 3
environment:
- EXECUTIONS_PROCESS=main
- QUEUE_BULL_REDIS_HOST=redis
depends_on:
- postgres
- redis
redis:
image: redis:7-alpine
volumes:
- redis_data:/data
postgres:
image: postgres:15
environment:
- POSTGRES_DB=n8n
volumes:
- postgres_data:/var/lib/postgresql/data
Kubernetes Deployment:
apiVersion: apps/v1
kind: Deployment
metadata:
name: n8n
spec:
replicas: 3
template:
spec:
containers:
- name: n8n
image: n8nio/n8n
env:
- name: DB_TYPE
value: postgresdb
- name: DB_POSTGRESDB_HOST
value: postgres-service
Database Maintenance
Backup Strategy:
# Daily PostgreSQL backup
0 2 * * * pg_dump -h localhost -U n8n n8n | gzip > /backups/n8n_$(date +\%Y\%m\%d).sql.gz
# Retention: Keep 30 days
find /backups -name "n8n_*.sql.gz" -mtime +30 -delete
Data Pruning:
# Environment variables for automatic pruning
EXECUTIONS_DATA_PRUNE=true
EXECUTIONS_DATA_MAX_AGE=336 # 14 days
EXECUTIONS_DATA_PRUNE_MAX_COUNT=10000
High-Throughput Optimizations
Batch Processing:
// Process in optimal batch sizes
const BATCH_SIZE = 100; // Adjust based on API limits
const items = $input.all();
for (let i = 0; i < items.length; i += BATCH_SIZE) {
const batch = items.slice(i, i + BATCH_SIZE);
// Process batch
// Use Split In Batches node for automatic handling
}
Connection Pooling:
// Reuse HTTP connections
const https = require('https');
const agent = new https.Agent({
keepAlive: true,
maxSockets: 50
});
// Use agent in fetch requests
10. Power Shortcuts & Productivity Hacks
n8n Hotkeys
- Save workflow:
Cmd/Ctrl + S - Execute workflow:
Cmd/Ctrl + Enter - Add node:
Space(when node selected) - Delete node:
DeleteorBackspace - Undo:
Cmd/Ctrl + Z - Redo:
Cmd/Ctrl + Shift + Z - Zoom in:
Cmd/Ctrl + + - Zoom out:
Cmd/Ctrl + - - Reset zoom:
Cmd/Ctrl + 0
Workflow Templates
Create Reusable Templates:
- Build workflow with placeholder values
- Export as JSON
- Store in version control
- Import and customize for new use cases
Template Structure:
{
"name": "Template: Data Sync",
"nodes": [
{
"name": "Source",
"type": "n8n-nodes-base.httpRequest",
"parameters": {
"url": "{{$env.SOURCE_API_URL}}"
}
}
]
}
Community Nodes
Recommended Community Nodes:
- n8n-nodes-base.redis - Redis operations
- n8n-nodes-base.aws - AWS services integration
- n8n-nodes-base.google-cloud - GCP services
- n8n-nodes-base.kubernetes - K8s operations
Installation:
npm install n8n-nodes-base.redis
# Restart n8n
VS Code Integration
Workflow Development Workflow:
- Export workflow as JSON
- Edit in VS Code with JSON schema validation
- Import back to n8n
- Use Git for version control
Recommended Extensions:
- JSON Tools
- Prettier
- GitLens
Productivity Tips
🧠 Pro Tip: Use workflow tags and naming conventions for easy discovery.
Naming Conventions:
[Category] - [Purpose] - [Environment]
Examples:
- "Marketing - Email Campaign - Production"
- "Data - Sync Sheets to DB - Staging"
- "Monitoring - Error Alerts - Production"
Workflow Organization:
- Group by department/team
- Use tags:
production,staging,deprecated - Document purpose in workflow notes
- Set up workflow folders
11. Bonus: Advanced Automation Frameworks
Integrating with LangChain
n8n + LangChain Pattern:
// Call LangChain API from n8n
const response = await fetch('http://localhost:8000/chain/invoke', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
input: {
question: $input.first().json.question,
context: $input.first().json.context
}
})
});
const result = await response.json();
return {
json: {
answer: result.output,
sources: result.sources
}
};
Python Script Integration
Calling Python Scripts:
// Execute Python script via HTTP Request
const pythonResponse = await fetch('http://localhost:5000/process', {
method: 'POST',
body: JSON.stringify($input.first().json)
});
return { json: await pythonResponse.json() };
Python Script Example:
# process.py
from flask import Flask, request, jsonify
app = Flask(__name__)
@app.route('/process', methods=['POST'])
def process():
data = request.json
# Your processing logic
result = {
'processed': True,
'data': data
}
return jsonify(result)
if __name__ == '__main__':
app.run(port=5000)
GitHub Actions Integration
Trigger n8n from GitHub Actions:
# .github/workflows/trigger-n8n.yml
name: Trigger n8n Workflow
on:
push:
branches: [main]
jobs:
trigger:
runs-on: ubuntu-latest
steps:
- name: Trigger n8n
run: |
curl -X POST \
"${{ secrets.N8N_WEBHOOK_URL }}" \
-H "Content-Type: application/json" \
-d '{
"event": "push",
"branch": "${{ github.ref }}",
"commit": "${{ github.sha }}"
}'
Firebase Functions Integration
n8n → Firebase Functions:
// Firebase Function calling n8n
const functions = require('firebase-functions');
const axios = require('axios');
exports.triggerN8n = functions.https.onRequest(async (req, res) => {
await axios.post(process.env.N8N_WEBHOOK_URL, {
data: req.body,
source: 'firebase'
});
res.json({ success: true });
});
CLI Integration
n8n CLI Commands:
# Install n8n CLI
npm install -g n8n
# Execute workflow via CLI
n8n execute:workflow --id=WORKFLOW_ID --data='{"key":"value"}'
# Export workflow
n8n export:workflow --id=WORKFLOW_ID --output=workflow.json
# Import workflow
n8n import:workflow --input=workflow.json
AI-Driven Automation Orchestration
Pattern: AI Decision Maker → n8n Executor
External Event
↓
AI Classifier (OpenAI/Claude)
↓
Route to Appropriate n8n Workflow
├─→ Workflow A (High Priority)
├─→ Workflow B (Normal)
└─→ Workflow C (Low Priority)
AI Classification Code:
const item = $input.first().json;
const classification = await fetch('https://api.openai.com/v1/chat/completions', {
method: 'POST',
headers: {
'Authorization': `Bearer ${$env.OPENAI_API_KEY}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({
model: 'gpt-4',
messages: [{
role: 'system',
content: 'Classify items into: urgent, normal, low-priority'
}, {
role: 'user',
content: JSON.stringify(item)
}]
})
}).then(r => r.json());
const priority = classification.choices[0].message.content.toLowerCase();
return {
json: {
...item,
priority: priority,
workflowId: priority === 'urgent'
? $env.URGENT_WORKFLOW_ID
: $env.NORMAL_WORKFLOW_ID
}
};
Quick Reference Card
Essential Expressions
// Access data
{{$json.fieldName}}
{{$json.nested.field}}
{{$input.first().json.id}}
{{$input.all()}}
// Environment variables
{{$env.API_KEY}}
// Node outputs
{{$('Node Name').first().json.field}}
{{$('Node Name').all()}}
// Functions
{{$now}} // Current timestamp
{{$today}} // Today's date
{{$max(1, 2, 3)}} // Maximum value
{{$min(1, 2, 3)}} // Minimum value
Common Code Patterns
// Filter items
$input.all().filter(item => item.json.status === 'active')
// Map items
$input.all().map(item => ({ json: { id: item.json.id } }))
// Reduce items
$input.all().reduce((sum, item) => sum + item.json.amount, 0)
// Find item
$input.all().find(item => item.json.id === targetId)
// Group by
const grouped = {};
$input.all().forEach(item => {
const key = item.json.category;
if (!grouped[key]) grouped[key] = [];
grouped[key].push(item.json);
});
Error Handling Checklist
- [ ] Add Error Trigger nodes after critical operations
- [ ] Implement retry logic for transient failures
- [ ] Set up alerting for persistent errors
- [ ] Log errors with context
- [ ] Provide fallback actions
- [ ] Test error scenarios
Performance Checklist
- [ ] Use Split In Batches for large datasets
- [ ] Implement rate limiting
- [ ] Trim unnecessary data fields
- [ ] Use parallel execution where possible
- [ ] Monitor execution times
- [ ] Set up data retention policies
Conclusion
This cheatsheet represents decades of automation engineering wisdom distilled into actionable patterns and practices. Use it as your daily reference while building workflows, and remember:
The best automation is invisible - it works so reliably that you forget it exists.
Key Takeaways:
- Start simple, iterate complex - Build MVP workflows first
- Design for failure - Errors will happen, plan for them
- Monitor everything - You can't optimize what you don't measure
- Document as you build - Future you will thank present you
- Reuse and modularize - Don't repeat yourself
Next Steps:
- Set up your n8n instance with proper security
- Build your first workflow using the patterns above
- Join the n8n community for support
- Contribute your own patterns and share knowledge
Happy Automating! 🚀
This cheatsheet is a living document. Update it as you discover new patterns and best practices.*
