Beginner's Guide to Workflow Automation with OpenAI, LangChain & API Integrations
How it works
This beginner-friendly workflow demonstrates the core building blocks of n8n. It guides you through:
- Triggers – Start workflows manually, on a schedule, via webhooks, or through chat.
- Data processing – Use Set and Code nodes to create, transform, and enrich data.
- Logic and branching – Apply conditions with IF nodes and merge different branches back together.
- API integrations – Fetch external data (e.g., users from an API), split arrays into individual items, and extract useful fields.
- AI-powered steps – Connect to OpenAI for generating fun facts or build interactive assistants with chat triggers, memory, and tools.
- Responses – Return structured results via webhooks or summary nodes.
By the end, it demonstrates a full flow: creating data → transforming it → making decisions → calling APIs → using AI → responding with outputs.
Set up steps
Time required: 5–10 minutes.
-
What you need:
- An n8n instance (cloud or self-hosted).
- Optional: API credentials (e.g., OpenAI) if you want to test AI features.
-
Setup flow:
- Import this workflow.
- Add your API keys where needed (OpenAI, etc.).
- Trigger the workflow manually or test with webhooks.
>👉 Detailed node explanations and examples are already included as sticky notes inside the workflow itself, so you can learn step by step as you explore.
n8n AI Agent Workflow
This n8n workflow demonstrates a basic setup for an AI Agent using LangChain components, triggered by a chat message or manually, and capable of executing code.
What it does
This workflow orchestrates an AI Agent to respond to chat messages or manual triggers, utilizing a chat model, memory, and a code execution tool.
- Listens for Chat Messages or Manual Trigger: The workflow starts either when a chat message is received through the
Chat Triggernode or when manually executed via theManual Triggernode. - Initializes AI Agent: The
AI Agentnode is the core of the workflow, configured to use a chat model, simple memory, and a code tool. - Configures AI Chat Model: The
OpenAI Chat Modelnode defines the large language model (LLM) that the AI Agent will use for processing and generating responses. - Manages Conversation Memory: The
Simple Memorynode provides a buffer to store and recall past conversational turns, enabling the AI Agent to maintain context. - Provides Code Execution Capability: The
Code Toolnode allows the AI Agent to execute custom JavaScript code, extending its capabilities beyond simple text generation. - Prepares Response: The
Edit Fields (Set)node prepares the AI Agent's response for output. - Splits Output (Optional): The
Split Outnode is present, though not connected in the provided JSON, suggesting a potential future step to handle multiple outputs or structured data. - Responds to Webhook: The
Respond to Webhooknode sends the final processed response back to the originating chat platform or webhook caller. - Makes HTTP Request (Optional): The
HTTP Requestnode is present, though not connected in the provided JSON, suggesting a potential future step for external API calls. - Conditional Logic (Optional): The
Ifnode is present, though not connected in the provided JSON, suggesting a potential future step for conditional routing based on AI output. - Merges Data (Optional): The
Mergenode is present, though not connected in the provided JSON, suggesting a potential future step to combine data streams. - Executes Code (Optional): The
Codenode is present, though not connected in the provided JSON, suggesting a potential future step for custom logic execution. - Scheduled Trigger (Optional): The
Schedule Triggernode is present, though not connected in the provided JSON, suggesting a potential future step to trigger the workflow on a schedule.
Prerequisites/Requirements
- n8n Instance: A running n8n instance.
- OpenAI API Key: Required for the
OpenAI Chat Modelnode to function. This should be configured as an n8n credential. - Chat Platform Integration: If using the
Chat Trigger, you'll need to configure the specific chat platform (e.g., Slack, Telegram) that will send messages to n8n. - Basic JavaScript Knowledge: For customizing the
Code Toolfunctionality.
Setup/Usage
- Import the Workflow: Import the provided JSON into your n8n instance.
- Configure Credentials:
- Set up an OpenAI API credential for the
OpenAI Chat Modelnode.
- Set up an OpenAI API credential for the
- Activate Chat Trigger: If using the
Chat Trigger, ensure it's configured to listen for messages from your desired chat platform. - Customize AI Agent:
- Adjust the
OpenAI Chat Modelsettings (e.g., model, temperature) as needed. - Modify the
Code Toolto define specific functions the AI Agent can execute.
- Adjust the
- Activate the Workflow: Once configured, activate the workflow to start processing chat messages or allow manual execution.
Related Templates
Create verified user profiles with email validation, PDF generation & Gmail delivery
Verified User Profile Creation - Automated Email Validation & PDF Generation --- Overview This comprehensive automation workflow streamlines the user onboarding process by validating email addresses, generating professional profile PDFs, and delivering them seamlessly to verified users. 🎯 What This Workflow Does: Receives User Data - Webhook trigger accepts user signup information (name, email, city, profession, bio) Validates Email Addresses - Uses VerifiEmail API to ensure only legitimate email addresses proceed Conditional Branching - Smart logic splits workflow based on email verification results Generates HTML Profile - Creates beautifully styled HTML templates with user information Converts to PDF - Transforms HTML into professional, downloadable PDF documents Email Delivery - Sends personalized welcome emails with PDF attachments to verified users Data Logging - Records all verified users in Google Sheets for analytics and tracking Rejection Handling - Notifies users with invalid emails and provides guidance ✨ Key Features: ✅ Email Verification - Prevents fake registrations and maintains data quality 📄 Professional PDF Generation - Beautiful, branded profile documents 📧 Automated Email Delivery - Personalized welcome messages with attachments 📊 Google Sheets Logging - Complete audit trail of all verified users 🔀 Smart Branching - Separate paths for valid and invalid emails 🎨 Modern Design - Clean, responsive HTML/CSS templates 🔒 Secure Webhook - POST endpoint for seamless form integration 🎯 Perfect Use Cases: User registration systems Community membership verification Professional certification programs Event registration with verified attendees Customer onboarding processes Newsletter signup verification Educational platform enrollments Membership card generation 📦 What's Included: Complete workflow with 12 informative sticky notes Pre-configured webhook endpoint Email verification integration PDF generation setup Gmail sending configuration Google Sheets logging Error handling guidelines Rejection email template 🛠️ Required Integrations: VerifiEmail - For email validation (https://verifi.email) HTMLcsstoPDF - For PDF generation (https://htmlcsstopdf.com) Gmail OAuth2 - For email delivery Google Sheets OAuth2 - For data logging ⚡ Quick Setup Time: 15-20 minutes 🎓 Skill Level: Beginner to Intermediate --- Benefits: ✅ Reduces manual verification work by 100% ✅ Prevents spam and fake registrations ✅ Delivers professional branded documents automatically ✅ Maintains complete audit trail ✅ Scales effortlessly with user growth ✅ Provides excellent user experience ✅ Easy integration with any form or application --- Technical Details: Trigger Type: Webhook (POST) Total Nodes: 11 (including 12 documentation sticky notes) Execution Time: ~3-5 seconds per user API Calls: 3 external (VerifiEmail, HTMLcsstoPDF, Google Sheets) Email Format: HTML with binary PDF attachment Data Storage: Google Sheets (optional) --- License: MIT (Free to use and modify) --- 🎁 BONUS FEATURES: Comprehensive sticky notes explaining each step Beautiful, mobile-responsive email template Professional PDF styling with modern design Easily customizable for your branding Ready-to-use webhook endpoint Error handling guidelines included --- Perfect for: Developers, No-code enthusiasts, Business owners, SaaS platforms, Community managers, Event organizers Start automating your user verification process today! 🚀
Automate financial transaction tracking with Gmail, GPT, Notion & Telegram alerts
📩 Automatically Log Transactions from Gmail into Notion and Get Telegram Alerts Who’s it for This workflow is for individuals or entrepreneurs who receive bank alerts, invoices, and payment emails in Gmail and want them to be automatically organized in Notion — while also receiving quick Telegram notifications for each transaction. If you manage personal or business finances and find it tedious to manually record every debit, credit, or invoice — this automation does it all for you. --- How it works The workflow acts as an AI-powered Accountant Agent that reads incoming Gmail messages and decides whether each email represents a Debit Transaction, Credit Transaction, Debit Invoice, or Credit Invoice. The Gmail Trigger watches your selected inboxes (like forwarding@bayesian-labs.com, support@bayesian-labs, anoop.karnik@bayesian-labs). The Classifier (GPT-5-nano) determines the correct transaction type. The appropriate Agent (GPT-5) then extracts amount, currency, and description details. The Agent uses Notion API tools to log structured data into your Personal Finance System Notion template (Financial Transactions & Income databases). Finally, a Telegram notification is sent summarizing the entry (From, To, Subject, Snippet). In short: every time your bank emails you — Notion gets updated, and you get notified. --- How to set up Duplicate the Personal Finance System Notion template into your workspace. Create a Telegram Bot with BotFather → copy the bot token and your chat ID. Generate an OpenRouter API key for GPT-5 / GPT-5-nano. Create a Notion Integration Token and connect it to your duplicated finance databases. Add your Gmail accounts (forwarding@, support@, and/or personal Gmail) under Gmail OAuth2 credentials in n8n. Import the workflow JSON into n8n → fill in the credential names as listed below: n8ncloudregular_usage → OpenRouter Notion account → Notion API Accountant AI → Telegram Bot Gmail OAuth2 for each inbox trigger Once active, n8n polls Gmail every minute, classifies emails, updates Notion, and sends Telegram updates. --- Requirements n8n instance (self-hosted or cloud) Gmail accounts connected via OAuth2 OpenRouter API key Telegram bot token & chat ID Notion integration token Your duplicated Personal Finance System Notion template --- How to customize the workflow You can extend this workflow to: Track credit card statements, subscriptions, or payroll notifications. Add Slack or WhatsApp alerts alongside Telegram. Include live FX rates for USD→INR conversion using an API node. Connect Google Sheets as a backup ledger or export target. Add error-handling branches to mark Gmail messages as processed or label them “Logged to Notion”.
Qualify leads with Salesforce, Explorium data & Claude AI analysis of API usage
Inbound Agent - AI-Powered Lead Qualification with Product Usage Intelligence This n8n workflow automatically qualifies and scores inbound leads by combining their product usage patterns with deep company intelligence. The workflow pulls new leads from your CRM, analyzes which API endpoints they've been testing, enriches them with firmographic data, and generates comprehensive qualification reports with personalized talking points—giving your sales team everything they need to prioritize and convert high-quality leads. DEMO Template Demo Credentials Required To use this workflow, set up the following credentials in your n8n environment: Salesforce Type: OAuth2 or Username/Password Used for: Pulling lead reports and creating follow-up tasks Alternative CRM options: HubSpot, Zoho, Pipedrive Get credentials at Salesforce Setup Databricks (or Analytics Platform) Type: HTTP Request with Bearer Token Header: Authorization Value: Bearer YOURDATABRICKSTOKEN Used for: Querying product usage and API endpoint data Alternative options: Datadog, Mixpanel, Amplitude, custom data warehouse Explorium API Type: Generic Header Auth Header: Authorization Value: Bearer YOURAPIKEY Used for: Business matching and firmographic enrichment Get your API key at Explorium Dashboard Explorium MCP Type: HTTP Header Auth Used for: Real-time company intelligence and supplemental research Connect to: https://mcp.explorium.ai/mcp Anthropic API Type: API Key Used for: AI-powered lead qualification and analysis Get your API key at Anthropic Console Go to Settings → Credentials, create these credentials, and assign them in the respective nodes before running the workflow. --- Workflow Overview Node 1: When clicking 'Execute workflow' Manual trigger that initiates the lead qualification process. Type: Manual Trigger Purpose: On-demand execution for testing or manual runs Alternative Trigger Options: Schedule Trigger: Run automatically (hourly, daily, weekly) Webhook: Trigger on CRM updates or new lead events CRM Trigger: Real-time activation when leads are created Node 2: GET SF Report Pulls lead data from a pre-configured Salesforce report. Method: GET Endpoint: Salesforce Analytics Reports API Authentication: Salesforce OAuth2 Returns: Raw Salesforce report data including: Lead contact information Company names Lead source and status Created dates Custom fields CRM Alternatives: This node can be replaced with HubSpot, Zoho, or any CRM's reporting API. Node 3: Extract Records Parses the Salesforce report structure and extracts individual lead records. Extraction Logic: Navigates report's factMap['T!T'].rows structure Maps data cells to named fields Node 4: Extract Tenant Names Prepares tenant identifiers for usage data queries. Purpose: Formats tenant names as SQL-compatible strings for the Databricks query Output: Comma-separated, quoted list: 'tenant1', 'tenant2', 'tenant3' Node 5: Query Databricks Queries your analytics platform to retrieve API usage data for each lead. Method: POST Endpoint: /api/2.0/sql/statements Authentication: Bearer token in headers Warehouse ID: Your Databricks cluster ID Platform Alternatives: Datadog: Query logs via Logs API Mixpanel: Event segmentation API Amplitude: Behavioral cohorts API Custom Warehouse: PostgreSQL, Snowflake, BigQuery queries Node 6: Split Out Splits the Databricks result array into individual items for processing. Field: result.data_array Purpose: Transform single response with multiple rows into separate items Node 7: Rename Keys Normalizes column names from database query to readable field names. Mapping: 0 → TenantNames 1 → endpoints 2 → endpointsNum Node 8: Extract Business Names Prepares company names for Explorium enrichment. Node 9: Loop Over Items Iterates through each company for individual enrichment. Node 10: Explorium API: Match Businesses Matches company names to Explorium's business entity database. Method: POST Endpoint: /v1/businesses/match Authentication: Header Auth (Bearer token) Returns: business_id: Unique Explorium identifier matched_businesses: Array of potential matches Match confidence scores Node 11: Explorium API: Firmographics Enriches matched businesses with comprehensive company data. Method: POST Endpoint: /v1/businesses/firmographics/bulk_enrich Authentication: Header Auth (Bearer token) Returns: Company name, website, description Industry categories (NAICS, SIC, LinkedIn) Size: employee count range, revenue range Location: headquarters address, city, region, country Company age and founding information Social profiles: LinkedIn, Twitter Logo and branding assets Node 12: Merge Combines API usage data with firmographic enrichment data. Node 13: Organize Data as Items Structures merged data into clean, standardized lead objects. Data Organization: Maps API usage by tenant name Maps enrichment data by company name Combines with original lead information Creates complete lead profile for analysis Node 14: Loop Over Items1 Iterates through each qualified lead for AI analysis. Batch Size: 1 (analyzes leads individually) Purpose: Generate personalized qualification reports Node 15: Get many accounts1 Fetches the associated Salesforce account for context. Resource: Account Operation: Get All Filter: Match by company name Limit: 1 record Purpose: Link lead qualification back to Salesforce account for task creation Node 16: AI Agent Analyzes each lead to generate comprehensive qualification reports. Input Data: Lead contact information API usage patterns (which endpoints tested) Firmographic data (company profile) Lead source and status Analysis Process: Evaluates lead quality based on usage, company fit, and signals Identifies which Explorium APIs the lead explored Assesses company size, industry, and potential value Detects quality signals (legitimate company email, active usage) and red flags Determines optimal sales approach and timing Connected to Explorium MCP for supplemental company research if needed Output: Structured qualification report with: Lead Score: High Priority, Medium Priority, Low Priority, or Nurture Quick Summary: Executive overview of lead potential API Usage Analysis: Endpoints used, usage insights, potential use case Company Profile: Overview, fit assessment, potential value Quality Signals: Positive indicators and concerns Recommended Actions: Next steps, timing, and approach Talking Points: Personalized conversation starters based on actual API usage Node 18: Clean Outputs Formats the AI qualification report for Salesforce task creation. Node 19: Update Salesforce Records Creates follow-up tasks in Salesforce with qualification intelligence. Resource: Task Operation: Create Authentication: Salesforce OAuth2 Alternative Output Options: HubSpot: Create tasks or update deal stages Outreach/SalesLoft: Add to sequences with custom messaging Slack: Send qualification reports to sales channels Email: Send reports to account owners Google Sheets: Log qualified leads for tracking --- Workflow Flow Summary Trigger: Manual execution or scheduled run Pull Leads: Fetch new/updated leads from Salesforce report Extract: Parse lead records and tenant identifiers Query Usage: Retrieve API endpoint usage data from analytics platform Prepare: Format data for enrichment Match: Identify companies in Explorium database Enrich: Pull comprehensive firmographic data Merge: Combine usage patterns with company intelligence Organize: Structure complete lead profiles Analyze: AI evaluates each lead with quality scoring Format: Structure qualification reports for CRM Create Tasks: Automatically populate Salesforce with actionable intelligence This workflow eliminates manual lead research and qualification, automatically analyzing product engagement patterns alongside company fit to help sales teams prioritize and personalize their outreach to the highest-value inbound leads. --- Customization Options Flexible Triggers Replace the manual trigger with: Schedule: Run hourly/daily to continuously qualify new leads Webhook: Real-time qualification when leads are created CRM Trigger: Activate on specific lead status changes Analytics Platform Integration The Databricks query can be adapted for: Datadog: Query application logs and events Mixpanel: Analyze user behavior and feature adoption Amplitude: Track product engagement metrics Custom Databases: PostgreSQL, MySQL, Snowflake, BigQuery CRM Flexibility Works with multiple CRMs: Salesforce: Full integration (pull reports, create tasks) HubSpot: Contact properties and deal updates Zoho: Lead enrichment and task creation Pipedrive: Deal qualification and activity creation Enrichment Depth Add more Explorium endpoints: Technographics: Tech stack and product usage News & Events: Recent company announcements Funding Data: Investment rounds and financial events Hiring Signals: Job postings and growth indicators Output Destinations Route qualification reports to: CRM Updates: Salesforce, HubSpot (update lead scores/fields) Task Creation: Any CRM task/activity system Team Notifications: Slack, Microsoft Teams, Email Sales Tools: Outreach, SalesLoft, Salesloft sequences Reporting: Google Sheets, Data Studio dashboards AI Model Options Swap AI providers: Default: Anthropic Claude (Sonnet 4) Alternatives: OpenAI GPT-4, Google Gemini --- Setup Notes Salesforce Report Configuration: Create a report with required fields (name, email, company, tenant ID) and use its API endpoint Tenant Identification: Ensure your product usage data includes identifiers that link to CRM leads Usage Data Query: Customize the SQL query to match your database schema and table structure MCP Configuration: Explorium MCP requires Header Auth—configure credentials properly Lead Scoring Logic: Adjust AI system prompts to match your ideal customer profile and qualification criteria Task Assignment: Configure Salesforce task assignment rules or add logic to route to specific sales reps This workflow acts as an intelligent lead qualification system that combines behavioral signals (what they're testing) with firmographic fit (who they are) to give sales teams actionable intelligence for every inbound lead.