Convert an XML file to JSON via webhook call
Who this template is for This template is for everyone who needs to work with XML data a lot and wants to convert it to JSON instead. Use case Many products still work with XML files as their main language. Unfortunately, not every software still supports XML, as many switched to more modern storing languages such as JSON. This workflow is designed to handle the conversion of XML data to JSON format via a webhook call, with error handling and Slack notifications integrated into the process. How this workflow works Triggering the workflow: This workflow initiates upon receiving an HTTP POST request at the webhook endpoint specified in the "POST" node. The endpoint, designated as <WEBHOOK_URL>, can be accessed externally by sending a POST request to that URL. Data routing and processing: Upon receiving the POST request, the Switch node routes the workflow's path based on conditions determined by the content type of the incoming data or any encountered errors. The Extract From File and Edit Fields (Set) nodes manage XML input processing, adapting their actions according to the data's content type. XML to JSON conversion: The XML data extracted from the input is passed through the "XML" node, which performs the conversion process, transforming it into JSON format. Response handling: If the XML-to-JSON conversion is successful, a success response is sent back with a status of "OK" and the converted JSON data. If there are any errors during the XML-to-JSON conversion process, an error response is sent back with a status of "error" and an error message. Error handling: in case of an error during processing, the workflow sends a notification to a Slack channel designated for error reporting. Set up steps Set up your own <WEBHOOK_URL> in the Webhook node. While building or testing a workflow, use a test webhook URL. When your workflow is ready, switch to using the production webhook URL. Set credentials for Slack.
Personalize resumes & cover letters with AI, GitHub Pages and Google Drive
๐ง Automated Resume & Cover Letter Generator This project is an automation workflow that generates a personalized resume and cover letter for each job listing. --- ๐ Features Automated Resume Crafting Generates an HTML resume from your data. Hosts it live on GitHub Pages. Converts it to PDF using Gotenberg and saves it to Google Drive. Automated Cover Letter Generation Uses an LLM to create a tailored cover letter for each job listing. Simple Input Database Agent Stores your experience in an n8n Data Table with the following fields: role, summary, task, skills, tools, industry. The main agent pulls this data using RAG (Retrieval-Augmented Generation) to personalize the outputs. One-Time GitHub Setup Initializes a blank GitHub repository to host HTML files online, allowing Gotenberg to access and convert them. ๐งฉ Tech Stack Gotenberg โ Converts HTML to PDF GitHub Pages โ Hosts live HTML files n8n โ Handles data tables and workflow automation LLM (OpenAI / Cohere / etc.) โ Generates cover letters Google Drive โ Stores the final PDFs --- โ๏ธ Installation & Setup Create a GitHub Repository This repo will host your HTML resume through GitHub Pages. Set the Webhook URL In the notify-n8n.yml file, replace: role | summary | task | skills | tools | industry Create the n8n Data Table Add the following columns: role | summary | task | skills | tools | industry Create a Google Spreadsheet Add these columns: company | cover_letter | resume Install Gotenberg Follow the installation instructions on the Gotenberg GitHub repository: https://github.com/thecodingmachine/gotenberg Customize the HTML Template Modify the HTML resume to your liking. You can use an LLM to locate and edit specific sections. Add Authentication and Link Your GitHub Repo Ensure your workflow has permission to push updates to your GitHub Pages branch. Run the Workflow Once everything is connected, trigger the workflow to automatically generate and save personalized resumes and cover letters. ๐ How to Use Copy and paste the job listing description into the Telegram bot. Wait for the "Done" notification before submitting another job. Do not use the bot again until the notification appears. The process usually takes a few minutes to complete. --- โ Notes This workflow is designed to save time and personalize your job applications efficiently. By combining n8n automation, LLMs, and open-source tools like Gotenberg, you can maintain full control over your data while generating high-quality resumes and cover letters for every job opportunity.
Monitor server uptime & get email alerts with Google Sheets
๐ Web Server Monitor & Alert System This automation pings web servers at regular intervals, logs their status, and sends email alerts if a server goes down. Itโs perfect for maintaining visibility over server uptime โ without complex monitoring tools. ๐ง How It Works This workflow performs minute-by-minute checks on all listed servers in a Google Sheet and: โ Logs all reachable servers in an โAliveโ log. ๐ป Sends an email alert if a server is unreachable. ๐ Logs failed servers in a โDownโ sheet with timestamps. ๐งฉ Key Components โฐ 1. Schedule Trigger Runs the workflow every minute for real-time monitoring. ๐ 2. Web Servers List (Google Sheets) Pulls server IPs or hostnames from a Google Sheet named Server_List. Each row = one server to monitor. This makes adding/removing servers effortless โ just update the sheet. ๐ 3. Servers Alive Check (HTTP Request) Performs an HTTP GET request to each server (e.g., http://your-server.com). If the request fails, it automatically triggers the error path (handled via continueOnFail). โ 4. Web Server Alive Log (Google Sheets) Records successful pings in ServerStatusAlive with: Timestamp Server IP Status = Alive This log can be used for uptime reports or audits. ๐ง 5. Server Down Notification (Gmail) If a server fails, this node sends an email to the admin. It includes: Server address Timestamp Suggested action ๐ 6. Web Server Down Log (Google Sheets) Logs failed pings in a separate sheet for historical tracking and debugging. โ Main Advantages Live Server Monitoring Stay informed about server health in near real-time. No-Code Configuration Add/remove servers from the Google Sheet โ no need to touch the workflow. Email Alerts on Failure Proactively notifies you before users report the issue. Audit-Ready Logging Maintains logs for both healthy and failed checks for documentation or reporting. Flexible & Scalable Monitor 1 or 100 servers with the same template โ just scale the list. โ๏ธ Setup Steps ๐ Prerequisites Google Sheet with server list (column name = โServerโ) Gmail OAuth2 Connection for alerts n8n Instance running regularly ๐ Configuration Google Sheets Sheet 1 (Server_List): Your list of servers. Sheet 2 (ServerStatusAlive): Log for reachable servers. Sheet 3 (ServerStatusDown): Log for unreachable servers. Gmail Integration Connect your Gmail account in the Server Down Notification node. Edit recipient email and message content as needed. HTTP Check Adjust the HTTP request URL template if using port numbers or paths (e.g., http://{{Server}}:8080/status). Schedule Default is every 1 minute. Change via Schedule Trigger if needed. ๐งช Testing Input a reachable server (e.g., example.com) and an unreachable IP. Run the workflow manually or wait for the next scheduled run. Check: Alive log updates correctly. Down log records failures. Email alert is received. ๐ Deployment Activate the workflow, and it will quietly run in the background, notifying you of any server downtime instantly while keeping logs for future review.
Automated real estate property lead scoring with BatchData
How It Works This workflow automates the real estate lead qualification process by leveraging property data from BatchData. The automation follows these steps: When a new lead is received through your CRM webhook, the workflow captures their address information It then makes an API call to BatchData to retrieve comprehensive property details A sophisticated scoring algorithm evaluates the lead based on property characteristics like: Property value (higher values earn more points) Square footage (larger properties score higher) Property age (newer constructions score higher) Investment status (non-owner occupied properties earn bonus points) Lot size (larger lots receive additional score) Leads are automatically classified into categories (high-value, qualified, potential, or unqualified) The workflow updates your CRM with enriched property data and qualification scores High-value leads trigger immediate follow-up tasks for your team Notifications are sent to your preferred channel (Slack in this example) The entire process happens within seconds of receiving a new lead, ensuring your sales team can prioritize the most valuable opportunities immediately.. Who It's For This workflow is perfect for: Real estate agents and brokers looking to prioritize high-value property leads Mortgage lenders who need to qualify borrowers based on property assets Home service providers (renovators, contractors, solar installers) targeting specific property types Property investors seeking specific investment opportunities Real estate marketers who want to segment audiences by property value Home insurance agents qualifying leads based on property characteristics Any business that bases lead qualification on property details will benefit from this automated qualification system. About BatchData BatchData is a comprehensive property data provider that offers detailed information about residential and commercial properties across the United States. Their API provides: Property valuation and estimates Ownership information Property characteristics (size, age, bedrooms, bathrooms) Tax assessment data Transaction history Occupancy status (owner-occupied vs. investment) Lot details and dimensions By integrating BatchData with your lead management process, you can automatically verify and enrich leads with accurate property information, enabling more intelligent lead scoring and routing based on actual property characteristics rather than just contact information. This workflow demonstrates how to leverage BatchData's property API to transform your lead qualification process from manual research into an automated, data-driven system that ensures high-value leads receive immediate attention.
Detect toxic language in Telegram messages
This workflow detects toxic language (such as profanity, insults, threats) in messages sent via Telegram. This blog tutorial explains how to configure the workflow nodes step-by-step. Telegram Trigger: triggers the workflow when a new message is sent in a Telegram chat. Google Perspective: analyzes the text of the message and returns a probability value between 0 and 1 of how likely it is that the content is toxic. IF: filters messages with a toxic probability value above 0.7. Telegram: sends a message in the chat with the text "I don't tolerate toxic language" if the probability value is above 0.7. NoOp: takes no action if the probability value is below 0.7.
Personal assistant bot with multi-agent system using Telegram & Google Gemini
How It Works Telegram Trigger receives incoming messages (text, voice, photo, document). Switch routes by message type to appropriate processors: Text โ forwarded as-is. Voice โ downloaded and sent to Transcribe a recording. Photo โ downloaded, converted to base64, then sent to Analyze image. Document โ routed to document handler. Merge collects the processed input and passes a unified prompt to Manager Agent. Manager Agent (LM: Google Gemini) orchestrates specialized agents/tools: memory_base (Airtable) โ saving & retrieving personal/company memory todoandtask_manager (Todoist / Google Sheets) โ tasks email_agent (Gmail) โ composing/sending emails calendar_agent (Google Calendar) โ scheduling research_agent (SerpAPI / Wikipedia / Wolfram) โ web research project_management (Google Sheets) โ project updates Manager Agent updates memory windows and sends the final reply back to Telegram. --- Setup Steps Create and configure Telegram bot; set bot token/webhook in Telegram Trigger and Telegram nodes. Update chatId placeholders. Add Google Gemini (PaLM) credentials in the Gemini model nodes. Configure Airtable knowledge-base: set base ID & table IDs used by memory_base nodes. Connect Google APIs: Sheets, Calendar, Gmail credentials and set document/sheet IDs. Configure Todoist, SerpAPI, WolframAlpha credentials and any other tool API keys. Verify Window Buffer Memory sessionKey values (match user sessions). Check schedule triggers (cron expressions) and adjust times/timezone. Run quick tests: send text, voice, image, and confirm replies and memory writes. --- Estimated Setup Time 30โ60 minutes โ if credentials & IDs are ready. 2โ4 hours โ full setup (API keys, spreadsheets, Airtable, detailed permissions). 4โ8 hours โ complex deployment (team permissions, multiple calendars, advanced tool tuning, production testing).
Predict customer churn with AI analysis of HubSpot and Google Sheets data
Who itโs for Built for Customer Success and Account Management teams focused on proactive retention. This workflow helps you automatically identify at-risk customers โ before they churn โ by combining CRM, usage, and sentiment data into one actionable alert. What it does This end-to-end workflow continuously monitors customer health by consolidating data from HubSpot and Google Sheets. Hereโs how it works: Fetch deals from HubSpot. Collect context โ linked support tickets and feature usage from a Google Sheet. Run sentiment analysis on the tickets to generate a customer health score. Evaluate risk โ an AI agent reviews deal age, sentiment score, and usage trends against predefined thresholds. Send alerts โ if churn risk is detected, it automatically sends a clear, data-driven email to the responsible team member with next-step recommendations. How to set it up To get started, configure your credentials and parameters in the following nodes: Credentials: HubSpot: Connect your account (HubSpot: Get All Deals). LLM Model: Add credentials for your preferred provider (Config: Set LLM for Agent & Chains). Google Sheets: Connect your account (Tool: Get Feature Usage from Sheets). Email: Set up your SMTP credentials (Email: Send Churn Alert). Tool URLs: In Tool: Calculate Sentiment Score, enter the Webhook URL from the Trigger: Receive Tickets for Scoring node within this same workflow. In Tool: Get HubSpot Data, enter the Endpoint URL for your MCP HubSpot data workflow. (Note: This tool does call an external workflow)*. Google Sheet: In Tool: Get Feature Usage from Sheets, enter the Document ID for your own Google Sheet. Email Details: In Email: Send Churn Alert, change the From and To email addresses. Requirements HubSpot account with Deals API access LLM provider account (e.g. OpenAI) Google Sheets tracking customer feature usage n8n with LangChain community nodes enabled A separate n8n workflow set up to act as an MCP endpoint for fetching HubSpot data (called by Tool: Get HubSpot Data). How to customize it Tailor this workflow to match your business logic: Scoring logic: Adjust the JavaScript in the Code: Convert Sentiment to Score node to redefine how customer scores are calculated. Alert thresholds: Update the prompt in the AI Chain: Analyze for Churn Risk node to fine-tune when alerts trigger (e.g. deal age, score cutoff, or usage drop). Data sources: Swap HubSpot or Google Sheets for your CRM or database of choice โ like Salesforce or Airtable. โ Outcome: A proactive customer health monitoring system that surfaces risks before itโs too late โ keeping your team focused on prevention, not firefighting.
Check Tron wallet USDT blacklist status via Telegram
Description This n8n workflow template allows users to check if a Tron wallet address is blacklisted on the USDT contract via a Telegram bot. When a user sends the command {walletAddress} through the Telegram bot, the workflow queries the Tronscan API to determine if the provided wallet address is blacklisted. The result is then sent back to the user via the Telegram bot. Detailed Description Workflow Overview This workflow is designed to interact with users through a Telegram bot and check if a given Tron wallet address is blacklisted on the USDT contract. The workflow consists of four main nodes: Telegram Trigger Node: Listens for messages from the Telegram bot. HTTP Request Node: Sends a GET request to the Tronscan API to check the blacklist status of the provided wallet address. Function Node: Processes the API response and formats the message to be sent back to the user. Telegram Send Message Node: Sends the formatted message back to the user via the Telegram bot. Nodes Configuration 1.Telegram Trigger Node Event: Message Update Types: Message Command: /sorgu Description: This node listens for the {walletAddress} command followed by a wallet address from the user. 2.HTTP Request Node Method: GET URL: https://apilist.tronscanapi.com/api/stableCoin/blackList?blackAddress={{ $json.message.text }} Response Format: JSON Description: This node sends a GET request to the Tronscan API using the wallet address provided by the user. 3.Code Node Check Api Response: const response = items[0].json; let message; if (response.total && response.total > 0) { message = ๐จ๐ This Wallet is Blacklisted! ๐๐จ: ${response.data[0].blackAddress}; } else { message = โ ๐ This Wallet is NOT Blacklisted! ๐โ .; } return [ { json: { text: message, }, }, ]; Description: This node processes the API response to determine if the wallet address is blacklisted and formats the message to be sent back to the user. 4.Telegram Send Message Node Resource: Message Operation: Send Chat ID: ={{$json["chat_id"]}} Text: ={{$json["text"]}} Description: This node sends the formatted message back to the user via the Telegram bot. How to Use Set Up Telegram Bot: Create a Telegram bot and obtain the API token. Configure the bot to listen for the {walletAddress} command. Import Workflow: Import this workflow into your n8n instance. Configure Credentials: Add your Telegram API credentials to the Telegram Trigger and Telegram Send Message nodes. Run Workflow: Start the workflow. Users can now send the {walletAddress} command to the Telegram bot to check if a Tron wallet address is blacklisted. Example Usage User Telegram Command: {TR7NHqjeKQxGTCi8q8ZY4pL8otSzgjLj6t} API Request: https://apilist.tronscanapi.com/api/stableCoin/blackList?blackAddress=TR7NHqjeKQxGTCi8q8ZY4pL8otSzgjLj6t API Response: { "total": 1, "data": [ { "blackAddress": "TR7NHqjeKQxGTCi8q8ZY4pL8otSzgjLj6t", "tokenName": "USDT", "num": "367583344429", "time": 1593184959, "transHash": "af4bc4d793f82ca5ba500cf13cf93ca3e7a56fccc2aabf8b09e55fc756500ea8", "contractAddress": "TR7NHqjeKQxGTCi8q8ZY4pL8otSzgjLj6t" } ] } Bot Response: ๐จ๐ This Wallet is Blacklisted! ๐๐จ: TR7NHqjeKQxGTCi8q8ZY4pL8otSzgjLj6t > This workflow provides a simple and efficient way to check the blacklist status of Tron wallet addresses via a Telegram bot, making it easy for users to stay informed about the status of their wallets.
Parse and extract invoice data with Nanonets OCR and export to Excel
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Description This workflow automates document processing and structured table extraction using the Nanonets API. You can submit a PDF file via an n8n form trigger or webhookโthe workflow then forwards the document to Nanonets, waits for asynchronous parsing to finish, retrieves the results (including header fields and line items/tables), and returns the output as an Excel file. Ideal for automating invoice, receipt, or order data extraction with downstream business use. How It Works A document is uploaded (via n8n form or webhook). The PDF is sent to the Nanonets Workflow API for parsing. The workflow waits until processing is complete. Parsed results are fetched. Both top-level fields and any table rows/line items are extracted and restructured. Data is exported to Excel format and delivered to the requester. Setup Steps Nanonets Account: Register for a Nanonets account and set up a workflow for your specific document type (e.g., invoice, receipt). Credentials in n8n: Add HTTP Basic Auth credentials in n8n for the Nanonets API (never store credentials directly in node parameters). Webhook/Form Configuration: Option 1: Configure and enable the included n8n Form Trigger node for document uploads. Option 2: Use the included Webhook node to accept external POSTs with a PDF file. Adjust Workflow: Update any HTTP nodes to use your credential profile. Insert your Nanonets Workflow ID in all relevant nodes. Test the Workflow: Enable the workflow and try with a sample document. Features Accepts documents via n8n Form Trigger or direct webhook POST. Securely sends files to Nanonets for document parsing (credentials stored in n8n credentials manager). Automatically waits for async processing, checking Nanonets until results are ready. Extracts both header data and all table/line items into a tabular format. Exports results as an Excel file download. Modular nodes allow easy customization or extension. Prerequisites Nanonets account with workflow configured for your document type. n8n instance with HTTP Request, Webhook/Form, Code, and Excel/Spreadsheet nodes enabled. Valid HTTP Basic Auth credentials saved in n8n for API access. Example Use Cases | Scenario | Benefit | |-----------------------|--------------------------------------------------| | Invoice Processing | Automated extraction of line items and totals | | Receipt Digitization | Parse amounts and charges for expense reports | | Purchase Orders | Convert scanned POs into structured Excel sheets | Notes You must set up credentials in the n8n credentials managerโdo not store API keys directly in nodes. All configuration and endpoints are clearly explained with inline sticky notes in the workflow editor. Easily adaptable for other document types or similar APIsโjust modify endpoints and result mapping.
Automated news monitoring with Claude 4 AI analysis for Discord & Google News
Who's it for Marketing teams, business intelligence professionals, competitive analysts, and executives who need consistent industry monitoring with AI-powered analysis and automated team distribution via Discord. What it does This intelligent workflow automatically monitors multiple industry topics, scrapes and analyzes relevant news articles using Claude AI, and delivers professionally formatted intelligence reports to your Discord channel. The system provides weekly automated monitoring cycles with personalized bot communication and comprehensive content analysis. How it works The workflow follows a sophisticated 7-phase automation process: Scheduled Activation: Triggers weekly monitoring cycles (default: Mondays at 9 AM) Query Management: Retrieves monitoring topics from centralized Google Sheets configuration News Discovery: Executes comprehensive Google News searches using SerpAPI for each configured topic Content Extraction: Scrapes full article content from top 3 sources per topic using Firecrawl AI Analysis: Processes scraped content using Claude 4 Sonnet for intelligent synthesis and formatting Discord Optimization: Automatically segments content to comply with Discord's 2000-character message limits Automated Delivery: Posts formatted intelligence reports to Discord channel with branded "Claptrap" bot personality Requirements Google Sheets account for query management SerpAPI account for Google News access Firecrawl account for article content extraction Anthropic API access for Claude 4 Sonnet Discord bot with proper channel permissions Scheduled execution capability (cron-based trigger) How to set up Step 1: Configure Google Sheets query management Create monitoring sheet: Set up Google Sheets document with "Query" sheet Add search topics: Include industry keywords, competitor names, and relevant search terms Sheet structure: Simple column format with "Query" header containing search terms Access permissions: Ensure n8n has read access to the Google Sheets document Step 2: Configure API credentials Set up the following credentials in n8n: Google Sheets OAuth2: For accessing query configuration sheet SerpAPI: For Google News search functionality with proper rate limits Firecrawl API: For reliable article content extraction across various websites Anthropic API: For Claude 4 Sonnet access with sufficient token limits Discord Bot API: With message posting permissions in target channel Step 3: Customize scheduling settings Cron expression: Default set to "0 9 1" (Mondays at 9 AM) Frequency options: Adjust for daily, weekly, or custom monitoring cycles Timezone considerations: Configure according to team's working hours Execution timing: Ensure adequate processing time for multiple topics Step 4: Configure Discord integration Set up Discord delivery settings: Guild ID: Target Discord server (currently: 919951151888236595) Channel ID: Specific monitoring channel (currently: 1334455789284364309) Bot permissions: Message posting, embed suppression capabilities Brand personality: Customize "Claptrap" bot messaging style and tone Step 5: Customize content analysis Configure AI analysis parameters: Analysis depth: Currently processes top 3 articles per topic Content format: Structured markdown format with consistent styling Language settings: Currently configured for French output (easily customizable) Quality controls: Error handling for inaccessible articles and content How to customize the workflow Query management expansion Topic categories: Organize queries by industry, competitor, or strategic focus areas Keyword optimization: Refine search terms based on result quality and relevance Dynamic queries: Implement time-based or event-triggered query modifications Multi-language support: Add international keyword variations for global monitoring Advanced content processing Article quantity: Modify from 3 to more articles per topic based on analysis needs Content filtering: Add quality scoring and relevance filtering for article selection Source preferences: Implement preferred publisher lists or source quality weighting Content enrichment: Add sentiment analysis, trend identification, or competitive positioning Discord delivery enhancements Rich formatting: Implement Discord embeds, reactions, or interactive elements Multi-channel distribution: Route different topics to specialized Discord channels Alert levels: Add priority-based messaging for urgent industry developments Archive functionality: Create searchable message threads or database storage Integration expansions Slack compatibility: Replace or supplement Discord with Slack notifications Email reports: Add formatted email distribution for executive summaries Database storage: Implement persistent storage for historical analysis and trending API endpoints: Create webhook endpoints for third-party system integration AI analysis customization Analysis templates: Create topic-specific analysis frameworks and formatting Competitive focus: Enhance competitor mention detection and analysis depth Trend identification: Implement cross-topic trend analysis and strategic insights Summary levels: Create executive summaries alongside detailed technical analysis Advanced monitoring features Intelligent content curation The system provides sophisticated content management: Relevance scoring: Automatic ranking of articles by topic relevance and publication authority Duplicate detection: Prevents redundant coverage of the same story across different sources Content quality assessment: Filters low-quality or promotional content automatically Source diversity: Ensures coverage from multiple perspectives and publication types Error handling and reliability Graceful degradation: Continues processing even if individual articles fail to scrape Retry mechanisms: Automatic retry logic for temporary API failures or network issues Content fallbacks: Uses article snippets when full content extraction fails Notification continuity: Ensures Discord delivery even with partial content processing Results interpretation Intelligence report structure Each monitoring cycle delivers: Topic-specific summaries: Individual analysis for each configured search query Source attribution: Complete citation with publication date, source, and URL Structured formatting: Consistent presentation optimized for quick scanning Professional analysis: AI-generated insights maintaining factual accuracy and business context Performance analytics Monitor system effectiveness through: Processing metrics: Track successful article extraction and analysis rates Content quality: Assess relevance and usefulness of delivered intelligence Team engagement: Monitor Discord channel activity and report utilization System reliability: Track execution success rates and error patterns Use cases Competitive intelligence Market monitoring: Track competitor announcements, product launches, and strategic moves Industry trends: Identify emerging technologies, regulatory changes, and market shifts Partnership tracking: Monitor alliance formations, acquisitions, and strategic partnerships Leadership changes: Track executive movements and organizational restructuring Strategic planning support Market research: Continuous intelligence gathering for strategic decision-making Risk assessment: Early warning system for industry disruptions and regulatory changes Opportunity identification: Spot emerging markets, technologies, and business opportunities Brand monitoring: Track industry perception and competitive positioning Team collaboration enhancement Knowledge sharing: Centralized distribution of relevant industry intelligence Discussion facilitation: Provide common information baseline for strategic discussions Decision support: Deliver timely intelligence for business planning and strategy sessions Competitive awareness: Keep teams informed about competitive landscape changes Workflow limitations Language dependency: Currently optimized for French analysis output (easily customizable) Processing capacity: Limited to 3 articles per query (configurable based on API limits) Platform specificity: Configured for Discord delivery (adaptable to other platforms) Scheduling constraints: Fixed weekly schedule (customizable via cron expressions) Content access: Dependent on article accessibility and website compatibility with Firecrawl API dependencies: Requires active subscriptions and proper rate limit management for all integrated services
Audience problem keyword research workflow with OpenAI, Ahrefs and Google Sheets
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Generates relevant keywords and questions from a a customer profile. Keyword data is enriched from ahref and everything is stored in a Google Sheet. This is great for market and customer research. Understanding search intent for a well defined audience and gives relevant actionable data in a fraction of time that manual research takes. How it works We'll define a customer profile in the 'Data' node We use an OpenAI LLM to fetch relevant search intent as keywords and questions We use an SEO MCP server to fetch keyword data from ahref free tooling The fetched data is stored in the Google sheet Set up steps Copy Google Sheet template and add it in all Google Sheet nodes Make sure that n8n has read & write permissions for your Google sheet. Add your list of domains in the first column in the Google sheet Add MCP credentials for seo-mcp Add OpenAI API credentials
Convert JSON objects to base64 strings with file processing
Encode JSON to Base64 String in n8n This example workflow demonstrates how to convert a JSON object into a base64-encoded string using n8nโs built-in file processing capabilities. This is a common requirement when working with APIs, webhooks, or SaaS integrations that expect payloads to be base64-encoded. > Tip: The three green-highlighted nodes (Stringify โ Convert to File โ Extract from File) can be wrapped in a Subworkflow to create a reusable Base64 encoder in your own projects. --- ๐ง Requirements Any running n8n instance (local or cloud) No credentials or external services required --- What This Workflow Does Generates example JSON data Converts the JSON to a string Saves the string as a binary file Extracts the fileโs contents as a base64 string Outputs the base64 string on the final node --- Step-by-Step Setup Manual Trigger Start the workflow using the Manual Execution node. This is useful for testing and development. Create JSON Data The Create Json Data node uses raw mode to construct a sample object with all major JSON types: strings, numbers, booleans, nulls, arrays, nested objects, etc. Convert to String The Convert to String node uses the expression ={{ JSON.stringify($json) }} to flatten the object into a single string field named json_text. Convert to File The Convert to File node takes the jsontext value and saves it to a UTF-8 encoded binary file in the property encodedtext. Extract from File This node takes the binary file and extracts its contents as a base64-encoded string. The result is saved in the base64_text field. --- Customization Tips Replace the sample JSON in the Create Json Data node with your own payload structure. To make this reusable, extract the three core nodes into a Subworkflow or wrap them in a custom Function. Use the base64_text output field to post to APIs, store in databases, or include in webhook responses.