5 templates found
Category:
Author:
Sort:

Organise an Event using Slack, Google Calendar and AI

This n8n workflow takes Slack conversations and turns them into Calendar events complete with accurate date and times and location information. Adding and removing attendees are also managed automatically. How it works Workflow monitors a Slack channel for invite messages with a "📅" reaction and sends this to the AI agent. AI agent parses the message determining the time, date and location. Using its Location tool, the AI agent searches for the precise location address from Google Maps. Using its Calendar tool, the AI agent creates a Google Calendar invite with the title, description and location address for the user. Back in the Slack channel, others can RSVP to the invite by reacting with the "✅" emjoi. The workflow polls the message after a while and adds the users who have reacted to the Calendar Invite as attendees. Conversely, removing any attendees who have since removed their reaction. Examples Jill: "Hey team, I'm organising a round of Laser Tag (Bunker 51) next Thursday around 6pm. Please RSVP with a ✅" AI: "I've helped you create an event in your calendar https://cal.google.com/..." Jack: "✅" AI: "I've added Jack to the event as an attendee". Requirements Slack channel to attach the workflow OpenAI account to use a GPT model Google Calendar to create and update events Customising the Workflow This workflow can work with other messaging platforms that support reactions or tagging like features such as discord. Don't use Google Calendar? Swap it out for Outlook or your own. Use any combinations of emjoi reactions and add new rules like "RSVP maybe" which could send reminder updates nearer the event date.

JimleukBy Jimleuk
7923

Automated n8n workflow backup to GitHub with deletion tracking

Remixed Backup your workflows to GitHub from Solomon's work. Check out his templates. How it works This workflow will backup your workflows to GitHub. It uses the n8n API node to export all workflows. It then loops over the data, checks in GitHub to see if a file exists that uses the credential's ID. Once checked it will: update the file on GitHub if it exists; create a new file if it doesn't exist; ignore if it's the same. In addition, it also checks if any workflows have been deleted from n8n. If a workflow no longer exists in n8n, the corresponding file will be removed from the repository to keep everything in sync. Who is this for? People wanting to backup their workflows outside the server for safety purposes or to migrate to another server.

Marcial AmbrizBy Marcial Ambriz
5279

Personalized email outreach with LinkedIn & Crunchbase data and Gemini AI review

AI-Enriched Cold Outreach: Research → Draft → QA → Write-back ============================================================ What this template does ----------------------- Automates cold email drafting from a lead list by: Enriching each lead with LinkedIn profile, LinkedIn company, and Crunchbase data Generating a personalized subject + body with Gemini Auto-reviewing with a Judge agent and writing back only APPROVED drafts to your Data Table Highlights ----------- Hands-off enrichment via RapidAPI; raw JSON stored back on each row Two-agent pattern: Creative Outreach Agent (draft) + Outreach Email Judge (QA) Structured outputs guaranteed by LangChain Structured Output Parsers Data Table–native: reads “unprocessed” rows, writes results to the same row Async polling with Wait nodes for scraper task results How it works (flow) ------------------- Trigger: Manual (replace with Cron if needed) Fetch leads: Data Table “Get row(s)” filters rows where email_subject is empty (pending) Loop: Split in Batches iterates rows Enrichment (runs in parallel): LinkedIn profile: HTTP (companyurl) → Wait → Results → Data Table update → linkedinprofile_scrape LinkedIn company: HTTP (companyurl) → Wait → Results → Data Table update → linkedincompany_scrape Crunchbase company: HTTP (urlsearch) → Wait → Results → Data Table update → crunchbasecompany_scrape (All calls use host cold-outreach-enrichment-scraper with a RapidAPI key.) Draft (Gemini): “Agent One” composes a concise, personalized email using row fields + enrichment + ABOUT ME block. Structured Output Parser enforces: json { "email_subject": "text", "email_content": "text" } Prep for QA: “Email Context” maps emailsubject, emailcontent, and email for the judge. QA (Judge): “Judge Agent” returns APPROVED or REVISE (brief feedback allowed). Route: If APPROVED → Data Table “Update row(s)” writes emailsubject + emailbody (a.k.a. email_content) back to the row. If REVISE → Skipped; loop continues. Required setup --------------- Data Table: “emaillinkedinlist” (or your own) with at least: email, Firstname, Lastname, Title, Location, CompanyName, Companysite, LinkedinURL, companylinkedin (if used), Crunchbase_URL, emailsubject, emailbody, linkedinprofilescrape, linkedincompanyscrape, crunchbasecompanyscrape (string fields for JSON). Credentials: RapidAPI key for cold-outreach-enrichment-scraper (store securely as credential, not hardcoded) Google Gemini (PaLM) API configured in the Google Gemini Chat Model node ABOUT ME block: Replace the sample persona (James / CEO / Company Sample / AI Automations) with your own. Nodes used ----------- Data Table HTTP Request: AI Agent: Google Gemini Chat Model Split in Batches: Main Loop Set: RapidAPI-Key Customization ideas ------------------- Process flags: Add emailgeneratedat or processed boolean to prevent reprocessing. Human-in-the-loop: Send drafts to Slack/Email for spot check before write-back. Delivery: After approval, optionally email the draft to the sender for review. Quotas & costs --------------- RapidAPI: Multiple calls per row (three tasks + result polls). Gemini: Token usage for generator + judge per row. Tune batch size and schedule accordingly. Privacy & compliance -------------------- You are scraping and storing person/company data. Ensure lawful basis, respect ToS, and minimize stored data.

Johnny RafaelBy Johnny Rafael
2679

Automate RSS news to multi-platform social media publishing via PostPulse

This automation template allows you to automatically receive news from RSS feeds, process their content, and publish or schedule posts on various social media platforms using PostPulse. ⚠️Disclaimer: This workflow uses the community node @postpulse/n8n-nodes-postpulse. Make sure community nodes are enabled in your n8n instance before importing and using this template. 👉 To install it: Go to Settings → Community Nodes → Install and enter:"@postpulse/n8n-nodes-postpulse". 💡 For more details, see n8n Integration Guide: PostPulse Developers – n8n Integration. Who Is This For? Marketers who want to automatically fill their content plan with relevant news. Content creators and editors who want to effectively distribute news across different platforms without unnecessary effort. Media agencies that want to maintain a constant presence on social media by republishing content from reliable sources. What Problem Does This Workflow Solve? Instead of manually searching, copying, and publishing news, you get: Automated news collection: The workflow automatically reads the RSS feed and finds new content. Intelligent processing: It automatically extracts the text and, when possible, images from news articles, adapting the content for different social media platforms. Seamless publishing: PostPulse publishes posts simultaneously on TikTok, Instagram, YouTube, LinkedIn, Telegram, Bluesky, X, and Threads. Flexibility and customization: RSS feeds from different websites have unique structures. This workflow is designed as a flexible template that can automatically publish news (even without images) and allows easy adaptation to any news source. Time saving: Automates routine processes, freeing up your time for more important tasks. How It Works This workflow runs on a schedule, reads news, and processes it before sending it to PostPulse. Scheduled execution: The workflow is triggered at a set time, for example, daily at 9:00 AM. RSS feed reading: The RSS Feed Read node connects to the specified RSS feed (default: https://rss.unian.ua/site/gplay56ukr.rss) and retrieves the latest news. Filtering and media check: The If and Media Check IF nodes verify whether the news was published yesterday and whether it contains an image, looking for it in several possible fields (enclosure, media:content, or even <img> tags in the HTML). Media upload: If an image is found, the PostPulse Upload Media node uploads it to PostPulse. Then the Get Upload Status node checks if the media is ready for publishing. Post creation: The content (with or without media) is sent to the Publish Post nodes, which create a draft post in PostPulse, adapting the text to each platform’s limits (e.g., 280 characters for X/Twitter). Publishing: PostPulse automatically publishes or schedules the posts across all connected platforms. Setup Connect PostPulse to n8n Request your OAuth client key and secret from PostPulse support at support@post-pulse.com. Add your PostPulse account in the Credentials section in n8n. Find an RSS Feed that you need The easiest way is to check the page’s source code. Open the news website you are interested in. Go to the page with a specific news category (e.g., "Sports"). Press Ctrl + U (or Cmd + Option + U on Mac) to open the page’s source code. Press Ctrl + F (or Cmd + F on Mac) to search the text. Type "rss" and press Enter. Usually, you will find a link pointing to an XML page, which is the RSS feed. Configure the RSS Feed Read node Open the RSS Feed Read node. Paste the URL of your RSS feed into the URL field. Configure the Limit to N Post node This node limits the number of posts generated in a single run. By default, const limit = 1;. You can change the value from 1 to any number of posts you want to publish at once. Requirements Connected PostPulse accounts (TikTok, Instagram, YouTube, LinkedIn, Telegram, Bluesky, X, Threads). OAuth client key and secret obtained from PostPulse. An n8n instance with community nodes enabled. ✨ With this workflow, PostPulse and n8n become your all-in-one automation hub for publishing news. How To Customize The Workflow This workflow is designed to be fully flexible and adaptable to your specific needs. While it works out-of-the-box with the default RSS feed, you can easily optimize it for any news source: Adapt to different RSS feeds: Each website’s RSS feed can have a unique structure. You can adjust the workflow to extract text, images, or additional fields as needed. Handle missing media: Some feeds may not include images in standard fields. The workflow is built to publish posts even without images, but you can customize it to extract images from other tags or HTML elements. Extend content extraction: If a feed stores the full text in a separate link, you can add nodes or logic to pull more content for richer posts. Text trimming and platform-specific formatting: You can modify the trimming logic in the Publish Post nodes to fit platform limits or adjust content formatting as desired. Flexible scheduling and limits: Easily change the number of posts per run, the schedule, or date filters to match your workflow and publishing strategy. 💡 Tip: The workflow is meant to be a template — fully functional out-of-the-box, but easily customizable to match any RSS feed or content source. Its main strength is flexibility, allowing you to adapt it to different feeds, extract more content, and adjust publishing rules without touching the core workflow.

DmytroBy Dmytro
616

Score telematics driving risk with Claude and adjust insurance premiums via HTTP, Gmail, and Slack

How It Works This workflow automates insurance premium adjustments by analyzing telematics data with AI-driven risk assessment and syncing changes across underwriting systems. Designed for carriers, actuaries, and underwriting teams managing usage-based insurance programs, it eliminates manual review of driving patterns, speed, braking, and mileage while ensuring compliance. Scheduled execution fetches telematics data via HTTP from vehicles or mobile apps. Anthropic Claude analyzes behavior with structured output parsing, generating risk scores from acceleration, harsh braking, speeding, and time-of-day driving. Calculator node applies scores to premiums, and HTTP node updates policy systems. High-risk cases trigger Gmail alerts to underwriting managers and Slack notifications to claims teams. Final HTTP sync ensures compliance across all systems. Setup Steps Configure Schedule node for desired analysis frequency Set up HTTP node with telematics platform API Add Anthropic API key to Chat Model node for behavioral risk analysis Connect policy management system API credentials in HTTP nodes Integrate Gmail and Slack with underwriting team addresses Prerequisites Anthropic API key, telematics data platform API access Use Cases Auto insurance carriers implementing usage-based insurance programs Customization Modify AI prompts to incorporate additional risk factors like weather conditions Benefits Reduces premium calculation time from days to minutes

Cheng Siong ChinBy Cheng Siong Chin
22
All templates loaded