Automate dental appointments with Google Calendar, AI assistant & email notifications
🦷 Dental Clinic Appointment Booking System (n8n) This project is an automated appointment booking system for a dental clinic, built using n8n. It helps streamline clinic operations by checking availability, creating events in Google Calendar, and sending email notifications to both the doctor and the patient. ✨ Features ✅ Check available slots before booking ✅ Create event in Google Calendar after patient confirmation ✅ Send email notification to the Doctor with appointment details ✅ Send email confirmation to the Patient with their booking details ✅ Unique Appointment ID generation for every booking 🚀 Live Demo https://devsabirul.github.io/Dental-Clinic-Receptionist-N8n/ 🛠 Tech Used n8n.io (Automation Platform) Google Calendar API (Manage bookings) SMTP / Gmail (Send email notifications) 📂 Workflow Setup Import Workflow JSON In n8n, go to Workflows → Import from File. Set Credentials Configure Google Calendar API Credentials in n8n. Configure Email SMTP / Gmail Credentials for sending emails. Update Clinic Info Replace placeholder Doctor’s email with the real email. Customize messages (email subject, body). 📧 Example Email Sent To Doctor: New Appointment Booked Patient: John Doe Phone: +123456789 Email: john@example.com Date: 31st August 2025 Time: 2:15 PM Appointment ID: APT-20250831-ABCD12 To Patient: Hello John Doe, Your appointment has been successfully booked. 📅 Date: 31st August 2025 🕒 Time: 2:15 PM 📍 Appointment ID: APT-20250831-ABCD12 Thank you, Dental Clinic 📌 Tags n8n automation dentalclinic calendar appointment 🙌 Author 👨💻 Developed by MD Sabirul Islam
Social media sentiment analysis dashboard with custom AI for Twitter, Reddit & LinkedIn
Social Media Sentiment Analysis Dashboard with AI and Real-time Monitoring 🎯 Target Audience Social media managers and community managers Marketing teams monitoring brand reputation PR professionals tracking public sentiment Customer service teams identifying trending issues Business analysts measuring social media ROI Brand managers protecting brand reputation Product managers gathering user feedback 🚀 Problem Statement Manual social media monitoring is overwhelming and often misses critical sentiment shifts or trending topics. This template solves the challenge of automatically collecting, analyzing, and visualizing social media sentiment data across multiple platforms to provide actionable insights for brand management and customer engagement. 🔧 How it Works This workflow automatically monitors social media platforms using AI-powered sentiment analysis, processes mentions and conversations, and provides real-time insights through a comprehensive dashboard. Key Components Scheduled Trigger - Runs the workflow at specified intervals to maintain real-time monitoring AI-Powered Sentiment Analysis - Uses advanced NLP to analyze sentiment, emotions, and topics Multi-Platform Integration - Monitors Twitter, Reddit, and other social platforms Real-time Alerting - Sends notifications for critical sentiment changes or viral content Dashboard Integration - Stores all data in Google Sheets for comprehensive analysis and reporting 📊 Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the mention was recorded | "2024-01-15T10:30:00Z" | | platform | String | Social media platform | "Twitter" | | username | String | User who posted the content | "@john_doe" | | content | String | Full text of the post/comment | "Love the new product features!" | | sentiment_score | Number | Sentiment score (-1 to 1) | 0.85 | | sentiment_label | String | Sentiment classification | "Positive" | | emotion | String | Primary emotion detected | "Joy" | | topics | Array | Key topics identified | ["product", "features"] | | engagement | Number | Likes, shares, comments | 1250 | | reach_estimate | Number | Estimated reach | 50000 | | influence_score | Number | User influence metric | 0.75 | | alert_priority | String | Alert priority level | "High" | 🛠️ Setup Instructions Estimated setup time: 20-25 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access Slack workspace for notifications (optional) Social media API access (Twitter, Reddit, etc.) Step-by-Step Configuration Install Community Nodes bash Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for sentiment analysis data Configure the sheet name (default: "Sentiment Analysis") Configure Social Media Monitoring Update the websiteUrl parameters in ScrapeGraphAI nodes Add URLs for social media platforms you want to monitor Customize the user prompt to extract specific sentiment data Set up keywords, hashtags, and brand mentions to track Set up Notification Channels Configure Slack webhook or API credentials Set up email service credentials for alerts Define sentiment thresholds for different alert levels Test notification delivery Configure Schedule Trigger Set monitoring frequency (every 15 minutes, hourly, etc.) Choose appropriate time zones for your business hours Consider social media platform rate limits Test and Validate Run the workflow manually to verify all connections Check Google Sheets for proper data formatting Test sentiment analysis with sample content 🔄 Workflow Customization Options Modify Monitoring Targets Add or remove social media platforms Change keywords, hashtags, or brand mentions Adjust monitoring frequency based on platform activity Extend Sentiment Analysis Add more sophisticated emotion detection Implement topic clustering and trend analysis Include influencer identification and scoring Customize Alert System Set different thresholds for different sentiment levels Create tiered alert systems (info, warning, critical) Add sentiment trend analysis and predictions Output Customization Add data visualization and reporting features Implement sentiment trend charts and graphs Create executive dashboards with key metrics Add competitor sentiment comparison 📈 Use Cases Brand Reputation Management: Monitor and respond to brand mentions Crisis Management: Detect and respond to negative sentiment quickly Customer Feedback Analysis: Understand customer satisfaction and pain points Product Launch Monitoring: Track sentiment around new product releases Competitor Analysis: Monitor competitor sentiment and engagement Influencer Identification: Find and engage with influential users 🚨 Important Notes Respect social media platforms' terms of service and rate limits Implement appropriate delays between requests to avoid rate limiting Regularly review and update your monitoring keywords and parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider privacy implications and data protection regulations 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets permission errors: Check OAuth2 scope and permissions Sentiment analysis errors: Review the Code node's JavaScript logic Rate limiting: Adjust monitoring frequency and implement delays Alert delivery failures: Check notification service credentials Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Social media platform API documentation Sentiment analysis best practices and guidelines
Validate Auth0 JWT tokens using JWKS or signing cert
> Note: This template requires a self-hosted community edition of n8n. Does not work on cloud. Try It Out This n8n template shows how to validate API requests with Auth0 Authorization tokens. Auth0 doesn't work with the standard JWT auth option because: 1) Auth0 tokens use the RS256 algorithm. 2) RS256 JWT credentials in n8n require the user to use private and public keys and not secret phrase. 3) Auth0 does not give you access to your Auth0 instance private keys. The solution is to handle JWT validation after the webhook is received using the code node. How it works There are 2 approaches to validate Auth0 tokens: using your application's JWKS file or using your signing cert. Both solutions uses the code node to access nodeJS libraries to verify the token. JWKS: the JWK-RSA library is used to validate the application's JWKS URI hosted on Auth0 Signing Cert: the application's signing cert is imported into the workflow and used to verify token. In both cases, when the token is found to be invalid, an error is thrown. However, as we can use error outputs for the code node, the error does not stop the workflow and instead is redirected to a 401 unauthorized webhook response. When token is validated, the webhook response is forwarded on the success branch and the token decoded payload is attached. How to use Follow the instructions as stated in each scenario's sticky notes. Modify the Auth0 details with that of your application and Auth0 instance. Requirements Self-hosted community edition of n8n Ability to install npm packages Auth0 application and some way to get either the JWK url or signing cert.
Fetch All Shopify Orders (Handles 250-Limit Loop)
Who Is This For? E-commerce managers, data analysts, and n8n beginners who need a hands-off way to pull all Shopify orders—even stores with thousands of orders—into Google Sheets for reporting or BI. What Problem Does It Solve? Shopify’s GraphQL API only returns up to 250 orders per call, forcing you to manually manage cursors and loops. This template handles the “get next 250” logic for you, so you never miss an order. What This Workflow Does Schedule Trigger – Runs at your chosen cadence (daily, hourly, or manual). Set Date Range – Defines startDay and endDay based on $now. GraphQL Loop – Fetches orders 250 at a time, using pageInfo.hasNextPage and endCursor until complete. Code Node – Flattens orders into line-item rows and summarizes by SKU/vendor. Google Sheets – Appends results to your sheet for easy analysis.
Live flight fare tracker with Aviation Stack API – alerts via Gmail & Telegram
This automated n8n workflow continuously tracks real-time flight fare changes by querying airline APIs (e.g., Amadeus, Skyscanner). It compares new prices with historical fares and sends instant notifications to users when a fare drop is detected. All tracked data is structured and logged for audit and analysis. Key Insights Works post-booking to track price fluctuations for booked routes. Supports multiple fare sources for improved accuracy and comparison. Users are notified instantly via email, SMS, or Slack for high-value drops. Historical pricing data is stored for trend analysis and refund eligibility checks. Can be extended to monitor specific routes or apply airline-specific refund rules. Workflow Process Schedule Trigger Initiates the fare check every 6 hours. Fetch Flight Fare Data Queries APIs (Amadeus, Skyscanner) for current flight fares. Get Tracked Bookings Retrieves tracked routes from the internal database. Compare Fares Detects price drops compared to original booking fares. Update Fare History Table Logs the new fare and timestamp into the fare_tracking table. Classify Drops Determines priority based on absolute and percentage savings. Notify Users Email Alerts: For all medium/high priority drops. SMS Alerts: For savings > \$100 or >15%. Slack Notifications: For internal alerts and rebooking suggestions. Log Activity Stores all sync actions and notifications in farealertlogs. Usage Guide Import the workflow into your n8n instance. Set up API credentials for Amadeus and Skyscanner. Configure email, SMS (Twilio), and Slack credentials. Update the booking database with valid records (with route, fare, timestamp). Set schedule frequency (e.g., every 6 hours). Review logs regularly to monitor fare alert activity and system health. Prerequisites Valid accounts and credentials for: Amadeus API Skyscanner API SendGrid (or SMTP) for email Twilio for SMS Slack workspace & bot token PostgreSQL or MySQL database for fare tracking Tracked booking dataset (with routes, fares, and user contacts) Customization Options Adjust alert thresholds in the comparison logic (e.g., trigger only if fare drops > \$50 or >10%). Add new notification channels (e.g., WhatsApp, Telegram). Extend logic to track multi-leg or roundtrip fares. Integrate airline refund APIs (where supported) to auto-initiate refund or credit requests. Excel Output Columns When exporting or logging fare tracking data to Excel or CSV, use the following structure: | flight\number | airline | departure | arrival | departure\time | arrival\time | current\fare | route | timestamp | | -------------- | --------------- | ---------------------------- | ------------------------- | ------------------------- | ------------------------- | ------------- | ------- | ------------------------ | | AT5049 | Royal Air Maroc | John F Kennedy International | Los Angeles International | 2025-07-21T06:00:00+00:00 | 2025-07-21T08:59:00+00:00 | 235 | JFK-LAX | 2025-07-21T13:04:14.000Z | | BA1905 | British Airways | John F Kennedy International | Los Angeles International | 2025-07-21T06:00:00+00:00 | 2025-07-21T08:59:00+00:00 | 479 | JFK-LAX | 2025-07-21T13:04:14.000Z |
Create images from text prompts using HeraNathalie model and Replicate
This workflow provides automated access to the Digitalhera Heranathalie AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the Digitalhera Heranathalie model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model for automated processing and generation tasks. Key Capabilities Specialized AI model with unique capabilities Advanced processing and generation features Custom AI-powered automation tools Tools Used n8n: The automation platform that orchestrates the workflow Replicate API: Access to the Digitalhera/heranathalie AI model Digitalhera Heranathalie: The core AI model for other generation Built-in Error Handling: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing: Handle specific AI tasks and workflows Custom Automation: Implement unique business logic and processing Data Processing: Transform and analyze various types of data AI Integration: Add AI capabilities to existing systems and workflows Connect with Me Website: https://www.nofluff.online YouTube: https://www.youtube.com/@YaronBeen/videos LinkedIn: https://www.linkedin.com/in/yaronbeen/ Get Replicate API: https://replicate.com (Sign up to access powerful AI models) n8n automation ai replicate aiautomation workflow nocode aiprocessing dataprocessing machinelearning artificialintelligence aitools automation digitalart contentcreation productivity innovation
Extract business email addresses using Serper.dev and ScrapingBee from Google Sheets
Lead Enrichment & Email Discovery from Google Sheets What this workflow does This template automates the enrichment of business leads from a Google Sheet by: Triggering when a row is activated Searching for company information with Serper.dev Generating and validating potential contact pages Scraping company pages with ScrapingBee Extracting emails and updating the sheet Marking rows as finished --- Prerequisites Google Sheet with columns: business type, city, state, activate Copy the ready-to-use template: Sheet Template Google Sheets API credentials (from Google Cloud) Serper.dev API key (free tier available) ScrapingBee API key (free tier available) --- Inputs Google Sheet row: Must include business type, city, state, activate Set Information Node: country, countrycode, language, resultcount (can also be provided via columns in the sheet) --- Outputs Google Sheet update: Company names, URLs, found email addresses (comma-separated if multiple), and status updates (Running, Missing information, Finished) --- Configuration Required Connect Google Sheets node with your Google Cloud credentials Add your Serper.dev API key to the HTTP Request node Add your ScrapingBee API key to the scraping node Adjust search and filtering options as needed --- How to customize the workflow Send country, countrycode, and resultcount from the sheet: Add these as columns in your sheet and update the workflow to read their values dynamically, making your search fully configurable per row. Add more blacklist terms: Update the code node with additional company names or keywords you want to exclude from the search results. Extract more contact details: Modify the email extraction code to find other contact info (like phone numbers or social profiles) if needed.
Find the best food deals automatically with Bright Data & n8n
Description This workflow automatically scans food delivery platforms and restaurant websites to find the best deals and discounts. It helps you save money on meals by aggregating special offers and promotions in one place. Overview This workflow automates finding the best food deals and discounts from various websites. It uses Bright Data to scrape deal information and can be configured to send you notifications or save the deals to a spreadsheet. Tools Used n8n: The automation platform that orchestrates the workflow. Bright Data: For scraping food deal websites without getting blocked. (Optional) Google Sheets/Discord/Telegram: To store or get notified about the deals. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Set Up Notifications (Optional): Configure the nodes for Google Sheets, Discord, or Telegram if you want to receive notifications. Customize: Specify the websites you want to scrape for deals. Use Cases Foodies: Always be the first to know about the best restaurant deals in your city. Students: Save money by finding cheap eats and special offers. Families: Plan your meals around the best grocery and restaurant discounts. --- Connect with Me Website: https://www.nofluff.online YouTube: https://www.youtube.com/@YaronBeen/videos LinkedIn: https://www.linkedin.com/in/yaronbeen/ Get Bright Data: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) n8n automation fooddeals brightdata webscraping discounts fooddiscounts mealdeals restaurantdeals savemoney foodoffers n8nworkflow workflow nocode foodtech dealfinder specialoffers fooddelivery budgetmeals foodsavings dealhunting foodautomation moneysaving foodhacks bestdeals foodscraping