Automate hotel price comparison with multi-platform scraping and email reporting
This is a production-ready, end-to-end workflow that automatically compares hotel prices across multiple booking platforms and delivers beautiful email reports to users. Unlike basic building blocks, this workflow is a complete solution ready to deploy.
β¨ What Makes This Production-Ready
β Complete End-to-End Automation
- Input: Natural language queries via webhook
- Processing: Multi-platform scraping & comparison
- Output: Professional email reports + analytics
- Feedback: Real-time webhook responses
β Advanced Features
- π§ Natural Language Processing for flexible queries
- π Parallel scraping from multiple platforms
- π Analytics tracking with Google Sheets integration
- π Beautiful HTML email reports
- π‘οΈ Error handling and graceful degradation
- π± Webhook responses for real-time feedback
β Business Value
- For Travel Agencies: Instant price comparison service for clients
- For Hotels: Competitive pricing intelligence
- For Travelers: Save time and money with automated research
π Setup Instructions
Step 1: Import Workflow
- Copy the workflow JSON from the artifact
- In n8n, go to Workflows β Import from File/URL
- Paste the JSON and click Import
Step 2: Configure Credentials
A. SMTP Email (Required)
Settings β Credentials β Add Credential β SMTP
Host: smtp.gmail.com (for Gmail)
Port: 587
User: your-email@gmail.com
Password: your-app-password (not regular password!)
Gmail Setup:
- Enable 2FA on your Google Account
- Generate App Password: https://myaccount.google.com/apppasswords
- Use the generated password in n8n
B. Google Sheets (Optional - for analytics)
Settings β Credentials β Add Credential β Google Sheets OAuth2
Follow the OAuth flow to connect your Google account
Sheet Setup:
- Create a new Google Sheet
- Name the first sheet "Analytics"
- Add headers:
timestamp,query,hotel,city,checkIn,checkOut,bestPrice,platform,totalResults,userEmail - Copy the Sheet ID from URL and paste in the "Save to Google Sheets" node
Step 3: Set Up Scraping Service
You need to create a scraping API that the workflow calls. Here are your options:
Option A: Use Your Existing Python Script
Create a simple Flask API wrapper:
# api_wrapper.py
from flask import Flask, request, jsonify
import subprocess
import json
app = Flask(__name__)
@app.route('/scrape/<platform>', methods=['POST'])
def scrape(platform):
data = request.json
query = f"{data['checkIn']} to {data['checkOut']}, {data['hotel']}, {data['city']}"
try:
result = subprocess.run(
['python3', 'price_scrap_2.py', query, platform],
capture_output=True,
text=True,
timeout=30
)
# Parse your script output
output = result.stdout
# Assuming your script returns price data
return jsonify({
'price': extracted_price,
'currency': 'USD',
'roomType': 'Standard Room',
'url': booking_url,
'availability': True
})
except Exception as e:
return jsonify({'error': str(e)}), 500
if __name__ == '__main__':
app.run(host='0.0.0.0', port=5000)
Deploy:
pip install flask
python api_wrapper.py
Update n8n HTTP Request nodes:
URL: http://your-server-ip:5000/scrape/booking
URL: http://your-server-ip:5000/scrape/agoda
URL: http://your-server-ip:5000/scrape/expedia
Option B: Use Third-Party Scraping Services
Recommended Services:
- ScraperAPI (scraperapi.com) - $49/month for 100k requests
- Bright Data (brightdata.com) - Pay as you go
- Apify (apify.com) - Has pre-built hotel scrapers
Example with ScraperAPI:
// In HTTP Request node
URL: http://api.scraperapi.com
Query Parameters:
api_key: YOUR_API_KEY
url: https://booking.com/search?hotel={{$json.hotelName}}...
Option C: Use n8n SSH Node (Like Your Original)
Keep your SSH approach but improve it:
- Replace HTTP Request nodes with SSH nodes
- Point to your server with the Python script
- Ensure error handling and timeouts
// SSH Node Configuration
Host: your-server-ip
Command: python3 /path/to/price_scrap_2.py "{{$json.hotelName}}" "{{$json.city}}" "{{$json.checkInISO}}" "{{$json.checkOutISO}}" "booking"
Step 4: Activate Webhook
- Click on "Webhook - Receive Request" node
- Click "Listen for Test Event"
- Copy the webhook URL (e.g.,
https://your-n8n.com/webhook/hotel-price-check) - Test with this curl command:
curl -X POST https://your-n8n.com/webhook/hotel-price-check \
-H "Content-Type: application/json" \
-d '{
"message": "I want to check Marriott Hotel in Singapore from 15th March to 18th March",
"email": "user@example.com",
"name": "John Doe"
}'
Step 5: Activate Workflow
- Toggle the workflow to Active
- The webhook is now live and ready to receive requests
π Usage Examples
Example 1: Basic Query
{
"message": "Hilton Hotel in Dubai from 20th December to 23rd December",
"email": "traveler@email.com",
"name": "Sarah"
}
Example 2: Flexible Format
{
"message": "I need prices for Taj Hotel, Mumbai. Check-in: 5th January, Check-out: 8th January",
"email": "customer@email.com"
}
Example 3: Short Format
{
"message": "Hyatt Singapore March 10 to March 13",
"email": "user@email.com"
}
π¨ Customization Options
1. Add More Booking Platforms
Steps:
- Duplicate an existing "Scrape" node
- Update the platform parameter
- Connect it to "Aggregate & Compare"
- Update the aggregation logic to include the new platform
2. Change Email Template
Edit the "Format Email Report" node's JavaScript:
- Modify HTML structure
- Change colors (currently purple gradient)
- Add your company logo
- Include terms and conditions
3. Add SMS Notifications
Using Twilio:
- Add new node: Twilio β Send SMS
- Connect after "Aggregate & Compare"
- Format: "Best deal: ${hotel} at ${platform} for ${price}"
4. Add Slack Integration
- Add Slack node after "Aggregate & Compare"
- Send to #travel-deals channel
- Include quick booking links
5. Implement Caching
Add Redis or n8n's built-in cache:
// Before scraping, check cache
const cacheKey = `${hotelName}-${city}-${checkIn}-${checkOut}`;
const cached = await $cache.get(cacheKey);
if (cached && Date.now() - cached.timestamp < 3600000) {
return cached.data; // Use 1-hour cache
}
π Analytics & Monitoring
Google Sheets Dashboard
The workflow automatically logs to Google Sheets. Create a dashboard with:
Metrics to track:
- Total searches per day/week
- Most searched hotels
- Most searched cities
- Average price ranges
- Platform with best prices (frequency)
- User engagement (repeat users)
Example Sheet Formulas:
// Total searches today
=COUNTIF(A:A, TODAY())
// Most popular hotel
=INDEX(C:C, MODE(MATCH(C:C, C:C, 0)))
// Average best price
=AVERAGE(G:G)
Set Up Alerts
Add a node after "Aggregate & Compare":
// Alert if prices are unusually high
if (bestDeal.price > avgPrice * 1.5) {
// Send alert to admin
return [{
json: {
alert: true,
message: `High prices detected for ${hotelName}`
}
}];
}
π‘οΈ Error Handling
The workflow includes comprehensive error handling:
1. Missing Information
If user doesn't provide hotel/city/dates β Responds with helpful prompt
2. Scraping Failures
If all platforms fail β Sends "No results" email with suggestions
3. Partial Results
If some platforms work β Shows available results + notes errors
4. Email Delivery Issues
Uses continueOnFail: true to prevent workflow crashes
π Security Best Practices
1. Rate Limiting
Add rate limiting to prevent abuse:
// In Parse & Validate node
const userEmail = $json.email;
const recentSearches = await $cache.get(`searches:${userEmail}`);
if (recentSearches && recentSearches.length > 10) {
return [{
json: {
status: 'rate_limited',
response: 'Too many requests. Please try again in 1 hour.'
}
}];
}
2. Input Validation
Already implemented - validates hotel names, cities, dates
3. Email Verification
Add email verification before first use:
// Send verification code
const code = Math.random().toString(36).substring(7);
await $sendEmail({
to: userEmail,
subject: 'Verify your email',
body: `Your code: ${code}`
});
4. API Key Protection
Never expose scraping API keys in responses or logs
π Deployment Options
Option 1: n8n Cloud (Easiest)
- Sign up at n8n.cloud
- Import workflow
- Configure credentials
- Activate
Pros: No maintenance, automatic updates Cons: Monthly cost
Option 2: Self-Hosted (Most Control)
# Using Docker
docker run -it --rm \
--name n8n \
-p 5678:5678 \
-v ~/.n8n:/home/node/.n8n \
n8nio/n8n
# Using npm
npm install -g n8n
n8n start
Pros: Free, full control Cons: You manage updates
Option 3: Cloud Platforms
- Railway.app (recommended for beginners)
- DigitalOcean App Platform
- AWS ECS
- Google Cloud Run
π Scaling Recommendations
For < 100 searches/day
- Current setup is perfect
- Use n8n Cloud Starter or small VPS
For 100-1000 searches/day
- Add Redis caching (1-hour cache)
- Use queue system for scraping
- Upgrade to n8n Cloud Pro
For 1000+ searches/day
- Implement job queue (Bull/Redis)
- Use dedicated scraping service
- Load balance multiple n8n instances
- Consider microservices architecture
π Troubleshooting
Issue: Webhook not responding
Solution:
- Check workflow is Active
- Verify webhook URL is correct
- Check n8n logs: Settings β Log Streaming
Issue: No prices returned
Solution:
- Test scraping endpoints individually
- Check if hotel name matches exactly
- Verify dates are in future
- Try different date ranges
Issue: Emails not sending
Solution:
- Verify SMTP credentials
- Check "less secure apps" setting (Gmail)
- Use App Password instead of regular password
- Check spam folder
Issue: Slow response times
Solution:
- Enable parallel scraping (already configured)
- Add timeout limits (30 seconds recommended)
- Implement caching
- Use faster scraping service
n8n Workflow: Automate Hotel Price Comparison and Email Reporting
This n8n workflow provides a robust solution for automating the process of comparing hotel prices across various platforms and generating a consolidated report via email. It's designed to streamline market research, competitive analysis, or personal travel planning by fetching data, processing it, and delivering actionable insights directly to your inbox.
What it does
This workflow automates the following steps:
- Receives a Trigger: The workflow is initiated by an external trigger, likely an HTTP request, which can be configured to run on a schedule or in response to an event from another system.
- Scrapes Hotel Data: It sends an HTTP request to an external API or web scraping service to gather hotel pricing and availability information.
- Processes Data: A "Code" node is used to process the raw data received from the scraping step. This likely involves parsing the JSON/HTML response, extracting relevant details (hotel name, price, platform, dates, etc.), and potentially standardizing the data format.
- Transforms Data for Reporting: An "Edit Fields (Set)" node prepares the extracted data into a structured format suitable for reporting, ensuring consistency and clarity for the subsequent steps.
- Stores Data in Google Sheets (Conditional): An "If" node checks a condition on the processed data.
- If True: The data is appended to a Google Sheet, allowing for historical tracking and further analysis.
- If False: The workflow proceeds without writing to Google Sheets.
- Sends Email Report: An email is composed and sent, containing the aggregated and processed hotel price comparison data. This report provides a quick overview of the findings.
- Responds to Webhook: Finally, the workflow sends a response back to the initial webhook trigger, indicating successful execution.
Prerequisites/Requirements
To use this workflow, you will need:
- n8n Instance: A running instance of n8n.
- Webhook Endpoint: An external system or scheduler configured to send HTTP requests to the n8n Webhook trigger.
- Web Scraping Service/API: Access to a web scraping service or API (e.g., a custom scraper, Bright Data, ScrapingBee, etc.) to fetch hotel data. This is configured within the "HTTP Request" node.
- Google Account: A Google account with access to Google Sheets if you intend to store the data. You'll need to set up Google Sheets credentials in n8n.
- SMTP Credentials: SMTP server details or an email service provider (e.g., Gmail, SendGrid) to send emails. You'll need to set up Send Email credentials in n8n.
Setup/Usage
- Import the Workflow:
- Download the provided JSON file.
- In your n8n instance, click "New" in the workflows sidebar, then "Import from JSON".
- Paste the JSON content or upload the file.
- Configure Credentials:
- Google Sheets: If you plan to use Google Sheets, set up your Google Sheets OAuth2 or Service Account credentials in n8n.
- Send Email: Configure your SMTP or email service credentials in n8n.
- Configure Nodes:
- Webhook: The "Webhook" node will automatically generate a unique URL. Copy this URL and configure your external system to send requests to it.
- HTTP Request:
- Update the
URLfield with the endpoint of your web scraping service or hotel API. - Adjust
HeadersandBodyfields as needed to pass parameters like hotel names, locations, check-in/check-out dates, and API keys.
- Update the
- Code: Review and modify the JavaScript code within this node to correctly parse the response from your specific web scraping service and extract the desired hotel data.
- Edit Fields (Set): Adjust the fields to be set or renamed based on the output of your "Code" node and the desired structure for your report.
- If: Modify the condition in the "If" node if you have specific criteria for when to write data to Google Sheets (e.g., only if prices are below a certain threshold).
- Google Sheets:
- Select your Google Sheets credential.
- Specify the
Spreadsheet IDandSheet Namewhere you want to store the data. - Ensure the
Operationis set to "Append Row" and map the input fields correctly to your sheet columns.
- Send Email:
- Select your email credential.
- Set the
From Email,To Email,Subject, andBodyof the email. You can use expressions to dynamically include data from previous nodes in the email body (e.g.,{{ $json.hotelName }}or a summary table).
- Activate the Workflow: Once configured, activate the workflow by toggling the "Active" switch in the top right corner of the n8n editor.
- Test: Trigger the webhook from your external system to test the workflow end-to-end. Monitor the execution in n8n to ensure data is processed, stored, and emailed as expected.
Related Templates
AI-powered code review with linting, red-marked corrections in Google Sheets & Slack
Advanced Code Review Automation (AI + Lint + Slack) Whoβs it for For software engineers, QA teams, and tech leads who want to automate intelligent code reviews with both AI-driven suggestions and rule-based linting β all managed in Google Sheets with instant Slack summaries. How it works This workflow performs a two-layer review system: Lint Check: Runs a lightweight static analysis to find common issues (e.g., use of var, console.log, unbalanced braces). AI Review: Sends valid code to Gemini AI, which provides human-like review feedback with severity classification (Critical, Major, Minor) and visual highlights (red/orange tags). Formatter: Combines lint and AI results, calculating an overall score (0β10). Aggregator: Summarizes results for quick comparison. Google Sheets Writer: Appends results to your review log. Slack Notification: Posts a concise summary (e.g., number of issues and average score) to your teamβs channel. How to set up Connect Google Sheets and Slack credentials in n8n. Replace placeholders (<YOURSPREADSHEETID>, <YOURSHEETGIDORNAME>, <YOURSLACKCHANNEL_ID>). Adjust the AI review prompt or lint rules as needed. Activate the workflow β reviews will start automatically whenever new code is added to the sheet. Requirements Google Sheets and Slack integrations enabled A configured AI node (Gemini, OpenAI, or compatible) Proper permissions to write to your target Google Sheet How to customize Add more linting rules (naming conventions, spacing, forbidden APIs) Extend the AI prompt for project-specific guidelines Customize the Slack message formatting Export analytics to a dashboard (e.g., Notion or Data Studio) Why itβs valuable This workflow brings realistic, team-oriented AI-assisted code review to n8n β combining the speed of automated linting with the nuance of human-style feedback. It saves time, improves code quality, and keeps your teamβs review history transparent and centralized.
Automate Reddit brand monitoring & responses with GPT-4o-mini, Sheets & Slack
How it Works This workflow automates intelligent Reddit marketing by monitoring brand mentions, analyzing sentiment with AI, and engaging authentically with communities. Every 24 hours, the system searches Reddit for posts containing your configured brand keywords across all subreddits, finding up to 50 of the newest mentions to analyze. Each discovered post is sent to OpenAI's GPT-4o-mini model for comprehensive analysis. The AI evaluates sentiment (positive/neutral/negative), assigns an engagement score (0-100), determines relevance to your brand, and generates contextual, helpful responses that add genuine value to the conversation. It also classifies the response type (educational/supportive/promotional) and provides reasoning for whether engagement is appropriate. The workflow intelligently filters posts using a multi-criteria system: only posts that are relevant to your brand, score above 60 in engagement quality, and warrant a response type other than "pass" proceed to engagement. This prevents spam and ensures every interaction is meaningful. Selected posts are processed one at a time through a loop to respect Reddit's rate limits. For each worthy post, the AI-generated comment is posted, and complete interaction data is logged to Google Sheets including timestamp, post details, sentiment, engagement scores, and success status. This creates a permanent audit trail and analytics database. At the end of each run, the workflow aggregates all data into a comprehensive daily summary report with total posts analyzed, comments posted, engagement rate, sentiment breakdown, and the top 5 engagement opportunities ranked by score. This report is automatically sent to Slack with formatted metrics, giving your team instant visibility into your Reddit marketing performance. --- Who is this for? Brand managers and marketing teams needing automated social listening and engagement on Reddit Community managers responsible for authentic brand presence across multiple subreddits Startup founders and growth marketers who want to scale Reddit marketing without hiring a team PR and reputation teams monitoring brand sentiment and responding to discussions in real-time Product marketers seeking organic engagement opportunities in product-related communities Any business that wants to build authentic Reddit presence while avoiding spammy marketing tactics --- Setup Steps Setup time: Approx. 30-40 minutes (credential configuration, keyword setup, Google Sheets creation, Slack integration) Requirements: Reddit account with OAuth2 application credentials (create at reddit.com/prefs/apps) OpenAI API key with GPT-4o-mini access Google account with a new Google Sheet for tracking interactions Slack workspace with posting permissions to a marketing/monitoring channel Brand keywords and subreddit strategy prepared Create Reddit OAuth Application: Visit reddit.com/prefs/apps, create a "script" type app, and obtain your client ID and secret Configure Reddit Credentials in n8n: Add Reddit OAuth2 credentials with your app credentials and authorize access Set up OpenAI API: Obtain API key from platform.openai.com and configure in n8n OpenAI credentials Create Google Sheet: Set up a new sheet with columns: timestamp, postId, postTitle, subreddit, postUrl, sentiment, engagementScore, responseType, commentPosted, reasoning Configure these nodes: Brand Keywords Config: Edit the JavaScript code to include your brand name, product names, and relevant industry keywords Search Brand Mentions: Adjust the limit (default 50) and sort preference based on your needs AI Post Analysis: Customize the prompt to match your brand voice and engagement guidelines Filter Engagement-Worthy: Adjust the engagementScore threshold (default 60) based on your quality standards Loop Through Posts: Configure max iterations and batch size for rate limit compliance Log to Google Sheets: Replace YOURSHEETID with your actual Google Sheets document ID Send Slack Report: Replace YOURCHANNELID with your Slack channel ID Test the workflow: Run manually first to verify all connections work and adjust AI prompts Activate for daily runs: Once tested, activate the Schedule Trigger to run automatically every 24 hours --- Node Descriptions (10 words each) Daily Marketing Check - Schedule trigger runs workflow every 24 hours automatically daily Brand Keywords Config - JavaScript code node defining brand keywords to monitor Reddit Search Brand Mentions - Reddit node searches all subreddits for brand keyword mentions AI Post Analysis - OpenAI analyzes sentiment, relevance, generates contextual helpful comment responses Filter Engagement-Worthy - Conditional node filters only high-quality relevant posts worth engaging Loop Through Posts - Split in batches processes each post individually respecting limits Post Helpful Comment - Reddit node posts AI-generated comment to worthy Reddit discussions Log to Google Sheets - Appends all interaction data to spreadsheet for permanent tracking Generate Daily Summary - JavaScript aggregates metrics, sentiment breakdown, generates comprehensive daily report Send Slack Report - Posts formatted daily summary with metrics to team Slack channel
Generate Weather-Based Date Itineraries with Google Places, OpenRouter AI, and Slack
π§© What this template does This workflow builds a 120-minute local date course around your starting point by querying Google Places for nearby spots, selecting the top candidates, fetching real-time weather data, letting an AI generate a matching emoji, and drafting a friendly itinerary summary with an LLM in both English and Japanese. It then posts the full bilingual plan with a walking route link and weather emoji to Slack. π₯ Who itβs for Makers and teams who want a plug-and-play bilingual local itinerary generator with weather awareness β no custom code required. βοΈ How it works Trigger β Manual (or schedule/webhook). Discovery β Google Places nearby search within a configurable radius. Selection β Rank by rating and pick the top 3. Weather β Fetch current weather (via OpenWeatherMap). Emoji β Use an AI model to match the weather with an emoji π€οΈ. Planning β An LLM writes the itinerary in Markdown (JP + EN). Route β Compose a Google Maps walking route URL. Share β Post the bilingual itinerary, route link, and weather emoji to Slack. π§° Requirements n8n (Cloud or self-hosted) Google Maps Platform (Places API) OpenWeatherMap API key Slack Bot (chat:write) LLM provider (e.g., OpenRouter or DeepL for translation) π Setup (quick) Open Set β Fields: Config and fill in coords/radius/time limit. Connect Credentials for Google, OpenWeatherMap, Slack, and your LLM. Test the workflow and confirm the bilingual plan + weather emoji appear in Slack. π Customize Adjust ranking filters (type, min rating). Modify translation settings (target language or tone). Change output layout (side-by-side vs separated). Tune emoji logic or travel mode. Add error handling, retries, or logging for production use.