Find high-intent sales leads by scraping Glassdoor with Bright Data & GPT
🔍 Scrape Glassdoor with Bright Data Designed for sales teams, recruiters, and marketers aiming to automate job discovery and prospecting. This workflow scrapes Glassdoor job listings using Bright Data and automatically generates targeted pitches using AI, streamlining lead identification and outreach. --- 🧩 How It Works This automation leverages n8n, Bright Data, Google Sheets, and OpenAI: Trigger Starts with a custom form input (Location, Keyword, Country). Bright Data Job Scrape Triggers a Bright Data dataset snapshot via HTTP Request. Polls snapshot progress using a Wait node, ensuring data readiness. Retrieves full job listings dataset once ready. Google Sheets Integration Writes detailed job data (company, role, location, overview, metrics) into a Google Sheet. Uses a pre-built template for organized data storage. Automated Pitch Generation (AI) Splits listings into actionable parts: company name, title, and description. Sends data to OpenAI (via LangChain) to generate relevant pitches or icebreakers. Saves generated content back into the same sheet for easy access. --- ✅ Requirements Ensure you have the following: Google Sheets Google account Template Sheet with columns for job details and AI-generated pitches Bright Data Active account with Dataset API access API key and dataset ID OpenAI Valid OpenAI API key for GPT models n8n Environment Nodes: HTTP Request, Wait, If, Google Sheets, Split Out, LangChain (OpenAI) Credentials: Google Sheets OAuth2 Bright Data API credentials OpenAI API key --- ⚙️ Setup Instructions Step 1: Prepare Google Sheets Copy the provided Google Sheets template Do not change headers Step 2: Import & Configure Workflow in n8n Import the workflow JSON file Set Google Sheets node: Link to your copied sheet Confirm correct tab name Step 3: Configure Bright Data Replace <YOURBRIGHTDATAAPIKEY> with your real key Set your dataset ID in all HTTP Request nodes Step 4: Configure OpenAI (LangChain) Connect OpenAI API key to the LangChain node Customize prompt to match tone and outreach style Step 5: Testing & Scheduling Test via manual form trigger Schedule runs or leave form enabled for on-demand use --- 🧠 Tips & Best Practices Use specific keywords and locations for better results Adjust polling intervals based on dataset size Refine AI prompts regularly to improve pitch quality Clean unused columns from your sheet to boost performance --- 💬 Support & Feedback For help or customization: 📧 Email: Yaron@nofluff.online 📺 YouTube: @YaronBeen 🔗 LinkedIn: linkedin.com/in/yaronbeen 📚 Bright Data Docs: docs.brightdata.com/introduction
Automated email classification & response system with Groq AI and Pinecone
📝 Description This workflow helps automatically classify incoming emails using a combination of conditional logic and minimal AI-based classification. The system checks email content, performs sentiment analysis, uses OpenAI for categorization, and routes emails accordingly — with smart but efficient use of LLMs and AI Agents. ⚙️ How it works Trigger: An IMAP Email Trigger initiates the workflow upon receiving a new email. Code Block: Parses essential data from the email. Switch Node: Routes emails based on classification. LLM Chain: Processes specific email cases (e.g., inquiries or complaints). AI Agent (Minimal): Used only when other methods cannot determine intent. Email Responses: Sends tailored replies or routes to support/sales teams accordingly. Sentiment Analysis: Assists with tone evaluation for better response routing. 🧩 Set up steps Estimated setup time: 10–15 minutes You’ll need: An IMAP-compatible email account OpenAI or any compatible LLM provider Pinecone (optional, for vector memory) SMTP credentials for sending email Replace placeholder credentials in sticky notes before running.
Deploy Docker MinIO, API backend for WHMCS/WISECP
Setting up n8n workflow Overview The Docker MinIO WHMCS module uses a specially designed workflow for n8n to automate deployment processes. The workflow provides an API interface for the module, receives specific commands, and connects via SSH to a server with Docker installed to perform predefined actions. Prerequisites You must have your own n8n server. Alternatively, you can use the official n8n cloud installations available at: n8n Official Site Installation Steps Install the Required Workflow on n8n You have two options: Option 1: Use the Latest Version from the n8n Marketplace The latest workflow templates for our modules are available on the official n8n marketplace. Visit our profile to access all available templates: PUQcloud on n8n Option 2: Manual Installation Each module version comes with a workflow template file. [](https://doc.puq.info/uploads/images/gallery/2025-03/image-1741974273723.png) You need to manually import this template into your n8n server. [](https://doc.puq.info/uploads/images/gallery/2025-03/image-1741284912356.png) n8n Workflow API Backend Setup for WHMCS/WISECP Configure API Webhook and SSH Access Create a Basic Auth Credential for the Webhook API Block in n8n. [](https://doc.puq.info/uploads/images/gallery/2025-03/image-1741974396480.png) [](https://doc.puq.info/uploads/images/gallery/2025-03/image-1741974500641.png) [](https://doc.puq.info/uploads/images/gallery/2025-03/image-1741285036996.png) Create an SSH Credential for accessing a server with Docker installed. [](https://doc.puq.info/uploads/images/gallery/2025-03/image-1741285118412.png) [](https://doc.puq.info/uploads/images/gallery/2025-03/image-1741285147192.png) [](https://doc.puq.info/uploads/images/gallery/2025-03/image-1741285198822.png) Modify Template Parameters In the Parameters block of the template, update the following settings: [](https://doc.puq.info/uploads/images/gallery/2025-03/image-1741974559641.png) [](https://doc.puq.info/uploads/images/gallery/2025-03/image-1741285412110.png) server_domain – Must match the domain of the WHMCS/WISECP Docker server. clients_dir – Directory where user data related to Docker and disks will be stored. mount_dir – Default mount point for the container disk (recommended not to change). Do not modify the following technical parameters: screen_left screen_right Deploy-docker-compose In the Deploy-docker-compose element, you have the ability to modify the Docker Compose configuration, which will be generated in the following scenarios: When the service is created When the service is unlocked When the service is updated [](https://doc.puq.info/uploads/images/gallery/2025-03/image-1741875704524.png) [](https://doc.puq.info/uploads/images/gallery/2025-03/image-1741974602887.png) nginx In the nginx element, you can modify the configuration parameters of the web interface proxy server. The main section allows you to add custom parameters to the server block in the proxy server configuration file. The main\_location section contains settings that will be added to the location / block of the proxy server configuration. Here, you can define custom headers and other parameters specific to the root location. [](https://doc.puq.info/uploads/images/gallery/2025-03/image-1741875960357.png) [](https://doc.puq.info/uploads/images/gallery/2025-03/image-1741974633761.png) Bash Scripts Management of Docker containers and all related procedures on the server is carried out by executing Bash scripts generated in n8n. These scripts return either a JSON response or a string. All scripts are located in elements directly connected to the SSH element. You have full control over any script and can modify or execute it as needed. [](https://doc.puq.info/uploads/images/gallery/2025-03/image-1741876353319.png)