Back to Catalog
Artur

Artur

Software Engineer & Automations Specialist

Total Views13,537
Templates5

Templates by Artur

Automated Upwork job alerts with MongoDB & Slack

Overview This automated workflow fetches Upwork job postings using Apify, removes duplicate job listings via MongoDB, and sends new job opportunities to Slack. Key Features: Automated job retrieval from Upwork via Apify API Duplicate filtering using MongoDB to store only unique jobs Slack notifications for new job postings Runs every 20 minutes during working hours (9 AM - 5 PM) This workflow requires an active Apify subscription to function, as it uses the Apify Upwork API to fetch job listings. Who is This For? This workflow is ideal for: Freelancers looking to track Upwork jobs in real time Recruiters automating job collection for analytics Developers who want to integrate Upwork job data into their applications What Problem Does This Solve? Manually checking Upwork for jobs is time-consuming and inefficient. This workflow: Automates job discovery based on your keywords Filters out duplicate listings, ensuring only new jobs are stored Notifies you on Slack when new jobs appear How the Workflow Works Schedule Trigger (Every 20 Minutes) Triggers the workflow at 20-minute intervals Ensures job searches are only executed during working hours (9 AM - 5 PM) Query Upwork for Jobs Uses Apify API to scrape Upwork job posts for specific keywords (e.g., "n8n", "Python") Find Existing Jobs in MongoDB Searches MongoDB to check if a job (based on title and budget) already exists Filter Out Duplicate Jobs The Merge Node compares Upwork jobs with MongoDB data The IF Node filters out jobs that are already stored in the database Save Only New Jobs in MongoDB The Insert Node adds only new job listings to the MongoDB collection Send a Slack Notification If a new job is found, a Slack message is sent with job details Setup Guide Required API Keys Upwork Scraper (Apify Token) – Get your token from Apify MongoDB Credentials – Set up MongoDB in n8n using your connection string Slack API Token – Connect Slack to n8n and set the channel ID (default: general) Configuration Steps Modify search keywords in the 'Assign Parameters' node (startUrls) Adjust the Working Hours in the 'If Working Hours' node Set your Slack channel in the Slack node Ensure MongoDB is connected properly Adjust the 'If Working Hours' node to match your timezone and hours, or remove it altogether to receive notifications and updates constantly. How to Customize the Workflow Change keywords: update the startUrls in the 'Assign Parameters' node to track different job categories Change 'If Working Hours': Modify conditions in the IF Node to filter times based on your needs Modify Slack Notifications: Adjust the Slack message format to include additional job details Why Use This Workflow? Automated job tracking without manual searches Prevents duplicate entries in MongoDB Instant Slack notifications for new job opportunities Customizable – adapt the workflow to different job categories Next Steps Run the workflow and test with a small set of keywords Expand job categories for better coverage Enhance notifications by integrating Telegram, Email, or a dashboard This workflow ensures real-time job tracking, prevents duplicates, and keeps you updated effortlessly.

ArturBy Artur
5080

Remove personally identifiable information (PII) from CSV files with OpenAI

What this workflow does Monitors Google Drive: The workflow triggers whenever a new CSV file is uploaded. Uses AI to Identify PII Columns: The OpenAI node analyzes the data and identifies PII-containing columns (e.g., name, email, phone). Removes PII: The workflow filters out these columns from the dataset. Uploads Cleaned File: The sanitized file is renamed and re-uploaded to Google Drive, ensuring the original data remains intact. How to customize this workflow to your needs Adjust PII Identification: Modify the prompt in the OpenAI node to align with your specific data compliance requirements. Include/Exclude File Types: Adjust the Google Drive Trigger settings to monitor specific file types (e.g., CSV only). Output Destination: Change the folder in Google Drive where the sanitized file is uploaded. Setup Prerequisites: A Google Drive account. An OpenAI API key. Workflow Configuration: Configure the Google Drive Trigger to monitor a folder for new files. Configure the OpenAI Node to connect with your API Set the Google Drive Upload folder to a different location than the Trigger folder to prevent workflow loops.

ArturBy Artur
1739

Automated Upwork job alerts with Airtable & Slack

Overview This automated workflow fetches Upwork job postings using Apify, removes duplicate job listings via Airtable, and sends new job opportunities to Slack. Key Features: Automated job retrieval from Upwork via Apify API Duplicate filtering using Airtable to store only unique jobs Slack notifications for new job postings Runs every 30 minutes during working hours (9 AM - 5 PM) This workflow requires an active Apify subscription to function, as it uses the Apify Upwork API to fetch job listings. Who is This For? This workflow is ideal for: Freelancers looking to track Upwork jobs in real time Recruiters automating job collection for analytics Developers who want to integrate Upwork job data into their applications What Problem Does This Solve? Manually checking Upwork for jobs is time-consuming and inefficient. This workflow: Automates job discovery based on your keywords Filters out duplicate listings, ensuring only new jobs are stored Notifies you on Slack when new jobs appear How the Workflow Works Schedule Trigger (Every 20 Minutes) Triggers the workflow at 20-minute intervals Ensures job searches are only executed during working hours (9 AM - 5 PM) Query Upwork for Jobs Uses Apify API to scrape Upwork job posts for specific keywords (e.g., "n8n", "Python") Find Existing Jobs in Airtable Searches Airtable to check if a job (based on title and link) already exists Filter Out Duplicate Jobs The Merge Node compares Upwork jobs with Airtable data The IF Node filters out jobs that are already stored in the database Save Only New Jobs in Airtable The Insert Node adds only new job listings to the Airtable collection Send a Slack Notification If a new job is found, a Slack message is sent with job details Setup Guide Required API Keys Upwork Scraper (Apify Token) – Get your token from Apify Airtable Credentials Slack API Token – Connect Slack to n8n and set the channel ID (default: general) Configuration Steps Modify search keywords in the 'Assign Parameters' node (startUrls) Adjust the Working Hours in the 'If Working Hours' node Set your Slack channel in the Slack node Ensure Airtable is connected properly - you'll need to create a table with 'title' and 'link' columns. Adjust the 'If Working Hours' node to match your timezone and hours, or remove it altogether to receive notifications and updates constantly. How to Customize the Workflow Change keywords: update the startUrls in the 'Assign Parameters' node to track different job categories Change 'If Working Hours': Modify conditions in the IF Node to filter times based on your needs Modify Slack Notifications: Adjust the Slack message format to include additional job details Why Use This Workflow? Automated job tracking without manual searches Prevents duplicate entries in Airtable Instant Slack notifications for new job opportunities Customizable – adapt the workflow to different job categories Next Steps Run the workflow and test with a small set of keywords Expand job categories for better coverage Enhance notifications by integrating Telegram, Email, or a dashboard This workflow ensures real-time job tracking, prevents duplicates, and keeps you updated effortlessly.

ArturBy Artur
196

Connect Pipedrive deal outcomes to GA4 & Google Ads via Measurement Protocol

Who’s it for Problem: Your ads and GA4 often optimize to shallow web events (form fills), while the real value sits in Pipedrive (Qualified, Closed Won). That gap means bidding chases cheap leads instead of revenue. Solution: This template turns Pipedrive deal milestones into server-side GA4 events (via Measurement Protocol), matched to the original visitor by client_id—no external database. You mark them as Key events in GA4 and, when ready, import them into Google Ads. Business value: Ads can optimize on actual CRM outcomes with value (and currency), improving CAC/ROAS and reducing lead spam. Everything stays GA4-centric, consent-aware, and deduped with deterministic event_id, so you get reliable attribution without building a custom Ads integration. If you’re serious about scaling paid spend toward high-quality, high-value deals, this is the missing link. This is not plug-and-play. Expect \~2 hours of developer work to integrate: What it does On Deal Updated, the workflow checks for target stages (e.g., Qualified, Closed Won), fetches the linked Person’s client\id, builds a GA4 Measurement Protocol event (with value, currency, deterministic eventid), and posts it server-side. Success updates deal-level dedupe flags; failures are logged to a Pipedrive Note. Credentials use n8n Credentials—no hardcoded secrets. How it works (high level) A Pipedrive trigger listens for stage changes → an eligibility check gates sends → a payload builder composes the GA4 event → an HTTP call posts to GA4. Sticky notes in the canvas explain the whole setup and what to edit. Requirements GA4 installed on your site; GA4 Measurement ID + API Secret (same data stream); Pipedrive Person field for clientid, optional consentgranted; Pipedrive Deal booleans for dedupe; n8n credentials configured (no secrets in nodes). How to customize the workflow Change which stages send events and the event names. Adjust value logic (e.g., margin- or probability-weighted). Choose skip-on-no-consent vs. nonpersonalizedads: true. Add extra params (dealid, pipelinestage, etc.) to GA4 as custom dimensions (event-scoped). Troubleshooting & debugging If client_id is null: GA4 not initialized yet, consent denied, GTM var not firing, or adblockers—fix site capture first. If GA4 shows events in DebugView but Ads import is empty: events not marked as Key events, GA4↔Ads not linked/imported, or wrong stream/secret pairing. Use https://www.google-analytics.com/debug/mp/collect only for payload validation; it won’t verify your API secret. Final sends go to /mp/collect.

ArturBy Artur
50

Ga4 anomaly detection with automated Slack & email alerts

Who’s it for Teams that monitor traffic, signups, or conversions in Google Analytics 4 and want automatic Slack/email alerts when a channel suddenly spikes or drops. What it does This n8n template pulls daily GA4 metrics, detects outliers with a rolling mean and z-score, and sends alerts with a sparkline chart. It supports per-channel analysis (e.g., sessionDefaultChannelGroup) and consolidates multiple anomalies into a single email while posting each one to Slack. How it works HTTP Request (GA4 Data API) fetches sessions, newUsers, conversions, bounceRate by date + channel. Code calculates 7-day moving average and z-scores, flags anomalies, and builds QuickChart links. If filters on alert === true and optional ALERT_ME toggle. Slack posts an alert + chart. Email sends one summary email (subject + HTML table + charts). Requirements GA4 OAuth2 credential in n8n. Slack API credential (bot with chat\:write). Email credential (SMTP or service). GA4 property ID and at least several recent days of data. Where to find your GA4 Property ID In the GA UI: Open Google Analytics → bottom-left Admin (gear). In the Property column, click Property settings. Copy Property ID — it’s a numeric value (e.g., 481356553). From the URL (quick way): When you’re inside the GA4 property, the URL looks like: …/analytics/web//p123456789/… → the digits after p are your Property ID (123456789 in this example). What not* to use: Measurement ID (looks like G-XXXXXXX) — that’s the data stream ID, not the property ID. Universal Analytics IDs (UA-XXXXX-Y) — those are legacy and won’t work with GA4 Data API. In this template: Put that numeric ID into the Set → PROPERTYID field. The HTTP node path properties/{{ $json.PROPERTYID }}:runReport expects only the number, no prefixes. How to set up Open the Set (Define variables) node and fill: PROPERTYID, LOOKBACKDAYS, ALERTPCT, ZTHRESHOLD, CHANNELDIM, ALERTME. Connect your Google Analytics OAuth2, Slack, and Email credentials. In Email Send, map Subject → {{$json.emailSubject}} and HTML body → {{$json.emailHtml}}. Keep Execute once enabled. Run the workflow. How to customize Change the moving-average window (WINDOW/MAWINDOW) and chart range (LASTNDAYSCHART). Swap CHANNEL_DIM (e.g., source/medium) to analyze different dimensions. Add/remove metrics in the GA4 request and the metrics list in the Code node. Tweak thresholds to reduce noise: raise ZTHRESHOLD or ALERTPCT. Output example

ArturBy Artur
42
All templates loaded