Zacharia Kimotho
Automation expert with years of experience helping businesses improve their efficiency and productivity with smart automations that are affordable, scalable, and flexible.
Categories
Templates by Zacharia Kimotho
Creating a AI Slack bot with Google Gemini
This is an example of how we can build a slack bot in a few easy steps Before you can start, you need to o a few things Create a copy of this workflow Create a slack bot Create a slash command on slack and paste the webhook url to the slack command Note Make sure to configure this webhook using a https:// wrapper and don't use the default http://localhost:5678 as that will not be recognized by your slack webhook. Once the data has been sent to your webhook, the next step will be passing it via an AI Agent to process data based on the queries we pass to our agent. To have some sort of a memory, be sure to set the slack token to the memory node. This way you can refer to other chats from the history. The final message is relayed back to slack as a new message. Since we can not wait longer than 3000 ms for slack response, we will create a new message with reference to the input we passed. We can advance this using the tools or data sources for it to be more custom tailored for your company. Usage To use the slackbot, go to slack and click on your set slash command eg /Bob and send your desired message. This will send the message to your endpoint and get return the processed results as the message. If you would like help setting this up, feel free to reach out to zacharia@effibotics.com
Stock market daily digest with Bright Data scraping & Gemini AI email reports
This workflow makes it easier to keep track of the stocks market and get an email with a summary of the daily highlights on what happened, key insights and trends Setup Guide Define the schedule (days, times, intervals). Replace sample stock data with your desired stock list (ticker, name, etc.) in JSON format. Split Out the fields to have a clean list of the stocks to monitor set keyword node Extracts the stock ticker from each item and sets it to the keyword property. Financial times scraper Triggers the Bright Data Datasets API to scrape financial data. Set the node as below Method: POST URL: https://api.brightdata.com/datasets/v3/trigger Query Parameters: dataset_id: Replace with your Bright Data dataset ID. include_errors: true type: discover_new discover_by: keyword Headers: Authorization: Bearer YOURBRIGHTDATAAPI_KEY Replace with your Bright Data API key. Body: JSON, ={{ $('set keyword').all().map(item => item.json)}} Execute Once: Checked. Get progress node Checks the status of the Bright Data scraping job if complete, or running Setup: URL: https://api.brightdata.com/datasets/v3/progress/{{ $json.snapshot_id }} Headers: Authorization: Bearer YOURBRIGHTDATAAPI_KEY Replace with your Bright Data API key. Get snapshot + data retrieves the scraped data from the Bright Data API. Pass the request as URL: https://api.brightdata.com/datasets/v3/snapshot/{{ $json.snapshot_id }} Query Parameters: format: json Headers: Authorization: Bearer YOURBRIGHTDATAAPI_KEY Replace with your Bright Data API key. Aggregate. Combines the data from each stock item into a single object Update to sheet and add all items to This sheet. Make a copy before you can map the data create summary node generates a summary of the scraped stock data using the Google Gemini AI model and notifies you via Gmail. Setup: Prompt Type: define Text: Customize the prompt to define the AI's role, input format, tasks, output format (HTML email), and constraints. Google Sheets. Appends the scraped data to a Google Sheet. This should be set to automap so as to adjust to the results found in the request Important Notes: Remember to replace placeholder values (API keys, dataset IDs, email addresses, Google Sheet IDs) with your actual values. Review and customize the AI prompt for the "create summary" node to achieve the desired email summary output. Consider adding error handling for a more robust workflow. Monitor API usage to avoid rate limits.
Generate AI prompts with Google Gemini and store them in Airtable
This workflow is designed to generate prompts for AI agents and store them in Airtable. It starts by receiving a chat message, processes it to create a structured prompt, categorizes the prompt, and finally stores it in Airtable. Setup Instructions Prerequisites AI model eg Gemini, openAI etc Airtable base and table or other storage tool Step-by-Step Guide Clone the Workflow Copy the provided workflow JSON and import it into your n8n instance. Configure Credentials Set up the Google Gemini(PaLM) API account credentials. Set up the Airtable Personal Access Token account credentials. Map Airtable Base and Table Create a copy of the Prompt Library in Airtable. Map the Airtable base and table in the Airtable node. Customize Prompt Template Edit the 'Create prompt' node to customize the prompt template as needed. Configuration Options Prompt Template: Customize the prompt template in the 'Create prompt' node to fit your specific use case. Airtable Mapping: Ensure the Airtable base and table are correctly mapped in the Airtable node. Running and Troubleshooting Running the Workflow Trigger the Workflow: Send a chat message to trigger the workflow. Monitor Execution: Use the n8n interface to monitor the workflow execution. Check Completion: Verify that the prompt is stored in Airtable and check the chat interface for the result. Troubleshooting Tips API Issues: Ensure that the APIs and Airtable credentials are correctly configured. Data Mapping: Verify that the Airtable base and table are correctly mapped. Prompt Template: Check the prompt template for any errors or inconsistencies. Use Case Examples This workflow is particularly useful in scenarios where you want to automate the generation and management of AI agent prompts. Here are some examples: Rapid Prototyping of AI Agents: Quickly generate and test different prompts for AI agents in various applications. Content Creation: Generate prompts for AI models that create blog posts, articles, or social media content. Customer Service Automation: Develop prompts for AI-powered chatbots to handle customer inquiries and support requests. Educational Tools: Create prompts for AI tutors or learning assistants. Industries/Professionals: Software Development: Developers building AI-powered applications. Marketing: Marketers automating content creation and social media management. Customer Service: Customer service managers implementing AI-driven chatbots. Education: Educators creating AI-based learning tools. Practical Value: Time Savings: Automates the prompt generation process, saving significant time and effort. Improved Prompt Quality: Leverages Google Gemini and structured prompt engineering principles to generate more effective prompts. Centralized Prompt Management: Stores prompts in Airtable for easy access, organization, and reuse. Running and Troubleshooting Running the Workflow: Activate the workflow in n8n. Send a chat message to the webhook URL configured in the "When chat message received" node. Monitor the workflow execution in the n8n editor. Monitoring Execution: Check the execution log in n8n to see the data flowing through each node and identify any errors. Checking for Successful Completion: Verify that a new record is created in your Airtable base with the generated prompt, name, and category. Confirm that the "Return results" node sends back confirmation of the prompt in the chat interface. Troubleshooting Tips: Error: 400: Bad Request in the Google Gemini nodes: Cause: Invalid API key or insufficient permissions. Solution: Double-check your Google Gemini API key and ensure that the API is enabled for your project. Error: Airtable node fails to create a record: Cause: Invalid Airtable credentials, incorrect Base ID or Table ID, or mismatched column names. Solution: Verify your Airtable API key, Base ID, Table ID, and column names. Ensure that the data types in n8n match the data types in your Airtable columns. Follow me on Linkedin for more
Posting from Wordpress to Medium
Usage This workflow gets all the posts from your WordPress site and sorts them into a clear format before publishing them to medium. Step 1. Set up the HTTP node and set the URL of the source destination. This will be the URL of the blog you want to use. We shall be using https://mailsafi.com/blog for this. Step 2. Extract the URLs of all the blogs on the page This gets all the blog titles and their URLs. Its an easy way to sort ou which blogs to share and which not to share. Step 3. Split the entries for easy sorting or a cleaner view. Step 4. Set a new https node with all the blog URLs that we got from the previous steps. Step 5. Extract the contents of the blog Step 6. Add the medium node and then set the contents that you want to be shared out. Execute your workflow and you are good to go
Create new Clickup tasks from Slack commands
Create new Clickup Tasks from Slack commands This workflow aims to make it easy to create new tasks on Clickup from normal Slack messages using simple slack command. For example We can have a slack command as /newTask Set task to update new contacts on CRM and assign them to the sales team This will have an new task on Clickup with the same title and description on Clickup For most teams, getting tasks from Slack to Clickup involves manually entering the new tasks into Clickup. What if we could do this with a simple slash command? Step 1 The first step is to Create an endpoint URL for your slack command by creating an events API from the link [below] https://api.slack.com/apps/) STEP 2 Next step is defining the endpoint for your URL Create a new webhook endpoint from your n8n with a POST and paste the endpoint URL to your event API. This will send all slash commands associated with the Slash to the desired endpoint Step 3 Log on to slack API (https://api.slack.com/) and create an application. This is the one we use to run all automation and commands from Slack. Once your app is ready, navigate to the Slash Commands and create a new command This will include the command, the webhook URL and a description of what the slash command is all about Now that this is saved you can do a test by sending a demo task to your endpoint Once you have tested the webhook slash command is working with the webhook, create a new Clickup API that can be used to create new tasks in ClickUp This workflow creates a new task with the start dates on Clikup that can be assigned to the respective team members More details about the document setup can be found on this document below Happy Productivity
Bookmarking urls in your browser and save them to Notion
Remember when you were doing some large research and wanted to quickly bookmark a page and save it, only to find premium options? Worry not; n8n got you covered. You can now create a simple bookmarking app straight to your browser using simple scrips on your browser called bookmarklets. A bookmarklet is a bookmark stored in a web browser that contains JavaScript commands that add new features to the browser. To create one, we need to add a short script to the bookmark tab of our browser like below A simple hack is to open a new tab and click on the star that appears on the right side Now that we have our bookmark, it's time for the fun part. Right-click on the bookmark we just created and select the edit option. This will allow you to set the name you want for your bookmark and the destination URL. The URL used here will be the script that shall "capture" the page we want to bookmark. The code below has been used and tested to work for this example Javascript javascript:(() => { var currentUrl = window.location.href; var webhookUrl = 'https://$yourN8nInstanceUrl/webhook/1rxsxc04b027-39d2-491a-a9c6-194289fe400c'; var xhr = new XMLHttpRequest(); xhr.open('POST', webhookUrl, true); xhr.setRequestHeader('Content-Type', 'application/json'); var data = JSON.stringify({ url: currentUrl }); xhr.send(data); })(); Your Bookmark should look like something like this Now that we have this setup, we are now going to n8n to receive the data sent by this script. Create a new webhook node that receives the POST request as in the workflow and replace $yourN8nInstanceUrl with your actual n8n instance. This workflow can then be configured to send this data to a notion database. Make sure the notion database has all the required permissions before executing the workflow. Otherwise the URLs will not be saved
Extract Domain and verify email syntax on the go
What problem is this workflow solving? This workflow is aimed for email marketing enthusiasts looking for an easy way to either extract the domain from an email ad also check if the syntax is correct without having to use the code node. How this works For this to work, replace the debugger node with your actual data source. Map your data at match the above layout Run your workflow and check for all the emails that are either valid or not Once done, you will have a list of all your emails, domains, and whether they are valid or not.
Generate SEO-optimized titles & meta descriptions with Bright Data & Gemini AI
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. What does this workflow do? This workflow helps speed up the analysis process of the top ranking titles and meta descriptions to identify paterns and styles that will help us rank on Google for a given keyword How does it work? We provide a keyword we are interested in on our Google sheet. When executed, We scrap the top 10 pages using Bright Data serp API and analyse the style and patterns of the top ranking pages and generate a new title and meta description Techncial setup Make a copy of this Google sheet Update your desired keywords on the cell/row Set your Bright data credentials on the Fetch Google Search Results JSON node Update the zone to your preset zone We are getting the results as a JSON. You can update this setting on the url https://www.google.com/search?q={{ $json.searchterm .replaceAll(" ", "+")}}&start=0&brdjson=1 by removing the brd_json=1 query Store the generated results on the Duplicated sheet Run the workflow Setting up the Serp Scraper in Bright Data On Bright Data, go to the Proxies & Scraping tab Under SERP API, create a new zone Give it a suitable name and description. The default is serp_api Add this to your account Add your credentials as a header credential and rename to Bright data API
Scrape & analyze Google Ads with Bright Data API and AI for email reports
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow is a gem for all PPC managers and experts out there looking to keep track of competitor ads and the campaigns they are running and generate an email report How does it work We use Bright Data API to scrap Google for a given keyword that can trigger an ad. We then extract and analyse different components of the ads to get insights and data rekevant for our processes Setting it up Make a copy of this workflow to your canvas Make a copy of this google sheet Add high intent commercial keywords to your google sheet. These are relevant to trigger ads Set your Bright Data API credentials and update the zone to your respective zone as set on your Bright Data account We filter only if ads are found and if true extract the top and botton ads This routes the results via different paths Store raw Ad results Process the Ads to get new insights and data Map the raw data to match your account You can adjust the prompt to provide any data as needed Connect your emailing platform or tool and update the to email Setting up Bright Data serp API and Zone On Bright Data, go to the Proxies & Scraping tab Under SERP API, create a new zone Give it a suitable name and description. The default is serp_api Add this to your account If you have any questions, feel free to reach out via linkedin
Reddit Sentiment Analysis for Apple WWDC25 with Gemini AI and Google Sheets
This workflow automates sentiment analysis of Reddit posts related to Apple's WWDC25 event. It extracts data, categorizes posts, analyzes sentiment of comments, and updates a Google Sheet with the results. Preliquisites Bright Data Account: You need a Bright Data account to scrape Reddit data. Ensure you have the correct permissions to use their API. https://brightdata.com/ Google Sheets API Credentials: Enable the Google Sheets API in your Google Cloud project and create credentials (OAuth 2.0 Client IDs). Google Gemini API Credentials: You need a Gemini API key to run the sentiment analysis. Ensure you have the correct permissions to use their API. https://ai.google.dev/". You can use any other models of choice Setup Import the Workflow: Import the provided JSON workflow into your n8n instance.", Configure Bright Data Credentials:, In the 'scrap reddit' and the 'get status' nodes, in Header Parameters find the Authorization field, replace Bearer 1234 with your Bright Data API key. Apply this to every node that utilizes your Bright Data API Key., Set up the Google Sheets API credentials, In the 'Append Sentiments' node, set up the Google Sheets API by connecting your Google Sheets account through oAuth 2 credentials. ", Configure the Google Gemini Credential ID, In the ' Sentiment Analysis per comment' node, set up the Google Gemini API by connecting your Google AI account through the API credentials. , Configure Additional Parameters:, In the 'scrap reddit' node, modify the JSON body to adjust the search term, date, or sort method., In the 'Wait' node, alter the 'Amount' to adjust the polling interval for scraping status, it is set to 15 seconds by default., In the 'Text Classifier' node, customize the categories and descriptions to suit the sentiment analysis needs. Review categories such as 'WWDC events' to ensure relevancy., In the 'Sentiment Analysis per comment' node, modify the system prompt template to improve context. customization_options Bright Data API parameters to adjust scraping behavior. Wait node duration to optimize polling. Text Classifier categories and descriptions. Sentiment Analysis system prompt. Use Case Examples Brand Monitoring: Track public sentiment towards Apple during and after the WWDC25 event. Product Feedback Analysis: Gather insights into user reactions to new product announcements. Competitive Analysis: Compare sentiment towards Apple's announcements versus competitors. Event Impact Assessment: Measure the overall impact of the WWDC25 event on various aspects of Apple's business. Target_audiences: Marketing professionals in the tech industry, Brand managers, Product managers, Market research analysts, Social media managers Troubleshooting: Workflow fails to start. Check that all necessary credentials (Bright Data and Google Sheets API) are correctly configured and that the Bright Data API key is valid. Data scraping fails. Verify the Bright Data API key, ensure the dataset ID is correct, and inspect the Bright Data dashboard for any issues with the scraping job. Sentiment analysis is inaccurate. Refine the categories and descriptions in the 'Text Classifier' node. Check that you have the correct Google Gemini API key, as the original is a placeholder. Google Sheets are not updating. Ensure the Google Sheets API credentials have the necessary permissions to write to the specified spreadsheet and sheet. Check API usage limits. Workflow does not produce the correct output. Check the data connections, by clicking the connections, and looking at which data is being produced. Check all formulas for errors. Happy productivity!
Convert n8n tags into folders and move workflows
N8n recently introduced folders and it has been a big improvement on workflow management on top of the tags. This means the current workflows need to be moved manually to the folders. The simplest idea to try is to convert the current tags into folders and move all the current workflows within the respective tags into the folders This assumes the tag name will be used as the folder name. To Note For workflows that use more than 1 tag, the workflow will be assigned the last tag that runs as the folder. How does it work I took the liberty of simplifying the setup of this workflow that will be needed on your part and also be beginner-friendly Copy and paste this workflow into your n8n canvas. You must have existing workflows and tags before you can run this Set your n8n login details on the node set Credentials with the n8n URL, username, and password. Setup your n8n API credentials on the n8n node get workflows Run the workflow. This opens up a form where you can select the number of tags to move and click on submit The workflow responds with the successful number of workflows that were imported Read more about the template Built by Zacharia Kimotho - Imperol
Automated Airtable to Postgres migration with n8n
Overview This ETL system automates the process of migrating data from Airtable to PostgreSQL with a single API request. It maps your Airtable schema into a Postgres-compatible structure. Automatically creates new tables in your Postgres database. Migrates all the data while preserving formats and relationships. > ⚙️ Originally built in-house to help us migrate off Airtable after exceeding usage limits. --- 🔧 How It Works Accepts Airtable and Postgres credentials via HTTP requests. Authenticates both services and validates schema compatibility. Fetches data from Airtable and maps each table and field to PostgreSQL equivalents. Creates the necessary tables in your Postgres database. Inserts all records in batches. Returns a success response with summary stats. > Bonus operations: You can list or delete created tables using API endpoints. --- Setup Instructions (n8n Workflow) Step 1: Airtable Configuration Generate an Airtable access token from the Airtable developer hub. Copy your Base ID or URL. Step 2: PostgreSQL Configuration Gather your PostgreSQL connection details: Host Port Database name Username Password Step 3: Deploy in n8n Import the workflow into your n8n instance. Use a simple HTTP request tool like curl or Postman to trigger migration actions. --- API Endpoints & Payloads Here are the available HTTP endpoints and how to use them. --- Test Airtable Credentials bash curl -X POST "https://n8n.com/webhook/123/validate-airtable" \ -H "Content-Type: application/json" \ -d '{ "airtable": { "airtableId": "app12345", "airtableToken": "pjhy.iyhhs" } }' Test PostgreSQL Credentials bash curl -X POST "https://n8n.com/webhook/123/validate-postgres" \ -H "Content-Type: application/json" \ -d '{ "postgres": { "host": "aws-0-us-west-1.pooler.supabase.com", "port": "6543", "user": "postgres.username", "password": "gamjgnrkxetb", "database": "postgres" } }' --- Sync Airtable Data to Postgres bash curl -X POST "https://n8n.com/webhook/123/sync" \ -H "Content-Type: application/json" \ -d '{ "host": "aws-0-us-west-1.pooler.supabase.com", "port": "6543", "user": "postgres.username", "password": "gamjgnrkxetb", "database": "postgres", "airtableId": "app73PqALbM3AM0xN", "airtableToken": "patNCueRkrLI98fEq.9ae7f9786e9ad73ac21ca26d8046f08ad77e135ae950a6e2ff3760d85aca3db4", "action": "Move" }' Expected Response: json [ { "statusCode": 200, "statusMessage": "Data migration successful", "recordsProcessed": 152, "tablesProcessed": 3 } ] --- 4.List All Created Tables bash curl -X POST "https://n8n.com/webhook/123/list-tables" \ -H "Content-Type: application/json" \ -d '{ "postgres": { "host": "aws-0-us-west-1.pooler.supabase.com", "port": "6543", "user": "postgres.username", "password": "gamjgnrkxetb", "database": "postgres" } }' --- Delete Migrated Tables bash curl -X POST "https://n8n.com/webhook/123/delete-tables" \ -H "Content-Type: application/json" \ -d '{ "postgres": { "host": "aws-0-us-west-1.pooler.supabase.com", "port": "6543", "user": "postgres.username", "password": "gamjgnrkxetb", "database": "postgres" } }' --- Technical Notes Schema Mapping: Field types from Airtable are mapped to PostgreSQL equivalents (e.g. singleLineText → VARCHAR, number → INTEGER, checkbox → BOOLEAN, etc.). Linked Records: Relationships in Airtable bases are resolved and converted into foreign key-friendly formats. Batch Inserts: Records are inserted in optimized chunks to improve performance and avoid payload limits. Error Handling: Invalid credentials, schema mismatches, or connection issues will return proper HTTP status codes and error messages. --- Usage Scenarios Airtable to Postgres migration during scale-up. Backup or sync Airtable records to a SQL environment. Use Postgres-powered dashboards while editing in Airtable. --- Requirements Airtable Pro/Developer Account PostgreSQL database (e.g. Supabase, Render, or local instance) n8n instance with webhook exposure Basic familiarity with HTTP requests (curl, Postman, or integrations) --- Need Help? Feel free to reach out via LinkedIn or Email if you need help adapting this workflow for your organization or extending it with extra transformations. Happy productivity!