Arunava
Product Growth at Testline AI | Automating the Boring Stuff with AI Agents | Ex-RecordBook (YC W22)
Templates by Arunava
Auto-reply to Google Play Store reviews with GPT-4o & sentiment analysis
This n8n workflow automates replying to Google Play Store reviews using AI. It analyzes each review’s sentiment and tone and posts a human-like response — saving time for indie devs, founders, and PMs managing multiple apps. --- 💡 Use Cases Respond to reviews at scale without sounding robotic Prioritize negative sentiment feedback Maintain consistent tone and support messaging Free up time for teams to focus on product instead of ops --- 🧠 How it works Uses the Play Store API to fetch new app reviews Filters out reviews that have already been replied to Analyzes sentiment using OpenAI GPT-4o Passes sentiment and review context to an AI Agent node that crafts a reply Replies are posted to Play Store via Google API (Optional) Logs the reply to Slack for visibility --- ⚡ Requirements Google Play Developer Console access Google Cloud Project with service account OpenAI account (GPT-4o or mini) (Optional) Slack workspace & app for logging --- 🙌 Don’t want to set this up yourself? I’ll do it for you. Just drop me an email: imarunavadas@gmail.com Let’s automate the boring stuff so you can focus on growth. 🚀
Auto-comment on Reddit posts with AI brand mentions & Baserow tracking
This workflow finds fresh Reddit posts that match your keywords, decides if they’re actually relevant to your brand, writes a short human-style reply using AI, posts it, and logs everything to Baserow. 💡Perfect for Lead gen without spam: drop helpful replies where your audience hangs out. Get discovered by AI surfaces (AI Overviews / SGE, AISEO/GSEO) via high-quality brand mentions. Customer support in the wild: answer troubleshooting threads fast. Community presence: steady, non-salesy contributions in niche subreddits. 🧠 What it does Searches Reddit for your keyword query on a schedule (e.g., every 30 min) Checks Baserow first so you don’t reply twice to the same post Uses an AI prompt tuned for short, no-fluff, subreddit-friendly comments Softly mentions your brand only when it’s clearly relevant Posts the comment via Reddit’s API Saves postid, commentid, reply, permalink, status to Baserow Processes posts one-by-one with an optional short wait to be nice to Reddit ⚡ Requirements Reddit developer API Baserow account, table, and API token AI provider API (OpenAI / Anthropic / Gemini) ⚙️ Setup Instructions Create Baserow table Fields (user-field names exactly): postid (unique), permalink, subreddit, title, createdutc, reply (long text), replied (boolean), created_on (datetime). Add credentials in n8n Reddit OAuth2 (scopes: read, submit, identity) and set a proper User-Agent string (Reddit requires it). LLM: Google Gemini and/or Anthropic (both can be added; one can be fallback in the AI Agent). Baserow: API token. Set the Schedule Trigger (Cron) Start hourly (or every 2–3h). Pacing is mainly enforced by the Wait nodes. Update “Check duplicate row” (HTTP Request) URL: https://api.baserow.io/api/database/rows/table/{TABLEID}/?userfieldnames=true&filterpostidequal={{$json.post_id}} Header: Authorization: Token YOURBASEROWTOKEN (Use your own Baserow domain if self-hosted.) Configure “Filter Replied Posts” Ensure it skips items where your Baserow record shows replied === true (so you don’t comment twice). Configure “Fetch Posts from Reddit” Set your keyword/search query (and time/sort). Keep User-Agent header present. Configure “Write Reddit Comment (AI)” Update your brand name (and optional link). Edit the prompt/tone to your voice; ensure it outputs a short reply field (≤80 words, helpful, non-salesy). Configure “Post Reddit Comment” (HTTP Request) Endpoint: POST https://oauth.reddit.com/api/comment Body: thingid: "t3{{$json.post_id}}", text: "{{$json.reply}}" Uses your Reddit OAuth credential and User-Agent header. Update user_agent value in header by your username n8n:reddit-autoreply:1.0 (by /u/{reddit-username}) Store Comment Data on Baserow (HTTP Request) POST https://api.baserow.io/api/database/rows/table/{TABLEID}/?userfield_names=true Header: Authorization: Token YOURBASEROWTOKEN Map: postid, permalink, subreddit, title, createdutc, reply, replied, created_on={{$now}}. Keep default pacing Leave Wait 5m (cool-off) and Wait 6h (global pace) → \~4 comments/day. Reduce waits gradually as account health allows. Test & enable Run once manually, verify a Baserow row and one test comment, then enable the schedule. 🤝 Need a hand? I’m happy to help you get this running smoothly—or tailor it to your brand. Reach out to me via email: imarunavadas@gmail.com
Track Amazon prices & monitor competitors with Apify and Google Sheets
Amazon Price Tracker & Competitor Monitoring Workflow (Apify + Google Sheets) This n8n workflow automates Amazon price tracking and competitor monitoring by scraping product pricing via Apify and updating your Google Sheet every day. It removes manual price checks, keeps your pricing data always fresh, and helps Amazon sellers stay ahead in competitive pricing, Buy Box preparation, and daily audits. 💡 Use Cases Automatically track prices of your Amazon products Monitor competitor seller prices across multiple URLs Maintain a daily pricing database for reporting and insights Catch sudden competitor undercutting or pricing changes Support Buy Box analysis by comparing seller prices Scale from 10 to 1000+ product URLs without manual effort 🧠 How It Works Scheduled Trigger runs the workflow every morning Google Sheets node loads all product rows with seller URLs Loop node processes each item one-by-one Apify Actor node triggers the Amazon scraper HTTP Request node fetches the scraped result from Apify JavaScript node extracts, cleans, and formats price data Update Sheet node writes the fresh prices back to the right row Supports additional price columns for more sellers or metrics ➕ Adding New Competitor Columns (Step-by-Step) Add a new column in Google Sheets Add two new columns: competitorurl3 pricecomp3 --- Update the Apify Actor (inside n8n) In the Apify Actor node, pass the new competitor URL: "competitorurl3": {{$json.competitorurl3}} This ensures Apify scrapes the additional competitor product page. --- Update your Code (JavaScript) node Inside the Code node, extract the new competitor’s price from the Apify JSON and attach it to the output: const pricecomp3 = item?.offers?.[2]?.price || null; item.pricecomp3 = pricecomp3; return item; (Adjust the index [2] based on the Apify output structure.) --- Update the Google Sheet “Update Row” node To save the new values into your Sheet: Open your Google Sheets Update Row node Scroll to Field Mapping Map Columns with New Data Hit the "Save & Execute" Button.🚀 ⚡ Requirements Apify account (free tier is enough) Apify "Amazon Product Scraper" API (Costs $40/month - 14-day free trial) Google Sheet containing product URLs Basic credentials setup inside n8n 🙌 Want me to set it up for you? I’ll configure the full automation — Apify scraper, n8n workflow, Sheets mapping, and error handling. Email me at: imarunavadas@gmail.com Automate the boring work and focus on smarter selling. 🚀
Track play store app rankings with SerpApi, Baserow & Slack alerts
Automatically track your Android app’s keyword rankings on Google Play. This workflow checks ranks via SerpApi, updates a Baserow table, and posts a heads-up in Slack so your team can review changes quickly. 💡 Perfect for ASO teams tracking daily keyword positions Growth & marketing standups that want quick rank visibility Lightweight historical logging without a full BI stack 🧠 What it does Runs on a schedule (e.g., weekly) Queries SerpApi for each keyword’s Play Store ranking Saves results to Baserow: Current Rank, Previous Rank, Last modified Sends a Slack alert: “Ranks updated — review in Baserow” ⚡ Requirements SerpApi account & API key Baserow account + API token Slack connection (bot/app or credential in n8n) ⚙️ Setup Instructions 1) Create a Baserow table Create a new table (any name). Add user-field names exactly: Keywords (text) Current Rank (number) Previous Rank (number) Last modified (date/time) Optional fields you can add later: Notes, Locale, Store Country, App Package ID. 2) Connect credentials in n8n Baserow: add your API token and select your Database and Table in the Baserow nodes. Slack: connect your Slack account/workspace in the Slack node. SerpApi: open the HTTP Request node and put your API key under Query Parameters → apikey = YOURKEY. 3) Verify field mapping In the Baserow (Update Row) node, map: Row ID → {{$json.id}} Current Rank → {{$json["Current Rank"]}} Previous Rank → your code node should set this (the template copies the old “Current Rank” into “Previous Rank” before writing the new one) Last modified → {{$now}} (or the timestamp you compute) 🛟 Notes & Tips If you prefer a single daily Slack summary instead of multiple pings, add a Code node after updates to aggregate lines and send one message. Treat 0 or missing ranks as “not found” and flag them in Slack if helpful. For multi-country tracking, include hl/gl (locale/country) in your SerpApi query params and store them as columns. 🤝 Need a hand? I’m happy to help you get this running smoothly—or tailor it to your brand. Reach out to me via email: imarunavadas@gmail.com