Generate personalized sales emails with LinkedIn data & Claude 3.7 via OpenRouter
How it works The automation loads rows from a Google Sheet of leads that you want to contact. It makes a Google search via Apify for LinkedIn links based on the First name / Last name / Company. Another Apify actor fetches the right LinkedIn profile based on the first profile which is retuned The same process is done for the company that the lead works for, giving extra context. If the lead has a current company listed on their LinkedIn, we use that URL to do the lookup, rather than doing a separate Google search. A call is made to OpenRouter to get an LLM to generate an email based on a prompt designed to do personalized outreach. An email is sent via a Gmail node. Set up steps Connect your Google Sheets + Gmail accounts to use these APIs. Make an account with Apify and enter your credentials. Set your details in the "Set My Data" node to customize the workflow to revolve around your company + value proposition. I would recommend changing the prompt in the "Generate Personalized Email" node to match the tone of voice that you want your agent to have. You can change the guidelines to e.g. change whether the agent introduces itself, and give more examples in the style you want to make the output better.
Build a tax code assistant with Qdrant, Mistral.ai and OpenAI
This n8n workflows builds another example of creating a knowledgebase assistant but demonstrates how a more deliberate and targeted approach to ingesting the data can produce much better results for your chatbot. In this example, a government tax code policy document is used. Whilst we could split the document into chunks by content length, we often lose the context of chapters and sections which may be required by the user. Our approach then is to first split the document into chapters and sections before importing into our vector store. Additionally, using metadata correctly is key to allow filtering and scoped queries. Example Human: "Tell me about what the tax code says about cargo for intentional commerce?" AI: "Section 11.25 of the Texas Property Tax Code pertains to "MARINE CARGO CONTAINERS USED EXCLUSIVELY IN INTERNATIONAL COMMERCE." In this section, a person who is a citizen of a foreign country or an en..." How it works The tax code policy document is downloaded as a zip file from the government website and its pages are extracted as separate chapters. Each chapter is then parsed and split into its sections using data manipulation expressions. Each section is then inserted into our Qdrant vector store tagged with its source, chapter and section numbers as metadata. When our AI Agent needs to retrieve data from our vector store, we use a custom workflow tool to perform the query to Qdrant. Because we're relying on Qdrant's advanced filtering capabilities, we perform the search using the Qdrant API rather than the Qdrant node. When the AI Agent, needs to pull full wording or extracts, we can use Qdrant's scroll API and metadata filtering to do so. This makes Qdrant behave like a key-value store for our document. Requirements A Qdrant instance is required for the vector store and specifically for it's filtering functionality. Mistral.ai account for Embeddings and AI models. Customising this workflow Depending on your use-case, consider returning actual PDF pages (or links) to the user for the extra confirmation and to build trust. Not using Mistral? You are able to replace but note to match the distance and dimension size of Qdrant collection to your chosen embedding model.
Google Play review intelligence with Bright Data & Telegram alerts
Google Play Review Intelligence with Bright Data & Telegram Alerts Overview This n8n workflow automates the process of scraping Google Play Store reviews, analyzing app performance, and sending alerts for low-rated applications. It integrates with Bright Data for web scraping, Google Sheets for data storage, and Telegram for notifications. Workflow Components β Trigger Input Form Type: Form Trigger Purpose: Initiates the workflow with user input Input Fields: URL (Google Play Store app URL) Number of reviews to fetch Function: Captures user requirements to start the scraping process π Start Scraping Request Type: HTTP Request (POST) Purpose: Sends scraping request to Bright Data API Endpoint: https://api.brightdata.com/datasets/v3/trigger Parameters: Dataset ID: gd_m6zagkt024uwvvwuyu Include errors: true Limit multiple results: 5 Custom Output Fields: url, reviewid, reviewername, review_date reviewrating, review, appurl, app_title appdeveloper, appimages, app_rating appnumberofreviews, appwhat_new appcontentrating, appcountry, numof_reviews π Check Scrape Status Type: HTTP Request (GET) Purpose: Monitors the progress of the scraping job Endpoint: https://api.brightdata.com/datasets/v3/progress/{snapshot_id} Function: Checks if the dataset scraping is complete β±οΈ Wait for Response 45 sec Type: Wait Node Purpose: Implements polling mechanism Duration: 45 seconds Function: Pauses workflow before checking status again π§© Verify Completion Type: IF Condition Purpose: Evaluates scraping completion status Condition: status === "ready" Logic: True: Proceeds to fetch data False: Loops back to status check π₯ Fetch Scraped Data Type: HTTP Request (GET) Purpose: Retrieves the final scraped data Endpoint: https://api.brightdata.com/datasets/v3/snapshot/{snapshot_id} Format: JSON Function: Downloads completed review and app data π Save to Google Sheet Type: Google Sheets Node Purpose: Stores scraped data for analysis Operation: Append rows Target: Specified Google Sheet document Data Mapping: URL, Review ID, Reviewer Name, Review Date Review Rating, Review Text, App Rating App Number of Reviews, App What's New, App Country β οΈ Check Low Ratings Type: IF Condition Purpose: Identifies poor-performing apps Condition: review_rating < 4 Logic: True: Triggers alert notification False: No action taken π£ Send Alert to Telegram Type: Telegram Node Purpose: Sends performance alerts Message Format: β οΈ Low App Performance Alert π± App: {app_title} π§βπ» Developer: {app_developer} β Rating: {app_rating} π Reviews: {appnumberof_reviews} π View on Play Store Workflow Flow Input Form β Start Scraping β Check Status β Wait 45s β Verify Completion β β βββββ Loop βββββ β Fetch Data β Save to Sheet & Check Ratings β Send Telegram Alert Configuration Requirements API Keys & Credentials Bright Data API Key: Required for web scraping Google Sheets OAuth2: For data storage access Telegram Bot Token: For alert notifications Setup Parameters Google Sheet ID: Target spreadsheet identifier Telegram Chat ID: Destination for alerts N8N Instance ID: Workflow instance identifier Key Features Data Collection Comprehensive app metadata extraction Review content and rating analysis Developer and country information App store performance metrics Quality Monitoring Automated low-rating detection Real-time performance alerts Continuous data archiving Integration Capabilities Bright Data web scraping service Google Sheets data persistence Telegram instant notifications Polling-based status monitoring Use Cases App Performance Monitoring Track rating trends over time Identify user sentiment patterns Monitor competitor performance Quality Assurance Early warning for rating drops Customer feedback analysis Market reputation management Business Intelligence Review sentiment analysis Performance benchmarking Strategic decision support Technical Notes Polling Interval: 45-second status checks Rating Threshold: Alerts triggered for ratings < 4 Data Format: JSON with structured field mapping Error Handling: Includes error tracking in dataset requests Result Limiting: Maximum 5 multiple results per request For any questions or support, please contact: info@incrementors.com or fill out this form https://www.incrementors.com/contact-us/