Back to Catalog

Send daily recipe emails automatically

jasonjason
2156 views
2/3/2026
Official Page

Not sure what to eat tonight? Have recipes emailed to you daily based on your criterial.

To run this workflow, you will need to have:

  1. A Recipe Search API key from Edamam
  2. An active email account with configured credentials

To set up your credentials:

  1. Set your Edamam AppID and AppKey in the Search Criteria node
  2. Select (or create) your email credentials in the Send Recipes node (and set up the to: and from: email addresses while you are at it)

To customize the recipes that you receive, open up the Search Criteria node and modify one or more of the following:

  • RecipeCount - the numner of recipes you would like to receive
  • IngredientCount - the maximum number of ingredients you would like each recipe to have
  • CaloriesMin - the minimum number of calories the recipe will have
  • CaloriesMax - the maximum number of calories the recipe will have
  • TimeMin - the minimum amount of time (in minutes) the recipe will take to prepare
  • TimeMax - the maximum amount of time (in minutes) the recipe will take to prepare
  • Diet - Select one of the following options:
    • balanced - Protein/Fat/Carb values in 15/35/50 ratio
    • high-fiber - More than 5g fiber per serving
    • high-protein - More than 50% of total calories from proteins
    • low-carb - Less than 20% of total calories from carbs
    • low-fat - Less than 15% of total calories from fat
    • low-sodium - Less than 140mg Na per serving
    • random - selects a different random diet each day
  • Health - Select one of the following options:
    • alcohol-free - No alcohol used or contained
    • immuno-supportive - Recipes which fit a science-based approach to eating to strengthen the immune system
    • celery-free - does not contain celery or derivatives
    • crustacean-free - does not contain crustaceans (shrimp, lobster etc.) or derivatives
    • dairy-free - No dairy; no lactose
    • egg-free - No eggs or products containing eggs
    • fish-free - No fish or fish derivatives
    • fodmap-free - Does not contain FODMAP foods
    • gluten-free - No ingredients containing gluten
    • keto-friendly - Maximum 7 grams of net carbs per serving
    • kidney-friendly - per serving – phosphorus less than 250 mg AND potassium less than 500 mg AND sodium: less than 500 mg
    • kosher - contains only ingredients allowed by the kosher diet. However it does not guarantee kosher preparation of the ingredients themselves
    • low-potassium - Less than 150mg per serving
    • lupine-free - does not contain lupine or derivatives
    • mustard-free - does not contain mustard or derivatives
    • low-fat-abs - Less than 3g of fat per serving
    • no-oil-added - No oil added except to what is contained in the basic ingredients
    • low-sugar - No simple sugars – glucose, dextrose, galactose, fructose, sucrose, lactose, maltose
    • paleo - Excludes what are perceived to be agricultural products; grains, legumes, dairy products, potatoes, refined salt, refined sugar, and processed oils
    • peanut-free - No peanuts or products containing peanuts
    • pecatarian - Does not contain meat or meat based products, can contain dairy and fish
    • pork-free - does not contain pork or derivatives
    • red-meat-free - does not contain beef, lamb, pork, duck, goose, game, horse, and other types of red meat or products containing red meat.
    • sesame-free - does not contain sesame seed or derivatives
    • shellfish-free - No shellfish or shellfish derivatives
    • soy-free - No soy or products containing soy
    • sugar-conscious - Less than 4g of sugar per serving
    • tree-nut-free - No tree nuts or products containing tree nuts
    • vegan - No meat, poultry, fish, dairy, eggs or honey
    • vegetarian - No meat, poultry, or fish
    • wheat-free - No wheat, can have gluten though
    • random - selects a different random health option each day
  • SearchItem - the general term that you are looking for e.g. chicken

Automated Daily Recipe Email Sender

This n8n workflow automates the process of fetching a daily recipe and sending it out via email. It's designed to run on a schedule, ensuring your subscribers or yourself receive fresh recipe ideas regularly.

What it does

This workflow performs the following steps:

  1. Triggers on a schedule: The workflow is initiated by a Cron node, configured to run at specific intervals (e.g., daily).
  2. Fetches a recipe: It uses an HTTP Request node to call an external API or service to retrieve recipe data.
  3. Processes recipe data: A Function node is used to manipulate or format the raw recipe data obtained from the HTTP request. This could involve extracting specific fields, reformatting text, or generating an HTML email body.
  4. Sets email fields: An Edit Fields (Set) node prepares the data for the email, likely setting the subject, recipient, and body of the email based on the processed recipe information.
  5. Sends the email: Finally, a Send Email node dispatches the prepared email to the specified recipient(s).

Prerequisites/Requirements

To use this workflow, you will need:

  • An n8n instance: Self-hosted or cloud.
  • An API endpoint for recipes: The HTTP Request node is configured to fetch data from an external source. You will need the URL and any necessary authentication for this API.
  • SMTP credentials: The Send Email node requires credentials for an email service (e.g., Gmail, SendGrid, your own SMTP server) to send emails.

Setup/Usage

  1. Import the workflow: Download the JSON provided and import it into your n8n instance.
  2. Configure the Cron node: Adjust the schedule in the "Cron" node to your desired frequency (e.g., daily at a specific time).
  3. Configure the HTTP Request node:
    • Update the "URL" field to point to your recipe API.
    • Add any necessary "Headers" or "Authentication" for your API.
  4. Configure the Function node: Review and modify the JavaScript code within the "Function" node to correctly parse and format the data returned by your recipe API into the desired structure for your email.
  5. Configure the Edit Fields (Set) node: Ensure the fields like to, subject, and html (or text) are correctly mapped to the data output by the Function node.
  6. Configure the Send Email node:
    • Select or create an SMTP credential.
    • Set the "From Email" and "To Email" addresses.
    • Ensure the "Subject" and "Body" (HTML or plain text) are correctly referencing the data from the previous nodes.
  7. Activate the workflow: Once all configurations are complete, activate the workflow. It will then run automatically according to the schedule defined in the Cron node.

Related Templates

Auto-create TikTok videos with VEED.io AI avatars, ElevenLabs & GPT-4

πŸ’₯ Viral TikTok Video Machine: Auto-Create Videos with Your AI Avatar --- 🎯 Who is this for? This workflow is for content creators, marketers, and agencies who want to use Veed.io’s AI avatar technology to produce short, engaging TikTok videos automatically. It’s ideal for creators who want to appear on camera without recording themselves, and for teams managing multiple brands who need to generate videos at scale. --- βš™οΈ What problem this workflow solves Manually creating videos for TikTok can take hours β€” finding trends, writing scripts, recording, and editing. By combining Veed.io, ElevenLabs, and GPT-4, this workflow transforms a simple Telegram input into a ready-to-post TikTok video featuring your AI avatar powered by Veed.io β€” speaking naturally with your cloned voice. --- πŸš€ What this workflow does This automation links Veed.io’s video-generation API with multiple AI tools: Analyzes TikTok trends via Perplexity AI Writes a 10-second viral script using GPT-4 Generates your voiceover via ElevenLabs Uses Veed.io (Fabric 1.0 via FAL.ai) to animate your avatar and sync the lips to the voice Creates an engaging caption + hashtags for TikTok virality Publishes the video automatically via Blotato TikTok API Logs all results to Google Sheets for tracking --- 🧩 Setup Telegram Bot Create your bot via @BotFather Configure it as the trigger for sending your photo and theme Connect Veed.io Create an account on Veed.io Get your FAL.ai API key (Veed Fabric 1.0 model) Use HTTPS image/audio URLs compatible with Veed Fabric Other APIs Add Perplexity, ElevenLabs, and Blotato TikTok keys Connect your Google Sheet for logging results --- πŸ› οΈ How to customize this workflow Change your Avatar: Upload a new image through Telegram, and Veed.io will generate a new talking version automatically. Modify the Script Style: Adjust the GPT prompt for tone (educational, funny, storytelling). Adjust Voice Tone: Tweak ElevenLabs stability and similarity settings. Expand Platforms: Add Instagram, YouTube Shorts, or X (Twitter) posting nodes. Track Performance: Customize your Google Sheet to measure your most successful Veed.io-based videos. --- 🧠 Expected Outcome In just a few seconds after sending your photo and theme, this workflow β€” powered by Veed.io β€” creates a fully automated TikTok video featuring your AI avatar with natural lip-sync and voice. The result is a continuous stream of viral short videos, made without cameras, editing, or effort. --- βœ… Import the JSON file in n8n, add your API keys (including Veed.io via FAL.ai), and start generating viral TikTok videos starring your AI avatar today! πŸŽ₯ Watch This Tutorial --- πŸ“„ Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube

Dr. FirasBy Dr. Firas
39510

Automate invoice processing with OCR, GPT-4 & Salesforce opportunity creation

PDF Invoice Extractor (AI) End-to-end pipeline: Watch Drive ➜ Download PDF ➜ OCR text ➜ AI normalize to JSON ➜ Upsert Buyer (Account) ➜ Create Opportunity ➜ Map Products ➜ Create OLI via Composite API ➜ Archive to OneDrive. --- Node by node (what it does & key setup) 1) Google Drive Trigger Purpose: Fire when a new file appears in a specific Google Drive folder. Key settings: Event: fileCreated Folder ID: google drive folder id Polling: everyMinute Creds: googleDriveOAuth2Api Output: Metadata { id, name, ... } for the new file. --- 2) Download File From Google Purpose: Get the file binary for processing and archiving. Key settings: Operation: download File ID: ={{ $json.id }} Creds: googleDriveOAuth2Api Output: Binary (default key: data) and original metadata. --- 3) Extract from File Purpose: Extract text from PDF (OCR as needed) for AI parsing. Key settings: Operation: pdf OCR: enable for scanned PDFs (in options) Output: JSON with OCR text at {{ $json.text }}. --- 4) Message a model (AI JSON Extractor) Purpose: Convert OCR text into strict normalized JSON array (invoice schema). Key settings: Node: @n8n/n8n-nodes-langchain.openAi Model: gpt-4.1 (or gpt-4.1-mini) Message role: system (the strict prompt; references {{ $json.text }}) jsonOutput: true Creds: openAiApi Output (per item): $.message.content β†’ the parsed JSON (ensure it’s an array). --- 5) Create or update an account (Salesforce) Purpose: Upsert Buyer as Account using an external ID. Key settings: Resource: account Operation: upsert External Id Field: taxid_c External Id Value: ={{ $json.message.content.buyer.tax_id }} Name: ={{ $json.message.content.buyer.name }} Creds: salesforceOAuth2Api Output: Account record (captures Id) for downstream Opportunity. --- 6) Create an opportunity (Salesforce) Purpose: Create Opportunity linked to the Buyer (Account). Key settings: Resource: opportunity Name: ={{ $('Message a model').item.json.message.content.invoice.code }} Close Date: ={{ $('Message a model').item.json.message.content.invoice.issue_date }} Stage: Closed Won Amount: ={{ $('Message a model').item.json.message.content.summary.grand_total }} AccountId: ={{ $json.id }} (from Upsert Account output) Creds: salesforceOAuth2Api Output: Opportunity Id for OLI creation. --- 7) Build SOQL (Code / JS) Purpose: Collect unique product codes from AI JSON and build a SOQL query for PricebookEntry by Pricebook2Id. Key settings: pricebook2Id (hardcoded in script): e.g., 01sxxxxxxxxxxxxxxx Source lines: $('Message a model').first().json.message.content.products Output: { soql, codes } --- 8) Query PricebookEntries (Salesforce) Purpose: Fetch PricebookEntry.Id for each Product2.ProductCode. Key settings: Resource: search Query: ={{ $json.soql }} Creds: salesforceOAuth2Api Output: Items with Id, Product2.ProductCode (used for mapping). --- 9) Code in JavaScript (Build OLI payloads) Purpose: Join lines with PBE results and Opportunity Id ➜ build OpportunityLineItem payloads. Inputs: OpportunityId: ={{ $('Create an opportunity').first().json.id }} Lines: ={{ $('Message a model').first().json.message.content.products }} PBE rows: from previous node items Output: { body: { allOrNone:false, records:[{ OpportunityLineItem... }] } } Notes: Converts discount_total ➜ per-unit if needed (currently commented for standard pricing). Throws on missing PBE mapping or empty lines. --- 10) Create Opportunity Line Items (HTTP Request) Purpose: Bulk create OLIs via Salesforce Composite API. Key settings: Method: POST URL: https://<your-instance>.my.salesforce.com/services/data/v65.0/composite/sobjects Auth: salesforceOAuth2Api (predefined credential) Body (JSON): ={{ $json.body }} Output: Composite API results (per-record statuses). --- 11) Update File to One Drive Purpose: Archive the original PDF in OneDrive. Key settings: Operation: upload File Name: ={{ $json.name }} Parent Folder ID: onedrive folder id Binary Data: true (from the Download node) Creds: microsoftOneDriveOAuth2Api Output: Uploaded file metadata. --- Data flow (wiring) Google Drive Trigger β†’ Download File From Google Download File From Google β†’ Extract from File β†’ Update File to One Drive Extract from File β†’ Message a model Message a model β†’ Create or update an account Create or update an account β†’ Create an opportunity Create an opportunity β†’ Build SOQL Build SOQL β†’ Query PricebookEntries Query PricebookEntries β†’ Code in JavaScript Code in JavaScript β†’ Create Opportunity Line Items --- Quick setup checklist πŸ” Credentials: Connect Google Drive, OneDrive, Salesforce, OpenAI. πŸ“‚ IDs: Drive Folder ID (watch) OneDrive Parent Folder ID (archive) Salesforce Pricebook2Id (in the JS SOQL builder) 🧠 AI Prompt: Use the strict system prompt; jsonOutput = true. 🧾 Field mappings: Buyer tax id/name β†’ Account upsert fields Invoice code/date/amount β†’ Opportunity fields Product name must equal your Product2.ProductCode in SF. βœ… Test: Drop a sample PDF β†’ verify: AI returns array JSON only Account/Opportunity created OLI records created PDF archived to OneDrive --- Notes & best practices If PDFs are scans, enable OCR in Extract from File. If AI returns non-JSON, keep β€œReturn only a JSON array” as the last line of the prompt and keep jsonOutput enabled. Consider adding validation on parsing.warnings to gate Salesforce writes. For discounts/taxes in OLI: Standard OLI fields don’t support per-line discount amounts directly; model them in UnitPrice or custom fields. Replace the Composite API URL with your org’s domain or use the Salesforce node’s Bulk Upsert for simplicity.

Le NguyenBy Le Nguyen
942

Document RAG & chat agent: Google Drive to Qdrant with Mistral OCR

Knowledge RAG & AI Chat Agent: Google Drive to Qdrant Description This workflow transforms a Google Drive folder into an intelligent, searchable knowledge base and provides a chat agent to query it. It’s composed of two distinct flows: An ingestion pipeline to process documents. A live chat agent that uses RAG (Retrieval-Augmented Generation) and optional web search to answer user questions. This system fully automates the creation of a β€œChat with your docs” solution and enhances it with external web-searching capabilities. --- Quick Implementation Steps Import the workflow JSON into your n8n instance. Set up credentials for Google Drive, Mistral AI, OpenAI, and Qdrant. Open the Web Search node and add your Tavily AI API key to the Authorization header. In the Google Drive (List Files) node, set the Folder ID you want to ingest. Run the workflow manually once to populate your Qdrant database (Flow 1). Activate the workflow to enable the chat trigger (Flow 2). Copy the public webhook URL from the When chat message received node and open it in a new tab to start chatting. --- What It Does The workflow is divided into two primary functions: Knowledge Base Ingestion (Manual Trigger) This flow populates your vector database. Scans Google Drive: Lists all files from a specified folder. Processes Files Individually: Downloads each file. Extracts Text via OCR: Uses Mistral AI OCR API for text extraction from PDFs, images, etc. Generates Smart Metadata: A Mistral LLM assigns metadata like documenttype, project, and assignedto. Chunks & Embeds: Text is cleaned, chunked, and embedded via OpenAI’s text-embedding-3-small model. Stores in Qdrant: Text chunks, embeddings, and metadata are stored in a Qdrant collection (docaiauto). AI Chat Agent (Chat Trigger) This flow powers the conversational interface. Handles User Queries: Triggered when a user sends a chat message. Internal RAG Retrieval: Searches Qdrant Vector Store first for answers. Web Search Fallback: If unavailable internally, the agent offers to perform a Tavily AI web search. Contextual Responses: Combines internal and external info for comprehensive answers. --- Who's It For Ideal for: Teams building internal AI knowledge bases from Google Drive. Developers creating AI-powered support, research, or onboarding bots. Organizations implementing RAG pipelines. Anyone making unstructured Google Drive documents searchable via chat. --- Requirements n8n instance (self-hosted or cloud). Google Drive Credentials (to list and download files). Mistral AI API Key (for OCR & metadata extraction). OpenAI API Key (for embeddings and chat LLM). Qdrant instance (cloud or self-hosted). Tavily AI API Key (for web search). --- How It Works The workflow runs two independent flows in parallel: Flow 1: Ingestion Pipeline (Manual Trigger) List Files: Fetch files from Google Drive using the Folder ID. Loop & Download: Each file is processed one by one. OCR Processing: Upload file to Mistral Retrieve signed URL Extract text using Mistral DOC OCR Metadata Extraction: Analyze text using a Mistral LLM. Text Cleaning & Chunking: Split into 1000-character chunks. Embeddings Creation: Use OpenAI embeddings. Vector Insertion: Push chunks + metadata into Qdrant. Flow 2: AI Chat Agent (Chat Trigger) Chat Trigger: Starts when a chat message is received. AI Agent: Uses OpenAI + Simple Memory to process context. RAG Retrieval: Queries Qdrant for related data. Decision Logic: Found β†’ Form answer. Not found β†’ Ask if user wants web search. Web Search: Performs Tavily web lookup. Final Response: Synthesizes internal + external info. --- How To Set Up Import the Workflow Upload the provided JSON into your n8n instance. Configure Credentials Create and assign: Google Drive β†’ Google Drive nodes Mistral AI β†’ Upload, Signed URL, DOC OCR, Cloud Chat Model OpenAI β†’ Embeddings + Chat Model nodes Qdrant β†’ Vector Store nodes Add Tavily API Key Open Web Search node β†’ Parameters β†’ Headers Add your key under Authorization (e.g., tvly-xxxx). Node Configuration Google Drive (List Files): Set Folder ID. Qdrant Nodes: Ensure same collection name (docaiauto). Run Ingestion (Flow 1) Click Test workflow to populate Qdrant with your Drive documents. Activate Chat (Flow 2) Toggle the workflow ON to enable real-time chat. Test Open the webhook URL and start chatting! --- How To Customize Change LLMs: Swap models in OpenAI or Mistral nodes (e.g., GPT-4o, Claude 3). Modify Prompts: Edit the system message in ai chat agent to alter tone or logic. Chunking Strategy: Adjust chunkSize and chunkOverlap in the Code node. Different Sources: Replace Google Drive with AWS S3, Local Folder, etc. Automate Updates: Add a Cron node for scheduled ingestion. Validation: Add post-processing steps after metadata extraction. Expand Tools: Add more functional nodes like Google Calendar or Calculator. --- Use Case Examples Internal HR Bot: Answer HR-related queries from stored policy docs. Tech Support Assistant: Retrieve troubleshooting steps for products. Research Assistant: Summarize and compare market reports. Project Management Bot: Query document ownership or project status. --- Troubleshooting Guide | Issue | Possible Solution | |------------|------------------------| | Chat agent doesn’t respond | Check OpenAI API key and model availability (e.g., gpt-4.1-mini). | | Known documents not found | Ensure ingestion flow ran and both Qdrant nodes use same collection name. | | OCR node fails | Verify Mistral API key and input file integrity. | | Web search not triggered | Re-check Tavily API key in Web Search node headers. | | Incorrect metadata | Tune Information Extractor prompt or use a stronger Mistral model. | --- Need Help or More Workflows? Want to customize this workflow for your business or integrate it with your existing tools? Our team at Digital Biz Tech can tailor it precisely to your use case from automation logic to AI-powered enhancements. We can help you set it up for free β€” from connecting credentials to deploying it live. Contact: shilpa.raju@digitalbiz.tech Website: https://www.digitalbiz.tech LinkedIn: https://www.linkedin.com/company/digital-biz-tech/ You can also DM us on LinkedIn for any help. ---

DIGITAL BIZ TECHBy DIGITAL BIZ TECH
1409