Workflow: Stickynote Automation

Workflow Details

Download Workflow
{
    "id": "af8RV5b2TWB2LclA",
    "meta": {
        "instanceId": "95f2ab28b3dabb8da5d47aa5145b95fe3845f47b20d6343dd5256b6a28ba8fab",
        "templateCredsSetupCompleted": true
    },
    "name": "Chat with local LLMs using n8n and Ollama",
    "tags": [],
    "nodes": [
        {
            "id": "475385fa-28f3-45c4-bd1a-10dde79f74f2",
            "name": "When chat message received",
            "type": "@n8n\/n8n-nodes-langchain.chatTrigger",
            "position": [
                700,
                460
            ],
            "webhookId": "ebdeba3f-6b4f-49f3-ba0a-8253dd226161",
            "parameters": {
                "options": []
            },
            "typeVersion": 1.100000000000000088817841970012523233890533447265625
        },
        {
            "id": "61133dc6-dcd9-44ff-85f2-5d8cc2ce813e",
            "name": "Ollama Chat Model",
            "type": "@n8n\/n8n-nodes-langchain.lmChatOllama",
            "position": [
                900,
                680
            ],
            "parameters": {
                "options": []
            },
            "credentials": {
                "ollamaApi": {
                    "id": "MyYvr1tcNQ4e7M6l",
                    "name": "Local Ollama"
                }
            },
            "typeVersion": 1
        },
        {
            "id": "3e89571f-7c87-44c6-8cfd-4903d5e1cdc5",
            "name": "Sticky Note",
            "type": "n8n-nodes-base.stickyNote",
            "position": [
                160,
                80
            ],
            "parameters": {
                "width": 485,
                "height": 473,
                "content": "## Chat with local LLMs using n8n and Ollama\nThis n8n workflow allows you to seamlessly interact with your self-hosted Large Language Models (LLMs) through a user-friendly chat interface. By connecting to Ollama, a powerful tool for managing local LLMs, you can send prompts and receive AI-generated responses directly within n8n.\n\n### How it works\n1. When chat message received: Captures the user's input from the chat interface.\n2. Chat LLM Chain: Sends the input to the Ollama server and receives the AI-generated response.\n3. Delivers the LLM's response back to the chat interface.\n\n### Set up steps\n* Make sure Ollama is installed and running on your machine before executing this workflow.\n* Edit the Ollama address if different from the default.\n"
            },
            "typeVersion": 1
        },
        {
            "id": "9345cadf-a72e-4d3d-b9f0-d670744065fe",
            "name": "Sticky Note1",
            "type": "n8n-nodes-base.stickyNote",
            "position": [
                1040,
                660
            ],
            "parameters": {
                "color": 6,
                "width": 368,
                "height": 258,
                "content": "## Ollama setup\n* Connect to your local Ollama, usually on http:\/\/localhost:11434\n* If running in Docker, make sure that the n8n container has access to the host's network in order to connect to Ollama. You can do this by passing `--net=host` option when starting the n8n Docker container"
            },
            "typeVersion": 1
        },
        {
            "id": "eeffdd4e-6795-4ebc-84f7-87b5ac4167d9",
            "name": "Chat LLM Chain",
            "type": "@n8n\/n8n-nodes-langchain.chainLlm",
            "position": [
                920,
                460
            ],
            "parameters": [],
            "typeVersion": 1.399999999999999911182158029987476766109466552734375
        }
    ],
    "active": false,
    "pinData": [],
    "settings": {
        "executionOrder": "v1"
    },
    "versionId": "3af03daa-e085-4774-8676-41578a4cba2d",
    "connections": {
        "Ollama Chat Model": {
            "ai_languageModel": [
                [
                    {
                        "node": "Chat LLM Chain",
                        "type": "ai_languageModel",
                        "index": 0
                    }
                ]
            ]
        },
        "When chat message received": {
            "main": [
                [
                    {
                        "node": "Chat LLM Chain",
                        "type": "main",
                        "index": 0
                    }
                ]
            ]
        }
    }
}
Back to Workflows

Related Workflows

BillBot
View
RAG Workflow For Stock Earnings Report Analysis
View
HDW Lead Geländewagen
View
Readbinaryfile Spreadsheetfile Create
View
Filter Telegram Send Triggered
View
HTTP Manual Create Webhook
View
Spot Workplace Discrimination Patterns with AI
View
Monitor USDT ERC-20 Wallet Balance with Etherscan and Telegram Notifications
View
Stripe Payment Order Sync – Auto Retrieve Customer & Product Purchased
View
Scrape Trustpilot Reviews to Google Sheets
View