Purpose
The LLM (Large Language Model) node is the brain of your AI chatbot. It connects to models like GPT-4o to provide intelligent responses, understand user intent, and search your trained knowledge base.
Configuration
- System Prompt: Define the AI's personality and core instructions. Example: "You are a helpful and friendly assistant for Zimflow."
- User Prompt: This is the main query sent to the AI. It's best to use variables here, such as
{{last_message}}to pass the user's most recent message. - Enable Vector Search: When checked, the AI will first search your trained documents (FAQs, files, websites) for relevant context before generating a response. This is essential for building a knowledge base bot.
- Variable Name: The base name for the variables where the AI's response will be stored (e.g.,
ai_response). The node will create two variables:{{ai_response_message}}(the text to send to the user) and{{ai_response_intent}}(the detected intent). - Intentions: Define specific user goals you want the AI to recognize (e.g., `book_appointment`, `check_status`). For each intention, provide a clear description of what it means.
- Intent Trigger Keywords: To make intent detection more reliable, you can provide specific keywords. If the AI's generated message contains one of these keywords, it will trigger the corresponding intent.
Usage Example
After an LLM node, you can use a Condition (Branch) node to check the value of {{ai_response_intent}}. If the intent is `book_appointment`, you can route the user to a booking flow. If the intent is empty, you can simply send the {{ai_response_message}} back to the user.