Evose
BuildTutorials

Tutorial · Customer Service Bot

Build an end-to-end customer-service Agent that looks up orders, answers FAQs, and hands off to humans

Build a production-worthy customer-service Agent from scratch in 45 minutes.

What You'll Get

A customer-service Agent that:

  • Answers FAQs (from a FAQ knowledge base)
  • Looks up orders (calls the order system API)
  • Detects satisfaction (decides whether to hand off to a human)
  • Hands off to humans (when out of scope)

Architecture

┌─────────────────────────────────────────────┐
│  Chatflow Agent · Customer Service Pro      │
│                                             │
│  Start → Intent classification               │
│           ├─ Order lookup → Tool (orders) → reply │
│           ├─ FAQ → Knowledge retrieval → LLM reply │
│           ├─ Unsatisfied / handoff → notify human │
│           └─ Other → LLM fallback                │
└─────────────────────────────────────────────┘

Prerequisites

  • SaaS 5-minute walkthrough completed
  • 5–10 FAQ documents (FAQ / return policy / shipping info)
  • An order-lookup API (if not, you can mock the "Order lookup" branch)

Step 1 · Build the Knowledge Base (5 minutes)

  1. Workspace → Data · Knowledge baseNew: Product FAQ
  2. Upload FAQ docs and wait for "ready"
  3. Knowledge base in detail

Step 2 · Connect the Order System Tool (10 minutes)

2.1 Register a Credential

Org · Credential management:

Name: order_system_token
Type: API Key
Value: <your token>

2.2 Add the Tool

Workspace → Capabilities · ToolsAdd HTTP Plugin:

Name: Order System
Method: queryOrder
URL:  https://order.example.com/api/orders/{{orderId}}
Auth: Bearer {{credential:order_system_token}}
Input schema:
  orderId: string, required
Output schema:
  status: string
  ship_date: string
  amount: number

Test connection → pass.

Step 3 · Create the Chatflow Agent (20 minutes)

3.1 New

Workspace → Apps · AgentNewChatflow → name CS Pro.

3.2 Configure the Start Node

inputs:
  - name: message
    type: string
    required: true

3.3 Add an Intent Classification Node

Drag the Intent classification node (in "Logic"), connect to Start:

input: {{message}}
intents:
  - id: query_order
    description: User wants to look up order status, ship date, amount
  - id: faq
    description: Asks about returns, shipping, product features, etc.
  - id: transfer
    description: User is unhappy or explicitly asks for a human
  - id: other
    description: Other

3.4 Branch · Order Lookup

The classifier's query_order output → drag a Parameter extraction node → extract orderId (string from user message):

extraction_target:
  - name: orderId
    description: Order number, typically 8–12 alphanumeric characters
    required: true

→ Connect to a Tool node:

tool: Order System.queryOrder
inputs:
  orderId: {{orderId}}
output_var: order

→ Connect to an LLM node:

prompt: |
  Reply to the user with the order info:
  Order #: {{orderId}}
  Status: {{order.status}}
  Ship date: {{order.ship_date}}
  Amount: {{order.amount}}
 
  Use a customer-service tone — warm and clear.

→ Connect to Direct replyEnd.

3.5 Branch · FAQ

The faq output → drag a Knowledge retrieval node:

knowledge_bases: [Product FAQ]
query: {{message}}
top_k: 5
output_var: chunks

→ Connect to an LLM node:

prompt: |
  You are a CS assistant. Answer based on the following snippets:
  {{chunks}}
 
  Rules:
  - Answer only from the snippets
  - If not found, say "I'm not sure — would you like to be transferred to a human?"
  - Concise and polite

→ Connect to Direct replyEnd.

3.6 Branch · Handoff

The transfer output → drag an HTTP node (notify the CS system to create a ticket):

url: https://hr.example.com/api/tickets
method: POST
body: { user_id: {{user.id}}, summary: {{message}} }

→ Connect to Direct reply:

text: Connecting you to a human agent. Please hold — your ticket # is {{response.ticket_id}}.

End.

3.7 Branch · Fallback

other output → straight LLM node (no knowledge base) → friendly reply.

Step 4 · Debug (5 minutes)

Test each intent in the right-side debug panel:

InputExpected branch
When will my order ABC12345 arrive?query_order
What's your return policy?faq
Talk to a humantransfer
How's the weather today?other

Step 5 · Publish (2 minutes)

  • Top right of the editor → Publish to Workbench
  • Visibility: entire org / CS role only / specific department
  • Channels: Workbench + Web embed (for the business team to drop into the product page)

Step 6 · Post-Launch Observability (3 minutes)

Workspace → Observability:

  • Intent accuracy (hit rate of the classifier node)
  • Unrecognized intents (cold start; tells you what to add)
  • Per-node conversion rate (where users go most)
  • Cost per conversation (Tokens × model unit price)

Going Further

I want to…Add
Auto-learn high-frequency questions and accumulate FAQsA scheduled Workflow that clusters unrecognized intents from chat logs
Multi-language supportIn the LLM node prompt, add Reply in {{lang}}
Multi-channel (DingTalk/WeChat)Deploy · Channels
A/B test promptsAdd a [Conditional node] to route to different LLM nodes

Next Steps