AgenticAI – Customer Service Agent


Agentic AI – Customer Service Agents

Table Of Contents:

  1. What Is Customer Service Agents ?
  2. What Tasks Customer Service Agents Can Perform ?
  3. Steps To Build Customer Service Agents.
  4. Python Implementation.

(1) What Is Customer Service Agents ?

(2) What Tasks Customer Service Agents Can Perform ?

(3) Steps To Implement Customer Service Agents.

(4) Python Implementation

(1) Set Up the Environment

pip install openai langchain pymilvus tiktoken

(2) Inject & Embed Support Document

from langchain.document_loaders import DirectoryLoader
from langchain.text_splitter import RecursiveCharacterTextSplitter
from langchain.embeddings.openai import OpenAIEmbeddings
from pymilvus import Collection, connections
from langchain.vectorstores.milvus import Milvus

# 1. Connect to Milvus
connections.connect(alias="default", host="localhost", port="19530")

# 2. Load documents
loader = DirectoryLoader("./support_docs", glob="**/*.txt")
docs = loader.load()

# 3. Split text into chunks
splitter = RecursiveCharacterTextSplitter(chunk_size=500, chunk_overlap=50)
chunks = splitter.split_documents(docs)

# 4. Embed chunks
embeddings = OpenAIEmbeddings()

# 5. Store in Milvus
vectorstore = Milvus.from_documents(
    chunks,
    embedding=embeddings,
    collection_name="customer_support_docs"
)

(3) Create The RAG Chain With LangChain

from langchain.chains import RetrievalQA
from langchain.chat_models import ChatOpenAI

retriever = vectorstore.as_retriever()
llm = ChatOpenAI(model="gpt-3.5-turbo")

rag_chain = RetrievalQA.from_chain_type(
    llm=llm,
    retriever=retriever,
    chain_type="stuff",
    return_source_documents=True
)

(4) Query The Agent

query = "How can I return a damaged product?"
response = rag_chain(query)

print("Answer:", response["result"])
for doc in response["source_documents"]:
    print("Source:", doc.metadata["source"])

(5) Define Tools

from langchain.agents import Tool

# Tool 1: Use your RetrievalQA chain from Step 4
retrieval_tool = Tool(
    name="support_doc_lookup",
    func=rag_chain.run,
    description="Use this to answer customer support queries from FAQs, return policies, etc."
)

# Tool 2: Simulated product API call
def product_api_tool(query: str) -> str:
    # You can replace this with a real API call
    if "price" in query.lower():
        return "The current price of the product is $199."
    return "No product data found."

product_tool = Tool(
    name="product_api",
    func=product_api_tool,
    description="Use this to get product info such as pricing, availability, or features."
)

# Tool 3: Escalate to human
def escalate_tool(query: str) -> str:
    return f"This question may require human support. Escalating: {query}"

escalate = Tool(
    name="escalate_human",
    func=escalate_tool,
    description="Use this when the question seems out of scope or needs human help."
)

(6) Initialize The Agent

from langchain.agents import initialize_agent
from langchain.agents import AgentType
from langchain.chat_models import ChatOpenAI

llm = ChatOpenAI(model="gpt-3.5-turbo")

# Create Memory
memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)

agent = initialize_agent(
    tools=[retrieval_tool, product_tool, escalate],
    llm=llm,
    agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
    memory=memory,
    verbose=True
)

(7) Ask A Question

response = agent.run("What is the return policy for damaged items?")
print(response)

response2 = agent.run("How much does the X100 wireless speaker cost?")
print(response2)

response3 = agent.run("I want to file a legal complaint.")
print(response3)

Leave a Reply

Your email address will not be published. Required fields are marked *