AI Agent : A Personalized Chatbot Using LangGraph and LangChain

Imagine having your own smart AI assistant — like ChatGPT, but customized for your own needs! That’s exactly what this project is about.
We’ve built a custom AI agent using tools like LangGraph, LangChain, and Tavily, which lets you interact with powerful AI models (like OpenAI’s GPT-4 or Groq’s LLaMA3). You can define how your AI should behave, ask questions, and even let it search the web for real-time answers.
Let’s break it down in simple terms!
🧠 What is an AI Agent?
An AI agent is like a smart virtual assistant that can:
- Understand your instructions
- Decide what actions to take
- Use tools (like a search engine)
- Give helpful responses
Instead of just answering questions like a chatbot, an AI agent can think, search, and respond smartly.
🧩 Tools We Used
1. LangChain
LangChain is like the “brain wiring” that connects different parts of your AI — such as the model (GPT or LLaMA), tools (like search), and how it behaves.
2. LangGraph
LangGraph builds on LangChain and lets you create agents that can reason and act step-by-step, just like a human. It’s perfect for building:
- Chatbots
- Search assistants
- Task automation bots
3. Tavily
Tavily is a search engine tool for AI. If your agent doesn’t know the answer, it can use Tavily to search the web and find up-to-date info.
💻 The Project Overview
🎯 Goal:
Build an AI assistant that you can control:
- Choose the AI model you want (like GPT-4 or LLaMA3)
- Give it a personality or role (like a teacher or travel guide)
- Let it search the web if needed
- Chat with it directly
Code :
ai_agent.py
from dotenv import load_dotenv
load_dotenv()
import os
from langchain_groq import ChatGroq
from langchain_openai import ChatOpenAI
from langchain_community.tools.tavily_search import TavilySearchResults
from langgraph.prebuilt import create_react_agent
from langchain_core.messages.ai import AIMessage
GROQ_API_KEY = os.getenv("GROQ_API_KEY")
TAVILY_API_KEY = os.getenv("TAVILY_API_KEY")
OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
def get_response_from_ai_agent(llm_id, query, allow_search, system_prompt, provider):
if provider == "Groq":
llm = ChatGroq(model=llm_id)
elif provider == "OpenAI":
llm = ChatOpenAI(model=llm_id)
tools = [TavilySearchResults(max_results=2)] if allow_search else []
agent = create_react_agent(
model=llm,
tools=tools,
state_modifier=system_prompt
)
state = {"messages": query}
response = agent.invoke(state)
messages = response.get("messages")
ai_messages = [message.content for message in messages if isinstance(message, AIMessage)]
return ai_messages[-1]
This file defines the core logic for creating and interacting with a personalized AI agent using LangGraph, Groq/OpenAI models, and optional web search (via Tavily). The agent responds to user queries based on a given system prompt.
backend.py
from dotenv import load_dotenv
load_dotenv()
from fastapi import FastAPI
from pydantic import BaseModel
from typing import List
from ai_agent import get_response_from_ai_agent
class RequestState(BaseModel):
model_name: str
model_provider: str
system_prompt: str
messages: List[str]
allow_search: bool
ALLOWED_MODEL_NAMES = [
"llama3-70b-8192",
"mixtral-8x7b-32768",
"llama-3.3-70b-versatile",
"gpt-4o-mini",
"llama-3.1-8b-instant",
"gemma2-9b-it"
]
app = FastAPI(title="LangGraph AI Agent")
@app.post("/chat")
def chat_endpoint(request: RequestState):
if request.model_name not in ALLOWED_MODEL_NAMES:
return {"error": "Invalid model name. Kindly select a valid AI model"}
response = get_response_from_ai_agent(
request.model_name,
request.messages,
request.allow_search,
request.system_prompt,
request.model_provider
)
return response
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="127.0.0.1", port=9999)


frontend.py
from dotenv import load_dotenv
load_dotenv()
import streamlit as st
import requests
st.set_page_config(page_title="LangGraph Agent UI", layout="centered")
st.title("AI Chatbot Agents")
st.write("Create and Interact with the AI Agents!")
system_prompt = st.text_area("Define your AI Agent:", height=70, placeholder="Type your system prompt here...")
MODEL_NAMES_GROQ = ["llama-3.3-70b-versatile","llama-3.1-8b-instant"]
MODEL_NAMES_OPENAI = ["gpt-4o-mini"]
provider = st.radio("Select Provider:", ("Groq", "OpenAI"))
selected_model = st.selectbox(
"Select Model:",
MODEL_NAMES_GROQ if provider == "Groq" else MODEL_NAMES_OPENAI
)
allow_web_search = st.checkbox("Allow Web Search")
user_query = st.text_area("Enter your query:", height=150, placeholder="Ask Anything!")
API_URL = "http://127.0.0.1:9999/chat"
if st.button("Ask Agent!"):
if user_query.strip():
payload = {
"model_name": selected_model,
"model_provider": provider,
"system_prompt": system_prompt,
"messages": [user_query],
"allow_search": allow_web_search
}
response = requests.post(API_URL, json=payload)
if response.status_code == 200:
data = response.json()
if "error" in data:
st.error(data["error"])
else:
st.subheader("Agent Response")
st.markdown(f"**Final Response:** {data}")
GitHub Repo : https://github.com/balrajegorad/Chatbot—AI-Agent.git
🚀 Final Thoughts
This project is a powerful example of how you can build your own intelligent AI assistant using cutting-edge tools — and the best part is, you’re in control of how the AI behaves, what tools it uses, and what models it runs on.
Whether you’re a student, a developer, or just curious about AI — this is a great starting point to explore the world of AI agents. 🔥