Skip to main content

LangGraph Quickstart

Benchmarked against: Anthropic — TypeScript V2 (preview) Line: Line 2 (LangGraph Self-Built) Prerequisites: Python 3.11+, .venv with LangGraph installed

Build a LangGraph agent from scratch that connects to SuperPortia's infrastructure.


Step 1: Set up environment

# Create virtual environment
python -m venv .venv
source .venv/bin/activate

# Install LangGraph ecosystem
pip install langgraph langgraph-checkpoint-sqlite langchain-mcp-adapters

Step 2: Define the state

from langgraph.graph import StateGraph, MessagesState, START, END
from langchain_core.messages import HumanMessage

# MessagesState tracks conversation history
graph_builder = StateGraph(MessagesState)

Step 3: Create the agent node

from langchain_google_genai import ChatGoogleGenerativeAI

# Use Gemini for low-cost operations
llm = ChatGoogleGenerativeAI(model="gemini-2.5-flash")

def agent_node(state: MessagesState):
"""The agent decides what to do next."""
response = llm.invoke(state["messages"])
return {"messages": [response]}

graph_builder.add_node("agent", agent_node)

Step 4: Add tools

from langchain_core.tools import tool

@tool
def search_knowledge(query: str) -> str:
"""Search the Universal Brain for knowledge."""
# This would connect to Cloud UB via API
import requests
response = requests.post(
"https://your-worker.workers.dev/brain/search",
json={"query": query}
)
return response.json()

# Bind tools to the LLM
tools = [search_knowledge]
llm_with_tools = llm.bind_tools(tools)

Step 5: Add routing logic

from langgraph.prebuilt import ToolNode

tool_node = ToolNode(tools)
graph_builder.add_node("tools", tool_node)

def should_use_tools(state: MessagesState):
"""Route to tools if the agent wants to use them."""
last_message = state["messages"][-1]
if hasattr(last_message, "tool_calls") and last_message.tool_calls:
return "tools"
return END

graph_builder.add_edge(START, "agent")
graph_builder.add_conditional_edges("agent", should_use_tools)
graph_builder.add_edge("tools", "agent")

Step 6: Compile and run

# Add memory for state persistence
from langgraph.checkpoint.sqlite import SqliteSaver

memory = SqliteSaver.from_conn_string(":memory:")
app = graph_builder.compile(checkpointer=memory)

# Run the agent
config = {"configurable": {"thread_id": "session-001"}}
result = app.invoke(
{"messages": [HumanMessage(content="What do we know about fleet architecture?")]},
config=config
)

print(result["messages"][-1].content)

What you built

ComponentWhat it does
StateGraphManages conversation state
agent_nodeLLM decides next action
ToolNodeExecutes tool calls
SqliteSaverPersists state across turns
Conditional edgesRoutes between agent and tools

Next steps

EnhancementHow
Add more toolsMCP adapters for Cloud UB, file ops
Multi-agentUse supervisor.py for orchestration
Better engineSwap Gemini for Claude for complex tasks
Persistent checkpointsUse file-based SQLite instead of in-memory

PageRelationship
Python SDKFull Python SDK reference
Agent SDK OverviewDual lines context
Engine OverviewAvailable engines for LangGraph