LangChain 学习笔记 · Day 7 智能聊天助理(Chain + Memory)
将前 6 天所学串联,基于 LCEL 与 Memory(Buffer/Window/Summary)实现一个可多轮对话的智能聊天助理,并提供 CLI、Streamlit 网页、FastAPI 接口与打包发布的最小实践。
20 分钟阅读
LangChainPythonLCELMemoryRunnableWithMessageHistoryStreamlitFastAPICLI
🎯 学习目标
- 复习 Prompt/LCEL/Memory 核心概念
- 用 Chain + RunnableWithMessageHistory 做一个可多轮对话的助理
- 提供 三种记忆策略(Buffer / Window / Summary)与可切换开关
- 输出 最小应用形态:CLI、Streamlit 网页、FastAPI 接口
- 了解如何 打包/部署(pipx/pyinstaller/Docker)
🔁 快速回顾
- LCEL 管道:
prompt | llm | parser,声明式拼接 - MessagesPlaceholder("history"):把“会话历史”注入到 Prompt
- RunnableWithMessageHistory:绑定“历史读写器”(按
session_id分会话) - Memory 策略:
- Buffer:保留全部历史(简单但 token 易膨胀)
- Window:只保留最近 K 条(省 token,可能遗忘过往)
- Summary:滚动摘要为 SystemMessage(长期保留关键信息)
🧱 架构与环境
输入(用户消息)
└─> Prompt(含 history) ── LCEL ──> Chat LLM ──> StrOutputParser
↑
RunnableWithMessageHistory(按 session_id 读写历史)
环境变量(以 Kimi OpenAI 兼容接口为例)
export OPENAI_API_KEY="你的_KIMI_API_KEY"
export OPENAI_BASE_URL="https://api.moonshot.cn/v1"
🧩 核心脚本:day7_chat_assistant.py
功能:三种记忆模式(Buffer/Window/Summary),命令行 REPL 交互 运行:
python day7_chat_assistant.py(或见下方 CLI 化)
# file: day7_chat_assistant.py
import os
from typing import Dict, Any, Literal
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_core.output_parsers import StrOutputParser
from langchain_core.runnables import RunnableLambda
from langchain_core.runnables.history import RunnableWithMessageHistory
from langchain_core.chat_history import InMemoryChatMessageHistory
from langchain_core.messages import SystemMessage
ASSISTANT_STYLE = (
"你是一位友好、专业且简洁的中文助理;"
"优先给出直接答案,再给必要解释;"
"不确定时要诚实说明。"
)
MEMORY_MODE: Literal["buffer", "window", "summary"] = os.getenv("DAY7_MEMORY_MODE", "buffer")
WINDOW_K = int(os.getenv("DAY7_WINDOW_K", "8"))
SUMMARIZE_THRESHOLD = int(os.getenv("DAY7_SUM_THRESHOLD", "14"))
MODEL_NAME = os.getenv("DAY7_MODEL", "kimi-k2-0711-preview")
llm = ChatOpenAI(model=MODEL_NAME, temperature=0.5)
base_prompt = ChatPromptTemplate.from_messages([
("system", ASSISTANT_STYLE),
MessagesPlaceholder("history"),
("human", "{input}")
])
base_chain = base_prompt | llm | StrOutputParser()
_hist_store: Dict[str, InMemoryChatMessageHistory] = {}
def get_history(session_id: str) -> InMemoryChatMessageHistory:
if session_id not in _hist_store:
_hist_store[session_id] = InMemoryChatMessageHistory()
return _hist_store[session_id]
def trim_last_k(inputs: Dict[str, Any]) -> Dict[str, Any]:
hist = inputs.get("history", [])
return {"history": hist[-WINDOW_K:], "input": inputs["input"]}
trim = RunnableLambda(trim_last_k)
from langchain_core.prompts import ChatPromptTemplate as CPT
from langchain_core.output_parsers import StrOutputParser as SOP
sum_prompt = CPT.from_messages([
("system", "请将以下对话历史总结为不超过120字的摘要,保留姓名、职业、城市、偏好、目标等长期信息。语气客观精炼。"),
("human", "{history_text}")
])
summarizer = sum_prompt | llm | SOP()
def maybe_rollup_summary(session_id: str):
history = get_history(session_id)
if len(history.messages) >= SUMMARIZE_THRESHOLD:
text = "\n".join(f"{m.type.upper()}: {getattr(m,'content','')}" for m in history.messages)
summary = summarizer.invoke({"history_text": text}).strip()
history.clear()
history.add_message(SystemMessage(content=f"【长期摘要】{summary}"))
def build_runnable():
if MEMORY_MODE == "buffer":
return base_chain
elif MEMORY_MODE == "window":
return trim | base_chain
elif MEMORY_MODE == "summary":
return base_chain
else:
raise ValueError("Unknown MEMORY_MODE")
chat_runnable = build_runnable()
chat = RunnableWithMessageHistory(
chat_runnable,
get_history,
input_messages_key="input",
history_messages_key="history",
output_messages_key="output",
)
HELP = """指令:
/help 查看帮助
/reset 重置当前会话记忆
/mode 查看当前记忆模式
/quit 退出
"""
def main():
session_id = os.getenv("DAY7_SESSION_ID", "user-001")
cfg = {"configurable": {"session_id": session_id}}
print(f"🤖 智能聊天助理(Memory: {MEMORY_MODE})— 会话ID: {session_id}")
print(HELP)
while True:
try:
user = input("你:").strip()
except (EOFError, KeyboardInterrupt):
print("\n再见~")
break
if not user:
continue
if user == "/help":
print(HELP); continue
if user == "/quit":
print("再见~"); break
if user == "/mode":
print(f"当前记忆模式:{MEMORY_MODE};窗口K={WINDOW_K};摘要阈值={SUMMARIZE_THRESHOLD}"); continue
if user == "/reset":
get_history(session_id).clear(); print("✅ 已重置会话记忆。"); continue
if MEMORY_MODE == "summary":
ai = chat.invoke({"input": user}, cfg)
print("助理:", ai)
maybe_rollup_summary(session_id)
else:
ai = chat.invoke({"input": user}, cfg)
print("助理:", ai)
if __name__ == "__main__":
main()
🖥️ 可视化网页:day7_streamlit.py
单文件网页 UI,零后端 启动:
streamlit run day7_streamlit.py
# day7_streamlit.py
import streamlit as st
from day7_chat_assistant import chat, get_history, maybe_rollup_summary, MEMORY_MODE
st.set_page_config(page_title="Day7 Chat Assistant", page_icon="🤖", layout="centered")
st.title("🤖 Day7 智能聊天助理")
session_id = st.sidebar.text_input("会话ID", value="user-web-001")
st.sidebar.write(f"记忆模式:{MEMORY_MODE}")
if "messages" not in st.session_state:
st.session_state.messages = []
user_input = st.chat_input("输入你的消息…")
cfg = {"configurable": {"session_id": session_id}}
if user_input:
st.session_state.messages.append(("user", user_input))
ai = chat.invoke({"input": user_input}, cfg)
st.session_state.messages.append(("ai", ai))
# summary 模式下尝试滚动摘要
maybe_rollup_summary(session_id)
for role, content in st.session_state.messages:
with st.chat_message("user" if role=="user" else "assistant"):
st.write(content)
🌐 最小 HTTP API:day7_api.py
# day7_api.py
from fastapi import FastAPI
from pydantic import BaseModel
from day7_chat_assistant import chat, maybe_rollup_summary
app = FastAPI()
class ChatReq(BaseModel):
session_id: str
input: str
@app.post("/chat")
def chat_api(req: ChatReq):
cfg = {"configurable": {"session_id": req.session_id}}
ai = chat.invoke({"input": req.input}, cfg)
maybe_rollup_summary(req.session_id)
return {"output": ai}
运行
pip install fastapi uvicorn
uvicorn day7_api:app --reload --port 8000
# POST http://127.0.0.1:8000/chat { "session_id": "u1", "input": "你好" }
🗃️ 可选:Redis 持久化历史
from langchain_community.chat_message_histories import RedisChatMessageHistory
def get_history(session_id: str):
return RedisChatMessageHistory(
session_id=session_id,
url="redis://localhost:6379/0",
key_prefix="chat_history:",
ttl=60*60*24*7
)
📦 打包与分发
- CLI 命令(pipx):在
pyproject.toml增加[project.scripts] day7 = "day7_chat_assistant:main" - PyInstaller:
pyinstaller --onefile day7_chat_assistant.py - Docker:容器化部署
- Streamlit Cloud:一键托管
✅ 今日练习
- 切换三种记忆模式,比较能否记住姓名/职业/城市。
- 在 Streamlit UI 加“清空历史 / 切换模式”按钮。
- FastAPI 增加
/reset、/mode,支持多会话管理。 - 用 Redis 替换内存历史,设 7 天 TTL。
- 用
pyinstaller打包发给同事试用。