?ingFang SC", "Hiragino Sans GB", "Microsoft YaHei UI", "Microsoft YaHei", Arial, sans-serif;font-size: 14px;letter-spacing: 0.544px;outline: 0px;visibility: visible;">DB-GPT 支持 Graph RAG 框架, 实现 TuGraph 上的知识图谱构建与检索docker pull tugraph/tugraph-runtime-centos7:latest
mkdir -p /tmp/tugraph/data && mkdir -p /tmp/tugraph/log && \docker run -it -d -p 7001:7001 -p 7070:7070 -p 7687:7687 -p 8000:8000 -p 8888:8888 -p 8889:8889 -p 9090:9090 \ -v /tmp/tugraph/data:/var/lib/lgraph/data-v /tmp/tugraph/log:/var/log/lgraph_log \ --name tugraph_demo tugraph/tugraph-runtime-centos7:latest /bin/bash && \docker exec -d tugraph_demo bash /setup.sh
pip install "neo4j>=5.20.0"
GRAPH_STORE_TYPE=TuGraphTUGRAPH_HOST=127.0.0.1TUGRAPH_PORT=7687TUGRAPH_USERNAME=adminTUGRAPH_PASSWORD=xxx 更多教程:https://docs.dbgpt.site/docs/latest/cookbook/rag/graph_rag_app_develop?ingFang SC", "Hiragino Sans GB", "Microsoft YaHei UI", "Microsoft YaHei", Arial, sans-serif;font-size: var(--articleFontsize);letter-spacing: 0.034em;outline: 0px;">DB-GPT 支持 ollama 部署本地模型服务1. 安装ollama https://ollama.com/ollamapullnomic-embed-text LLM_MODEL=ollama_proxyllmPROXY_SERVER_URL=http://127.0.0.1:11434PROXYLLM_BACKEND="qwen:0.5b" ROXY_API_KEY=not_usedEMBEDDING_MODEL=proxy_ollamaproxy_ollama_proxy_server_url=http://127.0.0.1:11434proxy_ollama_proxy_backend="nomic-embed-text:latest" pythondbgpt/app/dbgpt_server.py - 根据论文《A survey on large language model based autonomous agents》[1]将 Agent 模块代码重构为四个核心模块
ingFang SC", "Hiragino Sans GB", "Microsoft YaHei UI", "Microsoft YaHei", Arial, sans-serif;font-size: 11.2px;margin: 0.5em 8px;color: rgb(63, 63, 63);">
- 更灵活的
Profile模块实现,支持从环境变量、数据库和其他实现创建 agent profiles - 支持多种 memory 模式,
sensory memory, short-term memory, long-term memory and hybrid memory
- Agent中有多种
Resource类型,包括database, knowledge, tool, pack等。另外,资源的集合是一种特殊类型的资源,称为Resource Pack - 支持从
dbgpts中安装资源,例如使用下面命令安装一个简单计算器工具dbgpt app install simple-calculator-example -U
?ChatKnowledge支持ingFang SC", "Hiragino Sans GB", "Microsoft YaHei UI", "Microsoft YaHei", Arial, sans-serif;font-size: var(--articleFontsize);letter-spacing: 0.034em;outline: 0px;">rerank模型,同时支持将rerank模型发布成服务 ##RerankmodelRERANK_MODEL=bge-reranker-base##IfyounotsetRERANK_MODEL_PATH,DB-GPTwillreadthemodelpathfromEMBEDDING_MODEL_CONFIGbasedontheRERANK_MODEL.#RERANK_MODEL_PATH=##ThenumberofrerankresultstoreturnRERANK_TOP_K=3 dbgpt start controller --port 8000
dbgpt start worker --worker_type text2vec \--rerank \--model_path /app/models/bge-reranker-base \--model_name bge-reranker-base \--port 8004 \--controller_addr http://127.0.0.1:8000
LLM_MODEL=deepseek_proxyllmDEEPSEEK_MODEL_VERSION=deepseek-chatDEEPSEEK_API_BASE=https://api.deepseek.com/v1DEEPSEEK_API_KEY={your-deepseek-api-key}test_proxyllm.py DeepseekLLMClientimportasynciofromdbgpt.coreimportModelRequestfromdbgpt.model.proxyimportDeepseekLLMClient#YoushouldsetDEEPSEEK_API_KEYtoyourenvironmentvariablesclient=DeepseekLLMClient()print(asyncio.run(client.generate(ModelRequest._build("deepseek-chat","你是谁?"))))DEEPSEEK_API_KEY={your-deepseek-api-key}pythontest_proxyllm.py
#[Yi-1.5-34B-Chat](https://huggingface.co/01-ai/Yi-1.5-34B-Chat)LLM_MODEL=yi-1.5-6b-chat#[Yi-1.5-9B-Chat](https://huggingface.co/01-ai/Yi-1.5-9B-Chat)LLM_MODEL=yi-1.5-9b-chat#[Yi-1.5-6B-Chat](https://huggingface.co/01-ai/Yi-1.5-6B-Chat)LLM_MODEL=yi-1.5-34b-chat
- 修复在
EmbeddingRetriever使用CrossEncoderRanker 问题 - 支持 milvus autoflush 特性, 替换掉手动 flush
|